This document provides comprehensive guidelines for writing and running tests for the Cachier package.
- Test Suite Overview
- Test Structure
- Running Tests
- Writing Tests
- Test Isolation
- Backend-Specific Testing
- Parallel Testing
- CI/CD Integration
- Troubleshooting
The Cachier test suite is designed to comprehensively test all caching backends while maintaining proper isolation between tests. The suite uses pytest with custom markers for backend-specific tests.
- Memory: In-memory caching (no external dependencies)
- Pickle: File-based caching using pickle (default backend)
- MongoDB: Database caching using MongoDB
- Redis: In-memory data store caching
- SQL: SQL database caching via SQLAlchemy (PostgreSQL, SQLite, MySQL)
- Core Functionality: Basic caching operations (get, set, clear)
- Stale Handling: Testing
stale_afterparameter - Concurrency: Thread-safety and multi-process tests
- Error Handling: Exception scenarios and recovery
- Performance: Speed and efficiency tests
- Integration: Cross-backend compatibility
tests/
├── conftest.py # Shared fixtures and configuration
├── requirements.txt # Base test dependencies (includes pytest-rerunfailures)
├── requirements_mongodb.txt # MongoDB-specific test dependencies
├── requirements_redis.txt # Redis-specific test dependencies
├── requirements_postgres.txt # PostgreSQL/SQL-specific test dependencies
│
├── test_*.py # Backend-agnostic test modules
├── mongo_tests/ # MongoDB-specific tests
│ └── test_mongo_core.py
├── sql_tests/ # SQL-specific tests
│ └── test_sql_core.py
├── test_redis_core.py # Redis backend tests
├── test_memory_core.py # Memory backend tests
├── test_pickle_core.py # Pickle backend tests
├── test_general.py # Cross-backend tests
└── ...
Tests are marked with backend-specific markers:
@pytest.mark.mongo # MongoDB tests
@pytest.mark.redis # Redis tests
@pytest.mark.sql # SQL tests
@pytest.mark.memory # Memory backend tests
@pytest.mark.pickle # Pickle backend tests
@pytest.mark.maxage # Tests involving stale_after functionality
@pytest.mark.flaky # Flaky tests that should be retried (see Flaky Tests section)# Run all tests
pytest
# Run tests for specific backend
pytest -m mongo
pytest -m redis
pytest -m sql
# Run tests for multiple backends
pytest -m "mongo or redis"
# Exclude specific backends
pytest -m "not mongo"
# Run with verbose output
pytest -vThe recommended way to run tests with proper backend setup:
# Test single backend
./scripts/test-local.sh mongo
# Test multiple backends
./scripts/test-local.sh mongo redis sql
# Test all backends
./scripts/test-local.sh all
# Run tests in parallel
./scripts/test-local.sh all -p
# Keep containers running for debugging
./scripts/test-local.sh mongo redis -kTests can be run in parallel using pytest-xdist:
# Run with automatic worker detection
./scripts/test-local.sh all -p
# Specify number of workers
./scripts/test-local.sh all -p -w 4
# Or directly with pytest
pytest -n auto
pytest -n 4import pytest
from cachier import cachier
def test_basic_caching():
"""Test basic caching functionality."""
# Define a cached function local to this test
@cachier()
def expensive_computation(x):
return x**2
# First call - should compute
result1 = expensive_computation(5)
assert result1 == 25
# Second call - should return from cache
result2 = expensive_computation(5)
assert result2 == 25
# Clear cache for cleanup
expensive_computation.clear_cache()@pytest.mark.mongo
def test_mongo_specific_feature():
"""Test MongoDB-specific functionality."""
from tests.test_mongo_core import _test_mongetter
@cachier(mongetter=_test_mongetter)
def mongo_cached_func(x):
return x * 2
# Test implementation
assert mongo_cached_func(5) == 10Never share cachier-decorated functions between test functions. Each test must have its own decorated function to ensure proper isolation.
Cachier identifies cached functions by their full module path and function name. When tests share decorated functions:
- Cache entries can conflict between tests
- Parallel test execution may fail unpredictably
- Test results become non-deterministic
def test_feature_one():
@cachier()
def compute_one(x): # Unique to this test
return x * 2
assert compute_one(5) == 10
def test_feature_two():
@cachier()
def compute_two(x): # Different function for different test
return x * 2
assert compute_two(5) == 10# DON'T DO THIS!
@cachier()
def shared_compute(x): # Shared between tests
return x * 2
def test_feature_one():
assert shared_compute(5) == 10 # May conflict with test_feature_two
def test_feature_two():
assert shared_compute(5) == 10 # May conflict with test_feature_one- Pickle Backend: Uses
isolated_cache_directoryfixture that creates unique directories per pytest-xdist worker - External Backends: Rely on function namespacing (module + function name)
- Clear Cache: Always clear cache at test end for cleanup
- Define cached functions inside test functions
- Use unique, descriptive function names
- Clear cache after each test
- Avoid module-level cached functions in tests
- Use fixtures for common setup/teardown
@pytest.mark.mongo
def test_mongo_feature():
"""Test with MongoDB backend."""
@cachier(mongetter=_test_mongetter, wait_for_calc_timeout=2)
def mongo_func(x):
return x
# MongoDB-specific assertions
assert mongo_func.get_cache_mongetter() is not None@pytest.mark.redis
def test_redis_feature():
"""Test with Redis backend."""
@cachier(backend="redis", redis_client=_test_redis_client)
def redis_func(x):
return x
# Redis-specific testing
assert redis_func(5) == 5@pytest.mark.sql
def test_sql_feature():
"""Test with SQL backend."""
@cachier(backend="sql", sql_engine=test_engine)
def sql_func(x):
return x
# SQL-specific testing
assert sql_func(5) == 5@pytest.mark.memory
def test_memory_feature():
"""Test with memory backend."""
@cachier(backend="memory")
def memory_func(x):
return x
# Memory-specific testing
assert memory_func(5) == 5- pytest-xdist creates multiple worker processes
- Each worker gets a subset of tests
- Cachier's function identification ensures natural isolation
- Pickle backend uses worker-specific cache directories
# Automatic worker detection
./scripts/test-local.sh all -p
# Specify workers
./scripts/test-local.sh all -p -w 4
# Direct pytest command
pytest -n auto- Resource Usage: More workers = more CPU/memory usage
- External Services: Ensure Docker has sufficient resources
- Test Output: May be interleaved; use
-vfor clarity - Debugging: Harder with parallel execution; use
-n 1for debugging
The CI pipeline runs a matrix job per backend. Each backend uses the commands below:
# Local backends (memory, pickle, and other non-external tests)
pytest -m "not mongo and not sql and not redis and not s3"
# MongoDB backend
pytest -m mongo
# PostgreSQL/SQL backend
pytest -m sql
# Redis backend
pytest -m redis
# S3 backend
pytest -m s3Note: local tests do not use pytest-xdist (-n) in CI. External backends
(MongoDB, PostgreSQL, Redis, S3) each run in their own isolated matrix job with
the corresponding Docker service started beforehand.
CACHIER_TEST_VS_DOCKERIZED_MONGO: Use real MongoDB in CICACHIER_TEST_REDIS_HOST: Redis connection detailsSQLALCHEMY_DATABASE_URL: SQL database connection
-
Import Errors: Install backend-specific requirements
pip install -r tests/redis_requirements.txt
-
Docker Not Running: Start Docker Desktop or daemon
docker ps # Check if Docker is running -
Port Conflicts: Stop conflicting services
docker stop cachier-test-mongo cachier-test-redis cachier-test-postgres
-
Flaky Tests: Usually due to timing issues
- Increase timeouts
- Add proper waits
- Check for race conditions
-
Cache Conflicts: Ensure function isolation
- Don't share decorated functions
- Clear cache after tests
- Use unique function names
Some tests, particularly in the pickle core module, may occasionally fail due to race conditions in multi-threaded scenarios. To handle these, we use the pytest-rerunfailures plugin.
@pytest.mark.flaky(reruns=5, reruns_delay=0.1)
def test_that_may_fail_intermittently():
"""This test will retry up to 5 times with 0.1s delay between attempts."""
# Test implementationtest_bad_cache_file: Tests handling of corrupted cache files with concurrent accesstest_delete_cache_file: Tests handling of missing cache files during concurrent operations
These tests involve race conditions between threads that are difficult to reproduce consistently, so they're configured to retry multiple times before being marked as failed.
-
Run Single Test:
pytest -k test_name -v
-
Disable Parallel:
pytest -n 1
-
Check Logs:
docker logs cachier-test-mongo
-
Interactive Debugging:
import pdb pdb.set_trace()
- Test Speed: Memory/pickle tests are fastest
- External Backends: Add overhead for Docker/network
- Parallel Execution: Speeds up test suite significantly
- Cache Size: Large caches slow down tests
- Always define cached functions inside test functions
- Never share cached functions between tests
- Clear cache after each test
- Use appropriate markers for backend-specific tests
- Run full test suite before submitting PRs
- Test with parallel execution to catch race conditions
- Document any special test requirements
- Follow existing test patterns in the codebase
When adding new tests:
- Follow existing naming conventions
- Add appropriate backend markers
- Ensure function isolation
- Include docstrings explaining test purpose
- Test both success and failure cases
- Consider edge cases and error conditions
- Run with all backends if applicable
- Update this documentation if needed
- Check existing tests for examples
- Review the main README.rst
- Open an issue on GitHub
- Contact maintainers listed in README.rst