[TOC]
Advantages:
- Easy Implementation
- Efficient Use of Cache
- Adaptability
Disadvantages:
- Strict Ordering
- Cold Start Issues
- Memory Overhead
Advantages:
- Adaptability to Varied Access Patterns
- Optimized for Long-Term Trends
- Low Memory Overhead
Disadvantages:
- Sensitivity to Initial Access
- Difficulty in Handling Changing Access Patterns
- Complexity of Frequency Counters
Advantages:
- Simple Implementation
- Predictable Behavior
- Memory Efficiency
Disadvantages:
- Lack of Adaptability
- Inefficiency in Handling Variable Importance
- Cold Start Issues
Advantages:
- Simplicity
- Avoids Biases
- Low Overhead
Disadvantages:
- Suboptimal Performance
- No Adaptability
- Possibility of Poor Hit Rates
Caching APIs can significantly improve performance in system design by addressing several key factors:
- Faster Data Retrieval
- Reduced Database Load
- Minimized Network Latency
- Enhanced Throughput
- Improved User Experience
- Resource Optimization
- Decreased API Rate Limiting
- Scalability
Caching APIs reduces server load in system design through several mechanisms:
- Serving Repeat Requests from Cache
- Decreasing Database Queries
- Reducing Computational Work
- Handling Spikes in Traffic
- Efficient Use of Resources
- Enhanced System Stability and Reliability
Benefits:
- Reduces server load by storing responses directly on the client.
- Decreases latency since the data is fetched from the client's local storage.
Use Cases:
- Static assets like images, CSS, and JavaScript files.
- API responses that change infrequently, such as user profile data.
Benefits:
- Reduces the need to recompute responses for repeated requests.
- Can handle a large number of requests efficiently.
Use Cases:
- Frequently accessed data like product catalogs or new feeds.
- API responses that are resource-intensive to generate.
Benefits:
- Caches responses at the network edge, reducing latency and load on the origin server.
- Improves response times for end-users.
Use Cases:
- Publicly accessible APIs with high traffic volumes.
- Content delivery networks(CDNs) for static and dynamic content.
Benefits:
- Spreads the cache across multiple nodes, improving scalability and fault tolerance.
- Maintains data availability in the event that a node fails.
Use Cases:
- Large-scale applications with significant amounts of data to cache.
- Systems requiring high availability and reliability.
Benefits:
- Customizable caching strategies based on application logic.
- Can be integrated directly into the application code.
Use Cases:
- Specific parts of an application that require fine-grained control over caching.
- Scenarios where data validity and freshness need to be closely managed.
Benefits:
- Offloads database queries, improving database performance.
- Can cache query results or specific database rows.
Use Cases:
- Frequently queried database tables.
- Complex queries that require significant computation.
TODO
TODO
TODO
TODO
TODO
Cache Miss Attack: It refers to the scenario where data to fetch doesn't exist in the database and the data isn’t cached either. So every request hits the database eventually, defeating the purpose of using a cache. If a malicious user initiates lots of queries with such keys, the database can easily be overloaded.
[1] How to Design Game Database for 200,000 Concurrent Users
[2] Game Database Design Experience
[3] E-commerce System Design: Orders
[4] Cache and Database Consistency Series-01
[5] Cache and Database Consistency Series-02
[6] Canal and Databus Comparison
[7] Most Comprehensive Cache Architecture Design
[8] Cache Architecture in Large Distributed Systems
[9] Discussion on Web Cache Architecture
[10] Database Replication in System Design
[11] Introduction to Database Normalization
[12] Denormalization in Databases
[13] Cache Eviction Policies | System Design
[14] Distributed Caching: The Secret to High-Performance Applications
[15] Cache miss attack












