Skip to content

Latest commit

 

History

History
252 lines (141 loc) · 5.93 KB

File metadata and controls

252 lines (141 loc) · 5.93 KB

Caching Design

[TOC]

caching_arch_example

Eviction Strategies

Least Recently Used(LRU)

lru

Advantages:

  • Easy Implementation
  • Efficient Use of Cache
  • Adaptability

Disadvantages:

  • Strict Ordering
  • Cold Start Issues
  • Memory Overhead

Least Frequently Used(LFU)

lfu

Advantages:

  • Adaptability to Varied Access Patterns
  • Optimized for Long-Term Trends
  • Low Memory Overhead

Disadvantages:

  • Sensitivity to Initial Access
  • Difficulty in Handling Changing Access Patterns
  • Complexity of Frequency Counters

First-In-First-Out(FIFO)

fifo

Advantages:

  • Simple Implementation
  • Predictable Behavior
  • Memory Efficiency

Disadvantages:

  • Lack of Adaptability
  • Inefficiency in Handling Variable Importance
  • Cold Start Issues

Random Replacement

random_replacement

Advantages:

  • Simplicity
  • Avoids Biases
  • Low Overhead

Disadvantages:

  • Suboptimal Performance
  • No Adaptability
  • Possibility of Poor Hit Rates

Caching For API

Caching APIs can significantly improve performance in system design by addressing several key factors:

  • Faster Data Retrieval
  • Reduced Database Load
  • Minimized Network Latency
  • Enhanced Throughput
  • Improved User Experience
  • Resource Optimization
  • Decreased API Rate Limiting
  • Scalability

Caching APIs reduces server load in system design through several mechanisms:

  • Serving Repeat Requests from Cache
  • Decreasing Database Queries
  • Reducing Computational Work
  • Handling Spikes in Traffic
  • Efficient Use of Resources
  • Enhanced System Stability and Reliability

Client-Side Caching

client_side_caching

Benefits:

  • Reduces server load by storing responses directly on the client.
  • Decreases latency since the data is fetched from the client's local storage.

Use Cases:

  • Static assets like images, CSS, and JavaScript files.
  • API responses that change infrequently, such as user profile data.

Server-Side Caching

server_side_caching

Benefits:

  • Reduces the need to recompute responses for repeated requests.
  • Can handle a large number of requests efficiently.

Use Cases:

  • Frequently accessed data like product catalogs or new feeds.
  • API responses that are resource-intensive to generate.

Reverse Proxy Caching

reverse_proxy_caching

Benefits:

  • Caches responses at the network edge, reducing latency and load on the origin server.
  • Improves response times for end-users.

Use Cases:

  • Publicly accessible APIs with high traffic volumes.
  • Content delivery networks(CDNs) for static and dynamic content.

Distributed Caching

distributed_caching

Benefits:

  • Spreads the cache across multiple nodes, improving scalability and fault tolerance.
  • Maintains data availability in the event that a node fails.

Use Cases:

  • Large-scale applications with significant amounts of data to cache.
  • Systems requiring high availability and reliability.

Application-Level Caching

application_level_caching

Benefits:

  • Customizable caching strategies based on application logic.
  • Can be integrated directly into the application code.

Use Cases:

  • Specific parts of an application that require fine-grained control over caching.
  • Scenarios where data validity and freshness need to be closely managed.

Database Caching

database_caching

Benefits:

  • Offloads database queries, improving database performance.
  • Can cache query results or specific database rows.

Use Cases:

  • Frequently queried database tables.
  • Complex queries that require significant computation.

Caching Strategy

caching_strategy

Cache Aside

TODO

Read Through

TODO

Write Around

TODO

Write Back

TODO

Write Through

TODO

Challenges

Cache Miss Attack

cache_miss_attack

Cache Miss Attack: It refers to the scenario where data to fetch doesn't exist in the database and the data isn’t cached either. So every request hits the database eventually, defeating the purpose of using a cache. If a malicious user initiates lots of queries with such keys, the database can easily be overloaded.

References

[1] How to Design Game Database for 200,000 Concurrent Users

[2] Game Database Design Experience

[3] E-commerce System Design: Orders

[4] Cache and Database Consistency Series-01

[5] Cache and Database Consistency Series-02

[6] Canal and Databus Comparison

[7] Most Comprehensive Cache Architecture Design

[8] Cache Architecture in Large Distributed Systems

[9] Discussion on Web Cache Architecture

[10] Database Replication in System Design

[11] Introduction to Database Normalization

[12] Denormalization in Databases

[13] Cache Eviction Policies | System Design

[14] Distributed Caching: The Secret to High-Performance Applications

[15] Cache miss attack