Skip to content

Cache

A cache is a fast temporary storage area that stores frequently accessed data to speed up access to that data. Caches are used in hardware (e.g., CPU cache) and software (e.g., web browser cache) to improve performance or reduce system load.

Reasons for considering a cache:

  • Fast access to frequently accessed data
  • Reduces latency and loading times
  • Reduces load on other systems
  • More efficient use of resources
Further information

Link: What is a cache

Possible Problems with Caching

ChallengeProblemPossible Solution
ConsistencyCache shows stale dataWrite-through, event invalidation
InvalidationWhen should the cache be updated?TTL, LRU, LFU
Right strategyLRU, LFU, or FIFO?Choose depending on the application
Storage spaceToo much data in the cacheLimited size, Redis/Memcached
Cache warmupCache starts emptyPreloading, persistent cache
Cache stormsToo many requests on cache missLocking, asynchronous loading
SecuritySensitive data in the cacheEncryption, private caches

Cache Strategies

How a cache is used depends heavily on the use case. Depending on the requirements, there are several strategies for implementing the cache.

LRU - Least Recently Used

LRU (Least Recently Used) is a mechanism that removes the least recently used items to make room for new items. It is commonly used in operating systems, databases, and web caching systems.

LFU - Least Frequently Used

LFU cache (Least Frequently Used) is a mechanism that removes the least frequently used item when the cache is full.

FIFO - First In, First Out

FIFO (First In, First Out) caching is a simple mechanism that removes the oldest item once the cache is full. To put it simply, it works like a queue: the first item added is the first to be deleted.

LIFO - Last In, First Out

LIFO (Last In, First Out) caching is a mechanism that removes the most recently inserted element first when the cache is full. It works like a stack: the most recently added element is removed first.

Content Delivery Network (CDN)

A geographically distributed system for delivering (usually static) content. Content is delivered from a location as close as possible to the client. This minimizes response and loading times. Another advantage is that it acts as a cache, reducing the load on the actual servers behind it.

Please note

  • Cache expiry: When to fetch the latest version of the content?
  • Failover: What to do if the CDN goes down? (Forwarding requests to the server).
  • Versioning of content.

Contact: M_Bergmann AT gmx.at