Caching

Storing frequently accessed data in memory (using patterns like Cache Aside) to avoid repeated slow database calls

Overview

Caching stores frequently accessed data in fast storage (usually memory) to avoid expensive operations like database queries or API calls. It's one of the most effective ways to improve performance and reduce load.

Different caching strategies (Cache Aside, Write Through, Write Behind) offer different trade-offs between consistency and performance.

Key Concepts

Cache Aside (Lazy Loading)

Application checks cache first. On miss, loads from database and updates cache. Most common pattern.

Write Through

Write to cache and database simultaneously. Slower writes but cache always up-to-date.

Write Behind (Write Back)

Write to cache immediately, database asynchronously. Fast writes but risk of data loss.

Cache Eviction

How to remove items when cache is full: LRU (Least Recently Used), LFU (Least Frequently Used), TTL (Time To Live).

How It Works

Cache Aside Pattern:

  1. Application needs data
  2. Check cache: GET user:123
  3. Cache miss? Query database
  4. Store result in cache: SET user:123 = {...}
  5. Return data to application
  6. Next request: Cache hit! Return immediately

Example with Redis: // Check cache user = cache.get('user:123') if (!user) { // Cache miss user = db.query('SELECT * FROM users WHERE id = 123') cache.set('user:123', user, TTL=3600) // 1 hour } return user

Use Cases

Database query results (user profiles, product catalogs)

API responses (weather data, stock prices)

Computed results (reports, analytics)

Session data (user authentication)

Rate limiting (track API usage)

HTML fragments (rendered components)

Best Practices

Set appropriate TTL based on data freshness requirements

Implement cache warming for critical data

Use consistent hashing for distributed caches

Monitor cache hit ratio (aim for >80%)

Implement circuit breakers for cache failures

Use cache invalidation carefully (can be complex)

Consider cache-aside pattern as default

Size cache based on working set (frequently accessed data)

Interview Tips

What Interviewers Look For

  • Explain cache-aside as the most common pattern

  • Discuss cache invalidation challenges: "When does cached data expire?"

  • Mention common cache stores: Redis, Memcached

  • Talk about cache hit ratio and how to improve it

  • Explain cache stampede problem and solutions (locking, probabilistic expiration)

  • Discuss multi-level caching: browser cache, CDN cache, application cache, database cache

  • Know eviction policies: LRU, LFU, TTL

AI Tutor

Ask about the topic

Sign in Required

Please sign in to use the AI tutor

Sign In
Caching - Module 4: Scaling Strategies | System Design | Revise Algo