Serksa
All Concepts
API & Backend

Caching Strategies

1

What is it?

<strong>Caching</strong> is storing frequently accessed data in a fast-access location to avoid repeatedly fetching it from the original source. It dramatically improves performance and reduces server load.

2

Think of it like...

The Grocery Shopping Analogy

You don't go to the grocery store every time you need milk. You store it in your fridge (cache) for quick access. Only when it runs out do you go back to the store (database).

🏪

Grocery Store (Database)

Original source

🧊

Fridge (Cache)

Quick access storage

🍽️

Kitchen (App)

Where you use it

3

Visual Flow

📱Request

Needs Data

Cache

Check Cache First

🗄️Database

Fallback Source

4

Where you see it

1

Request comes in

App needs user profile data

2

Check cache first

Look in Redis/memory cache

3

Cache hit or miss

Found in cache? Return it. Not found? Continue

4

Fetch from database

Query the original data source

5

Store in cache

Save result in cache for next time

5

Common Mistake

Wrong

"Caching is just for big websites" or "Cache everything forever"

Correct

Caching benefits <strong>any application</strong>, but requires strategy: set appropriate expiration times, invalidate stale data, and cache the right things. Not all data should be cached.

💡 Real-World Example

Netflix using caching:

1

Popular shows are cached on CDN servers near you

2

Instead of fetching from main servers, you get it from nearby cache

3

Reduces latency from 500ms to 20ms

4

Saves bandwidth and improves user experience dramatically