Prerequisites
- Basic understanding of JavaScript ๐
- TypeScript installation โก
- VS Code or preferred IDE ๐ป
What you'll learn
- Understand the concept fundamentals ๐ฏ
- Apply the concept in real projects ๐๏ธ
- Debug common issues ๐
- Write type-safe code โจ
๐ฏ Introduction
Welcome to this exciting tutorial on caching strategies! ๐ In this guide, weโll explore how caching can dramatically boost your TypeScript applicationโs performance.
Ever waited for the same data to load repeatedly? Thatโs where caching comes to the rescue! ๐ฆธโโ๏ธ Youโll discover how caching can transform your TypeScript applications from sluggish to lightning-fast โก. Whether youโre building web applications ๐, APIs ๐ฅ๏ธ, or data-heavy services ๐, understanding caching is essential for creating responsive, efficient code.
By the end of this tutorial, youโll feel confident implementing various caching strategies in your own projects! Letโs dive in! ๐โโ๏ธ
๐ Understanding Caching
๐ค What is Caching?
Caching is like having a super-smart assistant who remembers everything! ๐ง Think of it as a notepad where you jot down answers to avoid recalculating them every time. Just like how you might save your favorite pizza order ๐ to avoid repeating it each time, caching saves computation results for quick access.
In TypeScript terms, caching stores the results of expensive operations so you donโt have to repeat them. This means you can:
- โจ Retrieve data instantly instead of waiting
- ๐ Reduce server load and API calls
- ๐ก๏ธ Improve user experience with faster responses
- ๐ฐ Save money on computational resources
๐ก Why Use Caching?
Hereโs why developers love caching:
- Performance Boost ๐: Access data 10-100x faster
- Resource Efficiency ๐ป: Reduce CPU and memory usage
- Better UX ๐: Users get instant responses
- Cost Savings ๐ธ: Fewer API calls = lower bills
Real-world example: Imagine building an e-commerce site ๐. With caching, product details load instantly instead of fetching from the database every time!
๐ง Basic Syntax and Usage
๐ Simple Memory Cache
Letโs start with a friendly example:
// ๐ Hello, Cache!
class SimpleCache<T> {
private cache: Map<string, T> = new Map();
// ๐ฅ Store data in cache
set(key: string, value: T): void {
this.cache.set(key, value);
console.log(`โจ Cached: ${key}`);
}
// ๐ค Retrieve from cache
get(key: string): T | undefined {
const value = this.cache.get(key);
if (value) {
console.log(`๐ฏ Cache hit: ${key}`);
} else {
console.log(`๐จ Cache miss: ${key}`);
}
return value;
}
// ๐งน Clear cache
clear(): void {
this.cache.clear();
console.log("๐งน Cache cleared!");
}
}
// ๐ฎ Let's use it!
const cache = new SimpleCache<string>();
cache.set("greeting", "Hello TypeScript! ๐");
console.log(cache.get("greeting")); // ๐ฏ Cache hit!
๐ก Explanation: This simple cache stores key-value pairs in memory. The generic type <T>
makes it work with any data type!
๐ฏ Cache with TTL (Time To Live)
Hereโs a more advanced pattern with expiration:
// ๐๏ธ Cache entry with timestamp
interface CacheEntry<T> {
value: T;
timestamp: number;
ttl: number; // โฐ Time to live in milliseconds
}
// ๐จ Advanced cache with expiration
class TTLCache<T> {
private cache: Map<string, CacheEntry<T>> = new Map();
// ๐ฅ Store with expiration
set(key: string, value: T, ttl: number = 60000): void {
this.cache.set(key, {
value,
timestamp: Date.now(),
ttl
});
console.log(`โจ Cached ${key} for ${ttl/1000}s`);
}
// ๐ค Get if not expired
get(key: string): T | undefined {
const entry = this.cache.get(key);
if (!entry) return undefined;
// โฐ Check if expired
if (Date.now() - entry.timestamp > entry.ttl) {
this.cache.delete(key);
console.log(`โฐ Cache expired: ${key}`);
return undefined;
}
console.log(`๐ฏ Cache hit: ${key}`);
return entry.value;
}
}
๐ก Practical Examples
๐ Example 1: API Response Cache
Letโs build a real-world API cache:
// ๐ Product API with caching
interface Product {
id: string;
name: string;
price: number;
emoji: string;
}
class ProductService {
private cache = new TTLCache<Product[]>();
private apiCallCount = 0;
// ๐ฏ Fetch products with caching
async getProducts(category: string): Promise<Product[]> {
// ๐ Check cache first
const cached = this.cache.get(category);
if (cached) return cached;
// ๐ Simulate API call
console.log(`๐ Fetching ${category} from API...`);
this.apiCallCount++;
// ๐ฎ Simulate delay
await new Promise(resolve => setTimeout(resolve, 1000));
// ๐ฆ Mock data
const products: Product[] = [
{ id: "1", name: "TypeScript Book", price: 29.99, emoji: "๐" },
{ id: "2", name: "Coffee Mug", price: 12.99, emoji: "โ" },
{ id: "3", name: "Keyboard", price: 89.99, emoji: "โจ๏ธ" }
];
// ๐พ Cache for 5 minutes
this.cache.set(category, products, 300000);
console.log(`๐ API calls made: ${this.apiCallCount}`);
return products;
}
}
// ๐ฎ Let's test it!
const service = new ProductService();
// First call - hits API
await service.getProducts("electronics"); // ๐ API call
// Second call - from cache!
await service.getProducts("electronics"); // ๐ฏ Cache hit!
๐ฏ Try it yourself: Add a method to invalidate cache when products are updated!
๐ฎ Example 2: Computation Cache (Memoization)
Letโs cache expensive calculations:
// ๐งฎ Fibonacci with memoization
class FibonacciCalculator {
private cache = new Map<number, bigint>();
private calculations = 0;
// ๐ Calculate with caching
calculate(n: number): bigint {
// ๐ฏ Check cache
if (this.cache.has(n)) {
console.log(`๐จ Cache hit for fib(${n})`);
return this.cache.get(n)!;
}
this.calculations++;
// ๐ข Base cases
if (n <= 1) return BigInt(n);
// ๐จ Calculate and cache
console.log(`๐งฎ Calculating fib(${n})...`);
const result = this.calculate(n - 1) + this.calculate(n - 2);
this.cache.set(n, result);
return result;
}
// ๐ Get stats
getStats(): void {
console.log(`๐ Calculation Stats:`);
console.log(` ๐งฎ Calculations: ${this.calculations}`);
console.log(` ๐พ Cache size: ${this.cache.size}`);
console.log(` ๐ Cache efficiency: ${
Math.round((this.cache.size / this.calculations) * 100)
}%`);
}
}
// ๐ฎ Test the difference!
const fib = new FibonacciCalculator();
console.time("โฑ๏ธ Fibonacci");
console.log(`Result: ${fib.calculate(40)}`);
console.timeEnd("โฑ๏ธ Fibonacci");
fib.getStats();
๐ Example 3: Multi-Level Cache
Advanced caching with layers:
// ๐๏ธ Multi-level cache system
class MultiLevelCache<T> {
private l1Cache = new Map<string, T>(); // ๐ Super fast (10 items)
private l2Cache = new Map<string, T>(); // โก Fast (100 items)
private stats = { hits: 0, misses: 0, l1Hits: 0, l2Hits: 0 };
// ๐ฅ Smart caching
set(key: string, value: T): void {
// ๐ฏ Add to L1 (hot cache)
this.l1Cache.set(key, value);
// ๐ Manage L1 size
if (this.l1Cache.size > 10) {
const firstKey = this.l1Cache.keys().next().value;
const demoted = this.l1Cache.get(firstKey)!;
this.l1Cache.delete(firstKey);
// ๐ฆ Move to L2
this.l2Cache.set(firstKey, demoted);
console.log(`๐ฆ Demoted ${firstKey} to L2`);
}
// ๐งน Manage L2 size
if (this.l2Cache.size > 100) {
const firstKey = this.l2Cache.keys().next().value;
this.l2Cache.delete(firstKey);
console.log(`๐๏ธ Evicted ${firstKey} from L2`);
}
}
// ๐ค Multi-level retrieval
get(key: string): T | undefined {
// ๐ฏ Check L1 first
if (this.l1Cache.has(key)) {
this.stats.hits++;
this.stats.l1Hits++;
console.log(`๐ L1 hit: ${key}`);
return this.l1Cache.get(key);
}
// โก Check L2
if (this.l2Cache.has(key)) {
this.stats.hits++;
this.stats.l2Hits++;
console.log(`โก L2 hit: ${key}`);
// ๐ Promote to L1
const value = this.l2Cache.get(key)!;
this.l2Cache.delete(key);
this.set(key, value);
return value;
}
// ๐จ Cache miss
this.stats.misses++;
console.log(`๐จ Cache miss: ${key}`);
return undefined;
}
// ๐ Performance stats
getStats(): void {
const hitRate = (this.stats.hits / (this.stats.hits + this.stats.misses)) * 100;
console.log(`๐ Cache Performance:`);
console.log(` ๐ฏ Hit rate: ${hitRate.toFixed(1)}%`);
console.log(` ๐ L1 hits: ${this.stats.l1Hits}`);
console.log(` โก L2 hits: ${this.stats.l2Hits}`);
console.log(` ๐จ Misses: ${this.stats.misses}`);
}
}
๐ Advanced Concepts
๐งโโ๏ธ LRU (Least Recently Used) Cache
When youโre ready to level up, implement an LRU cache:
// ๐ฏ LRU Cache with TypeScript magic
class LRUCache<T> {
private cache = new Map<string, T>();
private readonly maxSize: number;
constructor(maxSize: number = 100) {
this.maxSize = maxSize;
}
// ๐ฅ Add with LRU eviction
set(key: string, value: T): void {
// ๐ Delete and re-add to move to end
if (this.cache.has(key)) {
this.cache.delete(key);
}
this.cache.set(key, value);
// ๐งน Evict least recently used
if (this.cache.size > this.maxSize) {
const lru = this.cache.keys().next().value;
this.cache.delete(lru);
console.log(`๐๏ธ Evicted LRU item: ${lru}`);
}
}
// ๐ค Get and mark as recently used
get(key: string): T | undefined {
if (!this.cache.has(key)) return undefined;
// ๐ Move to end (most recent)
const value = this.cache.get(key)!;
this.cache.delete(key);
this.cache.set(key, value);
return value;
}
}
๐๏ธ Cache Warming and Invalidation
Advanced patterns for production:
// ๐ Production-ready cache manager
class CacheManager<T> {
private cache = new TTLCache<T>();
private warmupKeys: string[] = [];
// ๐ฅ Warm up cache proactively
async warmup(keys: string[], fetcher: (key: string) => Promise<T>): Promise<void> {
console.log("๐ฅ Warming up cache...");
const promises = keys.map(async (key) => {
const value = await fetcher(key);
this.cache.set(key, value);
this.warmupKeys.push(key);
});
await Promise.all(promises);
console.log(`โ
Warmed up ${keys.length} items!`);
}
// ๐ Invalidate patterns
invalidatePattern(pattern: RegExp): number {
let invalidated = 0;
// Note: In real implementation, track keys differently
this.warmupKeys.forEach(key => {
if (pattern.test(key)) {
// Invalidate matching keys
invalidated++;
}
});
console.log(`๐งน Invalidated ${invalidated} items`);
return invalidated;
}
}
โ ๏ธ Common Pitfalls and Solutions
๐ฑ Pitfall 1: Memory Leaks
// โ Wrong way - unlimited cache growth!
class LeakyCache {
private cache: Record<string, any> = {};
set(key: string, value: any): void {
this.cache[key] = value; // ๐ฅ Never cleaned up!
}
}
// โ
Correct way - limit cache size!
class BoundedCache<T> {
private cache = new Map<string, T>();
private maxSize = 1000;
set(key: string, value: T): void {
if (this.cache.size >= this.maxSize) {
// ๐งน Remove oldest entry
const firstKey = this.cache.keys().next().value;
this.cache.delete(firstKey);
}
this.cache.set(key, value);
}
}
๐คฏ Pitfall 2: Stale Data
// โ Dangerous - serving outdated data!
const userCache = new Map();
userCache.set("user:123", { name: "Old Name" });
// User updates their name...
// But cache still returns old data! ๐ฐ
// โ
Safe - invalidate on updates!
class UserService {
private cache = new TTLCache<User>();
async updateUser(id: string, data: User): Promise<void> {
// ๐ Update database
await this.db.update(id, data);
// ๐งน Invalidate cache immediately!
this.cache.delete(`user:${id}`);
console.log(`๐ Cache invalidated for user:${id}`);
}
}
๐ฅ Pitfall 3: Cache Stampede
// โ Problem - multiple requests hit DB!
async function getData(key: string): Promise<Data> {
const cached = cache.get(key);
if (cached) return cached;
// ๐ฅ Multiple simultaneous requests all hit DB!
const data = await fetchFromDB(key);
cache.set(key, data);
return data;
}
// โ
Solution - request coalescing!
class SmartCache<T> {
private cache = new Map<string, T>();
private pending = new Map<string, Promise<T>>();
async get(key: string, fetcher: () => Promise<T>): Promise<T> {
// ๐ฏ Return cached if available
if (this.cache.has(key)) {
return this.cache.get(key)!;
}
// ๐ Return pending promise if already fetching
if (this.pending.has(key)) {
return this.pending.get(key)!;
}
// ๐ Fetch once for all waiters
const promise = fetcher().then(data => {
this.cache.set(key, data);
this.pending.delete(key);
return data;
});
this.pending.set(key, promise);
return promise;
}
}
๐ ๏ธ Best Practices
- ๐ฏ Cache What Matters: Focus on expensive operations
- โฐ Set Appropriate TTLs: Balance freshness vs performance
- ๐ Limit Cache Size: Prevent memory issues
- ๐ Invalidate Smartly: Clear cache when data changes
- ๐ Monitor Performance: Track hit rates and adjust
๐งช Hands-On Exercise
๐ฏ Challenge: Build a Smart Weather Cache
Create a weather service with intelligent caching:
๐ Requirements:
- โ Cache weather data by city with 10-minute TTL
- ๐ท๏ธ Support temperature, humidity, and conditions
- ๐ค Track cache hit rate per city
- ๐ Implement cache warming for popular cities
- ๐จ Add weather emojis based on conditions!
๐ Bonus Points:
- Implement request coalescing
- Add multi-level caching (memory + localStorage)
- Create cache visualization dashboard
๐ก Solution
๐ Click to see solution
// ๐ฏ Smart weather cache implementation!
interface WeatherData {
city: string;
temperature: number;
humidity: number;
condition: string;
emoji: string;
}
class WeatherService {
private cache = new TTLCache<WeatherData>();
private stats = new Map<string, { hits: number; misses: number }>();
private pending = new Map<string, Promise<WeatherData>>();
// ๐ค๏ธ Get weather with smart caching
async getWeather(city: string): Promise<WeatherData> {
// ๐ Initialize stats
if (!this.stats.has(city)) {
this.stats.set(city, { hits: 0, misses: 0 });
}
// ๐ฏ Check cache
const cached = this.cache.get(city);
if (cached) {
this.stats.get(city)!.hits++;
console.log(`โ๏ธ Cache hit for ${city}!`);
return cached;
}
// ๐ Check pending requests
if (this.pending.has(city)) {
console.log(`โณ Waiting for pending request: ${city}`);
return this.pending.get(city)!;
}
// ๐ Fetch with coalescing
const promise = this.fetchWeatherData(city);
this.pending.set(city, promise);
try {
const data = await promise;
this.cache.set(city, data, 600000); // 10 minutes
this.stats.get(city)!.misses++;
return data;
} finally {
this.pending.delete(city);
}
}
// ๐ Simulate API call
private async fetchWeatherData(city: string): Promise<WeatherData> {
console.log(`๐ Fetching weather for ${city}...`);
await new Promise(resolve => setTimeout(resolve, 1000));
// ๐ฒ Random weather
const conditions = [
{ condition: "sunny", emoji: "โ๏ธ", temp: 25 },
{ condition: "cloudy", emoji: "โ๏ธ", temp: 20 },
{ condition: "rainy", emoji: "๐ง๏ธ", temp: 15 },
{ condition: "snowy", emoji: "โ๏ธ", temp: -5 }
];
const weather = conditions[Math.floor(Math.random() * conditions.length)];
return {
city,
temperature: weather.temp + Math.random() * 10,
humidity: 50 + Math.random() * 40,
condition: weather.condition,
emoji: weather.emoji
};
}
// ๐ฅ Warm up popular cities
async warmupCache(cities: string[]): Promise<void> {
console.log("๐ฅ Warming up cache for popular cities...");
const promises = cities.map(city => this.getWeather(city));
await Promise.all(promises);
console.log(`โ
Cache warmed for ${cities.length} cities!`);
}
// ๐ Get cache statistics
getCacheStats(): void {
console.log("๐ Cache Statistics:");
let totalHits = 0;
let totalMisses = 0;
this.stats.forEach((stats, city) => {
const hitRate = (stats.hits / (stats.hits + stats.misses)) * 100;
console.log(` ${city}: ${hitRate.toFixed(1)}% hit rate`);
totalHits += stats.hits;
totalMisses += stats.misses;
});
const overallHitRate = (totalHits / (totalHits + totalMisses)) * 100;
console.log(` ๐ Overall: ${overallHitRate.toFixed(1)}% hit rate`);
}
}
// ๐ฎ Test the weather service!
const weather = new WeatherService();
// Warm up popular cities
await weather.warmupCache(["London", "New York", "Tokyo"]);
// Make some requests
await weather.getWeather("London"); // Cache hit!
await weather.getWeather("Paris"); // Cache miss
await weather.getWeather("London"); // Cache hit!
weather.getCacheStats();
๐ Key Takeaways
Youโve learned so much about caching! Hereโs what you can now do:
- โ Implement various caching strategies with confidence ๐ช
- โ Avoid common caching pitfalls that trip up beginners ๐ก๏ธ
- โ Apply TTL and LRU patterns in real projects ๐ฏ
- โ Debug cache-related issues like a pro ๐
- โ Build performant applications with TypeScript! ๐
Remember: Caching is a powerful tool, but use it wisely! Not everything needs to be cached. ๐ค
๐ค Next Steps
Congratulations! ๐ Youโve mastered caching strategies!
Hereโs what to do next:
- ๐ป Practice with the weather cache exercise above
- ๐๏ธ Add caching to your existing projects
- ๐ Move on to our next tutorial: Memory Management
- ๐ Experiment with Redis or other cache stores!
Remember: Every millisecond saved makes users happier. Keep optimizing, keep learning, and most importantly, have fun! ๐
Happy caching! ๐๐โจ