068
LVL 03 — MID DEVELOPERSESSION 068DAY 68

REDIS CACHING

🎫 PIXELCRAFT-055
Feature / ⚡ Performance | 🟠 Hard | Priority: 🟠 High

The API makes repeated identical database queries. Login attempts aren't rate limited. Session management is clunky. Redis solves all three: cache frequently accessed data, rate-limit API calls, and manage sessions.
CONCEPTS.UNLOCKED
Redis
In-memory key-value store — blazing fast, microsecond operations. Data lives in RAM, not on disk. Used by Twitter, GitHub, StackOverflow for caching and real-time features.
🧰
Use Cases
Caching, session storage, rate limiting, pub/sub, leaderboards. Anywhere you need speed: store frequently accessed data in Redis instead of hitting the database every time.
🔑
Key-Value Operations
GET, SET, DEL, EXPIRE, INCR. Simple commands, blazing speed. SET stores data. GET retrieves it. EXPIRE sets auto-deletion. INCR atomically increments a counter.
📋
Cache-Aside Pattern
Check cache → cache miss → query DB → store in cache → return. First request hits DB (slow). Every subsequent request hits cache (fast). The most common caching strategy.
🚦
Rate Limiting
Track request count per IP/user per time window. 100 requests per minute max. Prevents brute-force login attacks, API abuse, and denial-of-service.
TTL (Time to Live)
Cache entries automatically expire. 'EX', 60 = expire after 60 seconds. Stale data is automatically cleaned up. Balance freshness vs performance.
HANDS-ON.TASKS
01
Set Up Redis
docker run --name pixelcraft-redis \ -p 6379:6379 -d redis npm install ioredis
02
Implement Cache-Aside Pattern
const Redis = require('ioredis'); const redis = new Redis(); // Cache-aside pattern for gallery app.get('/api/images', authenticate, async (req, res) => { const cacheKey = `images:${req.userId}` + `:page:${req.query.page || 1}`; // Check cache first const cached = await redis.get(cacheKey); if (cached) { return res.json(JSON.parse(cached)); } // Cache miss — query database const images = await Image.find({ owner: req.userId }) .sort('-createdAt') .limit(20); // Store in cache with 60-second TTL await redis.set(cacheKey, JSON.stringify(images), 'EX', 60); res.json(images); });
03
Cache Invalidation
app.post('/api/images/upload', authenticate, async (req, res) => { // ... upload logic ... // Invalidate this user's gallery cache const keys = await redis.keys( `images:${req.userId}:*` ); if (keys.length > 0) await redis.del(...keys); res.status(201).json(image); });
When data changes, the cache must be invalidated. Otherwise users see stale data. This is the fundamental tradeoff: speed vs freshness.
04
Rate Limiting Middleware
async function rateLimit(req, res, next) { const key = `ratelimit:${req.ip}`; const current = await redis.incr(key); if (current === 1) { await redis.expire(key, 60); // 60-second window } if (current > 100) { // 100 requests per minute return res.status(429).json({ error: 'Too many requests. ' + 'Try again in a minute.' }); } next(); } app.use('/api/', rateLimit);
05
Measure Performance
RequestWithout RedisWith Redis
First request50ms (DB query)50ms (cache miss → DB)
Second request50ms (DB again)1ms (cache hit!)
06
Close the Ticket
git switch -c feature/PIXELCRAFT-055-redis git add server/ git commit -m "Add Redis caching + rate limiting (PIXELCRAFT-055)" git push origin feature/PIXELCRAFT-055-redis # PR → Review → Merge → Close ticket ✅
CS.DEEP-DIVE

Cache hierarchies exist at every level of computing.

Each level trades capacity for speed. Cache invalidation is famously "one of the two hardest problems in CS."

// The cache hierarchy:

CPU L1          → 1ns  (fastest)
CPU L2          → 4ns
CPU L3          → 12ns
RAM             → 100ns
Redis           → ~0.1ms
Database        → ~10ms
Disk            → ~100ms (slowest)

// Your cache-aside pattern with TTL is
// the most common strategy: accept
// slightly stale data for dramatically
// better performance. — Phil Karlton
"Cache Lab"
[A]Add cache-aside to 3 more endpoints: /api/images/:id (single image), /api/auth/me (current user profile), /api/analytics/summary (dashboard stats). Measure the improvement for each.
[B]Implement a Redis-backed leaderboard: track top uploaders with ZADD/ZREVRANGE (sorted sets). Update scores on each upload. Display a real-time leaderboard in the dashboard.
[C]Research Redis pub/sub: when one user uploads an image, notify other connected users in real-time. Build a simple notification system using Redis PUBLISH/SUBSCRIBE channels.
REF.MATERIAL
ARTICLE
Redis Team
Official Redis guide: installation, data types, commands, persistence, and use cases. The definitive starting point.
REDISOFFICIALESSENTIAL
VIDEO
Fireship
Ultra-fast Redis overview: in-memory storage, data structures, pub/sub, and why it powers the world's fastest applications.
REDISQUICK
ARTICLE
Redis Team
Cache-aside, write-through, write-behind patterns. When to use each, TTL strategies, and invalidation approaches.
CACHINGPATTERNSOFFICIAL
VIDEO
Traversy Media
Practical Redis tutorial: strings, lists, sets, sorted sets, hashes. Connecting from Node.js with real-world examples.
REDISTUTORIAL
ARTICLE
Wikipedia
Cache theory: hierarchies, replacement policies, write strategies, coherence. The CS fundamentals behind every caching system.
CACHETHEORYCS
// LEAVE EXCITED BECAUSE
API responses went from 50ms to 1ms with caching. Brute-force login attacks are blocked by rate limiting. Redis is like giving your database a turbo boost.