079
LVL 03 — MID DEVELOPERSESSION 079DAY 79

BACKGROUND JOBS

🎫 PIXELCRAFT-066
Feature | 🟠 Hard | Priority: 🟠 High

Image uploads block the API for 5+ seconds while generating thumbnails and extracting metadata. Move heavy processing to a background queue. Respond to the user instantly, process asynchronously.
CONCEPTS.UNLOCKED
📬
Task Queues
A queue of jobs waiting to be processed. Instead of doing work inline (blocking the response), push a job onto a queue. A worker process picks it up and handles it later.
🐂
BullMQ + Redis
The most popular Node.js job queue. Uses Redis as the message broker. Reliable: jobs survive crashes, support retries, priorities, delays, and rate limiting.
⚙️
Background Processing
Separate process from the API server. The API enqueues jobs. A worker process dequeues and executes them. The API stays fast — heavy work happens elsewhere.
📊
Job Status Tracking
waiting → active → completed/failed. Track job progress in real-time. Show the user: "Generating thumbnails... 60%" — not a spinning loader with no context.
🔄
Retry on Failure
Jobs can fail — and that's okay. Configure automatic retries with exponential backoff: retry after 1s, 2s, 4s, 8s. After max retries, move to a dead-letter queue for inspection.
Scheduled Jobs (Cron)
Jobs that run on a schedule. Clean up temp files every hour. Send weekly digest emails. Generate analytics reports at midnight. BullMQ supports cron expressions natively.
HANDS-ON.TASKS
01
Set Up BullMQ Queue
npm install bullmq // queue.js — define the queue const { Queue } = require('bullmq'); const imageQueue = new Queue( 'image-processing', { connection: { host: 'localhost', port: 6379 } }); module.exports = { imageQueue };
02
Enqueue Jobs from API Route
const { imageQueue } = require('./queue'); app.post('/api/images/upload', authenticate, upload.single('image'), async (req, res) => { // Save image record immediately const image = await Image.create({ owner: req.userId, originalPath: req.file.path, status: 'processing', }); // Enqueue background job const job = await imageQueue.add( 'process-image', { imageId: image.id, filePath: req.file.path, userId: req.userId, }, { attempts: 3, backoff: { type: 'exponential', delay: 1000, }, } ); // Respond immediately — don't wait res.status(202).json({ image, jobId: job.id, message: 'Upload received, ' + 'processing in background', }); });
Status 202 (Accepted) means "I got your request and will process it later." The user gets an instant response. The heavy work happens in the worker.
03
Build the Worker
// worker.js — separate process! const { Worker } = require('bullmq'); const sharp = require('sharp'); const worker = new Worker( 'image-processing', async (job) => { const { imageId, filePath } = job.data; // Step 1: Generate thumbnail await job.updateProgress(25); await sharp(filePath) .resize(200, 200, { fit: 'cover' }) .webp({ quality: 80 }) .toFile(`thumbnails/${imageId}.webp`); // Step 2: Generate preview await job.updateProgress(50); await sharp(filePath) .resize(800) .webp({ quality: 85 }) .toFile(`previews/${imageId}.webp`); // Step 3: Extract metadata await job.updateProgress(75); const metadata = await sharp(filePath).metadata(); // Step 4: Update database await Image.findByIdAndUpdate( imageId, { status: 'ready', thumbnailPath: `thumbnails/${imageId}.webp`, previewPath: `previews/${imageId}.webp`, width: metadata.width, height: metadata.height, format: metadata.format, }); await job.updateProgress(100); return { imageId, status: 'ready' }; }, { connection: { host: 'localhost', port: 6379 } } ); worker.on('completed', (job, result) => { console.log( `Job ${job.id} completed:`, result); }); worker.on('failed', (job, err) => { console.error( `Job ${job.id} failed:`, err.message); });
04
Job Status API & Client Polling
// API: check job status app.get('/api/jobs/:jobId', authenticate, async (req, res) => { const job = await imageQueue.getJob( req.params.jobId); if (!job) { return res.status(404).json({ error: 'Job not found' }); } const state = await job.getState(); const progress = job.progress; res.json({ state, progress }); }); // Client: poll for status function useJobStatus(jobId) { const [status, setStatus] = useState('waiting'); const [progress, setProgress] = useState(0); useEffect(() => { const interval = setInterval( async () => { const res = await fetch( `/api/jobs/${jobId}`); const data = await res.json(); setStatus(data.state); setProgress(data.progress); if (data.state === 'completed' || data.state === 'failed') { clearInterval(interval); } }, 1000); return () => clearInterval(interval); }, [jobId]); return { status, progress }; }
05
Scheduled Cleanup Job
// Run every hour: clean temp files await imageQueue.add( 'cleanup-temp-files', {}, { repeat: { pattern: '0 * * * *' // every hour } } );
06
Close the Ticket
git switch -c feature/PIXELCRAFT-066-background-jobs git add server/ git commit -m "Add BullMQ background processing for uploads (PIXELCRAFT-066)" git push origin feature/PIXELCRAFT-066-background-jobs # PR → Review → Merge → Close ticket ✅
CS.DEEP-DIVE

Every production system separates fast work from slow work.

The pattern is message queues — one of the most important concepts in distributed systems architecture.

// Message queue architecture:

Producer (your API)
  Receives request, enqueues job
  Responds immediately

Broker (Redis)
  Stores jobs reliably
  Guarantees delivery

Consumer (your worker)
  Picks up jobs, processes them
  Can be scaled independently

// At scale, this becomes:
RabbitMQ  → general messaging
Apache Kafka → event streaming
AWS SQS    → managed queue
BullMQ     → Node.js + Redis

// Same pattern, different scale.
// Netflix processes billions of
// events/day with this architecture.
"Queue Lab"
[A]Add a BullMQ dashboard: install bull-board (npm install @bull-board/express). Mount at /admin/queues. Visualize active, completed, and failed jobs with retry controls.
[B]Replace client polling with WebSocket notifications: when a job completes, emit a Socket.io event to the specific user. No more polling — instant notification when processing finishes.
[C]Research: what is the difference between a message queue (RabbitMQ) and an event stream (Kafka)? When would you choose each? Write a comparison document with use cases.
REF.MATERIAL
ARTICLE
BullMQ Team
Official guide: queues, workers, jobs, events, repeatable jobs, rate limiting, and concurrency. The definitive Node.js queue library reference.
BULLMQOFFICIALESSENTIAL
VIDEO
IBM Technology
Why message queues exist, how they work, and when to use them. The architectural pattern explained clearly with diagrams.
QUEUESARCHITECTURE
VIDEO
Traversy Media
Practical tutorial: setting up Bull queues, creating workers, handling failures, and building a job dashboard in Node.js.
BULLMQTUTORIAL
ARTICLE
Wikipedia
The theory: message passing, producer-consumer pattern, guaranteed delivery, at-least-once vs exactly-once semantics.
QUEUESTHEORYCS
ARTICLE
The Pragmatic Engineer
Real-world lessons from operating queues and distributed systems at scale. The operational reality behind the architecture.
DISTRIBUTEDSCALE
// LEAVE EXCITED BECAUSE
Upload responds instantly — no more waiting 5 seconds. Thumbnails and metadata generate in the background. A progress bar shows "Processing... 75%". This is how every production app handles heavy work.