125
⚫ LVL 05 — SENIOR DEVELOPERSESSION 125DAY 125

EXAM REVIEW

🔍 PEER REVIEW
Collaborative Learning | Review & Reflect

The exam is done. Now the real learning happens. Present your solutions. Review a peer's solutions. Discover different approaches to the same problems. Give and receive constructive feedback. The best engineers learn more from reviewing code than writing it.
CONCEPTS.UNLOCKED
🎤
Solution Presentation
Walk through each fix: diagnosis → root cause → fix → verification. Explain your reasoning. Show your commit messages. The presentation reveals your thought process, not just your solution.
👁️
Peer Code Review
Read someone else's solution to the same problem. Did they find the same root cause? Did they fix it differently? Is their approach better? Could they have missed an edge case? This is how teams get stronger.
🔄
Alternative Approaches
There's rarely one correct fix — there are tradeoffs. Your aggregation pipeline vs their $lookup. Your defensive check vs their schema validation. Compare approaches, understand tradeoffs, choose the best.
💬
Constructive Feedback
Observe, question, suggest — never attack. "I noticed X — have you considered Y?" not "This is wrong." Good feedback makes the author better. Great feedback makes both reviewer and author better.
🪞
Reflection
What was hardest? What would you do differently? The gap between what you planned and what happened reveals growth areas. Seniors reflect systematically: they turn experience into principles.
REVIEW.TASKS
01
Present Your Solutions
// For each of the 6 exam tasks: ## 1. Walk Through Your Process - "Here's how I diagnosed it..." - "I found the root cause here..." - "My fix was..." - "I verified by..." ## 2. Show Your Work - Git diff for each fix - Test results - Before/after screenshots - Commit messages ## 3. Highlight AI Usage - When did you use AI? Why? - What did AI get right? - What did AI get wrong? - How did you verify AI's output? ## 4. Honest Assessment - What took longest? - What are you most confident about? - What are you least confident about? - What would you do differently with more time? // Time: 15 minutes per person // Audience: instructor + peers
02
Review a Peer's Solutions
// Code review framework: FOR EACH OF THEIR 6 SOLUTIONS: ## Correctness - Does the fix actually solve the root cause? - Or does it mask the symptom? - Are there edge cases missed? ## Completeness - Is the fix tested? - Does it handle error cases? - Would it survive a code review at a real company? ## Quality - Is the code clean and readable? - Are the commit messages clear? - Is the approach maintainable? ## Comparison - Did they find the same root cause? - Is their approach different? - Which approach is better? Why? // Write your review as comments: // "Nice catch on the IDOR! I used // the same $or approach. Did you // consider adding an index on // { userId: 1 } for the new query // pattern?"
03
Discussion: Best Approaches
// Group discussion topics: BUG 1 (CSS): - Who found it fastest? How? - DevTools Inspect vs searching CSS? - Did anyone use responsive mode? BUG 2 (Redo): - Where exactly was the off-by-one? - Did anyone add a unit test? - Could TypeScript have prevented it? BUG 3 (Upload crash): - Which file type triggered it? - Defensive coding vs strict typing? - Should we validate MIME types at the middleware level? SECURITY (IDOR): - Did everyone find the same vuln? - Authorization at route vs middleware? - How would you prevent IDORs systematically, project-wide? PERFORMANCE (N+1): - $lookup vs application-level join? - Did anyone add database indexes? - How would you detect N+1 problems automatically in the future? FEATURE (Bookmarks): - Compare data models - Compare API designs - Who wrote the most complete tests?
04
Instructor Feedback
// Instructor reviews each solution // against the rubric: ## Diagnosis Quality /20 - Found root cause, not symptom? - Used proper debugging tools? - Systematic approach? ## Fix Quality /20 - Correct and complete? - Handles edge cases? - Doesn't break other things? ## Communication /20 - Clear commit messages? - Good documentation? - Explained reasoning? ## AI Maturity /20 - Justified AI usage? - Verified AI output? - Could explain without AI? ## Feature Quality /20 - Data model appropriate? - API design clean? - Tests comprehensive? ## Total /100 - 90+: Exceptional - 80+: Strong pass - 70+: Pass with notes - <70: Areas to strengthen
05
Reflection Journal
// Write honest answers: ## What was the hardest problem? // Why? What made it hard? // What skill gap did it reveal? ## What was the easiest? // Why? When did that skill click? // How long ago would this have // been hard for you? ## What surprised you? // Did a "simple" bug take forever? // Did a "hard" one resolve quickly? ## What would you do differently? // With unlimited time? // If you could restart the exam? ## How did AI help / hinder? // When was it useful? // When did it lead you astray? // What's your AI workflow now vs // when you started the course? ## Biggest growth moment? // Looking back at all 125 sessions, // when did the biggest shift happen? // When did you stop feeling like // a student and start feeling like // an engineer?
CS.DEEP-DIVE

Code review is the highest-leverage activity in software engineering.

Google's research found that code review catches more bugs than testing, and the reviewer learns as much as the author. It's not overhead — it's the core of the engineering process.

// Code review benefits:

For the author:
  Someone catches your mistakes
  You learn better approaches
  You write cleaner code knowing
  someone will read it

For the reviewer:
  You learn the codebase
  You see different approaches
  You practice communication

For the team:
  Knowledge spreads (bus factor ↑)
  Quality increases over time
  Standards become shared

// Google internal data:
// - Average review time: 4 hours
// - Median changes per review: 24
// - 35% of comments are about
// readability (not bugs)
// - Reviewers AND authors report
// learning from the process.
REF.MATERIAL
ARTICLE
Google
Google's public code review standards: how to write good reviews, how to handle review comments, and the principles behind effective code review.
CODE REVIEWGOOGLEESSENTIAL
ARTICLE
Michael Lynch
The author's side of code review: small PRs, clear descriptions, self-review before submitting. Makes reviews faster and more productive.
CODE REVIEWAUTHORING
VIDEO
Clément Mihailescu
Inside Google's code review process: the tools, the culture, and why every change requires at least one reviewer's approval.
GOOGLEPROCESS
ARTICLE
Gergely Orosz
The art of constructive technical feedback: be specific, be kind, explain the "why." Feedback that makes people better, not defensive.
FEEDBACKCOMMUNICATION
ARTICLE
Ben Kuhn
Thoughtful reflection on engineering growth: what matters, what doesn't, and how to improve deliberately. A model for the reflection journal exercise.
CAREERREFLECTION
// LEAVE EXCITED BECAUSE
You presented your solutions with confidence. You reviewed a peer's code and found things they missed — and they found things you missed. The group discussion revealed better approaches. Your reflection journal captures honest growth. This is how engineering teams learn.