← Back to notes Programme design · Retrospective

What we change after every cohort (and what we will not)

A peek at the retrospective process behind our curriculum updates — including the things participants ask for that we deliberately do not change.

Every cohort closes with a 90-minute retrospective. Three things make it useful. It is anonymous, it is structured, and it is read by the person who will rewrite the next cohort.

After every retrospective the curriculum lead drafts a "change list" — a public-internal document that records what we will change, what we will not change, and why. The "will not" list is the harder one to write and the more useful one to read. Two examples this year: participants regularly ask for graded assessments at the end of each module. We do not add them. Our position is that graded letters are weaker feedback than mentor written reviews — they let learners convert effort into a number and stop reading the actual feedback. We are willing to be wrong about this; the data so far does not move us.

Second, participants ask for a "next-job" guarantee bundled with bootcamps. We do not add it. We are uncomfortable with the financial product structure (income-share agreements, deferred tuition tied to placement) for a regional KR market that does not have stable consumer protection norms for those instruments. We would rather charge a fair tuition and not pretend.

What we do change: scenario realism, scheduling, and lab tooling, every cohort. The week-3 alert-triage rubric has been rewritten four times. The Cloud Incident Handling track's OAuth scenario was rewritten twice in 2025. We will keep rewriting. The signal we are listening to is not satisfaction, it is "what would you do differently next time" — that is the question retros are designed to surface.