Product Practice #308 |
​Dear Reader,​
I once worked with a product team whom I should help "improve their OKRs" during a technical migration. They were 12+ months from shipping the first iteration to a pilot user segment. And yet, their OKRs featured a multi-year time horizon, including metrics like NPS, Conversion Rate, Support Requests, and Revenue. Nobody seemed to care that these metrics had been unchanged for months and would continue to do so. All that mattered was having a "customer-centered" set of OKRs in place for this project.
This was more about OKR theater than utilizing OKRs to measure the real progress of a team. And yet, there were other options.
There's a valid case to be made that a solid Gantt chart and disciplined sprint reviews are practical tools for measuring the progress of a migration. There is no NEED for using OKRs in a scenario like this - And definitely not by using made-up metrics that try to capture the metrics that happen as a result of the migration weeks or months after it's done – these are simply too lagging to be used as decision-making tools.
Revisiting last week’s idea that useful team-level OKRs sit within a product team’s Sphere of Influence or Zone of Control, metrics about the results created by the migration do not help measure progress during the migration.
So, what’s left for product teams to influence or control during such a project? Typically, you look at metrics like velocity changes, met sprint commitments, requirements completions, user acceptance tests, staging performance, dummy data processing, etc.
For example, idealo published a blog post about their recent 2-year cloud migration project — Including the OKRs they had set for the effort. Let’s review their KRs:
KR1: We migrate 100% of all applications into AWS until the 1.10.23.
This works as a high-level KR for the entire project. It’s essentially the trackable progress of the project. Maybe smaller chunks or contributing metrics could be more beneficial for quarterly-ish durations.
KR2: The average time to recover (TTR) for AWS-related incidents levels 1 & 2 stays below 4h until the 1.12.23.
From what I can tell, this metric can only be measured after the migration is completed, making it less useful for measuring the progress of the migration itself.
KR3: idealo is staying within the allocated yearly AWS budget.
This reads more like an Objective than a KR since it lacks a specific measurement. I could see a rephrased KR read something like this: “x% of AWS operating weeks are costing below x€/team.” However, the nature of the KR looks semi-trackable throughout the migration. You have to complete a partial AWS migration before you can start to measure the costs. But since I imagine the transition will happen gradually, you can probably begin to measure it before the project's completion. But it reads more effect-like, not progress-like.
Note: I have no inside information about this project beyond the blog post, and this is a pure armchair critique with no intention at all to downplay the accomplishments of Idealo’s product teams. Kudos to them for nailing this migration.
There might always be voices in your company that become scared when they see non-Outcome OKRs being used. But guess what? OKRs are, first and foremost, about measuring progress, however that looks like. And, if pragmatic OKRs for a migration feel like such a bad idea to your company, then maybe month/year-long migrations or the forcing of OKRs onto these projects aren’t a good idea in the first place.
​
Thank you for Practicing Product,
Tim
Learn the strategies and tactics you need to use OKRs in a way that helps product teams prioritize work based on user problems and business goals—instead of replicating existing feature backlogs.
My Outcome OKRs for Product Teams Course enables product teams to use OKRs as a tool for decision-making in the context of Product Strategy, Product Discovery, and Scrum. Without the fluff, but with a focus on practicality in everyday work.
Learn more about my OKRs Course |
What did you think of this week's newsletter?
As a Product Management Coach, I guide Product Teams to measure the progress of their evidence-informed decisions.
I identify and share the patterns among better practices to connect the dots of Product Strategy, Product OKRs, and Product Discovery.
1 tip & 3 resources per week to improve your Strategy, OKRs, and Discovery practices in less than 5 minutes.
Product Practice #361 Connecting North Star Metricsto Business Models READ ON HERBIG.CO PUBLISHED May 2, 2025 READING TIME 4 min & 45 sec Dear Reader, In many organizations, there's still a disconnect between product and business metrics. Product teams focus on customer-centric outcomes while business teams chase financial targets, with neither side fully trusting how one drives the other. When done right, a North Star Metric (NSM) can establish a middle ground that brings together both...
Product Practice #360 Why your Product DiscoveryFeels too Theoretical READ ON HERBIG.CO PUBLISHED Apr 25, 2025 READING TIME 4 min & 17 sec Dear Reader, Over the past two weeks, I've explored treating Product Strategy and OKRs like products to avoid Alibi Progress. Today, let's tackle the practice that often gets dismissed as "good in theory, impossible in practice" — Product Discovery. When teams tell me "we don't have time for proper Discovery," they're usually stuck in Alibi Progress —...
Product Practice #359 Make OKRs Drive Decisions,not Spreadsheets READ ON HERBIG.CO PUBLISHED Apr 18, 2025 READING TIME 5 min & 32 sec Dear Reader, Last week, we explored how treating Product Strategy like a product helps you avoid "Alibi Progress" — prioritizing correctness over value. Today, let's apply this same thinking to OKRs. If you've ever found yourself more relieved that the quarterly OKR-setting theater is over than excited about the OKRs themselves, you've experienced the symptoms...