🛠️ How to predict the future of your OKRs (and why it matters)


How to predict the future of your OKRs (and why it matters)

READ ON

HERBIG.CO

PUBLISHED

Oct 25, 2024

READING TIME

3 min & 32 sec

​Dear Reader,​

Ever sat in an OKR review meeting feeling like you're just going through the motions or reporting without meaning? You're not alone. As a Head of Product, I've been there too. But what if I told you there's a way to make your OKRs truly useful for decision-making? Let's dive in.

The power of predicting the future

One of my favorite ways of helping teams pressure-test their OKRs is by trying to predict the future. I want them to think about the answer to “Imagine a check-in 4 weeks down the road. With whom will you look at these OKRs and discussion emerges from looking at the changed values and confidence?”

This is similar to a premortem exercise. By predicting the future, teams can anticipate how useful their OKRs are for making decisions.

Why typical OKRs fail the "check-in-ability" test

Let’s use one of the first (documented) KRs I wrote as a Head of Product 6+ years (don’t judge me):

KR1: Understand 'feature' map of product status quo and compare with user expectations.

KR2: Develop design guiding principles for product simplification.

Imagine a check-in four weeks down the road:

A: “How are we doing on KR1?”

B: “I think we better understood the feature map.”

A: “Ok, by how much?”

B: “I think would put the confidence at 80%".”

A: “But…why?”

B: “Because it still feels on track.”

A: “And KR2?”

B. “We have scheduled a first meeting to discuss the design guiding principles. So, I estimate the progress to be about 20%.”

A: “Will you do anything differently based on the progress of these KRs?“

B: “No, I think everything works according to plan.”

A:

How would you rate the check-in-ability of these KRs? Spoiler: NOT GREAT. They can neither demonstrate real progress nor guide a conversation about priorities.

But fixing KRs like these by just adding numbers would be too short-sighted. After all, how check-in-able is a KR with a number, if you can’t influence it, if it’s too lagging, or if it’s a generic KPI?

Predicting the future of your OKRs isn't about being psychic. It's about understanding what they will help you do differently later, so they can be more useful now.

HOW TO PUT THIS THEORY INTO PRACTICE

  • Be clear on what OKRs should do for you. Articulate what change OKRs should create for whom? Is this primarily a reporting exercise or an actual prioritization tool for the team?
  • Role-play a check-in. How would the conversations of a check-in for this OKR go? Would there be meaningful conversations or just nodding in agreement as you blaze through the numbers?
  • Change the attributes. Which attribute is missing from your OKRs to make them more check-in-able? Do you need more precise numbers? Do the KRs need to be more leading? Do you need a more influenceable metric?

Did you enjoy this one or have feedback? Do reply. It's motivating. I'm not a robot; I read and respond to every subscriber email I get (just ask around). If this newsletter isn't for you anymore, you can unsubscribe here.

Thank you for Practicing Product,

Tim

Content I found Practical This Week

How to Set Effective Quarterly Goals and OKRs for Platform Engineering Teams: A Practical Guide

Once OKRs are agreed on, the initial problems they relate to, and the groupings, will lead to some high-level features or initiatives. Each initiative can then be refined further with the teams to identify the work that needs to be done, and how it might be broken down into small increments to develop and ship (in the example, you can see how it would link to a sprint planning priority and sprint goal).

The #1 Reason Why Most OKRs Suck!

video preview

How to track slow-moving OKRs

But if your lagging indicators are moving slowly (ex: you close 1 or 2 customers per quarter), then you should opt for leading indicators instead. You’ll need to find the best metrics that can help you not only assess your confidence, but also help you understand how to adjust your strategy. Using lagging metrics can sometimes feel like more art than science, but the key is to have something that can be observed on a weekly basis.

What did you think of this week's newsletter?

Who is Tim Herbig?

As a Product Management Coach, I guide Product Teams to measure the real progress of their evidence-informed decisions.

I focus on better practices to connect the dots of Product Strategy, Product OKRs, and Product Discovery.

Enjoy the newsletter? Please forward it. It only takes 2 clicks. Coming up with this one took 2 hours.

Product Practice Newsletter

1 tip & 3 resources per week to improve your Strategy, OKRs, and Discovery practices in less than 5 minutes.

Read more from Product Practice Newsletter

Product Practice #342 The OKR Trap:Reporting vs. Progress READ ON HERBIG.CO PUBLISHED Nov 12, 2024 READING TIME 3 min & 04 sec Dear Reader, Similar to being focused on Discovery motions, but missing Discovery decisions, simply filling in OKR templates will lead to reporting, but not measuring progress. OKRs that report numbers try to get a template-based Objective right and define a Key Result as something with a number. OKRs that measure progress take care of the hard conversations through...

Product Practice #341 The Context Matrix READ ON HERBIG.CO PUBLISHED Nov 8, 2024 READING TIME 2 min & 11 sec Dear Reader, Every level of product management involves two areas: creating context and making sense of it. While creating the context partially depends on the environment you work in, making sense of the context mainly depends on the hard skills a product team has. For Product Strategy, the context consists of internal and external data points about the market, your capabilities, how...

Product Practice #340 How to Navigate Product Discovery Like a Map READ ON HERBIG.CO PUBLISHED Nov 1, 2024 READING TIME 4 min & 51 sec Dear Reader, “We need to complete all six Discovery phases in order." "Let's perfect that JTBD statement before we talk to users." "Our process requires detailed one-pagers before any customer interviews." Does this sound familiar? These are examples of dogmatic defaults: Teams cycling through the same tools in the same order, regardless of context. While...