How it feels being Interviewed by AIDear Reader, Last week, I invited you to join an AI-led concept test of a side project idea. Here's what participants share with me about their experience: "It was like getting interviewed by someone who is doing this one of the first times." "I felt free to say what I wanted without 'hurting' the interviewers. I felt listened (as the AI was repeating my points). But sometimes the conversation was cut off weirdly." "The AI often interrupted me when I was thinking. In addition, it often started to respond, realized I wasn't finished, and then took a completely different approach, forgetting what it had said before. After two or three interruptions, I naturally lost the desire to speak naturally and fell into a telephone agent's way of speaking." So, what can you make of this? Obviously, this is an n=3 sample, so it's not a definitive judgment on AI interviewers' capabilities. So, with a grain of salt, here's what I'm observing:
The main caveat I shared last week about being intentional, where you insert AI into your discovery practices for what purposes, remains true: "Would users actually adopt it?" is a red flag in interview-style research, regardless of scale. Because humans suck at predicting future behavior - especially when it comes to trading time or money for a new solution. Concept testing provides insights into fundamental customer sentiment toward a visual design or gaps in understanding. But they won't help you predict what will get adopted or bought. To get strong evidence about feature adoption or willingness-to-pay, you need to turn to behavioral methods, not attitudinal methods like interviews (no matter if a human or an AI does them). As a result, here's how I would summarize the current state of AI tooling in the context of Product Discovery: Oh, and in case you're curious: Here's the live link to the (messy and work-in-progress) prototype I used for concept testing: Pour Over Diary - A Platform for Specialty Coffee Enthusiasts. Thank you for Practicing Product, Tim Get my BookAs a Product Management Coach, I guide Product Teams to measure the real progress of their evidence-informed decisions. I focus on better practices to connect the dots of Product Strategy, Product OKRs, and Product Discovery. |
1 tip & 3 resources per week to improve your Strategy, OKRs, and Discovery practices in less than 5 minutes. Explore my new book on realprogressbook.com
Product Practice #401 How to Close Your Confidence Loop PUBLISHED Mar 26, 2026 READ ON HERBIG.CO Dear Reader, Most teams can tell you what they're building. Far fewer can tell you why it matters and how they will know it has worked. And I mean in a connected, defensible way that traces from their next release or discovery back to a company goal. That gap is where confidence lives (or doesn't). The confidence loop describes the critical questions you need to be able to answer and connect to...
Product Practice #400 Get your North Star Metric Reviewed by me PUBLISHED Mar 19, 2026 READ ON HERBIG.CO Dear Reader, To celebrate the 400th edition of this newsletter (🥳), I thought, why not try something different: Share your current North Star Metric and some high-level context with me through the form below, and I'll send you a personalized review video - for FREE and without AI processing. Just me, in front of a camera, sharing my thoughts on your North Star Metric. There's no hook,...
Product Practice #399 How to Connect Strategy,Goals and Discovery PUBLISHED Mar 12, 2026 READ ON HERBIG.CO Most product teams don't have a strategy or OKR problem. They have a connection problem. My new Progress Wheel Intensive is a full-day working session for ambitious product teams where we fix that together — using your actual product context, not hypotheticals. Book a call to talk about it. Dear Reader, Often, the core concepts I share on how teams can make real progress by connecting...