How it feels being Interviewed by AIDear Reader, Last week, I invited you to join an AI-led concept test of a side project idea. Here's what participants share with me about their experience: "It was like getting interviewed by someone who is doing this one of the first times." "I felt free to say what I wanted without 'hurting' the interviewers. I felt listened (as the AI was repeating my points). But sometimes the conversation was cut off weirdly." "The AI often interrupted me when I was thinking. In addition, it often started to respond, realized I wasn't finished, and then took a completely different approach, forgetting what it had said before. After two or three interruptions, I naturally lost the desire to speak naturally and fell into a telephone agent's way of speaking." So, what can you make of this? Obviously, this is an n=3 sample, so it's not a definitive judgment on AI interviewers' capabilities. So, with a grain of salt, here's what I'm observing:
The main caveat I shared last week about being intentional, where you insert AI into your discovery practices for what purposes, remains true: "Would users actually adopt it?" is a red flag in interview-style research, regardless of scale. Because humans suck at predicting future behavior - especially when it comes to trading time or money for a new solution. Concept testing provides insights into fundamental customer sentiment toward a visual design or gaps in understanding. But they won't help you predict what will get adopted or bought. To get strong evidence about feature adoption or willingness-to-pay, you need to turn to behavioral methods, not attitudinal methods like interviews (no matter if a human or an AI does them). As a result, here's how I would summarize the current state of AI tooling in the context of Product Discovery: Oh, and in case you're curious: Here's the live link to the (messy and work-in-progress) prototype I used for concept testing: Pour Over Diary - A Platform for Specialty Coffee Enthusiasts. Thank you for Practicing Product, Tim Get my BookAs a Product Management Coach, I guide Product Teams to measure the real progress of their evidence-informed decisions. I focus on better practices to connect the dots of Product Strategy, Product OKRs, and Product Discovery. |
1 tip & 3 resources per week to improve your Strategy, OKRs, and Discovery practices in less than 5 minutes. Explore my new book on realprogressbook.com
Product Practice #394 Can an AI Interviewer doYour Concept Testing? PUBLISHED Feb 6, 2026 READ ON HERBIG.CO Dear Reader, What happens when you let an AI do the concept testing of design variations? I used Reforge's AI Concept testing* to find that out. As a prototype, I chose an app idea that lets you document taste notes from brewing specialty coffee. And I need your help exploring how AI-led interviews feel for participants (and improving the app design ☕️). I mostly went with their...
Product Practice #393 How much Money willthis Feature make? PUBLISHED Jan 30, 2026 READ ON HERBIG.CO I'm hosting my first free live webinar in 2026: Go From "We Need a Strategy" to "Here's Exactly What We're Doing and Why" — In 60 Minutes on Feb 11 at 5:00 PM CET (11:00 AM EST | 8:00 AM PT). Bring all your questions, and I will work through them on the spot. Note that there will be no recording - this is about hands-on interaction, not just information transmission. CLAIM YOUR FREE SPOT Dear...
Product Practice #392 From Product Jargonto Plain English READ ON HERBIG.CO PUBLISHED Jan 23, 2026 READING TIME 3 min & 29 sec Dear Reader,I often feel that, somewhere along the way, we, as an industry, started optimizing for sounding like product people instead of speaking like humans in plain English (or any other language). What would happen if we dropped the product lingo and used plain English to describe what's needed? I find it increasingly liberating for product teams to describe...