[ Interview ] Lean UX Co-Author Jeff Gothelf on How Product Teams Should Do Product Discovery in 2024

A few weeks back, I spoke to Lean UX co-author Jeff Gothelf for The Lean B2B Podcast. We talked about UX, product management, innovation, experiment design, and the importance of doing effective product discovery.

You can watch the full interview below, or access it on iTunes or Spotify.

Interview Transcript

Jeff Gothelf – Product Discovery

Etienne Garbugli: My guest today is Jeff Gothelf. Jeff is the co-author of the books Lean UX: Design Great Products with Agile Teams and Sense & Respond. He’s also the author of Lean vs Agile vs Design Thinking. Along with Josh Seiden, Jeff co-founded Sense & Respond Press, a publishing house focused on bringing innovation, digital transformation, product management, and design books to market.

Jeff is also a coach, speaker, and consultant helping organizations build better products. Jeff, welcome to the podcast.

Jeff Gothelf: Thanks so much for having me on, Etienne.

Etienne Garbugli: Maybe as a first question, so when you’re working with product teams, and when you speak with product teams, do you feel that, teams are generally sufficiently user or customer-driven?

Jeff Gothelf: Um, I think that most organizations pay a lot of lip service to being customer centric and, and customer driven. I think a significantly smaller percentage of those organizations actually work that way. That’s not to say that the product managers, the designers, the engineers, the QA folks on those teams are not customer centric or don’t want to be.

I just have seen that many organizations say they are or say they aspire to be, but don’t actually. The majority of them don’t put the effort into actually be that way, despite, despite having really smart customer-centric designers, product managers and engineers on staff.

Etienne Garbugli: Well. So, in that case, what do you feel are the biggest challenges, or the biggest hurdles that these teams or these people within organizations face in attempting to be more user or customer-driven?

Jeff Gothelf: I mean there, there are lots, not the least of which is there’s, especially in successful companies, even high-growth companies, they don’t have to be old companies, but companies that have found product-market fit and that are scaling quickly, there’s a belief that we know what’s best for the customer, right?

Look, we’re big. We’re successful. We’ve been around a hundred years or 20 years, or you know, where we, we’re a hundred-million-dollar company. We’re a $500 million company. We know what customers want. I don’t need to talk to them. I don’t need to ask them. Um, because look, look at what we’ve done on our own without that so far.

So, there’s, there’s a lot of that. Um, the other thing is that being customer centric means that your measure of success changes significantly. It changes from outputs to outcomes. Now that’s an easy thing to say. So, I’m going to unpack it for just a second. Outputs are the things that we make, the features, the products, the services, the apps, the devices, those types of things.

Um, most organizations manage to output for a variety of legacy reasons, but really, most of all, because it is a binary measure. You’ve shipped the product or you did not ship the product. And because it’s binary, it’s easy to measure and because it’s easy to measure, it’s easy to manage and easy to incentivize and reward.

If you’re truly being customer centric, you should be managing to outcomes. Outcomes are the customer behaviors that we see once we give our customers the product, the service, the app, the system, whatever it is. And if their behavior doesn’t change in a way that makes them more successful and then ultimately us more successful, then we have to update that system.

And so, the measure of success changes dramatically from a feature-centric one. We built a thing. To a customer-centric one when you manage to outcomes, which is we positively impacted the lives of our customers. And we know that because now they’re doing things differently that benefit them more and that benefit us more.

Uh, and so that’s, that’s, that’s really the biggest, the biggest obstacle to being customer centric.

Etienne Garbugli: And so, working with a lot of organizations or interacting with a lot of organizations, like how do you see that trend evolving?

Jeff Gothelf: Again, I see a lot of really smart people in the trenches, in the individual contributor roles, in the team manager roles, in the discipline leadership roles, the design leader, the product leader, that type of thing.

But I rarely see that conversation elevated to a leadership or an executive level outside of customer satisfaction or Net Promoter Score. Which again is, uh, not, uh, we should absolutely be worried about the satisfaction of our customers, but they’re using that one metric as a sole measure of that is incredibly risky and not terribly valuable.

Um, and so that’s the trend is. I think you’d be hard pressed, so I think you’d be hard pressed to find a designer who doesn’t believe they are customer centric. I mean, sure, there’s going to be some, some genius designers out there who believe they know best, but for the most part, I think the overwhelming majority of interaction designers, UX designers, those kinds of folks are customer centric by default as part of the profession.

I think product managers, modern product managers, people who subscribe to kind of the modern process of software development and delivery and understanding the market (product discovery). I think many of those folks believe they are and are customer centric. So, you’ll be hard pressed to, to find somebody who says, no, I’m not, I’m not customer centric.

There are, they exist. There are some sort of inward-facing product owners or product managers who don’t really believe in customer centricity. I think there’s probably more of those than there are UX designers. Um, and so I think the trends there are positive at the individual contributor level.

I think even engineers really get it. I mean, there’s still, again, I think, you know, there’s probably more engineers than product managers, than designers who don’t really care about the customer that much. But nevertheless, I think modern software developers really do understand that customer centricity is key.

So, I think those trends are headed in the right direction. I think at the C-suite and the leadership level, there is a belief in maintaining competitiveness, maintaining competitive advantage. And they see organizations like Apple, Netflix, Facebook, Google, Amazon, really sort of dominating not only their own spaces, but expanding fairly broadly into other domains. And they want to be like them. And unfortunately, the only real takeaways they get from those organizations is faster time to market, right?

How can we get more stuff to market more quickly? And I think where they fail to see, and this is not true across the board for those big five tech organizations, but what they fail to take away is how customer centric Amazon is. Netflix is.

How focused on the user experience they are. Um, sadly, Facebook is too to their own ends. They’re not really focused on optimizing the customer’s experience with more of their bottom line. But, um, and I think that that is, uh, that’s something that’s missing.

So, they’ll focus on things like agile, for example. Right? Agile is what makes these companies successful. We need agile. And they see it as a, as a recipe for delivering more software more quickly, which again, is managing the outputs, not managing the outcomes.

Etienne Garbugli: All right. So, going in that direction, so say I just got hired by an organization, how would you recommend I go about diagnosing the product organization?

So, how we work and what’s the current situation in terms of how things get delivered? Are we output, are we outcome? Is it, is it working in terms of agile? Is it working as an organization? And then maybe how would you recommend someone within an organization to shift that mindset towards more of a learning, more of an, uh, an output discussion?

Jeff Gothelf: I think how you have that conversation will vary by where in the organization you got hired, right? So what level in the organization. But the conversation you’re trying to have, generally speaking, is the same. It’s a conversation that asks the question, why? Why are we working on this? Right?

What customer is it for? Um, what benefit does it bring to the customer? If the customer is successful, how does that translate into business success for us? How does that fit into the broader product portfolio that we’re building into the strategic plans for the organization for the next year or two years, or whatever it is?

That’s the conversation you want to have. Now, again, asking why is increasingly risky to your career, the lower in the organization you are typically speaking in larger organizations, right? And so, we want to build the conversation to ask why in a way that doesn’t limit your career in that organization.

But that’s the conversation you’re trying to have. So, I think that, you know, at the individual contributor level, it’s about asking, what are the assumptions that went into making the decision to build X, right? Or to design it in this particular way? Who’s the target audience? Right?

When’s the last time we talked to anybody from that target audience? Can I go talk to somebody from that target audience? Right? And then to, as you start to collect more customer feedback, more quantitative data, and as you bring that back into the team. Really seeing how the team, the product manager, the business lead, the business line leader, the executives, the stakeholders, react to that data because that data is never going to be 100% reflective in a positive way of everything that you’re doing, right?

There’s always going to be some contradiction. Hey, we’re making it blue. Everyone says they like red and, whatever. Right? So, how people react to that becomes really interesting, right? It’s a good indicator of the culture of an organization if they’re like, oh yeah, yeah, that’s what the customers say, but we know better. Right?

Or that kind of builds up a better understanding of our, of our customer. Let’s explore and see why that is, and maybe through the next iterations, we can make some improvements, right? It’s two very different conversations.

Etienne Garbugli: So, I saw, I think last week or a couple of weeks back, you released the hypothesis canvas, which I really like.

So, when you’re working on a project, when you’re working as part of a team, so how do you pick which questions to address and how do you figure out what the most important things to learn are?

Jeff Gothelf: Yeah. So, um. The Hypothesis Prioritization Canvas is a tool that I use with every team that I work with to help them determine where to focus their customer discovery work.

There is sadly, not enough time in every sprint to do all the discovery work that we want, and frankly, we don’t need to be doing product discovery on everything. Right there, there. And that’s the purpose of the canvas. The canvas is to say, here are all the things that we’re looking at for the next iteration, two iterations, quarter, whatever it is, whatever the timeframe is.

Where should we focus our learning activities? Our discovery work, our research, our, uh, customer conversations? And, the matrix in that canvas is based on risk of the idea, risk of the hypothesis, low to high, and the perceived value or the perceived impact that we believe that this hypothesis will have on the customer, on the business, etc., again, from low to high.

It’s perceived value because we’re assuming that this will have big impact, right? We, uh, our educated guest tells us that, but we don’t really know. And then risk is contextual to the hypothesis or the idea itself. So, it might be technical risk, it might be design risk, might be brand risk or market risk. The hypotheses that end up being high risk, but also high potential value or high perceived value.

Those are the ones where we should spend our product discovery work on. That’s where we should, if we, if we’ve got, you know, those precious few hours, every sprint that we can build experiments, talk to customers, do research, you know, prototyping, whatever, whatever the activity is. Um, focus on the hypotheses that are high perceived value and high risk.

Because if you get those wrong, you stand to do a lot of damage. But if you get those right. You stand to have a good impact on the customer and on the organization. So that’s, that’s what the canvas is for, is to really help you kind of think through your ideas and where to focus your product discovery efforts.

As far as determining, uh, risky assumptions and that type of thing. The really interesting thing is there’s two key questions that I teach every team that I work with. Um, and the questions are boxes seven and eight in the Lean UX canvas, the first canvas that we did a couple of years ago.

What’s the most important thing that we need to learn first and what’s the least amount of work we need to do to learn that? Now, the focus of your question was box seven, which is what’s the most important thing we need to learn first? And that’s the question that I ask every team at the beginning of a new iteration, right?

What is, you know, uh, the, what is, it’s a conversation about risk and it’s a conversation about the risks that will derail the thing that we’re currently working on. Now, the interesting thing is, is that in a truly cross-functional setting, product managers, designers, engineers, sitting together, you will get at least as many answers to that question as there are disciplines in the room, right?

The designers will talk about some designer challenges, some customer challenges. The product managers will talk about, you know, product-market fit or scalability or business model or business rules or whatever it is, right? The engineers will talk about feasibility, scalability, performance, security, and all of those are valid concerns.
They’re all valid risks to the success of the project. The question is, what do you need to learn right now? What’s the most important thing you need to learn right now? The thing that if we get it wrong today, tomorrow, in this sprint breaks the whole initiative or breaks the whole hypothesis. And that is an interesting conversation to have with your team.

At the beginning of an initiative that’s going to be related to value, the value of the idea. Will people look for it? Will they find it? Will they try it? Will they sign up for it? That type of thing. Do they understand what it is? Right? Um, if you’re in kind of a more mature version of the product, we’re going to move away from value into feasibility, scalability, security, performance. Business viability, those types of things.

Um, and so really thinking about the, where in the lifecycle of your initiative, uh, where in the lifecycle of your idea is helps you to understand the biggest risks, but always at the earlier stages, focus on value. Because if nobody wants it, nobody will find it. Nobody will look for it. Nobody will try it. Uh, it doesn’t matter if you can build it.

Etienne Garbugli: But so, if we talk about that specifically, like one of the things I love about the Sense & Response framework is that agility, but not focused on how you ship, it’s more of that corporate agility where you know when to adapt and change course and, and change your direction.

So, in that case, when do you know, like what are the. So, for example, the trigger points that tell you that it’s no longer about the value. Now we should be learning about that next other thing afterwards. Maybe feasibility or whatever it is. Like how do you know when to transition between say one, a learning goal versus another learning goal?

Jeff Gothelf: It’s a great question. Look, I mean, the interesting thing about all this stuff is that we are inspired by science. We’re inspired by scientific method. You hear us talk about assumptions and hypotheses and experiments, right? Collecting data and all that stuff, and we do all of that stuff and we are inspired by scientific method, but we’re not doing science. Right.

In science, there are more or less absolute truths, right? We fed the bacteria X and they mutated into a monster. That happened or it didn’t happen, right? Uh, in our world, the answers are rarely black or white. They’re usually in, in some shade of gray in between. And so. How you decide when to kind of move on to the next set of assumptions, the next set of risks to say, Hey, you know what that was good data, so we’re going to continue, we’re going to persevere with this hypothesis versus pivoting or killing the idea.

There is, there’s a level of evidence that you want to use, but at some point, you’re going to need to augment that with some level of gut feeling or really confidence, right?

It’s confidence that that is, is the true test. So, my, my friend and colleague Jeff Patton talks about this concept of confidence in terms of bets, right? So, you, you ran an experiment, you learn something and you say, okay, great. Um, what are you willing to bet me. That that this is a good idea that we should keep working on, right?

Are you willing to bet me your lunch? Right? Most teams will say, yeah, are you willing to bet me a week’s vacation? Right? Maybe are you willing to bet me your retirement savings, right? Uh, your house? Are you willing to bet me your left arm? Like, you know, and so and so if you’re, when you, you know, when you collect the data, you know, qualitative and quantitative and you still can’t get to a, you know, it’s going to be in that gray area and you still can’t get to that really kind of definitive, yes, we should move forward. No, we should not. Um, that’s a good conversation to have.

What are we willing to bet that this is still a good idea? And that really gives you a sense, if people are saying, look, I’m not willing to bet you lunch that we should keep moving. That’s a good indication that. We got to change course, right?

If, and if people are saying, I’m willing to bet you my left arm, that this is a good idea. And that’s kind of the collective sentiment, that’s a good indication that the team’s got enough data to move forward. And that’s, that’s, that’s the non-scientific part of this, right? That’s the shades of gray part of this.

Etienne Garbugli: Well, so in that direction, so in one of your talk, you’re talking about, uh, say for example, the team is focused on improving the conversion rate. So, you hit 15%. Maybe you can get to 20%, but the effort between 15% and 20%, will that’d be worth your time?

Obviously, if you were able to always make the right decisions and always be, be changing, uh, to the next assumption at the right time, there would be a lot of value there cause you would probably be growing faster. Are there telltale signs that tell you, so we’ve, we’ve probably milked that objective enough for now, let’s try and, and, and address the next thing?

Jeff Gothelf: Yeah. It’s interesting, right? So, there’s theory of the local maxima is always an interesting one, right? We keep putting, we keep putting effort into this, and, um, we we’re not seeing the kind of returns on it. Um, again, this is, uh. There’s no, I mean, as far as I know, there’s no formula that says that you have now tried squeezing more out of this five times, and that’s enough, right?

There might be a formula. I just, I’m not aware of one. Um. And so, it’s really a thing of like, Hey, we ran this experiment and we couldn’t get this to move forward. Hey, we tried this week. We really got up, you know, we’ve got a 1% lift out of this. We did this, we got a 0.2% lift out of this.

Okay, we’ve been at this for a month, we can’t seem to budge off of a 15% conversion rate, 16% conversion rate. That’s a good conversation to come back to your stakeholders, your clients, whoever it is, and say, look, we spent a month trying to move this. We got to 15 pretty quickly. Getting beyond 15 has proven a challenge.

There are some big things we could try and kind of pick up and move over, but we don’t believe that we, you know, the, the, the return is worth the investment here at this point. So, I think that that, um, again, is one of those sort of gut feelings. Um, but it’s tough. It’s tough because remember, we all love our ideas and letting them go is difficult.

Etienne Garbugli: All right. So, within this like, so what’s the role of experimentation and what’s the role of maybe more in depth research on product when you’re doing product discovery? So how do you see them best working together?

Jeff Gothelf: Well, I mean this, this is how we learn, right? Experimentation and product discovery is how we learn.

Now, experiments, for me, it’s an umbrella term for anything that we do to build learning into our product. Discovering delivery process. Right? So, it can be anything from a customer interview to a paper prototype, to a live data prototype, to beta testing, price testing, A/B tests. All these things are, are experiments.

They’re learning activities to help us determine if we should continue investing in a particular direction. Um, and so to me, they are a crucial part of “the work”. In other words, these are not disposable elements of product development, modern product development that we can not do when we’re tight on time. Right?

The, the work, sadly, it’s primarily perceived the software engineering in most organizations, but in reality, “the work”, capital T capital W, right, is, uh, is software engineering, design, right? Product discovery, research, product management. Uh, uh, you know, all of these things are part of “the work”.

And I think that that’s, to me, if, if I feel successful when an organization understands that. They understand that the work can’t be done without, uh, uh, without all of those components. And so that’s to me is it’s a crucial role. It’s as crucial of a role as software engineering because, and people say, well, that’s not true because the software engineering is the actual making of the thing.

And while I agree with that, right, if you make the wrong thing or an unusable thing, then who cares? Right? And so, informing what work needs to get done is equally as important as, as expertly crafting the experience, both in a design and a development way.

Etienne Garbugli: So, in that case, if you, uh, if you use experiments as an umbrella term like that, so do you always recommend setting exit criteria, for example, you’re doing interviews or you’re doing other product discovery activities that are more, um, customer development or user research. Would you always say that there needs to be an exit criteria to these activities as well?

Jeff Gothelf: Define what you mean exit criteria.

Etienne Garbugli: More like you have an experiment, you’re running a test, so like you would have an evaluation criteria. If this passes, if we reach this certain threshold, would you do the same thing with interviews as well or with other activities like that?

Jeff Gothelf: Yeah. I mean, yes, you need to go in with a sense of what success looks like and what failure looks like. Right? So, um, there needs to be some kind of a threshold with the team that says, we’re going to talk to 10 people today. If less than three of them tell us that there’s any value in this thing, then we’ve got to go refigure, we’ve got to figure this out.

Now look, you may end up with like two and a half, right? Which is again, where this kind of like two people said they hated it, and one person said, you know, I don’t love it. I’m not sure I’d use like, you know, you don’t get that, that perfectly clear answer.

Um, and that’s where the challenges begin with all of this is, is because if you don’t get the, those very clear responses, you don’t get that very clear, you know, black or white, yes, it’s a pass or it’s a fail, but you absolutely need to go in with some kind of success threshold. That, as a team, kind of back to our conversation about bets and confidence gives you enough confidence to move forward. Right?

So, for example, let’s say you’re doing a feature fake. Right where there’s a link, kind of a button to nowhere, right? So, it looks like a feature on your site, but when you click it, you get nothing, or a 404 page or a coming soon message, right? Um, and 10,000 people come through that workflow on a daily basis, right?

As a team, how many people are going to have to click on that to give you the confidence to move forward? Right? And there’s, you know, you start off with a ridiculous. You can get teams to, to conclude, you can say, look, two people out of 10,000 and they’d be like, no way. Right? 9,000 out of 10,000 no way.

Okay, so it’s between two and 9,000 right? You kind of start to narrow that down. Is it a thousand is it 500 is it 2,500 right? That becomes the kind of conversation that you want to have and be very clear that if we hit this, we feel confident enough to move forward. That’s it.

Etienne Garbugli: So, it’s about increasing the level of confidence you have to move forward to the next step?

Jeff Gothelf: Yeah. Because I told you like, this is not, it’s not science. I mean, you will, every now and again you’ll get lucky, right. And nine out of 10 people will find the thing, tell you they love it, pay you for it. Right. Awesome. Right? But, uh. Well, I’ll give you a real life example. I was working with a team that was building a new subscription-based business for an organization that never had subscription-based businesses before.

So, they didn’t have, they didn’t even have a sense of what success looks like for subscription, but they knew how much money they needed to make and how much they’re going to charge for this service, what they thought they were going to charge for the service, and so they knew how much money they needed to make in order to build this into a viable business.

And so, they did some quick, you know, scaling or some spreadsheets for scaling this up and scaling this down. And they got a sense that about 75% retention rate month over month was what they needed, right? And so they ran this experiment with this new subscription model. A a. Idea, and as long as people converted at or around 75% continuously on a month to month to month basis for three months, that was a good indication that they were delivering value.

So, for them, that number was 75% and it may not be the number for your organization, but that’s what they needed to hit to be able to make a business case for further investment in this idea.

Etienne Garbugli: That’s super interesting. I don’t want to take too much of your time, but thanks for taking the time, Jeff.

It’s really appreciated. Where can people go to learn more about your work?

Jeff Gothelf: Absolutely. It’s my pleasure to be here. Um, so, uh, so I’m super easy to find. That’s by design. If you go to jeffgothelf.com, you’ll find everything there. My blog, What I do, how I do it, links to upcoming events.

I’ve got lots of in-person and online classes available as well as links to Sense & Respond Press. And so that’s a great place to go. Jeffgothelf.com, start there.

Etienne Garbugli: I’ll make sure to share the hypothesis canvas, and as well the revised Lean UX Canvas that you guys have.

Jeff Gothelf: Perfect.

Etienne Garbugli: Thank you very much. That’s really appreciated.

Jeff Gothelf: Thank you, Etienne. That was great.

More on Product Discovery

Download the First 4 Chapters Free

Learn the major differences between B2B and B2C customer development, how to think about business ideas, and how to assess a venture’s risk in this 70-page sampler.

Enjoyed this Episode?

Subscribe to The Lean B2B Podcast for more: