Introducing Productboard Pulse. Exec-level insights into what your customers need, powered by AI.
On our latest podcast,we welcomed product thinker John Cutler to discuss the important but complicated role of data in B2B product organizations.
As you no doubt know, John is everywhere in the world of product. He’s the Head of Product Research and Education at Amplitude, as well as a coach, mentor, author, and all-round advocate of making excellent products.
In case you missed the webinar, here’s a lightly edited version of the conversation.
I think it’s a function of their early success. I tend to frame things by assuming there’s not a lot of negligence out there. It’s usually the inertia of your success that creates challenges down the road.
“ It’s usually the inertia of your success that creates challenges down the road. ”
You see successful B2B teams getting super close to their customers at first. There’s a company here in Santa Barbara whose early team went up to Portland and lived with their customers when they were kicking off their property management product tier. Here at Amplitude, if you ask our CEO what an early team should do, his advice would be to get out there and talk to customers.
So I think the challenge comes when this inertia builds. Then what happens is your product becomes more complex. You start attracting more personas. At first, you had the early adopters, but now you’ve got these multi-sided user problems.
As a founder – and I’ve talked to other founders about this – you hit this point where your intuitions about customers start to fail. Even if you’re out there talking to customers, you have this inertia around what you thought they did or wanted. But if the company is growing, no one can really pin down the customer.
“ You hit this point where your intuitions about customers start to fail. ”
So I’m a huge advocate for the use of qualitative data. If a PM asks me what they should do first when they join a company, I’d say get on the phone for 20 hours a week if you can.
At Amplitude, our most “mature” customers learned very early that you’re just peeling away layers of learning. This idea that you’re going to go in and find a magic metric that’s going to unlock whole parts of your business – if it were that obvious, you’d probably have a strong intuition about it already. It would be staring you in the face.
Once you pull a layer off your product, you find out how often you’re wrong, or how often you’re misjudging your customers, or that the things you did last quarter didn’t work. But that’s where the learning happens. And if you stick with it and integrate measurement into how you think and work as a company, it’s not even being data-driven at that point. It’s just sort of data-infused. It’s measurement-infused. That’s when you get to have real success in what you’re doing.
“ If you integrate measurement into how you think and work as a company, it’s not even being data-driven at that point. It’s just sort of data-infused. That’s when you get to have real success in what you’re doing ”
B2B is a bit more complex. But one of the advantages in B2B is that you’re there to help someone be more efficient at their job. And because you’re not trying to persuade someone to do something they don’t want to do, there are often clear measurement strategies you can apply.
Being data-driven is not a panacea. Product teams often go through the trough of disillusionment, where they figure out that a lot of their decisions weren’t what they thought they were. But if they stay at it, they can be successful, especially in B2B.
Standard BI or business reporting is built on the idea that someone will say, “What question do you have?” Then someone will go away from the BI team and answer that question. That’s a pipe dream – especially with product experiences that are moving so quickly.
By the trough of disillusionment, I mean when teams think they will figure out the questions they want to have upfront and get perfect answers back right away. In reality, it’s much more exploratory. At Amplitude, we had to build our product with that exploration in mind.
I do think you can get a general idea of the class of questions. It’s not like you’re flying completely blind. But I think the disillusionment comes where you start to cycle through and realize what you thought was happening isn’t what’s happening. So then you refine the measurement, reach out to customers, look at qualitative feedback at scale, and tweak the measurement again.
Yeah, or learn-measure-build. I play around with the order. I think it’s interesting that in the agile community at least, there’s this idea that we won’t know anything till we put something in production. I think that’s because that relates to solutions and not opportunities.
I would flip it and say that often we are able to explore the size and value of opportunities that exist. We might not know precisely how to solve them – and that’s where the trough of disillusionment comes in. We put something out there that doesn’t exactly solve a problem. But I do think that research, planning, and strategy are critical. And I don’t think we’re flying blind. I think you can be better or worse at it.
Most challenges probably relate to the complexity of the products, the duration and complexity of the experiences, and the complexity of the user personas. A colleague of mind at Amplitude says, “How do we make the thousandth and first user of Amplitude successful?” That’s a pretty crazy B2B problem to solve.
Again, going back to that trough of disillusionment, I think that people often look at that problem and write it off. They’re like, “It’s not even worth starting.” That doesn’t fly with me. So when people get analysis paralysis, I ask them: Who are you building this for? What’s their current behavior in the product? What’s their new behavior going to be? How is it going to benefit them? How is it going to benefit the business? How are you planning to measure whether their behavior changes?
“ Who are you building this for? What’s their current behavior in the product? What’s their new behavior going to be? How is it going to benefit them? How is it going to benefit the business? How are you planning to measure whether their behavior changes? ”
I believe that in B2B, your whole company is the product. I remember looking over a problem that customer success was having, and an engineer said to me, “There’s nothing I could build that would be more important than helping that customer success person be amazing at their job.” You may think that the new feature you want to build is more important, but the customer success person is an extension of your product.
“ I believe that in B2B, your whole company is the product. ”
I have a couple of simple practices that I always suggest. Say you go to an executive, and they’re like, “Build this.” Well, there’s a bet in there somewhere. By saying “build this,” they’re making a bet that that’s going to do something. So even if your job as a B2B product manager is to build something else, there’s still a story behind it.
The executive isn’t stupid. They have some idea in their mind about what’s going to happen if you build that thing. And at its simplest level, in B2B, you can put a stake in the ground around that bet and then work measurement into that activity’s kick-off.
Even if you’re working in a feature factory where it’s all about shipping, make sure that you talk to your team about what you thought would happen once the product has been live for a while. Even if you find that the data was inconclusive or you didn’t measure the right things, that in itself helps the team learn. It’s important to go back over something you tried and talk about how it didn’t work.
“ Make sure that you talk to your team about what you thought would happen once the product has been live for a while … It’s important to go back over something you tried and talk about how it didn’t work. ”
In B2B, there’s so much pressure to ship. Shipping feels very tangible and real. So the only way you can break through that is to reflect on the efforts you’ve made. And there’s nothing stopping anyone on this call from doing that. No, you will not get fired if you do a presentation back to the team and say, “This is what we thought would happen, and this is what we learned.” It might annoy people, but you’re not going to get fired.
I love that! I always say to teams, if you could just nudge your decision quality up 10 or 20%, it could have amazing results – more than anything you could ship, more than hiring 30 new people.
This is the difference between measuring to learn and measuring to control. If all you’re doing with your team is measuring to control people, there will be all sorts of adverse effects downstream.
Assuming you can do it effectively on the small, the issue seems to be doing it effectively at scale. And that’s where some teams give up.
As someone who has poured over thousands of qualitative feedback pieces, I think it comes down to how you keep the empathy and connection and how you position that feedback. For example, I always suggest that teams relate feedback to behavioral personas when possible, to understand what this person does in relation to what they say.
“ As someone who has poured over thousands of qualitative feedback pieces, I think it comes down to how you keep the empathy and connection and how you position that feedback. ”
Yeah. I am excited to see products like Productboard that tackle that problem in a purpose-built way. But I think that some percentage of it is your commitment to the process. And this is the difference between data snacking versus working data collection into your mission and what you’re doing.
It’s one thing to have people snacking through qualitative data to put their case together. But sitting with your team and going through the qualitative feedback together, having a clear sense of the opportunity, and being focused on the decisions you can make – that’s another thing altogether.
“ To me, data snacking is just pinching and grabbing any data you can to make whatever case you need at that moment. The opposite of that is integrating data into the way you work. ”
My general philosophy is that the more you can start together with the people who are going to work on something, the more you can delay convergence on the effort and focus on the opportunity.
Sometimes I go opportunity-problem-solution. Because there’s an opportunity, which has a lot of nested problems, which have a lot of nested solutions. In many cases, I think that the role of product is to help frame that opportunity or at least the context around it. But when I talk about starting together, I mean trying to get a team of engineers, designers, data scientists, and marketers together for a while at the beginning – whether that’s three months or three quarters. It’s about peeling away the layers and understanding the problem together.
I use the test of the smartass in the back of the room. Imagine that the CEO is standing up and explaining how a particular number has gone up or down. And the person with a real understanding of the domain and product – the smartass – is at the back, pulling their hair out.
To me, that is success theater because the metric doesn’t hold the water of scrutiny. It doesn’t have any complexity to it. It’s not talking about the decisions that need to be made, or the value people create in the product. It just feels fake. That’s my test – whether people roll their eyes when they hear someone talk about a metric.
One sign is when you realize that your personas have multiplied, and your workflows have become more complex. The sign to me is when you can’t keep it in your head. Like, I can’t keep Amplitude in my head, because it’s a complex product in many ways. There are many layers to it, many personas. Another sign is when you talk to a customer, get feedback, and you’re like, “I’ve never heard anyone say that before.”
I think you’ll notice when your intuition starts breaking down. If you’re a founder, I usually suggest assigning someone in your company to call BS on you when your intuition breaks down, because founders are the last person in the room who wants to admit that. So, find an accountability partner in your company who will call you out when you’re no longer able to keep it all in your head.