Lenny's PodcastThe UX Research reckoning is here | Judd Antin (Airbnb, Meta)
EVERY SPOKEN WORD
150 min read · 29,635 words- 0:00 – 4:16
Judd’s background
- JAJudd Antin
User-centered performance refers to customer obsession or user-centered practice that is symbolic rather than focused on learning. It's hugely common, I would argue. It's work we do to signal to each other how customer-obsessed we are, not 'cause we want to make a different decision. If your, like, listeners are like, "I don't do that," I'm kind of like, think about it for a second. This is extremely common. Every time a PM comes to a researcher at the end of a product process and says, "Can you just run a quick user study, you know, just to validate our assumptions?" That's user-centered performance. It's too late to matter. We got to ship it, right? What they want is to check the box. One of my big kind of mantras was, we don't validate, we falsify. We are looking to be wrong. Many PMs, many designers are not in that place. They do not want to be wrong. They're looking to validate and that's user-centered performance.
- LRLenny Rachitsky
(instrumental music) Today, my guest is Judd Anton. Judd helped build the user research practice at Facebook, he was a long time head of research at Airbnb, and his direct reports have gone on to lead research teams at Figma, Notion, Slack, Robinhood, Duolingo, Fair, and other amazing companies. These days, Judd spends his time consulting, helping companies with organizational challenges, product strategy, design, research, hiring, onboarding and crisis management. In our conversation, we unpack a conclusion that Judd has come to recently about how the user research field is going through a reckoning and what needs to change both within the user research field and how companies leverage user research going forward. Judd shares what the user research field has gotten wrong over the last decade, how PMs and designers rely on user research too often and to answer the wrong questions, where user research will continue to provide significant value and how to best leverage your researchers, why it's important for researchers to think about the business goals more versus just what the users need, what to look for when you're hiring a user researcher, how PMs can be better partners to researchers, and also a phenomenon that I love that Judd describes and often witnesses that he calls user-centered performance where everyone acts like they care about the user, but they're just doing it for show and already know what they want to do. This episode has a lot of spicy takes and will probably upset some people, but Judd is sharing some real talk here that I think we all need to hear. With that, I bring you Judd Anton after a short word from our sponsors. This time of year is prime for career reflection and setting goals for professional growth. I always like to spend this time reflecting on what I accomplished the previous year, what I hope to accomplish the next year, and whether this is the year I look for a new opportunity. That's where today's sponsor, Teal, comes in. Teal provides you with the tools to run an amazing job search. With an AI-powered resume builder, job tracker, cover letter generator, and Chrome extension that integrates with over 40 job boards, Teal is the all-in-one platform you need to run a more streamlined and efficient job search and stand out in this competitive market. There's a reason nearly one million people have trusted Teal to run their job search. If you're thinking of making a change in the new year, leverage Teal to grow your career on your own terms. Get started for free at tealhq.com/lenny. That's tealhq.com/lenny. This episode is brought to you by Vanta, helping you streamline your security compliance to accelerate your growth. Thousands of fast-growing companies like Gusto, Calm, Quora, and Modern Treasury trust Vanta to help build, scale, manage, and demonstrate their security and compliance programs and get ready for audits in weeks, not months. By offering the most in-demand security and privacy frameworks such as SOC 2, ISO 27001, GDPR, HIPAA, and many more, Vanta helps companies obtain the reports they need to accelerate growth, build efficient compliance processes, mitigate risks to their businesses, and build trust with external stakeholders. Over 5,000 fast-growing companies use Vanta to automate up to 90% of the work involved with SOC 2 and these other frameworks. For a limited time, Lenny's podcast listeners get $1,000 off Vanta. Go to vanta.com/lenny. That's V-A-N-T-A.com/lenny to learn more and to claim your discounts. Get started today.
- 4:16 – 7:33
Critiques and responses to Judd’s post “The UX Research Reckoning Is Here”
- LRLenny Rachitsky
(instrumental music) Judd, thank you so much for being here. Welcome to the podcast.
- JAJudd Antin
Lenny, thanks for having me.
- LRLenny Rachitsky
It's my pleasure. So we actually worked together at Airbnb for many years and as I was preparing for this, I realized how many of the people that you managed went on to do amazing things, so I'm just going to read a list of people that worked for you and what they do now. So we had Matt Galvan who now leads research at Slack, we have Jana Bray who leads research at Notion, Celeste Ridlen who leads research at Robinhood, Rebecca Gray who leads research at Fair, Hanna Pellegi who I think was l- leading research at Duolingo, Luis Barrall who leads research at Figma, and then Noam who was leading research at Wealthfront, I- I think he moved on to something else. What a fricking crazy alumni community and group from this one team that you hired and incubated.
- JAJudd Antin
Now I've never looked at that list, but I'll tell you, I have been so privileged to work with all these amazing humans, I can't take credit for it, they're just outstanding people and I'm glad the diaspora is out there 'cause like these people, rock stars.
- LRLenny Rachitsky
Okay, so the main reason that I wanted to do a podcast episode with you is that you wrote this piece that was titled The User Research Reckoning is Here, which I understand caused quite a stir in the research community and I think adjacent communities, and let me just read, uh, one of your takeaways at the top of your post to give people a sense of what it was about. So you wrote, "The user research discipline over the last 15 years is dying. The reckoning is here. The discipline can still survive and thrive, but we'd better adapt and quick." Before we get into the meat of the piece, could you share a bit about just the reaction to this piece and maybe if it was a surprise and what you expected would happen when you put this out?
- JAJudd Antin
Yeah, I-I mean, I was definitely surprised. Uh, I- I- I wrote it because I wanted to start a conversation about something I was thinking about. I didn't really know who would read it and, and in the end it turned out, uh, a lot of people read it. Uh, I learned that using the word reckoning, uh, may have been a mistake because it inspires a lot of drama in a conversation that I wanted to be really productive and positive. Overall, I would say though that like the response was, was very positive. Um, it, it seemed to resonate with a lot of people who reached out to me. I spent a lot of time talking to teams, to designers, to researchers. But there were also a ton of critiques. I would say some of it was like people thought I was throwing research or researchers under the bus, like it's, it's researchers' fault, we're doing it wrong, which I don't believe at all, and that I wasn't taking responsibility as a, as a research leader or a design leader myself. And like a real... Uh, the- the- the most interesting one I would say was the anti-capitalist crew because one of my points that we'll talk about is that I think researchers need to be more profit-focused and there are a lot of people out there who think that's not... I think they think that's not cool or not research's job and I- I'm kind of like, well, what are we doing then if we're not helping businesses succeed? Um, so but that was the, that was the most surprising critique for sure.
- LRLenny Rachitsky
I've, I've worked with some of those people who are just like, "Why are we growing? Why do we focus so much on growth?"
- JAJudd Antin
(laughs)
- LRLenny Rachitsky
"Why do we need to grow this business?" So I get, I get that.
- JAJudd Antin
Yeah. Maybe, maybe it's the wrong industry for them.
- LRLenny Rachitsky
Yeah, yeah, not a fan of
- 7:33 – 8:53
The state of user research
- LRLenny Rachitsky
that. Okay, so let's actually dig into the meat of your message and the big takeaway and kind of the conclusion of what you're finding is happening in user research. And I know a lot of this comes from a lot of user researchers have been laid off at a lot of companies. It was one of the hardest hit, uh, teams and so I think a lot of this comes from that. So yeah, so let's just start big and then see where it goes.
- JAJudd Antin
So yeah, I mean, everybody, uh, who's paying attention has noticed that there have been a bunch of layoffs and I, I think back in the summer I was thinking, listen, this seems to be hitting UX and UX research particularly hard. Is there something going on? Is there a bigger picture? Is it maybe the... The reason I use the word reckoning is 'cause to me that's like, hey, a moment to take stock, um, and triggered by the fact that a lot of wonderful humans may have lost their jobs and many more are afraid of losing their jobs. And so if it's a sign, the fact that research has been hit so hard, it's a sign of what? And so the thesis of my article is really, it's a sign that maybe the system is a little more broken than we think, and that research is not driving the value or impact that it should or could. And that's for a bunch of reasons, I think. Some of it is, is stuff that research can do better and a lot of it is how research is integrated and positioned in companies. And at the root of all that, I think, is that we're, we're just doing too much of what I would consider the wrong type of
- 8:53 – 14:05
Macro, middle-range, and micro research
- JAJudd Antin
research. And what I mean by the wrong type of research is I have this framework that it's in the article, macro, middle-range and microresearch, at least three ways to talk about it, and it's pretty simple the intuition of what those are. So macro res- research is like big picture, strategic business focus, forward-looking innovation, you know, look at the market, look at competitors, you know, long-term research to understand where the product should go next. Stuff like that. And then you have microresearch, which a lot of really technical usability falls into this, all the beautiful stuff that researchers do to enable like a really high-quality, excellent, pixel-perfect thing to go out the door, laser-focused research to understand A-B test results, stuff like that. And then you have this middle range, which is this globular place where the research questions are sort of, uh, middle altitude and a lot of the core kind of, let's say, user understanding questions fall here, and a lot of what research is doing is research in that space. It's sort of let's take a group of people and ask some questions about how they think, feel, behave, how they're using a product or not using a product, and it's just this devastating mix of really interesting to many, including researchers, and not impactful enough for the business. That's kind of the core thesis. Researchers do it because it's interesting, but honestly, and I think we should talk about, Lenny, is researchers also do it because it's the kind of work we most often get asked to do.
- LRLenny Rachitsky
Yeah. That's exactly what I was thinking. That's what as a PM, like that's, that's what I want to get answers to is like how should we think about this one product? And, uh, I totally get this.
- JAJudd Antin
Yeah. It's, yeah, like the, the questions turn out to be really interesting and there are many cases, eh, and many companies where it's super impactful, but the problem with that, those types of questions is they tend to be really... Like, they trigger all the worst stuff that researchers experience, right? So they yield results which are interesting, but sometimes hard to operationalize. Um, they trigger the post-hoc bias really, really, really well where like a lot of people can say confidently like, "Oh, that was kind of obvious. We knew that already." And they fulfill this kind of need for us to feel and be customer obsessed, user centered without changing anything. And so doing too much of that research to me is a symptom of a broken system, right? And where, um, like companies are really different from each other. I heard from so many after this article and they're like, "Well, at my company in my industry is like this or not like this." But in tech, we spent the last many years hiring, hiring, hiring researchers, but maybe I- I'm sure most of your listeners are familiar with the idea of a ZIRP, you know? Like, maybe it was a zero interest based phenomenon where it was okay when the money was easy to hire researchers even though we were not setting them up properly. We were going to set them up to fail. We set them up as a service function. We didn't know what research was for. We didn't know how to d- how to really drive impact with it. And, and that's where the reckoning comes from, is like that era is over. Research, I think, is more crucial than ever. Researchers, good, great researchers are more impactful than ever, but it's, it's in a new space. We're in a new space now.
- LRLenny Rachitsky
I want to make sure people understand this, this framework and specifically how would you best describe the difference between this middle range research and macro research?
- JAJudd Antin
So middle-range research, um, is usually focused on a more specific set of research questions or a constituency. So if macro is like, "Let's understand the overall competitive landscape. Let's do a concept car-type project where we really look ahead. Let's get involved with strategic planning," which is a wonderful thing for researchers to do. Um, you know, do TAM studies, other things like that, that stuff lives in the macro space. The middle-range space is, like, um... What's a good example? We want to know h- how Airbnb hosts feel about their payment options. That's, like, a really interesting reasonable question, right? And we can go out and do research on that. But it's not that specific. It's not really targeted at a business problem yet. It could be, right? Maybe that's a result of the research. But it yields these p- kind of like middle-range insights in which we learn things like, "Well, hosts want flexibility about their, uh, you know, payment option." I'm making this up but, you know. And that's a good example where it's like, it's not that that's not an interesting set of questions. It's just not quite pointed enough in order to, like, re- And it's not framed in the language of, like, the funnel or the, or the business strategy or the OKRs. It's not quite enough aligned enough to that. It's too globular in that middle level, and it ends up not driving impact.
- LRLenny Rachitsky
I think it also leads to a lot of the things, as you've described, people don't like about research. It delays everything. You have to wait for the research to be done to, like, have an answer to make a clear decision. It also creates this issue that people complain about, that, like, PMs and product teams don't want to just make a decision on their own. They're like, "I will get this additional data point and make sure research tells us this is the right answer," instead of just trusting their... G- I guess, maybe along those lines, this may be going
- 14:05 – 15:46
What teams get wrong when it comes to research
- LRLenny Rachitsky
off a little track, but what are your... or what's your advice there for, say, product managers or PMs, or uh, product teams of, to not necessarily rely on research for that middle research?
- JAJudd Antin
I think the reason why so many PMs ask for those middle-range questions is because they haven't really gotten deep with their researcher in a way which can leverage it for maximum impact. So if the question is like, "W- hey..." Judd, you just pointed out, like, a bunch of problems. Like, can you be more solutions-oriented? Well, the solution is simple, but not easy to me. It's that we need to restructure the way we make products in a way which integrates, uh, research much more fully. Like, it looks like consistent relationships in which researchers and the work and the insights they provide are a part of the process from beginning to end. And, like, I think, Lenny, you as a PM, that's how you worked, you know? I remember you. Like, I know who you worked with. You worked with great researchers. But honestly, most product processes are not that way. And so that's when you h- like, research is a service function, it gets called in right at the end. I- it's reactive in the sense that a researcher in the room listening and participating in the conversation could have a ton of impact on framing exactly the right question that will drive maximum business impact, maximum product improvement at that moment, and then go do it quick and get back, and we're onto the next. But they weren't there. The relationship wasn't there. They're not engaged in the, in the project from the beginning, and, and, and that's the number one root of the problem. As long as research is a service discipline, I think we're gonna be stuck in this spot.
- LRLenny Rachitsky
When people might be hearing this, on the one hand, it's research has been not as helpful to teams as they thought, and researchers have been spending time
- 15:46 – 17:30
The importance of integrating research from the beginning
- LRLenny Rachitsky
on the wrong thing. On the other hand, your advice is integrate research from the beginning, make them more involved throughout, and I think that might confuse people. What's your... Or how should people think about, like, research is actually more important, you should integrate them more deeply?
- JAJudd Antin
There's a vicious cycle that's been happening, right? It is from, from where I sit, and this is what I hear from, you know, many, many researchers and research leaders, which is, a lot of companies hired a lot of researchers with great intentions, didn't quite know how to integrate them, you know? And research is a, is a... UX research is a kind of a newer discipline, so maybe that's not surprising. We're still learning how to use it. Cool, let's evolve. But a lot of companies hired these people, but they hired them into kind of like a service discipline, very reactive, not in the room, not integrated in the way I said. And so they had less input on the questions to ask, uh, not in... Or they're included, but only at the end, right? And then they're unable to build those direct relationships to be there in the room to actually, like, drive the questions and insert insights, 'cause a, a good researcher is like the repository of insights you need for growth. But they're not there. They don't participate in the decisions, so they end up re- doing research. They have jobs to do, so they do research that is too reactive, it doesn't matter, and then it doesn't... You know, it's less impactful. Researcher or, um, executives kind of conclude that, therefore, researchers are not as impactful, and then they get sidelined or laid off, and the cycle continues. So I think the short circuit is the constant engagement. If you take a great researcher and you insert them consistently in a product process, I feel confident that researcher will drive, uh, product improvement, uh, metrics impact, growth, all the things that we- you want to see as a PM and a product leader. It's just r- That's the exception, not the norm these days.
- 17:30 – 19:53
Traits of great researchers
- LRLenny Rachitsky
This may be a hard question to answer, but when people hear, "If you have a great researcher, here's how you approach it," what are signals that your researcher is great versus not great? What are some things people could look for to tell them like, "Oh, maybe I have the wrong researcher on my team"?
- JAJudd Antin
So the best researchers, I think, are, first of all, multimethod. The first iteration of user research was primarily a qualitative discipline. But a strong opinion that I have is that is largely one of those models that needs to evolve. It's not that qualitative user research is no longer important. It's that the best researchers have five tools. I think they have five tools, and those five tools are, um, number one, what we would call formative or generative user experience research, so looking ahead, innovation-focused, really open-ended, maybe more ethnographic. Let's go out into the field and talk to hosts and guests on Airbnb, uh-... um, let's see people using our product in the field, stuff like that. So that's formative. Um, the, um, the second type is evaluative, right? So more like usability testing. The third tool is a basic rigorous survey design, right? It's the best scaled way to get, um, responses from communities small and large. You can get a lot out of really well-crafted surveys. But to do that, you have to have the fourth tool, which is applied statistics. The best research know a little bit of stats. Like, you can't interact in a world of A/B testing without knowing basic statistics. And then I, I, in the old version of this, the fifth tool was, um, was SQL, because I think good researchers need to be able to run their own queries. These days so much of that is dashboarded that the fifth tool may now be, um, prompt engineering, (laughs) you know, which is a thing we could talk about. But I think it's some, maybe that, maybe that's the fifth tool is somewh- is it technical skills that fall in between querying your own data, understanding it very well, and, and companies that are awash with data, and then interacting with generative AI.
- LRLenny Rachitsky
Amazing, that's such a cool list. Okay, so just to play it back. Formative, generative, innovative, uh, skills to think bigger and, you know, and come up with new ideas. Um, usability.
- JAJudd Antin
Yep.
- LRLenny Rachitsky
Yeah, usability. What did, how did you describe it? I have a different word here. Evaluate?
- JAJudd Antin
Evaluative, right.
- LRLenny Rachitsky
Evaluative? Okay, okay, yeah.
- JAJudd Antin
So looking for evaluating products and doing more really, that's kind of the micro level of this, yeah.
- LRLenny Rachitsky
Survey design, being really rigorous about it, s- applied statistics, and then SQL/dashboard/prompt
- 19:53 – 21:10
Advice for evaluating user researchers
- LRLenny Rachitsky
eng- engineering.
- JAJudd Antin
Right.
- LRLenny Rachitsky
Maybe just one last question along this thread. Also a big question, but any advice for how to evaluate these skills/interview for them? I know this is its own, like, deep topic, but any advice for someone trying to find this person?
- JAJudd Antin
You know, I've interviewed hundreds or thousands of, of researchers, and the, the way I usually approach that is, you know, you want, you want a researcher who's a S- who's got a Swiss Army knife, because if all you have is a hammer then everything looks like a nail. And so if you give in the context of an interview let's say a researcher a pretty juicy open-ended research question, then you want to see how they handle it. And a good answer is usually multi-method, right? We're not going to handle it in any one way. We're going to say, "Well, here's a couple of ways we could deal with this. Here's how we could do this in a day or a week or a month." I mean, we usually don't have a month, but sometimes big research projects go on for that long. "And here are the different sets of methods that we can use." So see where they go. It's actually pretty simple. Most researchers are deeper in one than the other, and sometimes you can make up for those five tools with the team, right? So you have experts who are, who are kind of T-shaped but maybe deeper in one or several of those, those ways. But when I built a team at, at, uh, Meta and at Airbnb, that was my goal, is individually as researchers build up those tools, and then as a team build deep expertise that would fill all the gaps.
- 21:10 – 23:55
Balancing business and product focus
- JAJudd Antin
- LRLenny Rachitsky
Coming back to the main premise of your post, one of your big takeaways is researchers need to be much more business-oriented, thinking about what helps the business versus the user, which I think to a lot of researchers will feel really weird. Can you just talk about your, your kind of takeaways there?
- JAJudd Antin
So much of user experience practice, not just research but design too, is focused on empathy, right, and very user centered. This is beautiful. I'm not saying that we should abandon that. I think what I'm saying is there's an overlapping vent where you have the user and profit, or the business. And what researchers need to do is be way more explicit about finding that overlap. So one thing I u- I, I often, when researchers ask for advice, they're like, "Well, what should I do to be more busi- prof- business or profit fo- focused?" I say something like, "Did you read the last quarterly report," if it's a public company. "Did you listen to the shareholder call, you know?" Um, and they're probably like, "No, you know, it's full of a bunch of, you know, language I didn't quite get," and I'm like, "Mm-hmm. Yeah." So there you go. That's the language you need to learn. Scour, scour your, um, you know, Google Drive folder, your internal folder, and look for all of the documents that are about this quarter or this half or next half strategy. What are the OKRs? Understand the metrics and the funnel, the conversion funnel. Like, know it back and forward, because then what you're doing is you're proposing, you're, you're, if you're in the active conversation you're saying, "Cool, I hear you asking that research question. I've identified this is exactly the spot in the funnel where I think we need to do work," right? There's an, there's an opportunity here, or that competitor is eating our lunch, um, with this group of users. Like, I know that because I, I read the competitive report and I understand it deeply. So, like, those are skills that many, some researchers have and a lot are building these days, but historically, like last 15 years, it hasn't been a thing we've been as, as focused on, and I think that's an evolution that needs to happen.
- LRLenny Rachitsky
I think a lot of PMs listening to this are gonna be like, "Hallelujah. This is exactly what I've been trying to convince people of. It's what I've been trying to convince my researchers of." Design often falls into this.
- JAJudd Antin
But Lenny, the opposite is true too. Like, because you gotta take the average PM who lives in that land all day every day-
- LRLenny Rachitsky
Yeah.
- JAJudd Antin
... and what they do is not in the vent. You know, I think those are people who are also, like, performing customer centricity and performing user centeredness a lot when they're really not interested. And so this is, like, not about researcher. This takes two sides. Fixing this broken system takes everyone, researchers, PMs, designers, everyone at a company, but also the way that organization is structured, and integrating itself in a different way. Everybody's gotta come to the table.
- 23:55 – 26:42
User-centered performance
- JAJudd Antin
- LRLenny Rachitsky
Such a good point. And you have this actual term that you call, uh, user centered performance, where it's a performance of being user centered. Can you talk about that, and then just what advice you'd give to PMs that hearing this are like, "Yes, I love everything you're saying," and then not realizing maybe they're too far in that extreme?
- JAJudd Antin
So user centered performance is, uh, a term I made up, (laughs) 'cause it's fun to make up terms, and it refers to, um, customer obsession or user centered practice that is symbolic rather than focused on learning, right?So it's hugely common, I would argue. It's work we do to signal to each other how customer-obsessed we are, not 'cause we want to make a different decision. And like I... If your, if your, like, listeners are like, "I don't do that," I'm kind of like, think about it for a second, right? Because there, there... This is extremely common. It shows up in explicit ways and implicit ways. So explicitly, I would say every time a PM comes to a researcher at the end of a product process and says, "Can you just run a quick user study, you know, just to validate our assumptions?" That's user-centered performance. It's too late to matter. That PM is not interested in being wrong (laughs) at all. It's too late in the game for that. We got to ship it, right? What they want is to check the box. So any check-the-box style research is a wild example of user-centered performance. I would argue every researcher has probably had to do executive listening sessions, you know? 'Cause a lot of PMs, founders, product people, but designers too, they want to get close to the customer, right? And so, like, "Can I do some focus groups? I want to be there. I want to ask them questions." This is 97% performance. It's well-intentioned, but it isn't focused on learning. It isn't going to drive better outcomes or more impact. And then there's all these implicit ways that people engage in that kind of user performance too. A lot of it comes down to cognitive biases, confirmation bias, ego. Like, one of my big, kind of mantras was, we don't validate, we falsify, right? We are looking to be wrong. That is the mindset you should use when you're approaching insights and research. I want to be wrong. I want you to do research that shows we were off-base in the following ways. Tell me exactly how and why in a way that allows me to fix it quickly. But many PMs, many designers are not in that place. They do not want to be wrong. They're looking to validate, and that's user-centered performance.
- LRLenny Rachitsky
Oh, man. I think a lot of people are hearing this and feeling, feeling exposed.
- JAJudd Antin
Exposed.
- LRLenny Rachitsky
(laughs) I feel like you're like this Deep Throat person coming from... Sharing these things people don't want to talk about.
- JAJudd Antin
I know.
- 26:42 – 30:15
The role of intuition in product development
- JAJudd Antin
- LRLenny Rachitsky
There's this quote in your post I'm gonna read.
- JAJudd Antin
Mm-hmm.
- LRLenny Rachitsky
"Product managers love to ask for middle range research that they can use to justify decisions they're reluctant to make on their own. User designers love to ask for middle range research because it fits their model of what proper design process should look like. Executives love to ask for middle range because they don't really understand what research is for. It helps them do performative user-centeredness. In the end, they will decide based on their own opinions."
- JAJudd Antin
Mm-hmm. There is an important place for intuition in, in product development, of course. Like, the best designers, researchers, product people develop strong intuition for the product. But like, you got to understand, intuition is where all of those biases lie. It's where all your blind spots are. And what great insights people do, what great researchers do when you're next to them all the time is they'll expose you. I don't have to be the Deep Throat, right? Because you have somebody whose professional job is... Keeping you honest is probably the wrong way to, to put it, but you know, somebody whose c- who- whose capabilities are in about expanding your horizons, making it so that your intuition is constantly improving, you don't have to rely on it. When your intuition and the evidence sort of collide in a way that either affirms or falsifies the product decision you made, now something really good is happening. So, you know, and the other thing that is inherent in that quote is, you know, I, at Airbnb wore many hats over the years. I was head of research two different times. I was head of design for guest products, and my last job was I was head of the design studio. So UX research, UX design, writing, localization, they all reported up to me. So I've seen this from many disciplinary angles in the UX field, and researchers aren't the only ones who are guilty of this. Like, I would say design has a ton of performance, and it comes from the fact that we have, like, figured out user-centered design, this process or design thinking which, which IDEO popularized. Like, that's what we're supposed to do, right? Right? Bezos told us that we, as PMs, had to be customer-obsessed, right? So that's what we're supposed to do. It's a really common and damaging thing when we are not genuinely... Like, we don't genuinely have that growth learning mindset, and it's, like, easy to sideline researchers. Like, we don't need them in that situation. We, we've got our guts. Isn't the gut where... Like, a great PM, a great founder needs to have that gut. And they do, but they need to be open to the fact that your gut is limited and biased and, and narrow and wrong sometimes.
- LRLenny Rachitsky
The two sides of this is trust your gut opinion, I don't need research, I don't need data, I have opinions and my own experience and I'm gonna use the product and let's just go with what feels right to me, versus pure data-driven, research-driven. For designers that are maybe listening, for product managers, do you have any advice for just, like, where to fall on that spectrum and just how to best leverage research to inform that opinion?
- JAJudd Antin
Yeah, I taught a class at UC Berkeley this semester on leadership, and, um, we talked about that a lot, um, because, you know, great leaders develop intuition, right? They... That's part... It's the si- It's the, it's the pattern matching part of experience, right? Where you're, you develop heuristics which allow you to make good judgments even if you can't quite explain where that judgment came from, right? That's what the gut is. But it's also, like I said, where bias comes from, where, where cogni- Like, all the cognitive biases, there are a hundr- there's a list of 151 of them on Wikipedia. I won't name them, but like, all those thorny things that, that lead us astray, you know, that behavioral economists and social psychologists study, those live in the
- 30:15 – 32:54
Checking your gut instincts
- JAJudd Antin
gut. And so the advice is, you know, when you are looking to check your gut, you have to do that thing. A lot of, a lot of your listeners have probably read, uh, Thinking, Fast and Slow. System 1, System 2, right?
- LRLenny Rachitsky
I have it here right under my laptop, actually holding up my laptop screen.
- JAJudd Antin
That's so appropriate, Lenny.Um, so the secret is not that sexy. It's System 2.0, right? So you engage that slow, methodical process in which you do analytic thinking as a means of checking your gut. Slow in the grand scheme of things, right? Slow meaning not a split-second decision, not like months of analysis. That's not what I mean. The other thing you can do, and there's really great research on this, is you bring in the wisdom of the crowd, right? So the wisdom of the crowd is a phrase a lot of people are familiar with, and it works in a specific situation. The wisdom of the crowd works when the people involved with a decision are bringing diverse sources of information and judgment to the table. Obviously, if everybody has the same sources of information, then it doesn't matter how many people are out there. So if you wanna check your gut, get a bunch of different guts together. Get a bunch of different people in the room, um, who can bring evidence and intuition to bear and have an open, like, direct and kind conversation in which we might disagree. You know who's great at that? Researchers.
- LRLenny Rachitsky
Mm-hmm. Leading those discussions essentially-
- JAJudd Antin
Yeah.
- LRLenny Rachitsky
... and getting a bunch of people's opinions.
- JAJudd Antin
Yeah.
- LRLenny Rachitsky
Yeah.
- JAJudd Antin
This is the structural solution I'm talking about, Lenny, is, like, I never asked for research teams to have their own separate OKRs. I said two things. Number one, what's the team's... Okay, like, shouldn't the PMs, the engineers, the designers, and the research, everybody should have the same set of metrics for success, right, because either we're doing it together or we're not. And then I said, "My metric for success is when they won't that- have that meeting without you." That's my metric-
- LRLenny Rachitsky
Mm-hmm.
- JAJudd Antin
... for success. If they cannot have that decision meet- making meeting without the researcher there, that means you've developed influence, strong rela- strong trusting relationships, you're an active participant in the process, not just sort of somebody who provides input into someone else's process. And that is when researchers can have huge impact.
- LRLenny Rachitsky
I think of the PM role in a similar way, even though people won't have these meetings with their PMs 'cause they're often at the center of a lot of the stuff, but you wanna be a PM that people want on their team. There's a lot of teams that are like, "We don't want any PMs. We don't need product managers. They just get in the way." And I find that that's only the case when the product manager's not great and not really good at their job, 'cause most great PMs just make everyone's life easier.
- JAJudd Antin
They do. Agrees, I love it.
- 32:54 – 41:02
Common tropes about PMs, from researchers
- JAJudd Antin
- LRLenny Rachitsky
You mentioned also that, (laughs) before we started recording, that the biggest challenge for these researchers is their relationship with their product manager. Can you speak to that and what you've seen there?
- JAJudd Antin
I'm wary of over-generalizing, but I can tell you that from my experience and from what I hear, the product research or product insights relationship is one of the most challenged. And I think it comes from the fact that fundamentally, many researchers are just not included in the process that, that PMs are running. And then there... (laughs) I- actually, I kinda thought... I, I, I did some asking around before, before this, uh, podcast, and so I, I thought, you know, there are some tropes that researchers have about PMs that are worth PMs knowing, right? Like, just, like, four or five of them, like the, the things that researchers know PMs say which drive us nuts, because they're not true. So the first one is that research just slows us down. Research is too slow. This is bullshit. Um, a great research team can do research in a day, a week, or a month, you know? It just depends on what you wanna get out of it, you know? Like if you, you know, how much, how, how much detail do you need? How many people do we need to talk to? What is the depth or breadth? Do we need to go to seven different countries to talk about our constituencies in Latin America? Well, that's not gonna happen overnight, but we don't often need that. The other way to look at that is that, is it slower to get it wrong and fix it than to take a hot second to do the work to get it right the first time, you know? So that's BS. Like, research, good research, doesn't slow us down, it speeds us up.
- LRLenny Rachitsky
And also just along those lines, like a big part of your premise is you don't need to do as much research as people are doing, like this middle research that a lot of the time is put into.
- JAJudd Antin
Yeah. Research can go super fast. I think especially... So the macro-level research, I hope what it is is tied to things like annual plan- planning processes. We did a thing at Airbnb several years that we called the... It was like Insights 2019, Insights 2020. They were concept car projects, and we spent quite a long time synthesizing the entire year's worth of insights from every place we could get them, and then developing with designers and engineers, like, a concept car for five years in the future, so that's a long process. But the micro-level, there's so much business value to be derived there. So much business value, and it can go so fast, Lenny, right? It can go so fast. You can have, you know, you can have results in 48 hours on these things. Like, we did a thing at Airbnb, um... There's a famous story which I'll only tell in the abstract 'cause I don't want to out anything, but it's like the mo- we call it the multi-million dollar button. And basically, we did research which, uh, revealed that people weren't going down the purchase funnel because they were afraid that the calls to actions on the button was making them afraid that it would initiate a purchase when really it was just taking the next step, right? We changed the text on the button with, with help from our amazing, uh, content design, our UX writing team. We basically changed seven characters and made Airbnb millions of dollars, right? Because, because what we found out, like, was really simple. It was just like, "Hey, this button feels scary. The CTA on the button feels scary." So that's a great example of how micro- ... And that happened in, like, 48 hours, we, we would discover that insight, or 20- overnight, basically. And we were like, "Hmm, maybe we should test some other CTAs." We did. The conver- the conversion, like... We, we added, like, 1%, which y- is really, really hard to do, you know? So that's a quick example of how that type of quick research can drive a huge amount of business value.
- LRLenny Rachitsky
This episode is brought to you by Ahrefs. Many of you already know Ahrefs as one of the top tools for search engine optimization. It's used by thousands of SEOs and companies like IBM, Adidas, and eBay. What you may not know is that there's a free version that was made with small website owners in mind. It's called Ahrefs Webmaster Tools. It's free and it can help you bring more traffic to your website. Ahrefs Webmaster Tools will show you keywords that you rank for and backlinks that you can get. It also performs automated site audits to find what issues prevent your website from ranking higher on Google. Every detected issue comes with a detailed explanation and advice on how to fix it. Visit ahrefs.com/awt, set up a free account, connect your website and start improving it. That's A-H-R-E-F-S.com/awt. So just to make this even clearer, I think this middle research zone is the stuff that does slow people down, I imagine. It's like, what are the challenges hosts have with payments on Airbnb? Like, what you're basically saying is spend your time doing the micro stuff, like usability research, and then the bigger stuff, that's part of overall planning. That's part of the planning cycle. It's not like every project you're working on, you need to have, like, a whole research project on.
- JAJudd Antin
E-exactly. The micro research should be much more common. A lot of researchers think that they're, that that's sort of scut work, you know? That, like, usability is something junior researchers do. I completely disagree. Like, I think we need to get back there as an industry and be, like, when you make a product easier to use, when you re-, uh, like discover problems with functionality, business metrics we care about will go up. I've seen it happen, right? But it's, that's not just work for interns and new grads, that's for sure. And then the planning process, like, absolutely, if we're integrated from beginning to end, we can help. And you know, the thing about that middle range, I think you're right. Like, that's the stuff that makes the, the sort of stereotype that researchers is slow, and a lot of times it's also because it's just not pointed enough. The researcher can also say in that moment, "I have studied the business plan. I know exactly where... Like, I've seen the metrics trend. I have an idea about exactly where that's gonna go." We still need to do that middle range, uh, research. The question is valuable, but it's now very pointed, and that the time is worth it.
- LRLenny Rachitsky
Amazing. Okay, I want to hear the rest of these tropes.
- JAJudd Antin
Okay, so research is too slow is the first one. The second one, I can do my own research. Why do I need researchers? And like, that's true. (laughs) As product people, I hope, I hope, you know, you are engaging with customers and, and, and listening well, but no offense, garbage in, garbage out. You know? Um, like the thing, uh, the thing is anyone can talk to a user, that does not constitute research or insights work, you know? Because, uh, one user can be powerful, but one user can be idiosyncratic, and a researcher knows how to get to the heart of that really quick. They know how to take that conversation and understand and situate it in a way which means, like, sure, like democratize research, that's happening. There are tools out there that will let anybody get customer feedback, voice a customer type stuff. But a researcher is there to help you turn, uh, garbage into, you know, into something that's not garbage, so, and avoid the bias that can come from, like, you just reaching out to your cousin's family and then doing whatever they thought you should do to the product. So that's-
- LRLenny Rachitsky
That's interesting.
- JAJudd Antin
... that's the second trope. The third one is AB test everything, right? Um, and, like, like, AB tests are great, but one of my most painful things to do is to sit in a room full of PMs and data scientists who have just, uh, seen the results of an experiment that, like, flipped to stat sig, and then they're like, "Cool, I was significantly down over this course of time for these users." And then they just start speculating about why that is, because the AB test rarely tells you why it changed in the way it did. And then this endless flywheel of AB testing goes, and I'm like, "Hey, you don't have to guess. I know somebody who can get you an answer to the, or at least evidence that addresses the question of why did we see the, uh, the test result we did in a very short amount of time." Or you could use your customers as guinea pigs and throw more experiments at them over and over, and spend a long time on it and g- and, uh, and come to the
- 41:02 – 43:15
A/B testing vs. user research
- JAJudd Antin
same place in the end.
- LRLenny Rachitsky
I think a similar critique that PMs often have is AB testing is conclusive scientifically, statistically, user research is just talking to a bunch of people. Why would I trust that? What is your best way to help PMs realize that this is actually very, uh, valuable data and you should listen to it? It's not just this, you know, story here and there?
- JAJudd Antin
Yeah. No, I think they're both right. AB testing is as close as we can get to causal, making causal claims about products, right? Research is usually not oriented towards making causal claims, or it should not be, but those causal claims rarely tell you how and why things happen. And if you want to develop, if you want to not make that mistake again in the future, you need to know how and why. If you want to build a better product in a way that doesn't just answer this narrow question that an AB test answered, you need to know how and why, and so you kind of need both. Like, beautiful partnerships between data scientists and, and research and insights people, like, are, are I think what we're gonna see in that next evolution. And if you set that up, that, that sort of virtuous cycle up, if you set up the engagement where those people are involved from the beginning, you don't make those mistakes. You get the calls of relationship, which is valuable for one reason, and the hows and whys, which are valuable for other reasons.
- LRLenny Rachitsky
Awesome. Okay. I think there's two more tropes you had.
- JAJudd Antin
So one of them is, like, a simple one, which is, like, everyone loves to quote that, that it turns out a totally apocryphal Henry Ford quote, you know, about if I'd asked my users. It turns out, to the best of our knowledge, he did not say that. And-
- LRLenny Rachitsky
Really?
- JAJudd Antin
Yeah, I know. Isn't that sad?
- LRLenny Rachitsky
I didn't know that.
- JAJudd Antin
I know. Sorry to burst your bubble, Lenny.
- LRLenny Rachitsky
Oh, wow. Um, does anyone say anything? I feel like every quote is- (laughs)
- JAJudd Antin
Is apocryphal now? I know, I know.
- LRLenny Rachitsky
Yeah.
- JAJudd Antin
What is reality? Geez, can we... Well, let's just, oh, okay, maybe he said that. Like, he certainly believed that. That's what the historians say.
- LRLenny Rachitsky
Okay.
- JAJudd Antin
But the reason that makes researchers so angry is because...That's not research. That's not what researchers do. A researcher who's going to ask customers what they want is a bad researcher. You need a different researcher. Like, that's not, I've never done that in my career. No one on my team has ever run a study that's like that. You know, so, so that, that just makes researchers
- 43:15 – 44:55
Hindsight bias and narrative fallacy
- JAJudd Antin
mad. And then the last one is about post-hoc bias. It's we knew this already. That was obvious.
- LRLenny Rachitsky
Mm-hmm.
- JAJudd Antin
And, um, I think a lot about this book, which I would recommend to, uh, your listeners. So the author is a sociologist at UPenn named Duncan Watts, and the title is Everything is Obvious if You Already Know the Answer. And, uh, it's about hindsight bias. He kind of, like, makes the argument that we rely too much on intuition, heuristics and pattern matching in a way that sort of is inappropriate to our experience, and it's like, it leads us astray. It's kind of like a form of self-gaslighting. And it happens because we end up sort of selectively remembering things and then constructing narratives around them in a way which makes us feel like we already knew that, when we, in fact, did not. And, and he talks about this other, th- one of those cognitive biases called the, the, the narrative fallacy, which is, is the idea that, like, people love to make convenient, simple stories about the past. Like, if I asked you about your career, Lenny, and how you got to be this amazing podcast host, you'd be like, "Well, let me tell you about this series of events." And like, we do that. You know, it's like part of how we make sense of our lives and the information around us. But it would probably be a lie, you know, like you, you know, in the sense that, like, we all kind of twist the evidence we have to fit the narrative we want to be true, 'cause it's simple and lovely and makes us happy.
- LRLenny Rachitsky
This is gonna sound self-serving, but I find I'm the opposite. I'm like, "I have no idea how this all came about." Here's, here's some things that kind of happened, and somehow I ended up here. But maybe I'm being very modest and trying to not give myself any credit.
- JAJudd Antin
That's beautiful.
- LRLenny Rachitsky
Thank you for these tropes, by the way. This was fun. I didn't know you were gonna do that. So that's a fun little collection we've got here.
- JAJudd Antin
Yeah (laughs)
- 44:55 – 47:26
Making recommendations based on research
- JAJudd Antin
.
- LRLenny Rachitsky
Um, I wanted to ask about, there's this tweet by Patrick Collison that I've brought up a couple times on this podcast that I think, uh, is really interesting. And his tweet is this, "In my opinion, the best product will stem from a very strong mental model of the domain and user. User research can help you get such a model and validate it along the way. But it's important to view the syllogism of UXR as model, as, of user research to improving your mental model of the user to what product you should build, versus user research tells you what product to build." Does that resonate in any way? Thoughts on that way of thinking about user research?
- JAJudd Antin
Yeah. I mean, there's, there's a double-edged sword we talk about a lot in, in, in the research community, which is about making, um, recommendations, recommendations for design, right? So the, the best research doesn't, like, leave it at that, right? It, it tells you and, and, and it's like, uh, the what, the so what, and then the then what. But the problem with that is, some researchers go too far in the other direction, where they're like, "We ran this study, it yielded these insights, and therefore, this is what we should build." And everyone else on the team is like, "Whoa, whoa, whoa." (laughs) Right? "Glad to hear your thoughts on the matter, but, like, there's a lot going on here. Maybe we should talk about it." Like, so, and that makes perfect sense. That's like a failure of communication, and I think that it, like, speaks to the, the thing that Patrick is saying, is like, good research can sometimes tell us exactly what the problem is and exactly how to fix it. An example of that is the, the multimillion dollar button I told you about. But in a lot of the bigger picture questions, especially the macro ones and maybe also the really pointed middle range ones, the point isn't really, "This is exactly what we should do, and this is exactly what we should build." It is, "Let us develop a framework which is based on actual evidence, and then together as a team, figure out how we want to experiment our way to a successful product."
- LRLenny Rachitsky
To close the loop on this specific thread, what is your advice to teams, researchers, to help move out of this kind of reckoning and to move forward and help the field, both from a user researcher perspective and also from just, like, a company that maybe laid off a bunch of user researchers or is trying to decide what to do with the researchers?
- JAJudd Antin
Thank you for asking. Um, I, I think I said to you earlier, like, you know, and, and I feel some pressure as, uh, maybe the first conversation that you've had specifically about research on this podcast.
- LRLenny Rachitsky
Yeah. I think so.
- JAJudd Antin
And I want to help...
- 47:26 – 51:18
Advice for teams on how to leverage researchers
- JAJudd Antin
I, I, I believe so much in this discipline of research and insights, and I think there's the, you know... When I said the user, uh, ex- the, the UX research discipline of the last 15 years is dying, I didn't mean that I think research is dying. Far from it. I think that there's a version of it which we're now moving past and into a new version. We're going through an evolution, as many do. And so the question for me is like, how can researchers and the companies and, and, and the other people with, with whom they work create a new version, a different version, an evolution which is hugely impactful for the business? And so the advice I give to researchers about that is develop diverse research skills. Remembering, like, the five or five and a half, uh, tool list (laughs) that I, that I mentioned earlier, really go deep on that business knowledge. So speaking the language of product and business and metrics and understanding exactly how to use your insights like a scalpel, building those strong relationships, which is not a thing that researchers can do by themselves. It requires two-way engagements, and, and also in a way which allows researchers to do fewer things better. So most researchers that I know are working on teams where they're like, "I'm the only researcher, and there's like..." 'cause I have seven PMs and 20 designers and, and like, "I'm trying to do 10 projects," and no one's gonna do a job, good job that way. So researchers have to learn with their partners about how to say no and focus on the most important things. But that's like only half of it, right? That's the research side. The... I have, like, two thoughts about, about what companies should be doing. Th- the first one is, it's a little bit of an aside, but not really. Like, one thing I learned by, through the responses to-... to the article was everybody came out of the woodworks from the variety of insights disciplines that are out there. 'Cause I come from a tradition of user experience research or user research. But there are many insights disciplines in many industries, and they all wanted to claim one type of research or another and say like, "Oh, well, we over here in consumer insights or market research have been doing that well for years." And, you know, there are many insights disciplines. And generally, I think creating silos is stupid. Actually, I'm curious what you think, 'cause here's the number one thing I heard when I joined Airbnb, and you were there, is that I did a, like a quick listening tour where I talked to a bunch of product people, and, um, and they all kind of said the same thing. They were like, "Listen, we have all these different people throwing insights over the transom. And it's great, we want to hear from the data scientists, from the product specialists, from the customer service people, and the voice of the customer re- whatever, all that stuff. But they're all coming over the side and we don't know what to make of it," right? It's like too much. And that as much as anything is an argument for companies to stop siloing research disciplines. So when I joined Airbnb, I like set out to create an integrated insights function, where it was like, let's do UX research. Let's talk about the market and competitors when we have to. Let's integrate smartly with data science functions. Let's integrate all the stuff we're getting from customer service feedback. You know? We, we brought over what was then the NPS program and, and sort of said like, "Hey, like if we're getting customer feedback there, let's all just like use it all to fe- fuel this one insights machine." So that's the first piece of advice I'd give companies. And the second one, without being a broken record, is to think differently about the broken cycle. So integrate researchers into a unified lean process. So if the researcher is not there from beginning to end, if there are not strong relationships between product people and design people at every level, engineering people at every level, and somebody who's their partner, their insights partner, we're gonna fall back into this problem where we're just a service discipline. We're not extracting the maximum value. It comes too late. We don't know what questions to ask. We're ignorant about what research can do. And so creating that integrated lean process where a researcher is arm in arm from the beginning is the most important advice
- 51:18 – 56:53
How product managers can be better partners to user researchers
- JAJudd Antin
I'd give.
- LRLenny Rachitsky
That last piece may be the answer to this next question, but the question is how can product managers be better partners to user researchers/take more, get more leverage out of user researchers?
- JAJudd Antin
I think that is in many ways the answer, making sure that they are creating a process for the product, for their products that integrates user researchers and insights from beginning to end. Also, being willing to partner with the research on the ruthless prioritization. Um, I used to say that a full plate for a researcher was probably three things, two big projects and a small project, right? Like a side project. More than that, your researcher is probably not doing a very good job, and a project may take 48 hours. That's okay, but so they need your help to prioritize. They need you to participate. Great PMs will take the time to be with researchers, to go into the field, even to travel. Did you ever do that, Lenny?
- LRLenny Rachitsky
I did. I went with, uh, Luis, who, uh, introduced, who like came up with this, basically told me-
- JAJudd Antin
Yeah.
- LRLenny Rachitsky
... to chat with you about this topic.
- JAJudd Antin
Thanks, Luis.
- LRLenny Rachitsky
Thanks, Luis. She, we did a whole, uh, tour to Paris. Our whole team or the whole, the leads of our team went to Paris to do a bunch of focus groups and a bunch of user research behind like actual mirrors. I've never done that before-
- JAJudd Antin
Yeah.
- LRLenny Rachitsky
... on that trip. And it was amazing. We learned-
- JAJudd Antin
Yeah.
- LRLenny Rachitsky
... a ton.
- JAJudd Antin
Can I tell you a quick story about-
- LRLenny Rachitsky
Please.
- JAJudd Antin
... behind the mirror? So this is back from when I was at Facebook, um, and it was the high times there was like 12, 2012, '13, and newsfeed is like really taking off. The ads are going into newsfeed. And I was working, I was the leader of a team that was working, uh, among other things on how to address, uh, post quality, right? Like how do we think about what's a good post and how do we get feedback about it? And so there was kind of a team of engineers that thought one thing that you can do on Facebook is hide a post, right? So they were like, "This is easy. Let's look at the posts that are hidden the most and use that as the signal of what's a good post on, on Facebook." Seems kind of reasonable, right? And like something tripped me on this one. And so I was like, so I did two things. So the first thing I did is I looked at the distribution of hiding by user and found out that it's power law distributed like everything on the internet. There are a few people on Facebook who hide a ton and then most people don't hide at all. And so then what we did was we said let's fi- we called these super hiders, we called them super hiders. And so we said, "Let's find super hiders, uh, around the office and we'll get a super hider in and we'll do like a really traditional user interview." I just wanted, we just wanted to see. So literally the first person who walked in, I remember because this was a person who had those fingernails that are so long you don't know how they can do cup touch screens-
- LRLenny Rachitsky
Sure.
- JAJudd Antin
... um, but they did. They were amazing at it. And it was one of those rooms, uh, where with the glass and I insisted that the eng directors, the product people, and they were willing, whatever. So everybody's behind the glass and I'm there with them and the, the re- excellent researchers in the room and they come in and we're just doing a traditional think aloud study. And so they go, "Hey, can you open up your Facebook app? We would just love to see you, you know, what your experience was like." And so they, they opened up Facebook and we're looking and they, they look at the first story and they hide it and they go to the second story and they hide it. And this went on for a while. And like this she's definitely using Facebook, but every time she'd read- finish with a story, she'd hide it. And the people in the back room were starting to chatter and they're like, "Wait, what? What is happening right now?" And like good research that the good researcher that this person was like, they let it continue and they're like, "Oh, can you tell me what you're thinking right now?" Come to find out that she was like, um, "Well, I hid that story because I'd seen it already."... the model she was going for was inbox zero.
- LRLenny Rachitsky
Mm-hmm.
- JAJudd Antin
Which was, like, sad because it was infinitely scrolling, she would never get there. And the reason I like that story is, um, because the people in the backroom had their minds blown. It was not that we assumed that was common behavior. Like, this person could have been unique. But it was enough, because those people were there experiencing the research, that N of one allowed them to burst their own bubble and realize, "Okay, we can't think so naively about hides as a signal anymore," and we came up with a better solution.
- LRLenny Rachitsky
That is an awesome story and such a good example of you don't need statistical significance to get massive insights. Like one example just gives you a, "Wow, this might be exactly what's happening, let's go validate that," versus, like, "We are confident 100% this is what happened." I love that. It reminds me actually in the mirror, in the mirror study that I was talking about in Paris, there's a Facebook element to it too. We were trying to convince hosts how to feel more comfortable accepting guests who are booking instantly, and one of our theories was if they were connected on Facebook, they would be more comfortable letting someone book instantly.
- JAJudd Antin
Yeah.
- LRLenny Rachitsky
And we're just like, "Hey, why don't, what if you were to connect Facebook and see if they're friends?" And everybody in Paris was very afraid of connecting and giving Facebook any data, way ahead of, like, what the US hosts were feeling.
- JAJudd Antin
Yeah.
- LRLenny Rachitsky
So it just made us, made it very clear nobody wants to actually give Facebook any data. So it was very anti-Facebook at that point.
- JAJudd Antin
Yeah. That's so interesting. Germany, Germany and F- and, and France were always our, like, bellwethers for, for what the rest of the world would be thinking with private, data privacy concerns.
- LRLenny Rachitsky
Oh, man. Okay. A couple more things.
- 56:53 – 59:43
The ideal ratio of researchers in a company
- LRLenny Rachitsky
So a lot of this started with a lot of layoffs within user research, and I think r- between the lines there's a sense of teams don't need as many researchers as they hired during the ZIRP era. I think a question in everyone's mind is just, like, how many researchers do we need? What is a good ratio? I imagine there's not a simple answer here, but just what's your general advice to companies of how many researchers is right?
- JAJudd Antin
So this is a thing I, I've thought a lot about, especially in my role as the head of the design studio. That was, like, my fundamental question, is, like, you have all these writers, designers, researchers. Like, how do you structure them? How many and where and who works on what? And the organizing principle for me was always relationships. You know you have enough when the people who need to have a constant research partner have them. And I would much rather create pain in that situation than spread someone too thinly. So my advice was always, like, don't try to create a researcher to cover this entire product space. Pair a researcher up with, you know, somebody who's gonna involve them in a consistent engaged process and let them go to work, and see the impact they're going to have, but protect their time. And then other people are like, "Wait a second, that person's doing great work. I want some of that." And creating that pain for them, like, because it's, it's the pain of loss, like, is the number one way to grow headcount. That's how I always approached getting more headcount, was not arguing abstractly for why research is important, but by asking partners who wish they had it to do the arguing for me. And so there isn't a, you're right, there isn't a clean answer for, like, hey, this is h- this is the right ratio, because it really depends on the nature of the product. Like, is it a, is it a early stage product? Is, is it a late stage project? Are we talking about a startup or a late stage company? But, you know, I would argue there's always room for a researcher. Lenny, I'll tell you, and I used this in a keynote talk I gave lately, you published recently, uh, a list of I think it was about 20 B2B companies, and-
- LRLenny Rachitsky
Mm-hmm.
- JAJudd Antin
... their first 10 employees.
- LRLenny Rachitsky
Mm-hmm.
- JAJudd Antin
Do you remember doing that?
- LRLenny Rachitsky
A- absolutely.
- JAJudd Antin
Do you remember how many researchers are in, anywhere on that list? I'll give you a hint.
- LRLenny Rachitsky
(laughs) Not too many.
- JAJudd Antin
It's between zero and two. It's one. There is one researcher on that list anywhere.
- LRLenny Rachitsky
Mm-hmm.
- JAJudd Antin
Anywhere. And that's messed up to me. Now look-
- LRLenny Rachitsky
Mm-hmm.
- JAJudd Antin
... it's just these 20 companies, and, like, they're, you know, eh, each is in their own space, so I'm not going to over-generalize. But there is r- th- and in, a researcher can drive incredible value no matter what stage a company is at, because a good researcher makes you go faster, not slower. And they drive impact because they answer questions which are impossible to answer in any other way. That's true if you're a startup, it's true if you're a late stage company. Now if it's your first 10 employees, you know, like, one of n- one researcher is going to go a long way. You know, as you grow, making sure that you're matching up researchers so that they have strong partners in the key parts of the business is the best way to figure
- 59:43 – 1:03:39
Empowering user researchers to drive impact
- JAJudd Antin
out if you have enough.
- LRLenny Rachitsky
Interesting. So your advice is as you're starting a company, your pitch is that you will have a lot more leverage and move faster hiring a researcher versus generally an engineer, is what you'd be trading off. Essentially that's what most of the hires end up being.
- JAJudd Antin
(laughs) And I, I am reluctant to over-generalize, but I would say, um... And, and I would say I know in when many con- many founders who are in startup mode are like, "I know what I need to build. The problem is that I need people who can help me execute." And I think that's right, and so everything's a trade-off. But remember i- imagine that you could have that Swiss army knife at your disposal.
- LRLenny Rachitsky
Mm-hmm.
- JAJudd Antin
As you, as, maybe, maybe you've got, um, like, an MVP out the door, and you're, you're looking to make your m- first major iteration, or like many startups you need to pivot. Like, this is where s- uh, like, it's like, hey, you don't have to do that alone. Like, we, we, we deify startup founders who pivot appropriately, but I think that is what we would, might call moral luck, where, like, we, we deify the ones who got it right, and, and e- even though they made exactly the same decisions as the one who got them wrong. And the, the, the fact of the matter is, like, if you have an insights person with you who has that Swiss army knife of tools, like, you're not in it alone. You don't have to guess. Like, ultimately it will still come down to a tough decision that you and founders have to make, but you can have evidence to, that, that, that bears on that decision which you wouldn't be able to get any other way.
- LRLenny Rachitsky
To close out on this and have, like, just a couple more questions on this thread, I think one of your big messages to researchers is you can be empowered-Like, it's up to you to do the right sort of research and to move the, your career in the right direction, not become a researcher people don't need. And there's this quote that you have at the top of your post, where a lot of the reaction, or I guess the way you put it is, "I know what you're thinking. 'They just don't get it. We're so misunderstood. Our plight is to deliver insights that users use to drive business value while we're forgotten. Never driving the roadmap, no seat at the table, consistently miscast, only to be laid off in the end.'" And what I'm hearing from you is like you can change that. Like you can push back on doing research that isn't actually contributing. But let me ask you, what's, what's your kind of leaving, lasting, I don't know, advice you would leave researchers with to-
- JAJudd Antin
Yeah.
- LRLenny Rachitsky
... to be successful?
- JAJudd Antin
Yeah. It's tough to be operating in a broken system, and so I feel that response where you're, like you feel kind of powerless. But I think that's not likely to lead us past this moment to the next evolution of research. So that's kind of where it's like I don't blame any researcher at all for being in the spot they're in. It's a tough, it's been a tough go, right? However, crying about our lot, you know, is like not gonna get us anywhere. So like I, I think the point of the, the, the article for me, and, and this is advice I give companies all the time when I do consulting with them, is like, "Hey, we can set this up in a different way which responds to the current environment in a way which will drive a huge amount of impact." Now, that takes companies making the right choices. It also takes researchers owning up and developing skills, right? Pushing back, understanding what research can have the most value, developing the skills and the knowledge and, uh, and language around the business, like becoming more influential, being excellent communicators. Like it's one of the things I would evaluate the most in hiring especially research leaders 'cause I needed them to show and teach by example, is like can you ef- like it isn't just rigorous research. It's like a tree, if a tree fell in the forest and no one was there to hear it. You know, you need to communicate it effectively, and you need to do it in a way that's appropriate to the audience 'cause if I'm, I'm, if I'm talking to you, Lenny, it's different than I'm talking to Brian Chesky, you know, at C, at Airbnb. And so I got to be able to give that presentation effectively and get right to the heart of it and speak the right language. And so if you're a researcher, it's not hopeless. Actually, the f- the discipline, the future is so bright, and we can help it along by continuing to develop these different skills as companies build a, a, a model that's more inclusive.
- LRLenny Rachitsky
Awesome.
- 1:03:39 – 1:06:48
The limitations of NPS as a metric
- LRLenny Rachitsky
Okay.
- JAJudd Antin
Cool.
- LRLenny Rachitsky
I have one just random tangential question about NPS. You have strong opinions about NPS, and I just wanted to hear your perspective on the value of NPS, your experience with NPS.
- JAJudd Antin
Yeah. I do have a strong opinion about NPS. Um, I, I like to say NPS is the best example of the marketing industry marketing itself, and I think I'm gonna ail- the problem is this threatens many people's livelihoods 'cause there's an entire industry of consultants and software providers that want you to believe NPS is a useful and accurate metric. The problem is, so the consensus in the survey science community is that NPS makes all the mistakes. So it, it's, it's a garbage in, garbage out problem. So the likelihood to recommend question is bad for a whole variety of reasons. So it's, it's bad because it's a 0 to 11 scale. It's bad because it's usually unlabeled. So we A, label the polls, but the, that, that's not the gold standard for research. It's bad because it's 11 items, right? And there's a couple problems with that. Number one, we find that precision goes down after five items on average, maybe seven. Number two, especially on mobile, if you're taking this survey, what percentage of those options are below the fold, right? We are not gonna get accurate survey data, right? And so from a survey perspective, it's really bad. There's also this intuition which is like are you a person who will let reco- you know, like are you, are you li- how likely are you to recommend Windows 11 to your friends and family? I am not a person who goes around recommending operating systems. The question is fundamentally flawed. The argument is that's the, that that question is a good indicator of loyalty, but there's a really simple solution, Lenny. Customer satisfaction, a simple CSAT metric is better. It has better data properties. It is more precise. It is more correlated to business outcomes. I wanted to prove this. This is something that survey scientists know and marketers don't want you to know, and so I did, we did the work with Mike Murakami who w- who led survey science at, at Airbnb, and he's still there, great researcher. And we basically redid all that work to find out if all that stuff was true just for Airbnb, and it is. It's simple. Don't ask NPS. Ask customer satisfaction.
- LRLenny Rachitsky
And the cut- customer satisfaction question, what's the actual question for people to make up their mind?
- JAJudd Antin
Overall, how satisfied are you with your experience with Airbnb? Or it could be some version o- of that, which is like, overall, how satisfied are you with your experience with customer service, um, when you had a problem? You know, so there could be a more specific version of that question, but those questions have better properties. And, you know, a lot of people say, "Well, hey, everybody's using NPS, right? So at least it gives me a benchmark because I can compare my NPS to industry NPS." The problem with that is the research shows that NPS is idiosyncratic. So it, it goes up and downs i- in ways that we don't understand, and there's a lot of inconsistency in how it's asked that creates variations in the data, which means it's not apples to apples. So you can't even compare your NPS meaningfully to somebody else's.
- LRLenny Rachitsky
I love these hot takes. I'm curious to see who comes out of the woodwork to confront-
- JAJudd Antin
People are gonna be so mad, Lenny.
- LRLenny Rachitsky
I love that. I think, yeah. People, like I've heard this many times, and people don't talk about it. Okay.
- 1:06:48 – 1:08:51
The risks of dogfooding
- LRLenny Rachitsky
Is there anything else you want to share or leave people with before we get to our very exciting lightning round?
- JAJudd Antin
I, can I, yeah. I wanna, I wanna add one thing if I could because this has come up on your podcast a few times recently, which is about the idea of people doing their own product walkthroughs. Like so should a PM just rely on their own, um, dogfooding of the product and their own walkthrough to figure out how to fix it?Um, and a couple of times recently this has come up like this a- and, and I think the consensus seems to be sort of yes, this is a good thing. And I have a contrarian opinion there too, which is that I, I think it is really important for everyone to dogfood their own products. The problem is related to relying on your intuition about those products, which is the thing most PMs, um, have trouble with is realizing you are nothing like the user. You are nothing like them in ways that will bias the way you think about what's good and bad in your product, in ways that you can't necessarily recognize. Like some things with a product, some problems with a product, you need a pulse to recognize. And like most good PMs that I know have a pulse, and so cool, but a lot of them require like context of use, priorities, constraints that you just don't have and you can't imagine purely on the basis of your own usage. So what I think that means is that you should definitely dogfood your own product. Doing product walkthroughs to identify lists of potential issues is a great thing to do. Prioritizing that list, figuring out which ones are more or less a problem and for whom is an area where you should be extremely wary of relying on your own sort of opinion, expertise, or intuition when you are dogfooding your own product.
- LRLenny Rachitsky
Thank you for sharing that. It's definitely come up a bunch on this podcast, so I think that's an important lesson for people to take away. Anything else before we get to our very exciting lightning round?
- JAJudd Antin
I appreciate you, Lenny. Thanks for having me on.
- 1:08:51 – 1:14:34
Lightning round
- JAJudd Antin
- LRLenny Rachitsky
I appreciate you, Judd. Well, with that, we've reached our very exciting lightning round. Are you ready?
- JAJudd Antin
I am ready.
- LRLenny Rachitsky
What are two or three books that you've recommended most to other people?
- JAJudd Antin
I recently read a book by, uh, a business book by Barbara Kellerman called Bad Leadership. And what I love about it is that we spend a lot of time talking about good leaders, and she really dives into like the worst leaders and what makes them bad leaders in a way that I think is really valuable for everybody. I'd also recommend... I read a lot of fiction, so two recommendations there. One, uh, recent Pulitzer Prize winner, Demon, uh, Copperhead by Barbara Kingsolver. It's like an outstanding read that also, um, is like really sad and moving and illustrative, uh, especially when you're, if you want to understand rural poverty. And then completely other side of the fiction spectrum, if you ha- if you're interested in science fiction, which I am, uh, read The Murderbot Diaries. It's like about a sarcastic robot, uh, a sarcastic killer robot, and who doesn't love that?
- LRLenny Rachitsky
I love these fiction recommendations. I feel like we need more of these on the podcast, so thank you.
- JAJudd Antin
Yeah, everybody goes to business books.
- LRLenny Rachitsky
Yeah, absolutely. Uh, what is a favorite recent movie or TV show that you really enjoyed?
- JAJudd Antin
Uh, we recently watched The Last of Us, and it blew our mind. I watched it after I played the video game after long last. If you are a person who plays video games and you haven't played The Last of Us, play it. If you don't know, the m- the show is based on the video game, not the other way around.
- LRLenny Rachitsky
Do you have a favorite interview question you like to ask candidates that you're interviewing?
- JAJudd Antin
Think of a topic that you had to explain lately that was the most complex, and then explain it to me like I'm five. Um, and there are a lot of ways to vary that question, but the reason I like it is because I think... And I've asked this question to, you know, VP and C-suite candidates in like multiple disciplines, and sometimes it's related to a conversation. Like I might w- ask them to explain something complicated about quantum computing or music theory, or it could be a complex business decision, but I want to see if somebody can break a complex problem down in a really simple way and give me an intuition for it in a short amount of time. I think that is a differentiator between good and great for many people.
- LRLenny Rachitsky
Do you have a favorite product you recently discovered that you really like?
- JAJudd Antin
Yeah, this is a really weird one, but my whole family started, uh, indoor rock climbing recently, and there's a challenge you have when you top rope, which is that you're looking up all the time. So they make these glasses, which are called belay glasses, and they have an angled mirror embedded in the lens so that you can look straight ahead and the view you see is up towards the person, um, who you're belaying. And I just thought that product is like so perfect for that. Like that, that's a niche problem, and there isn't a better way to solve it.
- LRLenny Rachitsky
Do you have a favorite motto that you often come back to that you share with friends, either in work or in life that you find useful?
- JAJudd Antin
Yeah. So this is going to seem like pandering, Lenny, but, um, I don't know if you remember a conversation that you and I had. It must have been eight years ago. I remember where we were sitting, and it was about stoicism. Do you remember this? Anyway, we had this conver-
- LRLenny Rachitsky
I don't, but I was into stoicism for a while.
- JAJudd Antin
I know you were because we talked about it. And so the c- the motto is, comes from stoicism, which is basically focus on the things you can control and ignore the rest. And, you know, a lot of people think of this, uh, as with, you know, a lot of people think of this as the, the serenity prayer or the serenity saying. That is, that was a 20th century invention, but Epictetus was writing about this, you know, BC, and, um, eh, I think about it all the time. So much of the stress and pain and, uh, worry that we have in life comes from things we can't control. So I try to let those things go.
- LRLenny Rachitsky
Amazing. I l- I learned that lesson from 7 Habits of Highly Effective People, and just the importance of thinking about these circles of you can control, you can influence, and you have no control over and there's no reason to think about those other things.
- JAJudd Antin
Absolutely.
- LRLenny Rachitsky
Judd, this is everything I hoped it would be. We got into some really good stuff. I'm excited to hear how people react. Two final questions. Where can folks find you if they want to learn about what you're up to, actually share what you're up to these days and how people can find you? Uh, and then also, how can listeners be useful to you?
- JAJudd Antin
Yeah. Thanks for asking those questions. So, uh, people can find me at, uh, juddantin.com. That's the best way to find out what I'm up to. Uh, these days I'm a consultant. Um, I help people with UX strategy, uh, org design and crisis management. Uh, somehow I, I love dealing with other people's dumpster fires, and I've found that I'm constitutionally good at it somehow. So, um, juddantin.com is the place to find that. I also write. I write a, um, Medium post that you can find at onebigthought.com. Um, and you'll find a lot of the topics we talked about today, including the original post that started this, uh, at onebigthought.com. Um, if there's one thing I could ask, uh, your listeners to do is to get next to your researcher. You know, like I just, I just think if you build those relationships and involve a researcher and insights person early and often, beautiful things will happen for you and for the business. So that's my... That's the thing everyone can do for me.
- LRLenny Rachitsky
I love that. I've always done that. I love the, my researchers that I've worked with, many of them reporting to you, and so beautiful takeaway. Judd, thank you so much for being here.
- JAJudd Antin
Lenny, thank you. It's been a pleasure.
- LRLenny Rachitsky
Bye, everyone. (instrumental music) Thank you so much for listening. If you found this valuable, you can subscribe to the show on Apple Podcasts, Spotify, or your favorite podcast app. Also, please consider giving us a rating or leaving a review, as that really helps other listeners find the podcast. You can find all past episodes or learn more about the show at lennyspodcast.com. See you in the next episode.
Episode duration: 1:14:34
Install uListen for AI-powered chat & search across the full episode — Get Full Transcript
Transcript of episode L6RKi9ZvkT4
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome