Lenny's PodcastUnpacking Amazon’s unique ways of working | Bill Carr (author of Working Backwards)
EVERY SPOKEN WORD
150 min read · 30,173 words- 0:00 – 4:26
Bill’s background
- BCBill Carr
... as Jeff would say, we took it as an article of faith. If we served customers well, if we prioritized customers and delivered for them, things like sales, things like revenue and active customers, and things like the share price and free cash flow would follow. So therefore, when we're making a decision thinking about a problem, we're gonna start with what's best for the customer and then come backward from there. That te- that informs, like, what's the work you have to do to then create this new solution for customers.
- LRLenny Rachitsky
(instrumental music) Today my guest is Bill Carr. Bill is the co-author of the book Working Backwards, which is a synthesis of the biggest lessons that Bill and his co-author learned from their many years at Amazon. Bill joined Amazon just five years after it was founded, stayed there for 15 years where he worked on the books business, and then as VP of digital media, launched and managed the company's global digital music and video businesses including Amazon Music, Prime Video, and Amazon Studios. After Amazon, Bill was an executive in residence at Maveron, an early stage VC firm, then chief operating officer at OfferUp. And these days Bill runs a consulting firm called Working Backwards LLC where he and his co-author Colin Bryar help growth stage and public companies implement the many practices developed at Amazon. In our conversation, we go many levels deep on how to actually implement a number of the practices and ways of working that helped Amazon become the success that it is today, including the process of how to actually work backwards, how to organize your team with a single threaded leader, how to divide up your metrics into input and output metrics, how to practice disagreeing and committing, how to implement the Bar Raiser program in your hiring process, and so much more. Huge thank you to Ethan Evans for making this episode possible and introducing me to Bill. With that, I bring you Bill Carr after a short word from our sponsors. Today's episode is brought to you by AssemblyAI. If you're looking to build AI-powered features in your audio and video products, then you need to know about AssemblyAI, which makes it easy to transcribe and understand speech at scale. What I love about AssemblyAI is you can use their simple API to access the latest AI breakthroughs from top-tier research labs. Product teams at startups and enterprises are using AssemblyAI to automatically transcribe and summarize phone calls and virtual meetings, detect topics in podcasts, pinpoint when sensitive content is spoken, and lots more. All of AssemblyAI's models which are accessed through their API are production ready. So many PMs I know are considering or already building with AI, and AssemblyAI is the fastest way to build with AI for audio use cases. Now is the time to check out AssemblyAI, which makes it easy to bring the highest accuracy transcription plus valuable insights to your customers, just like Spotify, CallRail, and Writer do for theirs. Visit assemblyai.com/lenny to try their API for free and start testing their models with their no-code playground. That's assemblyai.com/lenny. This episode is brought to you by Coda. You've heard me talk about how Coda is the doc that brings it all together, and how it can help your team run smoother and be more efficient. I know this firsthand, because Coda does that for me. I use Coda every day to wrangle my newsletter content calendar, my interview notes for podcasts, and to coordinate my sponsors. More recently, I actually wrote a whole post on how Coda's product team operates, and within that post they shared a dozen templates that they use internally to run their product team, including managing the roadmap, their OKR process, getting internal feedback, and essentially their whole product development process is done within Coda. If your team's work is spread out across different documents and spreadsheets and a stack of workflow tools, that's why you need Coda. Coda puts data in one centralized location regardless of format, eliminating roadblocks that can slow your team down. Coda allows your team to operate on the same information and collaborate in one place. Take advantage of this special limited time offer just for startups. Sign up today at coda.io/lenny and get $1,000 starter credit on your first statement. That's coda.io/lenny to sign up, and get a startup credit of $1,000. Coda.io/lenny.
- 4:26 – 9:54
Amazon’s workplace evolution
- LRLenny Rachitsky
Bill, thank you so much for being here and welcome to the podcast.
- BCBill Carr
Thanks, Lenny. Thanks so much for having me. Pleasure to be here.
- LRLenny Rachitsky
It's my pleasure. So I was reading your book and something that I recognized as I was going through this is just how many new ways of working Amazon contributed to the way tech and business runs. And I made this little list and I'm curious if there's anything I'm forgetting that's obvious. So obviously the idea of working backwards, the idea of one-way and two-way door decisions, the concept of disagreeing and committing, input and output metrics, using memos versus decks. There's this idea of two-pizza teams, and then I know that evolves into single threaded leaders. Is there anything else that's just, like, an obvious core thing that's maybe almost too obvious that I don't even think about that Amazon contributed?
- BCBill Carr
The one that's sort of non-obvious and... is really the, the way in which Amazon created a set of leadership principles that were very real, and the way in which Amazon, uh, created a set of processes to reinforce them, and, uh, this is, I think... I certainly haven't encountered anything quite like that. It was very intentional. So that is also a distinctive element of it, uh, that, um, w- we try to point out in our book.
- LRLenny Rachitsky
Awesome. Okay, so maybe we'll come back to that, 'cause that is also a really powerful mechanism. So the question I wanted to ask about this is, there are companies that are bigger than Amazon, that are more successful than Amazon, that have been around longer than Amazon, but I don't think any other company has contributed so many unique new ways of working and also been able to coin them into such shareable ways.... what would you say it is about Amazon that enables this sort of way of working, and also just making things so ... just proliferate through the culture?
- BCBill Carr
That's actually one of the reasons why Colin and I set out to write our book. Because everyone knows about Amazon as a innovative product company. At least certainly during the time I was there, which was from 1999 through the end of 2014, the company rolled out all kinds of innovative products, the Kindle, AWS, Alexa, Echo, um, Prime, the Prime subscription itself is innovative. And, um-
- LRLenny Rachitsky
I use all those things, by the way.
- BCBill Carr
Yes, (laughs) a lot of people around the world use all those things. And, um, and obviously Jeff was a huge driver of those things. But what people don't realize is that Amazon was actually, to some degree, equally focused on process innovation. In many cases, by the way, we stood on other people's shoulders. We cannot take, you know, credit for having ... For most of these, there were other, uh, inspirations or we built on, uh, work that others had done. Uh, which by the way is what I think all great companies should do. And again, that's also why we wrote the book was because we would like to allow people to stand on Amazon's shoulders, uh, to learn w- what we learned and then take all or part of these things and, you know, build from there. But to sort of more directly (laughs) answer your question, how or why did this happen, so this period of both product and process innovation actually occurred in this, this one narrow window of 2003 to 2007. During that window of time, all the products I just mentioned, and all of the processes except for one, were all developed in this one four-year period.
- LRLenny Rachitsky
Wow.
- BCBill Carr
And this is the period actually where we were going from, um, hypergrowth stage, you know, zero to one company, to what I would call, you know, one to whatever, 1,000 infinity, um, you know, that, that next step that companies have to make where what happens is you become, things become very complex. We're no longer just a bookstore. We sell a lot of things. We actually branched out beyond just a retail business, we had a third party marketplace business. We were experimenting in those days with providing s- you know, uh, running websites for third party retailers in those days too. We were developing new things. We were in many countries around the world. So we've become, you know, very complex. And what happens at that point is that then, you know, you, you reach this point where the CEO can no longer be in every important meeting, can no longer be involved with hiring every person, and you need, you need a, a s- a system, uh, uh, a method to run the company effectively. And Jeff Bezos is, you know, fundamentally he's sort of a very scientific and analytical thinker. Um, you know, his undergraduate degree, uh, was in computer science I'm pretty sure, although I think he actually started off wanting to get a physics degree. (laughs) He ended up moving over to computer science. He spent his early days at D.E. Shaw as a, you know, a quant on Wall Street, very quantitative mind. So he applied this kind of when he thought about this problem, he said, "Well, I need to up- I need to, uh, be scientific about this. There needs to be some system or some approach, some mechanism for me to be able to manage such a company, so I'm going to experiment, like a scientist would, with, you know, different ideas, different hypotheses, implement them and see what works, and iteratively improve." So that was kind of the mindset which we took, which by the way applied both to process innovation but also, you know, product innovation.
- 9:54 – 11:44
Amazon’s “fitness function”
- BCBill Carr
- LRLenny Rachitsky
Awesome. I had Eric Ries on, and he also happened, I thought, about this at the same time. He contributed a lot of core concepts to the way tech worked, and he actually brought up a couple, uh, concepts that were on the cutting room floor, basically things that he thought would be, like, things people adopt everywhere. And I'm curious, is there an example of that at Amazon where you built a process and had this clever term for it and it just never spread or never actually worked at Amazon? Anything come to mind?
- BCBill Carr
You know, the, the dev team, the design team, the product team, they're all in one group. And they'll go operate autonomously, but not completely autonomously because we, the senior leadership team, Jeff and the S-team, want to know that they're on the right track, so we're gonna create something called a fitness function, which was let's figure out what are the four or five or six metrics that matter most for your particular area, let's give a weighting to all of them, and then let's create an index for those and we'll measure that index up and down, and that's the fitness function.
- LRLenny Rachitsky
That is a very nerdy way of, uh, organizing teams. I lo- I love it.
- BCBill Carr
S- yeah. It was super nerdy, but we realized, um, uh, after, you know, I don't know how long, several months or a year of doing this, that the fitness function was not a good idea. This is what I would describe as a compound metric, uh, where you try to take several important metrics and munge them into one. The problem is it's actually becomes totally meaningless. When you're measuring things, you're trying to f- you're trying to understand, like, what actions or reactions are, are creating, you know, the good outputs that you want, revenue, customer growth. But by putting them all together, you basically obfuscate that, and what really we realized is we need just break each one of these out individually and manage them each, uh, uh, in its own way. So I, today, I discourage teams and companies from creating any sort of compound metrics.
- 11:44 – 18:07
Single-threaded leadership
- BCBill Carr
- LRLenny Rachitsky
I've done that once, and it was a terrible idea as well, where we had six different metrics and every quarter we were gonna move a different metric that contributed to a higher metric. What we realized is we just never learn how to get good at one thing, and then it turns out there's always one thing that actually impacts the bigger goal more, so you just end up working on that thing anyway. Let's actually go deeper into the single threaded leader piece, since you mentioned it. It's actually come up a lot on this podcast of people working this way, where they have a single threaded leader.... and so clearly, it's worked. And I guess let's just help people understand, what does, what does a single-threaded leader actually mean? And then why is it such an effective way of working?
- BCBill Carr
So the concept of, of single-threaded leadership was first bo- uh, you know, was born from this time of, of complexity at Amazon. And where, again, large co- you know, once you get to a certain scale, you get to a point of where there are competing departments, competing interests, and they are competing for some centralized pool of resources. For all of you who are working for a tech company, this is this pool of engineering resources, or today, data science and AI resources. Um, there may be other constrained resources often designed as a constrained resource. But the point is now all these teams want, you know, that pool of resources to go build stuff for them, but they're competing, in competition with each other. So most companies solve this by having, like, an intense, centralized, highly collaborative process. We decided to go in the other direction becau- for the reasons I mentioned, which were that we just found that we were take, spending all our time in these meetings planning, and a lot of the work we were doing, the artifacts we'd create, the documents, the projections were actually not very useful either, were kind of, uh, bureaucratic time-wasters, um, largely because a lot of the assumptions built into them are, were deeply flawed. So you're, you're debating numbers in these documents that are based on flawed assumptions, which is a waste of time. So what we realized instead was how do we, how do we get, you know... The three things we really wanted were ownership, speed, and agility. And so we experimented with that and said, "Let's create teams that are, can stand alone, where there's a single leader and the cross-functional resources that they need are all, you know, either directly report to them or are dedicated to them." So they don't necessarily have to be a straight line direct report. In Amazon's case, for the most part it was. There were some dotted line. But it could be all straight line, it could be all dotted line, it could be a mix of the two. But fundamentally, we've moved from what we called a project orientation to a program orientation. So project orientation means oh, we're gonna do this project to change our search, uh, result page and algorithm, and the project is defined in this way, and it's gonna take six months. The resources will come and swarm on that, and then they'll move off to some other thing in some other part of the company. The program-based orientation says, let's stick with the search example, that there's a team that works on search, and they always work on search. And instead of thinking about things on a project-by-project basis, they, they think holistically about what they need to do to improve search. They have a set of metrics by which they're looking to, to drive those metrics, uh, largely ones that they can control. Things like what percent of the time is a customer clicking on one of the top three results in my search page? Or how many milliseconds does it take for, for the page load time in this browser type on this device type? Uh, et cetera, et cetera. And they then are, are running their own roadmap. They are deciding what are the most important things for us to go work on, and having a prioritized list of those things, and being able to sort of start at the top of the list and work their way down with the pool of resources that they have. Sometimes, um, most times they may want more resources to be able to tackle more, but they spend less time sort of in resource contention, resource fighting, and instead focus on building what they can build with the resources that they've got. And so, uh, you know, the benefit of this is if, you know, there's success or failure is, you know, they're really dependent on themselves now. The only, um, thing they could maybe argue about how they could do better is if they had more resources, which they can petition management for. But this way it also solves a big management problem, which is instead of management, senior management refereeing every different, every item on a roadmap, they're refereeing which teams have how many resources, which is kind of more of, like, a once, or twice, or three times a year decision, versus refereeing everything on the product roadmap and then all of the resource contention issues. That's, like, a daily issue. And so it frees teams up then to actually go and, and sprint ahead. There's, there's a lot of work you have to do to get ready for this. So for example, in a software environment, when we first started and we had, you know, kind of a monolithic code base that was, you know, not pretty, we weren't ready to do this, uh, because you have all those d- interdependencies. Once we moved to, you know, a service-based architecture and then, you know, teams could own their code with, you know, defined end points, you know, stru- uh, you know, APIs that other, other teams could, you know, understand, that are well documented, then we could sort of move in that direction. And the other thing is we had to create what I would call countermeasures, because there's no, there's no free lunch in org structures. In any org structure you're trading off one thing for another thing. In this case, you're trading off potentially functional excellence. So in other words, if you no longer have every single engineer, or every single marketing person, or every product person, or every biz dev person reporting into a C-level leader of that s- particular function, uh, and instead they are, they are, they're spread out in small teams across the company reporting into some generalist who is, is probably not gonna have functional expertise in several of the functions that they're leading, you risk the problem of then the people in those teams not gaining functional competency. Uh, that's the downside. And, you know, we can talk more about this, but we created a lot of countermeasures to still enable us to have functional excellence while creating these single-threaded
- 18:07 – 20:16
Implementing a program orientation with single-threaded leadership
- BCBill Carr
teams.
- LRLenny Rachitsky
To drill into this a little bit further, is the, kind of the origin of this, this kind of recognition at Amazon that the best stuff comes from one person's vision and just, like, one person driving and-... one person to ask being on the line, versus the often th- decision by committee approach.
- BCBill Carr
It, it's less about that. And I do wanna be clear, it's one leader and their team who are accountable and responsible. So with respect to, you know, what are we gonna go build? How are we gonna go measure success? All those things. This team, and that leader are responsible for documenting that, writing their plan. Now, they don't just get to go off and do that? There's, there was an intense review process at Amazon where either, you know, a- at some level, whether it be the vice president, senior vice president, or all the way up to the Jeff level, and his direct reports called the S team, this plan would be reviewed, uh, and scrutinized deeply as well, and there'd be a, a discussion, an interchange, um, uh, and basically getting alignment between the senior leadership team and each one of these single threaded teams on that plan before the team could go t- go off and run. You know, the beauty of that though is that once we'd had those discussions, those interchanges, then the teams were free to sprint hard after their plan. They didn't have to worry about whether was, am I aligned with my CEO? Am I aligned with my senior vice president? They, they could know that they were. But yes, this creates then clear... You know, i- if they're gonna deliver it or not, it's up to that owner and that team, whereas when you have this highly like cross-functional approach, and there's not, you know, one clear person who's responsible for this one project that's on this roadmap, I've seen many a CEO pull their hair out saying like, "I don't, I have no ownership and accountability here. How do I have that there?" They're, they're pushing on a string, uh, because they can't, because th- their, their, their, their different people and leaders are kind of part owning, half owning, you know, a long list of things instead of fully owning a short list of things.
- 20:16 – 21:31
The GM model vs. single-threaded leadership
- BCBill Carr
- LRLenny Rachitsky
I like that (clears throat) I like that metaphor of pushing on a string. Is this approach similar to just the GM model, or is there a big difference when someone's thinking about going GM model versus the single threaded leader approach?
- BCBill Carr
Yeah, I mean obviously there are probably different definitions of what people consider the GM model, but we, I would consider that meaning this person is a P&L owner. Uh, and you can, of course, create, you know, many P&Ls within a P&L. Like for example, in the book business, we, we could have... And I, I don't know, you know most of the time we didn't do this, but we could have created a P&L owner just for fiction books or just for professional and technical books, which is a very large category with big differences between the others. And then you say, "Great," then that team, they have their own, you know, dedicated team. They're fully responsible for the revenue numbers and, and other numbers. But you have to be, you know, thoughtful about how you do this, because one of the three questions you have to ask when you establish one of these teams is, does the team have the resources, you know, within their control to effectively manage this, this part of the, this department, this product, this P&L? And sometimes then you, if you narrow things down too much in some cases, then the answer is no. In other cases, the answer can be yes, very
- 21:31 – 25:22
Functional countermeasures needed for single-threaded leadership
- BCBill Carr
easily. Like, uh, a great example of this was in Prime Video, one of the businesses that I managed. You know, we could create a, uh single threaded team who just were working on applications for TV sets like Samsung, Sony. We could create another team that was working on game consoles and another team that was working on mobile phones and tablets. And then within each one of those we could further, uh, break it down. We could have one team working on Xbox and another one on PlayStation, another one just on iOS. Uh, in those cases then it's, it's very clear how you can break the teams down and they can have, you know, very clear ownership.
- LRLenny Rachitsky
Awesome. Let's go back to the countermeasures topic, and then even just a little more broadly. You talked about one thing that was important to put in place before you moved to the single threaded leader model, which is creating APIs and basically breaking apart this monolith. What are some other things that you think you need to put in place to be successful in trying to shift to this model?
- BCBill Carr
The other thing was these, these functional countermeasures. So let's just, let's stick with the, the engineering for an example. So in, you know, 2004, 2005-ish, I started managing a single threaded team. Uh, actually I managed two different ones, one for music and one for video, which are now, you know, Amazon Music and Prime Video. They weren't called that in those days. But, um, I started managing, uh, a small team of software engineers at that point. Well, I have never... Well, I have written lines of code, but that would be back in high school and, you know, we're talking about like, uh, basic and Pascal. I have a master's in business. I am, you know, a background in sort of marketing. I'm a, I'm a, I'm a generalist, okay? So I'm not equipped to coach. I, I, I, I couldn't possibly conduct a code review. I couldn't possibly conduct an architectural review. I couldn't possibly coach or mentor an engineer on how to improve, you know, their, their craft. But I was one of many of these examples. And there could be reverse examples where instead of me being a business leader, I was purely an engineer and now I'm managing a team that does marketing and business development and I, I, you know, I wouldn't know anything about those things if that had been my background. So what we did in the... I'll stick with the engineering example, is what we, we, we came up with various countermeasures. Like one example was that we still had, you know, a C-level leader of engineering in, you know, Rick Dalzell, and m- most of the core infrastructure, uh, and, uh, core services still reported into Rick. So it was things like payments or infrastructure, search. And Rick still could be a technical leader for the whole company, and he and his team could create things like what are the standard ways that we're going to do code reviews? What are the standard ways across the company that we will eng- uh, interview and screen engineers?... what does the promotion process look like? What are the, what are the defined steps from getting from an SD1 to an SD2, SD3? How do we document and describe what's, what are the requirements? There were, you know, many things like this. And effectively, what it also meant is that anyone who was a, an engineering vice president, or in many cases, a director, they would often have something else beyond their day job of, of some sort of subject matter expertise area where they would also contribute to the company.
- LRLenny Rachitsky
Mm-hmm.
- BCBill Carr
A good example of this would be that they might sit on a panel for, um, uh, promotion from a certain level to another level in the engineering world, or they might be available to do code review outside of their organization for another organization. So people had other jobs in addition to their day job to build and maintain functional excellence, and there are a lot of examples like this across
- 25:22 – 30:22
Embracing the “disagree and commit” principle
- BCBill Carr
the company.
- LRLenny Rachitsky
Let's go in a different direction and talk about one of the, one of my favorite principles of Amazon, which is disagree and commit, and I think in the way I even describe it, I know is wrong, and I think people hear this term and they often use this principle incorrectly. For example, it actually starts with have backbone and then disagree and commit. So I'd love to just hear how you've seen this actually implemented well, and what people should do and think about when they're trying to implement something like this at their company?
- BCBill Carr
So when I was at Amazon, there were 10 leadership principles, and they've since expanded them. But of those 10, this was always the least well-understood when I was at Amazon, too, and partly because it is actually the most nuanced and difficult to actually, um, use. So here's what it means. What it means is that have backbone and disagree, meaning when we are in making any kind of a decision, important decision, if you are, you know, part of that team, part of that unit, it is your obligation to voice your point of view if you disagree with the approach that's been taken, um, and the point of that disagreement, by the way, is to, uh, provide usually additional information or a new point of view that people have not considered. So, um, I like to geek out a bit on like the, the, um, process of decision-making and have read more and more about this, and I think that, uh, Peter Drucker probably has the best writing on this topic. But as he would describe it, you know, a, a decision is... Uh, good decisions are made by first understanding all the different, you know, points of view and pros and cons to, uh, uh, the, the potential issue at hand, or the potential direction, and that great leaders, what they do is they solicit these different points of views. They have a, a team that they, they work with to debate and discuss things. So another way to think about this, a king and their court. In an ideal world, you know, if you assume that there's no political motivations, the court is there to advise the king and help them think through different problems and f- provide different and opposing points of view to allow the king to arrive at the right decision. And this is sort of no different than that, which is it's, it's the disagree part is about bringing forth, you know, new information, new data, new point of view that would be contrary to the current direction. So that's the disagree part, and you're obligated to do it, um, all, you know, as we would describe sort of all the way up the chain if necessary, if it's an important issue, and people are not hearing or understanding your point of view. Now, the important point is f- first of all about hearing and understanding your point of view. What would often happen, I can tell you as someone in a leadership role, someone would come to me with a disagreement and many times, I'd appreciate it, by the way, 'cause they'd bring some point of view that was useful, but sometimes they'd bring the disagreement and cite the reasoning behind it, and I already knew that reasoning. I'd alread- we'd already thought of that reasoning. We'd already thought of that. In which case, I would say, "I hear your disagreement. I've al- we've- I- we have already considered that factor, but even though that factor is there, you know, here are these other factors that like outweigh that." Now, that is the point at which, as long as, as long as the disagreer is hearing back from the leader that they understand their point of view, understand like why they are pushing back, and seem to fully understand it, and they've taken that into consideration, that is the point for them to, to commit. Because the point is you provided your, your information, they've processed that information, and they've decided to go this way with the knowledge of that. Where people get confused about is they don't maybe understand like when they're supposed to stop disagreeing is one thing, and so hopefully that, that explanation made people clear like this is when you're supposed to stop, um, and, uh, and then the commit part done well means that it's not just like, "I'm gonna commit. I d- I, you know, I don't really agree with what we're gonna do, um, but I'm gonna, you know, get behind this." It's a... Ideally, it's, "Oh, I've, now I've heard the argument. I've actually now thought about the argument," and hopefully that person has now understood why we're taking that direction, and so their commitment is based on that understanding because then they can reflect that understanding back to their organization, too. 'Cause the worst thing to do is to say, "Yeah, we're committed to this. I don't really agree, and I still think it's wrong, but I'm committed to it." That's, that's not, that's not actually commitment. So it's really about... This is really about decision-making and understanding the facts or, and information that people are gonna use to make a decision and then be able to reflect that back.
- 30:22 – 32:41
Understanding disagreements
- LRLenny Rachitsky
I imagine there are many times I've gone through this where I still don't agree. What's your advice to a manager to report of just like, okay, when you actually still don't agree, how do you behave? Do you still, do you just-... behave like, "Yes, I agree with this," and don't really sh- voice your concerns. Or something else.
- BCBill Carr
I work with Jeff on all kinds of different new ideas. And Jeff has, uh, um... Jeff doesn't think like a normal person. Uh, you know, he's, he's g- his, his level of sort of creativity and the way he thinks, the time scale at which he thinks, there's many ways about the way he thinks that, that, you know, there was no one else (laughs) at Amazon that thought that way. And so there would be times when even after we've had that discussion, I would maybe still disagree, but then what I would do is I'd focus on, "Okay. Well, what is the c- what is the, the, the kernel or the core of, like, why Jeff thinks that we should do this?" And I would focus on that kernel. I got great advice actually from one of my managers at one point, Steve Kessel, who said, "You have to look for what that is, and then your job is to then take that kernel and try to run with it and expand it and try to see how I can take that, that, you know, that idea, that concept, and then make it into something viable." And it doesn't always work. But it's, it's about then having that, um, understanding of what it is, not just sort of, like, going through the motion of, like, stomp, stomp, stomp through it. Like, that, that's, that's not gonna work, and it's also, I've seen people who try that and, you know, their career doesn't go very far. You have to have, um, uh, some f- some degree of, of faith that, you know, there's something there am I'm gonna try, try to do the best I can to, you know, make that part... H- how would I, how would I productize that idea? How would I make that viable from a business point of view? Or whatever the different constraints are.
- LRLenny Rachitsky
Awesome. So the advice there is focus on the parts you agree with and think about how you can find out if it's actually right or not.
- BCBill Carr
Agree with or even just, like... (laughs) you may not even be able to agree, but, like, what is, like, what is, what is the core of, like, what, what that person is thinking is the big benefit or good guy or what is the, or thinking vector that they're on that's causing them to want to go in this
- 32:41 – 35:25
Deciphering Amazon’s “Leaders are right, a lot” principle
- BCBill Carr
direction?
- LRLenny Rachitsky
Mm. Thinking vector. Love that term.
- BCBill Carr
(laughs)
- LRLenny Rachitsky
Along the same lines, another principle that I love is leaders are right a lot. And I feel like this is a term that... it almost goes unsaid. Like, you almost can't say this in a lot of companies.
- BCBill Carr
Yeah.
- LRLenny Rachitsky
And I'm curious just the origin of why that became an important principle and then how it's implemented at Amazon.
- BCBill Carr
Yeah. So going back to this last discussion, so one, you know, f- one fallacy we should all acknowledge is that when you're making these decisions, you know, you can kind of ta- uh, and you're trying to use data to make decisions, like, you can make the data kind of look however you want it to look to sort of try to meet your decision. I, you know, if I, looking at some issue, I, and I've got some big data set, I can, (laughs) I can come up with, you know, ways of looking at a data set to support this idea and ways of looking at that data set to, you know, not support it. So the data doesn't, you know, rarely makes the decision for you. What is happening is then a lot of judgment and interpretation of the data weighing that, weighing various factors to then come to a decision. And that is sort of the right a lot part. The right a lot part comes from ex- you know, having exper- you know, having what we call sort of, you know, sound judgment, which generally come, you know... Some people maybe are born with this. Uh, and not, not a lot of them. Mostly they get it through experience. L- a lot of experience is actually about being wrong, by the way, about making mistakes, and by having looked at a lot of problems, made decisions or observed others making decisions, being a student of that, and then using that to understand then how to weight different information when making a decision. So right a lot is that you're good at that and that then it proves... And that generally speaking, people want to follow someone who ends up by and large, like, you know, going in the right direction, right? You're the leader of a team. The team is, you know, petitioning you on multiple sides. And if you keep kind of going off in some direction where most of the team is scratching their head saying, like, "I don't think that, that was the right decision," you're probably not gonna go very f- they're not gonna want to follow you very far, and you're probably not gonna go very far. So this is something that you develop through, you know, experience and I'd say from having the opportunities to ob- you know, observe and work for others that are good at this.
- LRLenny Rachitsky
I love that it's a lot. I like that it's not just leaders are right.
- BCBill Carr
(laughs)
- LRLenny Rachitsky
It's, like, right a lot.
- BCBill Carr
Yeah. Yeah. There'll still be... Y- y- you know, n- n- no one is right every time. Um, that's, that is totally unrealistic.
- 35:25 – 41:16
An explanation of the working backwards framework
- BCBill Carr
Yeah.
- LRLenny Rachitsky
Let's talk about the, uh, t- ular concept of your book, and that's a word I've never used but I think it's appropriate, which is Working Backwards. First of all, just what does it actually mean to work backwards versus working forwards?
- BCBill Carr
The title of the book comes from two things. One is, you know, one of the leadership principles, which is that, um, you know, customer obsession and the principle states, uh, something along the lines of that, you know, great leaders start, you know, start with the customers' needs and work backwards from there to sort of, you know, meet those needs or, or solve them. And then also because we created a process in this, in this window I was talking about earlier, the 2004 through 2007 window, we created this process for new product innovation called the working backwards PRFAQ process. And they both refer to the same idea, which is that as your, your guiding, you know, star or the point from which you're going to, uh, start is, what are the customer's problems or what are the customer's needs? And then figure out, okay, well, what would be the solution to that?... a, what are potential solutions to that, and to do those things to starting with without the constraints of my financial constraints, my resource constraints, my legal constraints, my engineering constraints, whatever all those constraints may be. Because the problem is what most of us do is we start with those constraints, and work forward from there. We start with, like, things like, "I gotta increase revenue. A, how do I increase revenue? I need to increase active customers. How do I increase active customers?" For customer-oriented behavior, we tend to start with those things, which may often lead you in the wrong direction. Whereas we had, as Jeff would say, we took it as an article of faith. If we served customers well, if we prioritized customers and delivered for them, we took it as an article of faith that then things like sales, things like revenue and active customers, and things like the share price and free cash flow would follow. And so this is important because there's no... I still can't give you objective proof that that is true. I don't know who could. And so there wa- it was saying, a, this is an article of faith that if we do that, we think those other things will work out. So therefore, when we're making a decision thinking about a problem, we're gonna start with what to say best for the customer and then come backward from there. And then in that coming backward process, we're gonna have to figure out, well, to do that, gee, I'm gonna have to solve this engineering problem, or I'm gonna have to figure out how to make this thing cost less or make this thing faster or solve some, one or more problems. And that's the, that's the, that's the backwards process
- LRLenny Rachitsky
Mm-hmm.
- BCBill Carr
... pro- proc- that tell, that informs like what's the work you have to do to then create this new solution for customers.
- LRLenny Rachitsky
Awesome. So just to summarize. You start with, what are the customer's needs and problems? And I think a big part of Amazon's approach is what are, like, the lasting problems they'll always have, which is I think it's, like, lower prices, faster shipping, and all those things, and then think with no constraints. When you work with companies to implement this idea of working backwards, is it always what is the customer problem and need versus, like, revenue or growth or something like that, or is there other examples of where you work backwards from at different sorts of companies?
- BCBill Carr
Well, the, the, the, the working backwards part is, um, is strictly about the customer's needs. Yeah, we don't, we don't wanna work backwards from, from revenue. I guess there is... we don't really, we didn't really use this term for sort of other things like, um, cost structure.
- LRLenny Rachitsky
Mm-hmm.
- BCBill Carr
Cost structure was actually a part of working backwards from the customer that if we had a low-cost structure, we could afford to give customers lower prices, therefore let's figure out how to have a low-cost structure. 'Cause in itself, driving out costs, you know, doing things more efficiently, uh, doesn't ne- inherently benefit customers 'cause you could just choose to take more profit. It only does if you, if you decide that I'm, in doing so, I'm going to lower my prices, uh, to customers or provide some other benefit. So, uh, no, we used it in this, this method of, like, I'm starting from the customer. And then very specifically, we used it in this method of, of, you know, new products and features that I- I'm gonna go build on behalf of customers.
- LRLenny Rachitsky
Awesome. This episode is brought to you by Wix Studio. Your agency has just landed a dream client, and you already have big ideas for the website, but do you have the tools to bring your ambitious vision to life? Let me tell you about Wix Studio, the new platform that lets agencies deliver exceptional client sites with maximum efficiency. How? First, let's talk about advanced design capabilities. With Wix Studio, you can build unique layouts with a revolutionary grid experience and watch as elements scale proportionally by default. No-code animations add sparks of delight while adding custom CSS gives total design control. Bring ambitious client projects to life with any industry with a fully integrated suite of business solutions, from e-commerce to events, bookings, and more. And extend the capabilities even further with hundreds of APIs and integrations. You know what else? The workflows just make sense. There's the built-in AI tools, the on-canvas collaborating, a centralized workspace, the reuse of assets across sites, the seamless client handover, and that's not all. Find out more at wix.com/studio.
- 41:16 – 44:47
PR FAQ process: Amazon’s innovation engine
- LRLenny Rachitsky
Okay, so then when you go work with a company to implement this idea of working backwards, what are the very tactical things that you do to help them here? I know PR/FAQ is a part of that, so let's chat about how to actually implement that. What are the steps to shift to working backwards?
- BCBill Carr
Yeah, so the first shift is to, is to take this, you know... So that's just a concept, right? Working backwards. Well, how do I turn that concept into a scalable, repeatable process? And that's exactly where Jeff's mind went, and eventually, without getting into the origin story, we came up with this process called the PR/FAQ process. So what it m- what it means is that whenever we're devi- uh, devising a new product or feature, we're gonna start by writing a press release describing the feature and describing it in a way that speaks to the customer, um, and to some degree, you know, the external press and world, where the idea is, in my description of this, it better jump off the page as something like, "Wow, you know, I really, you know, as a customer, I will really need this." And so what I work first is to say, "Okay, for your product development process, let's start by, you know, using, using this method to, as the method to decide what am I gonna go build?" And, and, and oh, by the way, to use it as a method to sort between a lot of different choices of what you might build. Y- you know, in summary, the way that process works is that you're... In, in the PR...... you're going to describe very carefully and clearly, like who's the customer, what's their problem, and what's the solution that you're planning to build? That sounds really simple and easy, but if you... it's actually very hard to do, to do that well, to crisply and clearly define those. The first two things are the things that are hardest to define. Like, who's the customer? Like anyone says, like, "All restaurants are my customer." Okay. Well, that's a mistake. No. I mean, like which kinds of restaurants are your customers? In what kinds of cities? In what kinds of, uh, uh, formats? Et cetera, et cetera. And then what is the specific problem you are solving and ident- uh, i- ideally, you would some way have like quantified that problem or there's some data or customer insights that have led you to understand that problem to know that it is a meaningful and big problem. Ideally a problem that people would, you know, pay money if you could, if you could solve that problem for it because you can just look at the economics of that problem and, you know, if instead they use your solution, this would be beneficial to them. So, I worked to have them first implement this PR/FAQ process, is the first step, and then the next step really is to go from there to say, "Okay. Writing PR/FAQs is one thing. Well, how do I actually use them? How do you actually develop them?" Because there's this iterative nature to writing PR/FAQs where it's sort of a concentric circle review. Like, you start off small, like with one author, and with low fidelity writing these things, and then you start to share them with a, with, you know, a small group and get feedback and improve it. A wider group, get feedback and improve it, and onward and onward until, you know, depending on the size and scale of your company, you get up to the, the CEO, as a way to strengthen, improve, and, um, really codify this idea and determine whether it's a great idea or not. So I, I help them understand like how- how does that work? How does- how do you do this iterative process? And then once you've done that,
- 44:47 – 44:55
Deconstructing the PR FAQ structure
- BCBill Carr
um, then what do I do with these PR/FAQs like once I've got them? How do- how do I then think about that with respect to my roadmap?
- 44:55 – 47:52
The customer problem-solution statement
- BCBill Carr
- LRLenny Rachitsky
Awesome. Okay. That was an awesome overview. I'm gonna fire off a couple of questions around the first part. Do you still suggest people do it as a press release? Feels like press releases aren't a thing anymore. Do you ever suggest people do it as a tweet or as a TikTok video or a blog post?
- BCBill Carr
Good question. So, the first thing is, it's not a real press release. Okay? We could, uh, change the nature of it and if instead we wanted to (laughs) call it the customer problem solution statement, right? We could just change it to that, because there really are, you know, you know, three money paragraphs in this, this... 'Cause it's, first of all, yeah, it's not meant to be a real press release, so don't use the language you would use if you were sending an actual press release. This is like an internal document. Okay? So that's the first thing. The second thing is, the heart of it really is that like first paragraph, it's a short description. That second paragraph, that's the problem statement, and that third paragraph, that's the solution statement. If you wanted to like ditch the rest of it and the artifacts of the press release, you could. I think there are other benefits to it, like the headline. Is this headline like long and drawn out and like I can't even tell what the heck this thing is from just reading this headline? That's generally a bad s- a, wh- (laughs) if you used a tweet, that wouldn't work very well. The date is also a meaningful thing when you write the press release. The date is meant to be a hypothetical timing on which you're envisioning, you know, launching this thing, which tells the reader something. Are you thinking that this is something that's so simple and easy we're gonna launch it next month or so complex that we're gonna launch it in a year from now? So there are some other, uh, uh, directional, uh, cues within it. Like I said, with everything, the- these are tools that people can- can use and maybe... and, you know, I'm sure that companies will find other ways to improve upon these tools. B- but if you don't use those parts of them correctly, you're kind of missing out on like what's the main benefit that you're getting out of this.
- LRLenny Rachitsky
And do you try to write it in a way that it would be announced, like a press release feel? Or is it mostly just like, "Who is the customer? What is..." Like, do you try to pitch it as a part of this experience?
- BCBill Carr
So you, you, you try to write it in- in- in that way but you're- but the one thing is you- you don't want to use like hyperbole.
- LRLenny Rachitsky
Mm-hmm.
- BCBill Carr
And it would be like very factual like with like, you know, numbers, like data-rich document too. So, um, so again, not like a real press release. A lot of like internal confidential data would be like-
- LRLenny Rachitsky
Uh-huh.
- BCBill Carr
... (laughs) in this press release.
- LRLenny Rachitsky
Got it.
- BCBill Carr
So it's- it's, um, it's a tool, uh, that's- that's, you know, has a very specific, you know, use to it.
- LRLenny Rachitsky
Is there a template that we can point people to in the show notes to help them craft this? I think there's a version in your book maybe, but is there some online that we could point people to?
- BCBill Carr
Yeah. So we have a website related to the book which is www.workingbackwards.com and there's a resources section within there and you'll find
- 47:52 – 51:19
Create a product funnel, not a product tunnel
- BCBill Carr
a template.
- LRLenny Rachitsky
Amazing. Okay, and then the concentric circle piece. So the idea there is basically get feedback from an increasingly larger swath of the company and it sounds like a big part of that is also get buy-in as you go along the way.
- BCBill Carr
Yes and no. So first of all, there are some things where you may write it and you, the author will, you know, uh, um, uh... If- if we were in the old world would like take the piece of paper, crumple it up and throw it in the trash can which is... In your own, you've realized now that I've put this down on paper and read it, this is not actually that good of an idea, I'm gonna try something else. And by the same token, you may then have written one you think this is a pretty good idea and you show your, up here or your manager and they give you feedback that makes you want to then ball it up and throw it into the trash can. So, part of this concentric circle thing is- is not just that every one you write lives on and gets all the way to the CEO. Like-... uh, there are no, like, stats on this. But let's just say in some virtual wor- uh, some imaginary world where, yes, all these things, yo- you're a product manager and you've got, you know, a director of product management you report to, who reports to some senior vice president of division, who reports to a CEO. Well, if you truly run this out and you write 100 PR FAQs in a year, you know, maybe 20 of those make it their way to the CEO, right? The point is not every single one of them is suppo- is destined to go that far. They're, they're, you know, th- the numbers, you know, get narrower. And this sort of leads me down the concept of what you're really trying to create is a product funnel not a product tunnel. And with a funnel, meaning, you know, lot- lots of things at the top, fewer things at the bottom. The tunnel means that everything that comes in is also gonna come out the other side. And the problem with that method is it means that you have no consider- you're not actually having a method of, like, consideration and comparing it against other things that you might build, uh, or, or how you deploy what are frankly in most companies your most precious resources, which is your engineering team. You should be looking at various choices. You should think of yourself, honestly, as like a venture capitalist. They don't fund (laughs) every company that they meet with. They actually fund a very, very low percentage of them. And at Amazon, you know, we had lots and lots of PR FAQs that were a great idea but we didn't ship them, because we had other ones that were just a better idea which had, you know, uh, bigger potential impact. So you want that, you want, you want to create this, like, corpus of ideas that are well thought out and select the best ones.
- LRLenny Rachitsky
It feels like a lot of these processes are basically just ways to stop stupid shit from happening. I think-
- BCBill Carr
(laughs)
- LRLenny Rachitsky
... the narrative is a good example where you have to, like, expose your thinking deeply. This is a great example of that.
- BCBill Carr
Yeah. It, it, it, um... And it's also I would say an example of where, where this is a process to prevent the other process, which is the product development process from becoming the thing, right? Right? Where you just get locked in on what are we doing in this sprint, you know, what are we trying to get done and, like, focused on shipping stuff. What I recommend is you try to break tha- into two different processes. One is the process of deciding, like, what you should go build, and that's what the PR FAQ is designed for. And then once you've decided that, then yes, by all means use all that good thinking to create n- now how can I ship it, uh, efficiently and effectively with, you know, few to no bugs?
- 51:19 – 54:35
How Amazon promotes action vs. talk
- BCBill Carr
- LRLenny Rachitsky
I was just reading this Harvard Business Review article, I think that's called The Thinking to Doing Gap, where a lot of companies just spend a lot of time talking about ideas and solutions and not actually doing anything. And so I'm curious how you try to avoid that at Amazon, considering there's this period of just, like, "Let's explore, explore, explore and we're fine."
- BCBill Carr
There's a couple of ways. And of course I'm, I'm somewhat having to imagine, you know, what are the problems in such companies where that, that's going on.
- LRLenny Rachitsky
Mm-hmm.
- BCBill Carr
So one such version of this problem is what I'd call the, the, the big idea that's not fleshed out problem.
- LRLenny Rachitsky
Mm-hmm.
- BCBill Carr
So I'm sure that every single person listening to this podcast has either themselves done this or have witnessed others in their company who come up with a concept of like, "Oh, well we, you know, I think if we built this, boy that would really, you know, solve things or that would really work well or that would really grow things." And the, and it sound- it may sound good to everyone. Uh, it may sound good, you know, to you to everyone. And then maybe you start then working on building it, but the reality is that actually once you've spent some time looking at that idea more deeply, you then start to identify several roadblocks or, or maybe a fatal flaw with this idea. And, and in fact, no, you shouldn't waste any of your time going and building that thing, 'cause it has a, a fatal flaw. So one problem is that companies get stuck, I think, where they never actually go do that documentation, and so it's a debate and discussion about concepts that aren't really well fleshed out, and so people's ability to actually evaluate them in any realistic way is they don't have a good way. And so in those situations, you know, what gets done is probably more of a, a function of, you know, politics or will, or, or, um, you know, a culture of, um, you know, completely, like, top down. Um, I think the, you know, the, the other way is where they're debating and discussing things that they just don't have, like, good methods where then they can take things and then, and then go build them. Meaning they probably don't have the right org structure or processes in place to then go take the good idea, assign it to someone who will own it, go look at it, and after they have owned it and gone and looked at it, if it works then they and their team, you know, can go actually build it. Um, uh, what, what I always found is, is, uh, you know, once as I became more senior in the company and my, my role became bigger and bigger, is that when something came up, some idea that didn't neatly fit within my org structure, I couldn't necessarily delegate it to someone, that this bec- this, this, there were only two things I could possibly do, which is just sort of set it aside all together 'cause otherwise it would just be a real distraction to people, or I had to decide this was a compelling enough idea that we were going to take a resource, could be one person, could be a whole team, doesn't, depending on the idea, and I'm gonna have to assign that resource to actually go look at this and work at this. Otherwise, it will never happen.
- LRLenny Rachitsky
I have been through those many times.
- BCBill Carr
(laughs)
- 54:35 – 1:00:51
Amazon’s flywheel and input metrics
- LRLenny Rachitsky
Okay. So there's two more concepts I want to try to touch on before we wrap up. The next one is the idea of input and output metrics. This is something at Airbnb we super implemented and became a very de facto way of thinking. And actually there's a lot of Amazonians that ended up at Airbnb, a lot of leaderships.
- BCBill Carr
Yeah.
- LRLenny Rachitsky
So there's a lot of stuff they've ended up doing, like the memos.And so on the input-output metrics, could you just describe what that is and why that's so important, why people kind of think about metrics in the wrong way often?
- BCBill Carr
Yeah. So the origin of this one really was, again, kind of in our early years at Amazon, '99, 2000, 2001. We would, you know, we were a public company then, we were growing. But then, you know, growth started to, like, it wasn't just all, like, up and to the right and like woohoo. You know, every company's gonna hit a wall eventually and it's not gonna be a, you know... If you're so lucky to have even been in a company where it's just, like, going up (laughs) and to the right with no gravity, good for you, because most people don't, never experience that. What most people experience is the reality is that there's a lot of gravity pulling against your, your revenue numbers and you've put a plan out there and you wanted to grow 15% or 20% or 75, whatever it was, and now you look like you're not gonna hit that number this quarter. And so what ensues then is the, "We're not gonna hit our number. What should we do about that to hit our number?" And this often happens with, well, you know, there's like a month or a month and a half left in the quarter.
- LRLenny Rachitsky
Yep.
- BCBill Carr
And then we would run around like chickens with our heads cut off and come up with a bunch of ideas that tended to be, like, promotional in nature or tended to be, like, price reduction in nature or, like, we'll send this extra email or-
- LRLenny Rachitsky
(laughs)
- BCBill Carr
... extra ad or whatever it might be.
- LRLenny Rachitsky
Another crying baby.
- BCBill Carr
Right. And the reality is we did that, you know, we went through that (laughs) enough times, several quarters. We started to realize, "Huh, these fire drills don't really work." We're not re- we didn't really, like, get meaningful progress against the number with these, like, last minute things we decided to go do. And oh, by the way, they were kind of a big distraction. They probably, if they did work at all, they pulled revenue that might've just gotten in the next month or next quarter into this one. So it wasn't really kind of a zero sum game there. And we realized, you know, we're not, we're not really actually working on things that matter to customers that are gonna, like, move the needle over the long term. And this is about the same time when Jeff and the S-team were reading the book Good to Great. And I, I would, you know, uh, you have to ask Jeff what it is, but if you ask me, I think that this was the single most influential and effective management book, uh, for our company. Because what it caused Jeff to do, and I won't describe what, you know... Most of your viewers probably know what it is. If you don't know what it is, go read Good to Great. It is, in my opinion, it has the best, you know, most important management book you'll ever read, um, because what it did is to help us codify our growth flywheel. Meaning what are the, what are the inputs that if we improve these things, which in our case, uh, uh, was, you know, how do we have broad selection? How do we have a great customer experience or great customer experiences in retail things, like how easy was it to find what you wanted to buy? How easy was it to buy it and how fast did it get to you? Were the prices low? Um, do we have lots of merchants on our platform? And by the way, could we drive out costs? So we identified these things on our flywheel. And this identification of these things was, uh, such a critical moment for the company because then it realized, "Okay, well what we need to do is spend our time focusing on, like, how do I measure each one of those things and then how do I improve each one of those things?" So it shifted our focus away from, like, this short-term thinking of pushing the revenue number up to this longer term thinking that if we just improve these things, you know, whether it's... There's no day that people will wake up 10 years, 20 years, 30 years from now and say, "All else equal, I'd rather shop at a store with fewer items than more items or a store with higher prices than low prices or a store where things get to me more slowly versus, you know, more quickly." So if we can just improve these things, you know, this is, this is our path to winning. So those were all inputs, inputs to the customer experience and so we then figured out ways to measure them, creating a set of input metrics. And so then when we would develop, develop our, our operating plans and c- and review our business each week and set our goals, we were hyper-focused on those inputs and the input metrics. As a simple example, there was one tool that Jeff and the leadership team, the S-team used called S-team Goals, which are effectively a list of what they would harvest would be like, "Here are the most important goals for the company that I've harvested from all of our operating plans." And I can't remember exactly what year, something around like two thou- 2007, 2008, they, you know, looked at that list, which is about 500 items long by the way, and they counted it up. And of that list, only 10 of them actually had a financial metric in it, like revenue or free cash flow or, like, gross profit. These other things were, were generally speaking all, you know, one of those inputs, like I mentioned to you, about low prices and selection and, and speed, uh, of the customer experience. So yes, the point was, again, it's this other article. Well, so we, we, we took it as an article of faith that if we can just improve these inputs, the outputs will take care of themselves. The inputs are the things that drive the outputs which are revenue, uh, customer activity, free cash flow. And so one of Amazon's, you know, it's not really a secret, but one of (laughs) Amazon's great strengths we- uh, is, was they would focus on those things and make, you know, just continuous process, uh, continuous improvement on each one of them and measure them, you know, rigorously.
- LRLenny Rachitsky
The flywheel, you reminded me, that was another, it feels like that's another concept Amazon proliferated through all of companies is everyone's trying to create their own little flywheel. And I imagine everyone has that image of the Amazon flywheel in their head with the little orange circle in the center and the black arrows.
- 1:00:51 – 1:04:23
Signs you’ve got a good input metric
- LRLenny Rachitsky
On the topic of input metrics, just briefly, what is an example of a good input metric? 'Cause I'm imagining people are listening like, "Oh shit, I gotta think about my metrics as input and output now."... what's, what's a sign it's a good input metric?
- BCBill Carr
A sign that it's a good input metric is, uh, first of all, map your end-to-end customer experience. Like, I never worked at Airbnb, but okay, step one is that they clicked on some ad somewhere and showed up in the website or the, the app. Now, you're in the app, now you're looking at this first screen. Well, the first thing what they're doing is they're browsing and they're, and/or they're searching. Okay. How are we measuring the speed, quality, and ease of that browsing and searching? Now they've got onto a different, a detail page for an individual property. How are we measuring the speed, ease, and quality of the different actions they may take, like, you know, reserve? Um, forgive me if I get any of my terminology wrong, I'm not an Airbnb expert.
- LRLenny Rachitsky
Y- you are, but it doesn't matter. It's close enough.
- BCBill Carr
And then, you know, uh, so then you've reserved, now you have interactions with a property owner. How do I measure the quality of those? Am I going to, you know... How many messages go back and forth? Is a lot of messages a good thing? Is that a bad thing? At first, you may not know the answer to that question. Um, same thing every step of the way. The, then there's the actual rental experience. How do I instrument and measure e- every part of the customer experience? So, you know it's an input metric if it is measuring something with respect to the customer experience. Um, which ones are the right metrics? Which ones are the most causal to the outputs? I couldn't begin to tell you. This is actually, this is actually what you're getting paid for (laughs) if you work at Airbnb, is to figure that out and, and use basically through an iterative process of measuring, observing, improving, and looking at what the effect is on your outputs. So again, we didn't, we didn't really create this concept. This is a concept from, uh, Six Sigma, which is, uh, using, uh, what's, you know, DMAIC, which is I, I have a process, there's a output of this process, but the inputs are a black box to me. So, how do I understand those inputs? Well, DMAIC stands for define, um, oh boy, define, measure, uh, uh, the A is gonna come back to me in a, in a minute. Um, improve and control, and I'm gonna have to... Oh gosh, the A is lost, I- I've lost it for a second here. But, um-
- LRLenny Rachitsky
Oh, here it is. Uh, I'm looking at Wiki- define, measure, analyze, improve-
- BCBill Carr
Analyze. Thank you.
- LRLenny Rachitsky
... and control.
- BCBill Carr
Duh. Yeah, duh, analyze. So, (laughs) you know, we just used that process which was... And, and by the way, the way we think about it first is like, well, you need to throw a lot of things at the wall. You don't really know which of these things are going to be the most, uh, causal. So-
- LRLenny Rachitsky
Mm-hmm.
- BCBill Carr
... you know you're doing input metrics if it is, uh, do you control it? Meaning can you apply resources to make this thing better or worse? Um, is it... Does it touch customers? It doesn't always have to touch customers, but if it is affecting the customer experience, it's almost certainly is an input. And then, you know, which ways you're going to measure that input, you know, you need to try, you know, more than one way. Uh, because like, again, we tell a story in the book about one of our most important input metrics, which was how much selection do we have? And (laughs) we were actually not measuring that right for several years. We had to refine that measurement.
- 1:04:23 – 1:06:54
How mistakes can still be made with working backwards
- BCBill Carr
- LRLenny Rachitsky
So, I don't know if you saw this, but I asked on Twitter what questions I should ask you and told people you were coming on. And something that came up a bunch is with working backwards, obviously some products Amazon has launched have not worked out. Say, the Fire Phone is a classic example. What have you learned from that process of just like, "Okay, here's signs maybe this won't work out?" Also knowing many things are not gonna work out. There's no way to really know.
- BCBill Carr
Yes. So the, the one important thing to share is that all these tools that are described in this book that Amazon is using, whether it's using documents in meetings with a PR/FAQ process or input metrics, is that none of these things give you the answer. (laughs) Um, they are tools to help you make decisions. So, sometimes you're going to make the wrong decision. Fire Phone is a great example that comes up often. People ask, "Well, you know, w- if you've got this great PR/FAQ process, like, how did you get Fire Phone?"
- LRLenny Rachitsky
Mm-hmm.
- BCBill Carr
So I, I was sort of tangential to the Fire Phone team, um, and I didn't work on it closely. And, you know, different people have different opinions, so I'll just share, share my opinion, which is that if you think about, again, how does the PR/FAQ process work? Well, there's a customer problem. Well, what was the problem that the Fire Phone was seeking to solve for customers? I would argue this is a case where we made the mistake of where we had a technology solution in mind, which was 3D effects, then, and we took that solution and were then in search of a problem. I don't, I don't think it solved any meaningful problems for customers. Um, in Canada, like I was, I had, you know, we had to build a version of the music application and the Prime Video application for this phone. And I don't, I couldn't figure out, like, how this 3D part would make it, you know, better for the customers to discover, watch, or, you know, uh, play back any of these, these media. Maybe there were games that could have been, you know, a great solution. I don't know. But I think, you know, uh, the, the simplest place to go when you see a failed product is to ask yourself, what problem did you solve? And, uh, I could, you know, I could get into all kinds of other examples outside of Amazon too, but, you know, nine times out of 10, I think, you know, that's, that's where... I- if it wasn't poor execution, if the product was executed correctly, what was wrong with the concept of the product?
- LRLenny Rachitsky
I imagine there was a lot of, uh, disagreeing and committing on that concentric circle process.
- 1:06:54 – 1:08:02
Why disagreements aren’t necessarily signs products will fail
- LRLenny Rachitsky
Is there anything that you found of just, like, the number of disagreement and commits in this process of PR/FAQ filtering out, I don't know, that tells you, like, maybe this is not a good idea?
- BCBill Carr
Not necessarily.
- LRLenny Rachitsky
Mm-hmm.
- BCBill Carr
You see... So I, I tell you partly also why the Fire Phone happened was, you know, from my point of view, I think, you know, I, I think that, um, we had had a number of successful products where in some cases there were a lot of people who doubted, um, whether it worked. A lot of people doubted whether the Kindle, inside Amazon, doubted that the Kindle was going to be a good idea. I remember contentious, you know, board meetings on this topic. So even, you know, even within a company that was considered innovative, you would have a lot of people, you know, that would doubt things. I can tell you that for years is, working on Prime Video, I would tell people about, you know, what our end vision was of, you know, watching on our TV set and, "We're going to have our own motion, you know, studio. They'll make our own movies and TV shows." (laughs) And they would laugh at me. They thought that was crazy. So, uh, that's not necessarily the sign of whether, uh, you know, the product is right or wrong. Um, and so that- that's a problem actually. That makes it harder to know.
- 1:08:02 – 1:09:55
Examples of failed Amazon projects
- LRLenny Rachitsky
Yeah. And I think something Amazon's incredibly good at is being okay with a lot of failures, and I think that's part of the reason there's been so much innovation. Is that... Is that true?
- BCBill Carr
I'd say it's partially true. I mean, uh, again, it's hard for me to do a compare and contrast-
- LRLenny Rachitsky
Mm-hmm.
- BCBill Carr
... with other companies.
- LRLenny Rachitsky
Yeah.
- BCBill Carr
Because, you know... But I can tell you, did we have a lot of things that we launched that, that failed? Yes. Some of the, uh, are, uh, some of them are, you know, very, you know, public and obvious. I'll give you one that, that people don't really realize. It's something called... We had a feature in the early 2000s called Slots. And what it was, was it was basically third parties could, you know, bid on different search terms and put, like, a little ad in there.
- LRLenny Rachitsky
Mm. Sounds familiar.
- BCBill Carr
Well, that didn't work. You know, obviously that works now on Amazon, but it didn't work then. It didn't work then because we simply didn't have the scale-
- LRLenny Rachitsky
Mm-hmm.
- BCBill Carr
... um, and, uh, uh, that we ha- that Amazon has today. So a lot of times (laughs) a product idea, perfectly good idea, you just have the wrong time or the technology isn't there. I mean, Jeff... Jeff wrote about a, a product, you know, there was a puck that sat on your, on the, the... in your kitchen that you would talk to and ask it for things and, you know, could shop from it. He wrote about that in 2004.
- LRLenny Rachitsky
Mm.
- BCBill Carr
Well, the technology wasn't there to be able to- to create that little puck, which one day would become Echo. It was sort of a decade away. But we had a lot of things we launched that failed. Uh, we were, we were not afraid to take what we considered a well-calculated risk. I think many, many companies are, are, are less willing to do so, uh, less committed to, um, product innovation and, and, and really do not want that fe- fear of failure. Uh, th- they do fear failure and they're really focused on sort of their near-term financial goals. It's no- it's sort of not their fault. It's kind of the way, you know, the, uh, a public company and Wall Street sort of interact with each other, sort of creates this dynamic.
- 1:09:55 – 1:13:57
Cultivating risk-taking and accepting failure
- BCBill Carr
- LRLenny Rachitsky
Just to pull on that thread a little bit more, it feels like a lot of companies talk about, "We are okay failing, we're okay launching things that don't work." But then in practice, their performance review is impacted. Teams get shut down, budgets get pulled. Is there something that you recommend for... to companies that want to actually improve in this? Like, what could they actually change and how to actually do this well?
- BCBill Carr
Yeah. I just spoke with, actually, a senior executive at a... at a, uh, well-known Silicon Valley company about this topic the other day and said, "Well, what is it... what is it we had structurally at Amazon, especially from a people point of view, that- that would enable or- or, uh, encourage people to take these risks?" Because yes, in a lot of companies, if you go work on the project that fails, like then your career, you know, is in the garbage can and/or your compensation system, like your- your're gonna lose out on that bonus. So there were two things. One was our compensation system. So there were no performance bonuses. So if I was running the book business and I had a killer year from a financial point of view, there was no, like, extra kicker for me. And if I ran the book business and it had a bad year, there was no financial penalty for me either because our compensation was based on the stock price. So we all had an incentive to do what's right for the company, frankly over a... over a long term, because, you know, trying to win off of short term fluctuations off Wall Street is- is kind of a losing proposition.
- LRLenny Rachitsky
Mm-hmm.
- BCBill Carr
Which meant that therefore if I am... You know, so I- I mean, 'cause I had that situation, I moved off of working on our largest P&L, and then, uh, the book business nu- and music and video business. Now I'm going to go work on digital media. There is zero business there. (laughs) This might not work. Well, my... It didn't... My compensation didn't change as a result of that, didn't change, you know, one way or the other. We tended to als- have our performance management system that then would change compensation based on evaluating what did you actually deliver, kind of more in an input method. We cared about the outputs too, but the... But, you know, just... The- there are... there are plenty of people that could be in a business that's like up into the right but has nothing to do with them. And so we tried to focus more on, well, what did you actually build and contribute? You know, ways you improve selection or- or lowered prices or, uh, whatever that might be. So those two things about the compensation, you know, mattered a lot. And then the second thing was having, um, you know, a CEO who was really committed to it, and it wasn't something that they delegated to someone else. So, uh, Safi Bahcall, I think writes about this i- in his book Loonshots where part of the conditions that are necessary for innovation to occur are that you actually s- create different structures of decision-making, of- of approvals, of all kinds of things if you create some, you know, team that's gonna go build something new and innovative. Because the most of the structures inside a big company are- are kind of designed to crush (laughs) and impede a small, innovative team that's trying to go build something new. They need speed, but, you know, some, like, you know, approval here, approval there, it's gonna, like, get in their way.... and one of the ways we solved that two ways. One was when we went to go build, you know, digital media in AWS, we put, like, two of our smartest, you know, leaders in the company on those things, Steve Kessel and Andy Jassy. And number two, they were meeting with Jeff regularly. Jeff was deeply engaged with them reviewing, like, what are we going to go build, part of the decision to decide what we're going to go build. And so he could then als- be-between their seniority, and of course him being the CEO, they could run interference on these sorts of things too. So even if you want to have innovation, uh, even if you really, you know, you really do crave it, you're willing to take the risk, if you don't set up the organization in the right way, like, you're just not gonna get it.
- LRLenny Rachitsky
Amazing. I'm glad we get in- got into that. I wasn't planning to talk about that and I'm glad we did.
- 1:13:57 – 1:18:21
Amazon’s “bar-raiser” practice for hiring
- LRLenny Rachitsky
Final topic, this concept of bar raisers, it feels like it's been such a core way of allowing Amazon to scale successfully, and s- and I think that's something a lot of people can implement. It's like a very one-off thing you could just implement in your company. Can you just talk about what this idea of a bar raiser is in the hiring process and then what people can do if they wanted to add this to their hiring process?
- BCBill Carr
So the bar raiser hiring process is a process... is actually one of the first ones that was established in the book. It was pretty early in the company's history back in 1999. And we created it for a simple reason. To quote one senior leader at Amazon, "We had new people hiring new people, hiring new people."
- LRLenny Rachitsky
(laughs)
- BCBill Carr
We were in our hypergrowth phase, okay? You know, we were, um, the company was only, you know, what, three, four years old, and we were growing like a weed at the, uh, at the, that point. So what, uh... this started off actually in our tech org, and what, you know, our senior leaders in tech realized is, "My gosh, we, you know, we hire some new engineering leader, and then the next thing is that their job is to go hire, like, you know, the senior managers, and they'll go hire, like, managers, and all these people have been here for, you know, (laughs) like a week. So we don't really even know our company yet. They don't know our culture yet. They don't know our standards yet. So h- what information are they using to make these hires?" And, you know, what information they were using is obviously they were just using their own personal judgment, and their personal judgment combined with whatever criteria they used at prior companies that they worked for. So let's say if they came over from Microsoft, they, you know, if Microsoft had some methodology or criteria, they probably would just apply that. Well, is that methodology or criteria relevant to our company? Because every company has a different culture, and, you know, I'm here to tell you that if someone's been a super successful vice president at Microsoft does not mean they can be a super successful at Amazon, or at Google, or Facebook. Sometimes they can, but these companies are very different. They all do work very differently. The way leadership happens and decisions are made are very different. So how do we fix this problem other than letting it run rampant and basically are... hire a bunch of people who are... we don't know if they fit our culture and we don't know if they fit our high standards we have for what we expect of, of, you know, engineering leaders or engineers? So they created this bar raiser process, which by the way they borrowed from Microsoft, which had a process called As Appropriate, and the concept was that on every interview loop, there's one person who is not the hiring manager, who doesn't report to the hiring manager, who's not the recruiting manager. They're, you know, an, uh, a line, uh, they're, they're, they're in the business. Like, they're, they're a software development manager, or they're a marketing manager, and they are on the interview loop, and they're a bar raiser, which means when we get to the debrief meeting, they will run that meeting, not the hiring manager, not the recruiter. They will run the meeting, and it also means that they technically have veto power over the hiring manager, which, by the way, a good bar raiser never uses, or I never saw a bar raiser use. I was a bar raiser, um, and in my 15 years at Amazon, I n- I never used it, never saw it used. And then finally, which actually was not true in 1999 but later became true, was once we established our leadership principles, we created a set of objective criteria that would be used, and an interview methodology that would be used in every interview, which was the objective criteria would be our leadership principles and the methodology would be behavioral-based interviewing. So this bar raiser was basically... would be a subject matter expert on how this process worked. They'd conduct the debrief to make sure that we were actually adhering to the process, that people were sticking to the objective criteria, rather than saying, "I don't think we should hire this person because... I don't know, they don't seem to want to work here enough." That might... maybe that's a valid reason, but the- but is actually not part of our objective criteria. And so the, the bar raiser was there to act as a balance also on the urgency bias that every hiring manager has, which is like, "I gotta fill these roles," and, but rather than filling them with the next warm body they find, make sure they fill them with people who actually meet our standards, fit our culture, and meet our standards for, you know, functional excellence too.
- 1:18:21 – 1:20:41
Selecting Amazon’s bar raisers
- BCBill Carr
- LRLenny Rachitsky
Such a cool process. Two questions along these lines. One is, who has the final decision in hiring? Is it, uh, the hiring manager? Y- and this is just-
- BCBill Carr
Yes.
- LRLenny Rachitsky
... advice from the bar raiser?
- BCBill Carr
Yeah. So this often gets confused. The decision-maker is the hiring manager. The, the, the whole interview loop and the bar raiser are actually just there to help the buy- hiring manager make the right decision.
- LRLenny Rachitsky
Mm-hmm.
- BCBill Carr
Now oftentimes, the hiring manager could, you know, feel like this is actually a bureaucratic process and a group of people that I have to sell and they're just in my way (laughs) between me and hiring this person, which is kind of a natural feeling to have. But one of the f- feedback I would always give managers who are new to this is, like, "No, no, no. That's not the way to think about it. Think about these people are helping you. Because the amount of time you're gonna put into the hiring process, um, may seem like a lot, but if you hire the wrong person, boy, that amount of time you're gonna have to deal with managing that person, that's gonna be a lot more, the impact on the team, the impact on you. So making a great decision here..."... is important. They're here to help you. So yes, the final decision was with the hiring manager. Technically speaking, the Bar Raiser could block them from a decision to hire someone. But they would allow, uh, uh, well done, they would help the hiring manager see the reasons not to hire the person through kind of a Socratic method in how they would guide the discussion.
Episode duration: 1:33:27
Install uListen for AI-powered chat & search across the full episode — Get Full Transcript
Transcript of episode S9WHQa_AJQo
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome