Lenny's PodcastBecoming evidence-guided | Itamar Gilad (Gmail, YouTube, Microsoft)
EVERY SPOKEN WORD
135 min read · 27,071 words- 0:00 – 4:35
Itamar’s background
- IGItamar Gilad
... you fake it, you do a fake door test. You do a smoke test, Wizard of Oz test. We used a lot of those in the tabbed inbox, by the way. One of the first early versions was actually, we showed the tabbed inbox working to people but it wasn't really Gmail. It was just a façade of HTML and behind the scenes, and according to the permissions that the users gave us, some of us moved just the subject and the sender into the right place. So initially, the interviewer kind of distracted them and then they showed them their inbox, and in it, the top 50 messages were sorted to the right place, more or less, if we got it right. And people were like, "Wow, this is actually very cool." But it gave us some evidence to go and say, "Hey, we should try and build this thing."
- LRLenny Rachitsky
(instrumental music) Welcome to Lenny's Podcast, where I interview world-class product leaders and growth experts to learn from their hard-won experiences building and growing today's most successful products. Today my guest is Itamar Gilad. Itamar is a product coach, author, speaker and former long-time product manager at Google, where he worked on Gmail, Identity and YouTube. He also just published an awesome new book called Evidence-Guided: Creating High-Impact Products in the Face of Uncertainty. Itamar has an important perspective on why and also how you can push your team and organization from an opinion-based decision-making process to a- a more evidence-guided approach. In our conversation, Itamar shares a number of very practical and handy frameworks to do just that, including the confidence meter, metrics trees, gist and the gist board, plus his take on how people often misuse ICE for prioritizing ideas, also how you can make your OKRs more effective, and so much more. Enjoy this episode with Itamar Gilad after a short word from our sponsors. This episode is brought to you by Ezra, the leading full body cancer screening company. I actually used Ezra earlier this year, unrelated to this podcast, completely on my own dime, because my wife did one and loved it, and I was super curious to see if there's anything that I should be paying attention to in my body as I get older. The way it works is, you book an appointment, you come in, you put on some very cool silky pajamas that they give you that you get to keep afterwards. You go into an MRI machine for 30 to 45 minutes, and then about a week later you get this detailed report sharing what they found in your body. Luckily, I had what they called an unremarkable screening, which means they didn't find anything cancerous, but they did find some issues in my back which I'm getting checked out at a physical next month, probably because I spend so much time sitting in front of a computer. Half of all men will have cancer at some point in their lives, as will one third of women. Half of all of them will detect it late. According to the American Cancer Society, early cancer detection has an 80% survival rate, compared to less than 20% for late stage cancer. The Ezra team has helped 13% of their customers identify potential cancer early and 50% of them identify other clinically significant issues such as aneurysms; disc herniations, which maybe is what I have; or fatty liver disease. Ezra scans for cancer and 500 other conditions in 13 organs using a full body MRI powered by AI, and just launched the world's only 30 minute full body scan which is also their most affordable. Their scans are non-invasive and radiation-free and Ezra is offering listeners $150 off their first scan with code Lenny150. Book your scan at ezra.com/lenny. That's E-Z-R-A.com/lenny. This episode is brought to you by Vanta, helping you streamline your security compliance to accelerate your growth. Thousands of fast-growing companies like Gusto, Calm, Quora, and Modern Treasury trust Vanta to help build, scale, manage, and demonstrate their security and compliance programs, and get ready for audits in weeks, not months. By offering the most in-demand security and privacy frameworks such as SOC2, ISO 27001, GDPR, HIPAA, and many more, Vanta helps companies obtain the reports they need to accelerate growth, build efficient compliance processes, mitigate risks to their businesses, and build trust with external stakeholders. Over 5,000 fast-growing companies use Vanta to automate up to 90% of the work involved with SOC2 and these other frameworks. For a limited time, Lenny's Podcast listeners get $1,000 off Vanta. Go to vanta.com/lenny. That's V-A-N-T-A.com/lenny to learn more and to claim your discounts. Get started today.
- 4:35 – 8:35
How his time working on Gmail shaped his philosophy of “opinion-based” development
- LRLenny Rachitsky
Itamar, thank you so much for being here. Welcome to the podcast.
- IGItamar Gilad
It's a pleasure being here. Thank you for inviting me.
- LRLenny Rachitsky
It's my pleasure. I thought we'd start with the story of your work on Google+ and Gmail, and how those experiences formed your perspective on how to build successful product. Can you share that story?
- IGItamar Gilad
Google+ was my first experience at Gmail. I joined Gmail in August 2011 and the first thing they asked me is, "Let's connect Gmail with Google+." If you're hazy about the, the story, back then Facebook was massive. It's still massive but then it was growing like mushrooms. People were spending hours. It really freaked out Google and the solution, the obvious solution was to launch a social network of Google called Google+. And we all believed in this thing, it really caught on very well initially. We all used it, we all believed in it, so our mission was to build this thing and Google really cut no costs. It created a whole new division within Google and it created a whole strategy around Google+, and we had to connect Gmail and, uh, YouTube and Search to Google+ to make them more personalized in a sense and more social. So that was the, the idea and we went on and we launched a series of features in Gmail for a couple of years honestly, and Google+ itself became this massive project, very feature-rich and with a lot of its redesigns and iterations.And none of it worked. Turned out people actually didn't need another social network, people didn't love it, people didn't use it. Eventually in, uh, Gmail, we rolled back all the Google+ integration few years later and Google+ itself was shut down in 2019. So putting aside all the tremendous waste that went into this, eh, all the millions of person hours, eh, and person weeks, in hindsight not only did Google bet on the wrong thing, it missed much easier opportunities. So just not far from Google's headquarters there was WhatsApp, not very famous in the US but they actually created massive impact. Hundreds of millions of people were using their stuff and they became a threat to Facebook much more than Google was. So Google missed the opportunity of social mobile apps like WhatsApp, like Snapchat, et cetera. And for me, this story kind of was the epitome of what I call today opinion-based, uh, development. We come up with an idea, we believe in it, we... All the indications show it's good, maybe the early tests show it's good, then we just go all in and we try to implement it. And I made this very mistake many times as a product manager, I was the guy pushing for the ideas. Uh, so for me this was kind of a turning point. I felt we need to adopt a different system.
- LRLenny Rachitsky
And just before you move on to the next story, how big was the team roughly or how many years was spent on this area? Just to give people a sense of the- the waste as you said.
- IGItamar Gilad
So there was a tremendous earthquake inside Google to create the Google+ team. Teams and entire divisions were kind of torn apart and reformatted, and I think at its peak it was about a thousand people inside Google.
- LRLenny Rachitsky
Wow. Oh my god.
- IGItamar Gilad
It was a division the size of Android and Docs, and a really sizable thing. They're under their own building. It was... Yeah, it's taken from the playbook of Steve Jobs, you know, create this whole secretive project inside and just run like hell.
- LRLenny Rachitsky
Yeah. I remember though, Facebook was really scared. I remember they shut everything down, it was like a Code- DEFCON 1 situation too, so it really scared Facebook at the same time.
- IGItamar Gilad
Yeah, it's true. Uh, but at the end of the day, neither Google's advertising revenue was affected, neither was, uh, Facebook affected. So it turned out this idea was not that necessary after
- 8:35 – 13:40
Lessons from developing Gmail’s tabbed inbox
- IGItamar Gilad
all.
- LRLenny Rachitsky
Yeah. Yeah. Okay. So that's an example of something that didn't work because it was opinion-based software is I think the phrase used, and then there's a different experience with, uh, with tabs, I think with Gmail?
- IGItamar Gilad
That's right. So Google is a very successful company, it's not for me to criticize it or to in hindsight kind of, uh, say, "Y- you guys need to be better." And some of the people that were behind Google+ were some of the smartest leaders and I still think they are, uh, despite this, uh, story. If you look back at the history of Google, how things started in the first decade or so, Google was what I call an evidence-guided company. So essentially it put a high premium on focusing on customers, on coming up with a lot of ideas, on looking at the data, looking at how these ideas actually worked out. They weren't shy about launching, you know, betas and things that were very rough and incomplete and learning from that, and then they expected people to take action based on the- on the results. So, uh, fail fast is a very famous, uh, paradigm. And so you had to kill your project or pivot it seriously if it didn't work out, and I think ha- had we kept fail fast, it would really have helped, eh, Google+ if we had this mentality. But for some reason with Google+, Google put this playbook aside and used a different playbook which I call plan- plan and execute essentially. Uh, but I think inside Google, the DNA still existed, so inside Gmail, the next project after Google+ was the tabbed inbox. So it started with the... It was kind of the reverse of Google+. It started as a very small idea that no one believed in, and we started looking what's behind this idea? What's the goal? What's the problem actually we're trying to solve? It turned out that a lot of people were get- re- receiving social notifications and promotions, et cetera, and most of them were very passive. They weren't clearing their inbox, they were just living in this world of clutter, and I came up with an idea how to fix this. I was sure it was great, I wanted to push it, you know, plan and execute, but my colleagues were like, "Hold on. We actually tried this. We have a bunch of ideas to help people organize their inbox, they're not using it. Why- why is your idea good?" So that sent us kind of, me and my team, into searching, into researching these users, into establishing goal that was much more user centric, and then thinking of other ideas. And then we started testing them much more rigorously: uh, and basically we started testing on our own inboxes and then we recruited other dogfooders, other Googlers to- to test the same inbox, then we put it outside for external testers. We did disability studies, we did data, we- we built a whole data, eh, mining team and a whole, um, machine learning team to build the right categorization, and we ended up with a solution that turned out to be very successful for a lot of these passive users. And this was a surprise to a lot of people because most of my colleagues and most of the people I talk with actually know how to manage their inbox. So for them, that solution makes complete nonsense, like splitting promotions and social to the side sounds like it's the stupidest idea. But there's about 85% of the population, 85 to 88% that absolutely love it, and today Gmail has about 1.8 billion u- active users according to Gmail.... most of these users are using this feature. So it was a pretty high fe- uh, impact feature as well.
- LRLenny Rachitsky
And the feature specifically, just in case people aren't totally getting it, is the promotions folder and the, uh, social, I think, and then the regular-
- IGItamar Gilad
Yeah, there, there are a couple more that you can enable in settings if you, if you like.
- LRLenny Rachitsky
Mm. Yeah, I use it. I love it, except it puts my newsletter in People's promotions folder. Who do I talk to about that?
- IGItamar Gilad
Yeah, newsletters are a very complicated, uh, scenario for the categorization engine.
- LRLenny Rachitsky
Yeah, we just need an exception for my newsletter and then we're good. Okay, but go on.
- IGItamar Gilad
So in hindsight, I was asking myself, "Why was this project so different?" And I think the reason is that we didn't have that much confidence in our opinions. We had opinions, we had ideas but we didn't just go all in and just, "Let's build it." We actually used an evidence-guided system and I think that's not unique just to Google. I think every successful product company out there that you look at, Amazon, Airbnb, anyone you, you will check, at least in their best periods, they found a way to balance human judgment with evidence. They didn't try to obliterate human judgment and opinion, just to supercharge them with evidence and they came up with very different models. Apple is another example, um, but the principle still holds in all these companies.
- LRLenny Rachitsky
Awesome. So you took that experience and all the experience you've had from coaching product leaders, working with
- 13:40 – 14:30
A brief overview of Itamar’s book, Evidence-Guided
- LRLenny Rachitsky
companies, and you wrote this book called Evidence Guided, which people on YouTube can see sitting there behind you and so I want to talk through some of these stories and then some of these other lessons and frameworks that emerged. But maybe just to start, what's, what's the elevator pitch for this book?
- IGItamar Gilad
So this is a book for people like us, product people who want to bring evidence-guided thinking or modern product management if you like, into their organizations. There's a lot of challenges, it's not simple. We all read the books, we all know the theory, we all know some parts of the system. It tries to give you a system how to do that. It's a meta-framework that kind of helps you lift your organization in the direction of evidenced guidance, if that's what you want to do.
- LRLenny Rachitsky
So going back to the story briefly before we get into the frameworks and lessons of the book. In the first example of Google+,
- 14:30 – 17:32
Balancing founder creativity with an evidence-based approach
- LRLenny Rachitsky
basically it came top down, "Hey, we need to build a social network. Go build it." Obviously, that happens at a lot of companies. I, I don't know if there's a easy answer to this, but are there cases where it does make sense to approach it that way? Obviously Apple is a classic example of Steve Jobs, is like, "We need to build an iPhone." I don't know if that's exactly how it went, but are there instances where it is worth just approaching new product ideas that way, based on kind of the experience and creativity and insights of the founder? Or is your thinking it should always come from this evidence-based approach?
- IGItamar Gilad
I, I think the founders are very important, especially in the startup and scale-ups phase. They come up with many of the most important ideas and it's super important that they have the, the space to express and to push the organization to look at those. However, it's not about shutting them down, it's about looking at them critically. You need to create the environment in your organization where the leader comes and says, "You know what? I talked to these three customers. I figured it out. Here's what we need to do in the next five years." You need to ask, "Where's your evidence?" And by the way, the example you give, that's a classic example. Steve Jobs, he just brainstorm in his, I don't know, kitchen, the iPhone, uh, and then just told the team to build it. That's the story Steve Jobs told but it's not the, the real story at all. Now we know what ev- actually happened and the iPhone has a, actually a story of discovery, of trial and error. Multiple projects to do with, uh, multi-touch, with, with phones. Most of them failed. Steve Job was the architect, he kind of managed to connect the dots and eventually come up with this perfect device. Uh, but it, he wasn't actually the, the creator, it wasn't his brainchild. He was actually against it for a while but over time as he saw the evidence, as he saw what this thing can do, as he saw the demos, he was able to piece together something that was very useful.
- LRLenny Rachitsky
Mm. That's really an important insight. People that are hearing this might feel like, okay, I like this idea of pushing back and encouraging the founders to make it more evidenced guided. In the case of say Google+, was, was it even possible? Could you have come to Larry and Sergey and be like, "Here's all this data I've gathered that tells us this is not going to work." Do you have any ins- advice for how to push back and encourage the founders and execs to like really take that, the counter point seriously or really kind of vet their idea?
- IGItamar Gilad
So another nice thing about Google is that it's a very open culture and people are not shy to tell even Sergey and Larry that they're wrong and they do this all the time in certain forms, right? It's not, uh... You need to know the right channels. But there was a very big discussion about Google+ and whethe- whether it's the right thing to create a clone of Facebook, there was very public internal discussion. I think what I would change is not have this discussion based on opinions because when you have the discussion, you come with your own opinions,
- 17:32 – 19:36
Advice on how to push back against founders
- IGItamar Gilad
usually the s- the most senior person's opinions will win. That's just the way it is. If we had come with data, hard data and we said, "Listen. Things are not actually panning out the way you guys are all, all expecting. What can we do? Should we continue? Should we pivot this?" I think the discussion would've done better. Now I'm doing a huge disservice, I was not in all the discussions. I know probably in Google+ there were very s- serious discussions happening along these lines. Uh, but it just, as a general trend, I find that evidence is very empowering for us smaller people in the organization or mid-level managers to be empowered to challenge the opinions.
- LRLenny Rachitsky
Is there anything tactically found to be useful and effective in getting people... Say they don't work at Google, they work at companies where founders and bosses and execs are not as open to challenge. Is there any tactically found about how to present a counter proposal or like, "Hey, I have this data that we should really pay attention to."
- IGItamar Gilad
I think if you come with data, if you run a secret experiment and you come back and you show them, you usually get one of two, uh, results. Either they get extremely mad at you and they tell you to get back to work and to do what you were told and in that case, probably you need to start polishing your resume and look for another place, either inside the organization or outside it because that person is not being reasonable, to be honest. But the more common case is they're, they're pleasantly surprised and that's what happened with Steve Jobs as well. He was against phones but then people showed him all sorts of evidence that Apple can make a phone. He was against multi-touch initially but then he changed his mind. There, there was a lot of like back and forth. So even Steve Jobs given evidence was willing to, to, to flip. And I've s- I, I say this in many organizations, so evidence is so powerful that's why this is the principle I, I based the book on.
- LRLenny Rachitsky
You know, this concept of being evidenced guided, people listening may feel like, "Hey we're evidenced guided, we run experiments, we make decisions using data."
- 19:36 – 21:13
Signs you aren’t as evidence-guided as you may think
- LRLenny Rachitsky
Oftentimes they aren't actually and so what are signs that maybe you're not actually that evidenced guided, as evidenced guided as you think you are?
- IGItamar Gilad
I think there's a few telltale signs that I look for. Uh, first the goals are very unclear. Either there are many or they're very kind of obscure and vague, or they're about output, there's misalignment so the goals part is not there. Usually this goes hand in hand with, uh, metrics, missing metrics or- or just using, you know, revenue and business metric but there's no user facing metrics so that's another telltale sign. Then there is a lot of time and effort spent on planning, especially on road mapping, creating the perfect road map which really can consume a lot of time of the top management and PMs et cetera. Then as, as you go down you see there's not a lot of experimentation and if there is experimentation, there's not a lot of learning. And finally another tell- tell sign is that the team is disengaged, so the engineers are kind of getting the signal that what they need to do is deliver. They're focused on, on output, that's what they're measured on so they're kind of disengaged. They're disengaged from the users, from the business, they don't care that much. That's usually a sign of, uh... or it's usually something that you can fix by adopting a more evidenced guided system.
- LRLenny Rachitsky
Okay, so let's dive into your approach to becoming more evidenced guided. In the book you
- 21:13 – 23:51
Itamar’s GIST model for becoming more evidence-guided
- LRLenny Rachitsky
share this model that you call the JISS model which is kind of this overarching approach to building product that almost forces you to be more evidenced guided. So let's just start with, what's the simplest way to understand this JISS model?
- IGItamar Gilad
Um, with your permission I can show a few slides and-
- LRLenny Rachitsky
Oh, let's do it.
- IGItamar Gilad
... maybe that, that will help?
- LRLenny Rachitsky
Here we go, yeah.
- IGItamar Gilad
But-
- LRLenny Rachitsky
And then yeah, good excuse to go check it, check us out on YouTube.
- IGItamar Gilad
All right, you're seeing this, uh, so this is the JISS model, goals, ideas, steps and tasks and essentially it tries to break the change which is a really big change for a lot of companies, into four slightly more manageable parts. They're still big but each one you can tackle on its own and that's kind of the reason I kind of split it. And goals are about defining what we're trying to achieve, ideas are hypothetical ways to achieve the goals, steps are ways to implement the idea and validate it at the same time. So essentially build, measure, learn loops and tasks are the things we manage, you know, in Kanban and Jira and all these good tools. These are the things that your development team is usually very focused on. And just listening to this, a lot of this will sound familiar to you because JISS is not a brand new invention, it's a meta framework that puts in place a lot of existing methodologies. It's based on lean startup, on design thinking, product discovery, growth. There's a lot of all these things here, it just tries to put them all into one framework or one model.
- LRLenny Rachitsky
So what's the simplest way to think about what this model is meant for? Is this how you think about your roadmap? Is this how you plan? What is this trying to tell people to do differently in the way they build product broadly?
- IGItamar Gilad
I would say these are four areas that you need to look at and ask, "Are we doing the right thing in each?" In each one you may need to change or even transform and as I go and explain each one of those I'll, I'll give you basically three things. I- in each chapter in the book I try to, to touch on three things. The principles behind them, the frameworks or models that implement the, the, the principles and then process. And the process honestly is the most brutal part and the mo- the one that you will need to change and adapt to your company because no two companies are exactly the same. And it's very tempting when you write a book not to give any process but that's the part that people actually want the most, so it's included as well but just be aware that you will have to change this process.
- LRLenny Rachitsky
Awesome. Okay, so we're going to talk about each of these four layers. Before we do that, where do like vision and strategy fit
- 23:51 – 28:45
How to set overarching goals using his “value exchange loop”
- LRLenny Rachitsky
into this? Do they bucket into one of these four layers and how do you think about strategy and vision?
- IGItamar Gilad
That's a great question, so there's this whole strategic c- context that is outside of JISS, JISS is not trying to tackle that. It assumes it's in place, uh, there's another huge blob which is research. JISS is not about research, it's more about kind of discovery and delivery, uh, but strategy is extremely important and it, you can use some of the tools we will talk about to develop your strategy as well.... in many companies the strategy is just a roadmap on steroids, it's-
- LRLenny Rachitsky
Hm.
- IGItamar Gilad
... small plan and execute just on a grand scale and Google+ again was a strategic choice actually, uh, if you think about it. So in the book there is a chapter where I, I touch on strategy and I explain how the same evidence guided methods are being used by companies to develop their strategy as well.
- LRLenny Rachitsky
Awesome. Maybe one last context question, so people might be seeing this and thinking, "Okay, cool, I have goals, I have ideas, steps, I have tasks, I'm already doing this." What is this kind of a counter or reaction to? What are people probably missing when they're seeing this and they're like, "Oh, I see, this is, like, what we're not doing and this is the most important... This is something we should probably change"? And then we'll go through these in detail too.
- IGItamar Gilad
I, I think talking about each one will-
- LRLenny Rachitsky
Okay.
- IGItamar Gilad
... will help.
- LRLenny Rachitsky
Let's do it. Let's do it.
- IGItamar Gilad
But, uh, we can talk about in each level what's actually being done so when-
- LRLenny Rachitsky
Awesome.
- IGItamar Gilad
... people say, "I have goals," usually they, they take the goals layer and use it as a planning session. They talk about, "What shall we build, by when, what are the resources?" And that's actually not goals at all, that's planning work. And-
- LRLenny Rachitsky
Cool. Let's talk about goals. And I know-
- IGItamar Gilad
All right.
- LRLenny Rachitsky
... part of this is OKR related too, so I'm excited to hear your take on OKRs.
- IGItamar Gilad
Oh, that's a whole different discussion, uh, (laughs) you had, uh, Christina, you had a, a real expert on, over there, so I, I doubt I can add more to that. But it's true, OKR is all part of it. But let's start with goals what, what, what's-
- LRLenny Rachitsky
Let's do it.
- IGItamar Gilad
... are goals supposed to be? Goals are supposed to paint the end state, to define where we want to end up and the evidence will not guide you unless you w- you know where you want to go, and in many companies, what you have is goals at the top for revenue, market share, whatever it is, and then a bunch of, uh, siloed goals for each department, there's engineering goals, there's design goals, there's marketing goals, et cetera. And that actually pushes people into different vectors and it's really hard to decide. And I would argue that in evidence-guided companies, and you worked for a few so probably you've seen this, they use models in order to construct overarching goals for the entire organization. One of the models I show in the chapter about goals is the value exchange, eh, loop, where basically the organization is trying to deliver as much value as it can to the market and to capture as much value back, and by creating a feedback loop between these two you are actually able to grow very fast. Now, I would argue that you want to measure both of these and to put a metric on each, and the metric we usually use to measure value delivered is called the North Star metric. I know you wrote an article, a very good article about it.
- LRLenny Rachitsky
(laughs) Thank you.
- IGItamar Gilad
And in it you listed dozens and dozens of companies, like leading companies, and what they considered a North Star metric. Super interesting. I would argue that what they told you is what is the most important metrics we measure, what is the number one metric for us? But it's not what I call the North Star metric. The North Star metric measures how much value we create for the market. For example, let's take WhatsApp. WhatsApp for a very long time measured messages sent, because every message sent is a little incremental of value for the sender, the receiver, it's free, it's rich media, you can send it from anywhere in the world. Compared to SMS, that's huge value. So if in year one we have a, a billion messages being sent and year two two billion, probably we double the amount of value. In Airbnb I think one of your key metrics or the real North Star metric was nights booked-
- LRLenny Rachitsky
Mm-hmm.
- IGItamar Gilad
... I don't know if it was still the case while you were there?
- LRLenny Rachitsky
Yeah. Yeah. Absolutely.
- IGItamar Gilad
And there are examples like this, uh, in, um, Amplitude for example, uh, they measure lear- active learning users, or weekly-
- LRLenny Rachitsky
Yeah.
- IGItamar Gilad
... active, eh, learning users which are users that found in the tool some insight that was so important that they shared it with at least two other users and they consume it. So it's a very powerful thing to point at this metric said this is the most important metric, combined with the value of metric that we want to capture, revenue, market share, whatever it is. Once you have these two you can, uh, further break them down into what I call metrics trees. So there's a metric tree for the North Star metric and there's the metric tree for the top KPI, the top business metric which you see here on the, on the left side in blue,
- 28:45 – 33:47
North star metrics vs. KPIs
- IGItamar Gilad
and usually they overlap, so you might find in the middle some metrics that are super, super important because moving them actually moves the needle on everything else.
- LRLenny Rachitsky
Can you clarify again the difference between what you called this top KPI versus North Star metric?
- IGItamar Gilad
So the North Star metric, uh, is measuring how much value we're creating for the user, the core value that they're getting. In this case this is some productivity suite, so this is number of documents created per month, for example. Because we think that every document created, maybe it's a small document, I don't know, AI is in fashion now, is a little incremental value, so that's the number we're trying to grow. The top KPI is what we expect to get. It should be re- revenue or profit or-
- LRLenny Rachitsky
I see. This is the value exchanged. I see. One is what users are getting, one is what you're getting back from them?
- IGItamar Gilad
Exactly. Yeah.
- LRLenny Rachitsky
Basically what the business is... How the business is benefiting. Awesome. I think this is a really important concept, the metric tree. I think a lot of people think they have something like this in mind where they're just like, "Oh, here's our North Star metric, here's the levers and things that we can work on to move that." But I think actually mapping it out the way you have it here where it kind of goes layers and layers deep to all of the different variables that impact this metric, not only is a way to think about impact and goals and things like that, but also helps you estimate the impact of the experiment you're potentially thinking about running. So if you're gonna work on something at the bottom here like activation rate, like, say you move that 10%, how much is that going to impact this global metric? It's probably a very small amount.
- IGItamar Gilad
This is a very important one and, and we will talk about impact assessment shortly. This helps with it. It also helps with alignment because the entire organization is trying to move these two metrics, it's the two sides of our mission essentially. We have the mission, that's the top objective of the company, and these are the two...... top most key results if you like, the top most things. So when you go, go and work with another team and you say, "Hey why don't you work on my project?" They might say, "You know this project, this idea actually might move the North Star metric more, more than your idea." And that helps you guys align and I'm, and I've seen cases where team B put aside their own ideas to jump on the ideas of team A because of this model. It also creates an opportunity to give some sub-metrics to, to teams to own on an ongoing basis, um, so it creates a little sense of ownership as well and mission within the tree.
- LRLenny Rachitsky
It also helps you figure out what teams you should have, which teams are, have the biggest potential to impact the metric.
- IGItamar Gilad
Another thing that happens in a lot of organization, the team topology reflects, you know, the structure of the, of the software or some hierarchical model where we want to organize the, the organization in a particular way. But if you start with a metrics tree, you can try to arrange the topology around goals and sometimes you need to readjust, it's not a constant re-org, but from time to time you will realize the goals have changed and we need to reorganize. So the tree helps visualize that as well.
- LRLenny Rachitsky
I think for people that are listening to this and thinking about this, I think the simplest way to even think about this is basically there's a formula, there's like a math formula that equals your North Star metric or your revenue or whatever you're trying to do. And if you don't have some ideally really clear sense of what that math formula is, you should work on that because that will inform so much of how you think about where to invest, what teams to have, where to invest more resources, less resources.
- IGItamar Gilad
Right.
- LRLenny Rachitsky
Imagine a place where you can find all your potential customers and get your message in front of them in a cost-efficient way. If you're a B2B business, that place exists and it's called LinkedIn. LinkedIn Ads allows you to build the right relationships, drive results and reach your customers in a respectful environment. Two of my portfolio companies, Webflow and Census are LinkedIn success stories. Census had a 10X increase in pipeline with a LinkedIn startup team. For Webflow, after ramping up on LinkedIn in Q4, they had the highest marketing source revenue quarter to date. With LinkedIn Ads, you'll have direct access to and can build relationships with decision-makers including 950 million members, 180 million senior execs and over 10 million C-level executives. You'll be able to drive results with targeting and measurement tools built specifically for B2B. In tech, LinkedIn generated 2 to 5X higher return on ad spend than any other social media platforms. Audiences on LinkedIn have two times the buying power of the average web audience and you'll work with a partner who respects the B2B world you operate in. Make B2B marketing everything it can be and get $100 credit on your next campaign. Just go to linkedin.com/podlenny to claim your credit. That's linkedin.com/podlenny. Terms and conditions apply. Okay, so, uh, metrics trees. What comes next?
- IGItamar Gilad
All right, so next we need to go to the ideas layer and the ideas layer is there to help us sort through the many
- 33:47 – 37:39
Using “ICE” to assess the value of ideas
- IGItamar Gilad
ideas we might encounter and they may come from, as you said, the founders, the managers, the stakeholders, from the team, from research, from competitors, from... We're flooded with ideas and what usually happens inside an organization is some sort of battle of opinions or, or some sort of politics sometimes or highest paid person's opinion, HIPO. You had Ronnie Kohavi invented this term, uh, in, on, in your show. Uh, what doesn't happen is very rational, logical decisions, these are the, the best ideas-
- LRLenny Rachitsky
Mm-hmm.
- IGItamar Gilad
... because it's really, really hard to predict honestly. There is so much uncertainty in the needs of the users, in the changes in the market, in our technology, in our product, in our ow- own organization. It's almost impossible to say, "This idea is going to be the best." But we do say this because we have cognitive biases that kind of convince us that this idea is far superior to anything else and it's definitely the right choice. In order to avoid this, what we want to do is to evaluate ideas in a much more objective and consistent and transparent way. In the book I suggest using ICE, impact, confidence and ease. I think I have a, a slide coming on this. So impact, confidence and ease which is basically a way to, to assign three values to each idea. The impact tries to assess how much impact it will have on the goals and that's why it's so important that we have very clear goals and not many. How we measuring the ideas on the North Star metric, on the top business KPI, on a local metric of the team, whatever it is, let's be clear about it and then let's evaluate the ideas against this thing. Ease is basically the opposite of effort, how easy or hard it's going to, to be. But both of those are guesstimates, both of those are things we need to estimate. I would argue that just by breaking the question to these two ques- two questions, we usually have a slightly better discussion than just my idea is better than yours. But then there's the third element which is confidence which tries to assess how sure are we or should we be about our first guesstimates, about the impact and the ease.
- LRLenny Rachitsky
It's interesting you use the word ease, because I think it's usually effort. Is there a, you kind of make it, uh, f- positive. Is that an intentional tweak you made?
- IGItamar Gilad
I'm, I'm using the, the definitions, the, of Sean Ellis. Sean invented-
- LRLenny Rachitsky
Oh, interesting.
- IGItamar Gilad
... uh, uh, ICE. You know Sean, I don't know if you, if you've had him yet, but he's-
- LRLenny Rachitsky
I haven't had him on yet.
- IGItamar Gilad
Yeah, for, for the people who don't know him, Sean is amazing. He's like one of the fathers of the growth movement, he coined the term growth hacking and he very pop- he popularized the, the concept of product market fit.
- LRLenny Rachitsky
Yeah.
- IGItamar Gilad
He created ICE, he created a bunch of things that we use in product that we don't even know-
- LRLenny Rachitsky
Wow, I didn't know he came up with ICE. Okay, cool. So the original version of ICE is ease instead of effort?
- IGItamar Gilad
... exactly, yeah.
- LRLenny Rachitsky
Fun fact.
- IGItamar Gilad
Uh, a, a lot of your viewers are w- wondering where's the R, 'cause there's another variant of this called RICE, where there's-
- LRLenny Rachitsky
Yeah.
- IGItamar Gilad
... reach as well. I prefer ICE, 'cause I prefer to fold the reach into the I for various reasons. But both are valid, both are equivalent in a sense.
- LRLenny Rachitsky
I'm in your boat, that's exactly how I think about it. I think people over complicate this stuff and try to get so many math formulas involved with estimating impact and I feel like these are just simple heuristics to kind of bubble the best ideas to the top. It doesn't have to be a perfect, like estimate of impact and confidence and all those things. So, I think the simpler the better, and always ends up being a spreadsheet. People always have these tools to estimate these things but, I feel like a spreadsheet, Google Sheets is great.
- IGItamar Gilad
So yeah, y- you're actually leading me to my next point. So when you come to estimate impact, you will realize it's the hardest part. So sometimes it's just a gut feeling and it's a guess, and sometimes it's
- 37:39 – 44:28
Itamar’s confidence meter
- IGItamar Gilad
based on some spreadsheet or some analysis, some back of envelope calculation you've done and I think that's legitimate. Sometimes these things do show you some, uh, things you didn't think of and sometimes the best case, it's based on tests. You actually tested it, you interviewed 12 customers, you showed them this thing and out of those, only one actually liked it. You should reduce your impact based on that usually, or you do other types of tests and we'll talk about testing in a second. What happens is that people tend to just go with gut instinct and then give themselves a high confidence. They say, "It's an eight and I'm pretty convinced, so it's eight for confidence." And I found this a bit disturbing 'cause it kind of subverts the whole system, so I wanted to help people realize when they have strong evidence in support of their guesses and when it's weak evidence, w- how to calculate confidence in a sense. And for that I created a tool called the, the confidence meter which you can see here, this colorful thing. Uh, should I go on explaining it?
- LRLenny Rachitsky
Yeah, let's do it and then again, if you're, uh, just listening to this you can check this out on YouTube and you can see the actual slide.
- IGItamar Gilad
All right, awesome. So basically I constructed it a bit like a thermometer. It goes from very low confidence which is the blue area or the upper right a- all the way to high confidence which is the red area. And you can see the numbers going from zero to 10, uh, where zero is very low confidence, we don't know basically anything, we're just guessing in the dark and 10 is full confidence, you know for sure this thing is a success, no doubt about it. And across the, the circle I put various classes of evidence you might find along the way. So for example, starting at the top right, all of this blue area is about opinions. It could be your own self- uh, your own self-confidence in the idea, your self-conviction, you feel that it's a great idea. Guess what? Behind every terrible idea that was ever, someone thought it was great, that gives you 0.01 out of 10. Maybe you, you created a shiny pitch deck or a six page document that explains in detail why this is a great idea, slightly harder to do but still very low confidence. Maybe you connected it to some theme, you know it's about the blockchain, well, sorry, the blockchain is out of fashion, what's hot right now?
- LRLenny Rachitsky
AI.
- IGItamar Gilad
Exactly, AI. It's about AI. That makes it a good idea? Absolutely not. Or the strategy of the company, that's another thematic support. Thousands and thousands of terrible ideas are being implemented right now as we speak, based on these themes. So all those things combined can give you a maximum of 0.1 out of 10 according to the tool, if you, if you follow it.
- LRLenny Rachitsky
(laughs)
- IGItamar Gilad
Then we move into slightly harder tests. One is reviewing it with your colleagues, your managers, your stakeholders, the idea. They don't know it either, they don't have a crystal ball, they're usually not the users, they cannot predict but they can evaluate it in a slightly more objective way and maybe find flaws in your idea. On the other hand, uh, groups tend to have biases too, politics, group think. So groups can actually arrive sometimes at worse decisions than individuals. There is some research to that. Next are estimates and plans, so you may do some sort of back of an envelope calculation or your colleagues might go out and try to evaluate the ease a little bit better, that gives you a little bit more confidence but still, we're at- we're at the level of guesswork at this point. Next we're moving to data and data could be anecdotal, so you find a few data points across, dotted across your data or you talk to a handful of customers or maybe one competitor has that same idea. In many companies I meet, if the leading competitor has this feature and we think it's a good idea, validation is done, let's launch it, that's it. It's a great idea, we, we need to do it. Never works honestly. You should not assume that your competitor actually knows what they're doing any more than you do.
- LRLenny Rachitsky
(laughs)
- IGItamar Gilad
Data could be also what I call market data that comes from surveys, from, uh, assessing a lot of, of your data by doing a deep competitive analysis and there are other methods where you create a larger dataset and you contrast your idea against it. Finally, to gain medium and high confidence, you really need to build your idea and test it and that's where the red area is. So there's various forms of tests, we will talk about them i- if we have time and they give you various levels of confidence.
- LRLenny Rachitsky
Awesome. This is a very cool visual. We'll link to a, a image of this in the show notes too if you- people want to check it out. I think what's awesome about this is you could just use this as a little tool on your team of just like, "Where are we along the spectrum?" Like, "We think the impact of this is very high but we're probably in this like blue area of confidence and so let's just make sure we understand that." And it's really clear language to help people understand. "I see, if we had this, it'd be a lot more confident."
- IGItamar Gilad
So you can also tie your investment i- into the idea i- based on the level of, uh, confidence you have found essentially. Uh, so early on you want to do the cheap stuff just to get more confidence and then you can go and invest more. If it's a really cheap idea, you can jump to a high confidence idea. You can, uh, test. You can do an A/B experiment, early adopter program, whatever it is and then launch it. Some ideas you don't need to test. Sometimes expert opinion is enough. You, if you're just changing the order of the settings, no one sees this or no one will, will be impacted, the, the risk is low, you can launch it without testing. So part of the trick is also knowing when to stop, not just trying to f- force your way all the way up when you don't have to.
- LRLenny Rachitsky
That's a really important point. The other important point here is just a big part of a PM's job is to say no and to stop stupid shit from happening, and this is an awesome tool to help you do that. To be like okay, here's this idea you have, just like let's just be real, how confident are we with this? And okay, it's going to take us three months to do this, maybe, maybe we should think about something different, maybe we should work up the confidence meter before we actually commit to this.
- IGItamar Gilad
Yeah. This, this is a real world usage that I hear about a lot.
- LRLenny Rachitsky
Mm-hmm.
- IGItamar Gilad
When people use this to kind of do a, an objective way to say no and gently, or to say we will think about it but look at these other ideas we have and how their impact and ease and confidence stack up.
- LRLenny Rachitsky
Classic PM move, just like that was a great idea but what about this better idea? Coming back to something that we talked a bit about at the beginning, say you, say you have a founder who's like actually very smart and experienced, say even at a startup where you don't really have the time to build
- 44:28 – 46:14
Speed of delivery vs. speed of discovery
- LRLenny Rachitsky
tons of evidence for ideas, do you have a different perspective on how much time to spend building confidence in ideas versus just like, "Cool, they're, they actually have really good ideas, let's just see what happens?"
- IGItamar Gilad
So there's always like a trade-off between speed of delivery and speed of, uh, discovery and, and that actually leads to the next layer of how do we combine the two? Because people tend to think it's an either/or. Either we are building very fast or we're learning and then we're building very slow, but I think we're l- using the wrong metric. The metric is not how fast can we get the bits into production. When there's a lot of uncertainty, and we all face uncertainty and startupists especially, it's not about getting the bits to production, it's about getting the right bits to production. It's about creating the outcomes that you need, the impact, and so it's about time to outcomes, and I would argue that the evidence guided method is far more impactful, is far faster, is far more resource efficient than the opinion-based method because opinion-based methods tend to waste a lot more of your resources building the wrong things or discovering, learning too late while evidence guided helps you learn earlier. Plus it is a fallacy that you, if you learn, you don't build. Good teams know how to do both at the same time and that's actually what the steps layer is meant to, to teach you or to help you do.
- LRLenny Rachitsky
Awesome. So maybe just to close off that loop, say someone listening is at a bigger company, say Netflix versus a series A, series B startup, is
- 46:14 – 49:09
How to apply Itamar’s frameworks based on company type and stage
- LRLenny Rachitsky
there something you'd recommend about them approaching this differently? Any kind of guidance there of just how to take what you're sharing differently if you're a dif- different source of companies like that?
- IGItamar Gilad
Absolutely. I, I think the, the concept we talked about of the ve- the North Star metric, the value created, uh, versus the value captured is very important in every company. Building your entire, uh, metrics trees may be overkill, doing heavy-weighted OKRs may be overkill for a, for e- early stage. Early stage companies even don't know how they create value so they need to iterate a- and their goals is really to find product market fit. Beyond that, what happens is that you need to start building your business model, so that's your goal and you iterate towards that and you need to put metrics on that and then when you move into scale you need to try to create order, because when you scale up... And all this is covered in the book, there's a s- special chapter just about these questions. When you scale up you get a lot of people and a lot of money and, uh, everything is happening at the same time, so there you need to order off evaluating ideas in a very systematic way. In a company like Netflix... By the way, I don't know if they need this specific method. They're a very efficac-
- LRLenny Rachitsky
(laughs) Yeah, maybe that was a bad example. They're probably doing things pretty well.
- IGItamar Gilad
One thing I discovered by the way, there's two types of companies that really benefit from this technique; one is those companies that are kind of emerging into modern product development, they have product teams, they have product managers, they have OKRs, they're starting to do agile but... They're starting to do experimentation but they're, they're struggling to put it all together. Every CPO is building it their own little framework and the other type is those companies that used to be evidence guided and they regressed, and that happens way too often. Change of management, change of culture and then all of a sudden they need to rediscover, to rekindle that spirit that was lost, uh, a la Google Plus. So some of the people that actually respond to this strongest are actually surprisingly in these companies.
- LRLenny Rachitsky
Mm-hmm. What I love about your frameworks and kind of all these things we're talking about, is these are just kind of a... You can almost think of them as a grab bag set of tools to make you more evidence guided as a company. You could start with thinking about the confidence meter, you could start using ICE more, you could start using the metrics tree and all these things just move- push you closer and closer to being more evidence guided. You don't have to adopt this whole thing all at once.
- IGItamar Gilad
Absolutely. I, I would recommend that you don't try because if the transformation is way too big, you will get fatigued and you will just create a lot of, uh, process for a lot of people and you will not see the results and after a quarter you will give up. So exactly what you suggested is the right approach.
- LRLenny Rachitsky
What would be the first thing you'd suggest if people were trying to move closer to being less opinion oriented and more evidence based? Which of these frameworks are-
- IGItamar Gilad
... models would you recommend first? I recommend that they discuss internally
- 49:09 – 50:21
First steps in becoming more evidence-guided
- IGItamar Gilad
where is the biggest problem that they're facing. If the, the goals are unclear, there's misalignment, we, we keep chasing the wrong things, start at the goals layer. Try to establish your North Star metric, your top business metric, your metrics trees, start assigning teams with their own area of responsibility. If you're spending a lot of time in debates and you're constantly fighting and changing your mind, start with the ideas layer and establish, uh, impact is confidence or whatever prioritization model you like but involve evidence in it. I think the confidence meter is, is a good tool to use irrespective. If you're building too much and you're not learning enough, start adopting the steps layer, which we haven't seen yet and if your team is very disengaged, you have one of these teams where the developers are very into Agile, very into quality, very into launching things, start working on the tasks layer. Awesome. Okay, let's keep going. All right, so steps. Steps are about kind of helping us learn and build at the same time as we said and
- 50:21 – 55:41
Next steps in testing
- IGItamar Gilad
one of the patterns I see is that organizations don't know that they can actually learn at a much lower cost. They believe they need to build this elaborate MVP, which is not minimal in any way, and then launch it and then they will discover and basically it's what we used to call beta 20 years ago but just with a different name. What I'm trying to do here at the steps layer is to help companies realize there's a gamut of ways to validate your ideas or more specifically to validate the assumptions in your idea and I created a little model for this, it's called AFTER, assessment, fact-finding, uh, tests, experiments and release results. But it's again, it's, it's just putting together things that much smarter people invented. So, uh, in assessment you have very easy things, things that don't require a lot of work, you check if it aligns with the goals, this idea that you have in your hand, you do maybe some business modeling, you do I- ICE analysis, you do assumption mapping which is a great tool by David J. Bland or you talk to your stakeholders one-on-one just to see if there are any risks, et cetera. These are usually not expensive things and they can teach you an awful lot about the impact and the ease of your idea. The next step is to dig data and usually that goes hand-in-hand with this so you can find data in your data analysis, through surveys, through competitive analysis, through user interviews and through field research observing your users. Obviously these last two are pretty expensive, so it's often good not to wait until you have the idea and then start doing your research, it's best to keep doing your research ongoing and then you have a... some sort of data to, to lie on and to compare your idea against. But until now we didn't build anything, now you're ready to start testing, building versions of the product and putting i- them in front of users and measuring the results. But initially you don't build anything, you fake it, you do a fake door test, you do smoke tests, Wizard of Oz tests, uh, concierge tests, usability tests. We used a lot of those in the tabbed inbox by the way. One of the first v- early versions was actually we showed the tabbed inbox working to people but it wasn't really Gmail, it was just a facade of HTML and behind the scenes, and according to the permissions that the users gave us, some of us moved just the subject and the sender into the right place. So initially the interviewer kind of distracted them and then they showed them their inbox and they made the top 50 messages were sorted to the right place, more or less, if we got it right and people were like, "Wow, this is actually very cool." And that gave us a lot of evidence. That's an awesome story. So that was in the user research, it wasn't like rolled out to people, it was a manual in- individual? The, there wasn't a single line of code written, this was just cooked up by our u- user researchers- That's awesome. ... and our designers. Yeah. Um, but it gave us some evidence to go and say, "Hey, we should try and build this thing." Love that. So initially you fake it, mid-level tests are about building a rough version of it, it's, it's not complete, it's not polished, it's not scalable but it's good enough to give to users to start using so those are early adopter programs, ALPHAs, longitudinal user studies and phish food. Phish food is testing on your own team. Phish food? I haven't heard that term before. So it's dogfooding but more, uh, more local to your team. I think it's a Googly thing but some people told me that they use phish food as well in their company, the, the name. So I'm using it, I don't ha- know if there's a better name for it. I wonder why it's called phish food, because it's like little, it's like little gentle, little clicks? It could be, yeah. I don't know. (laughs) Wow, okay. Super cool. I'm learning a lot here. So the next stage is to actually build a v- a, a kind of a cr- more complete version of this and then you can dogfood it, then you can give this to your, to your users, uh, internally. When I joined Microsoft many years ago, the first thing I noticed was that Outlook was very buggy and I asked people, "What's going on?" And we s- and they told me, "We are all dogfooding the next version of Outlook that hasn't come out yet." And that's a very common practice in Silicon Valley, uh, you can do previews, you can do betas, you can do labs. So those are tests. Now there's a special class of tests which are experiments because they have a control element so A/B tests, multivariate tests, those are all experiments. I'm using the word experiment the way data scientists use it, although people tend to call experiments to everything that you see here. And finally even the release, you can do a stage release, you can do percent launches, you can do holdbecks, all of these things help you further validate your assumptions. Sometimes you need to well back and change things but it's another opportunity to learn. So the key point is you don't have to start at the right-hand side, which is expensive. You can start early on and that leads to parking a lot of ideas very quickly. You realize they are not as good as you thought and then you can invest more effort into the good ideas. If they get, generate positive evidence you can go further and further until that point where you f- feel you're ready for delivery.
- LRLenny Rachitsky
Okay. So we've talked about goals, we've talked about ideas, we're talking about steps here. Is there anything else along steps? And then next thing I know comes tasks.
- IGItamar Gilad
No. Th- this is it for s- for step. There's a lot more but, uh, we will not
- 55:41 – 1:02:54
The task layer in the GIST framework
- IGItamar Gilad
go into all of it-
- LRLenny Rachitsky
Okay.
- IGItamar Gilad
... here.
- LRLenny Rachitsky
That sounds good. Let's talk about tasks and what you mean there.
- IGItamar Gilad
All right, awesome. So in, in many organizations there's these two worlds. There's the planning world where basically you have, you know, the managers, the stakeholders, some of the PMs really s- sit and think about what we need to launch and that's where we create the strategies and the roadmaps and the projects. But guess who is not invited to the party? The people who are actually doing the work, they live in Agile world, they're, they're very focused on moving tickets to the done state, on completing burning story points, you know, pushing stuff into production and there's a big gap between these two worlds. They don't understand each other, they don't see eye to eye. There's a lot of mistrust being built sometimes against the, the plans or the managers feels that the teams are just not being very effective. We- we've seen all this and the solution, kind of the stop gap, is to put a PM in the middle. The PM is supposed to make all of this work, deliver on the roadmap like a project manager, fill the Agile machine with perfectly prioritized product backlogs and stories and it just doesn't work, honestly. And the PMs I meet are very tired and they have to spend so much time in planifications and roadmap discussions and they're very busy, they don't have time to do research or to, to test ideas. So I suggest changing this and bringing the developers a little bit out of their kind of Agile cage, if you like. And no disrespect to Agile, it's a great thing, but let's let them do more than just develop. Let's let them discover as well. And the... One of the tools I suggest, and again this is a process, is what I call the gist board. So it's basically the top three layers of GIST, the goals are on the right, these are just the key results and usually per team I suggest not more than four. So you create a gist board per team. Then the ideas we're working on right now, sometimes with their ICE scores, and then the next few steps that we might want to pursue in order to validate these ideas and this is a very dynamic thing. It changes all the time. The, the team leads need to update it and the team needs to meet around it, uh, at least once every other week to sync, to talk about what's going on. Are we still following the right ideas? How are we doing on the, on the goals? What are the next steps? What's blocking us from completing the most important steps? And this is a discussion that is not happening today 'cause most of discussion happens at the roadmap level and then there's a lot of discussion at the task level, but this middle layer of what actually are we trying to achieve and how well are we doing on it doesn't exist. If you do have this, uh, you create a lot more context in the minds of your team and then they need to ask you fewer questions. You need to tell them less what to do. They know what's success and they are able to actually do a lot more on their own.
- LRLenny Rachitsky
Is the way to think about the- your- the gist board as the way you should be road mapping or is this more of a strategy framework to think about where you should be prioritizing broadly?
- IGItamar Gilad
The way I see this is at the beginning of the quarter, the team defines its goals. The, the leads of the team define the goals but they review it with the team, they review it with the managers of course, with the stakeholders, everyone's in agreement these are the maximum four key results and the one or two objectives you guys need to work on. Teams cannot deliver on more than that. You copy these key results into the gist board, then you start looking at your idea bank or you start generating ideas and say, "How can we achieve these key results?"
- LRLenny Rachitsky
And to clarify, the thing you copy is the key result as the goal?
- IGItamar Gilad
Yes, exactly. You can write the objectives, uh, alongside that if you... To remind people what are we trying to achieve but the key results are the thing we show here. Then you pick some ideas, the ones that look most promising and it un- as unintuitive as it sounds or counterintuitive as this sounds, I would recommend that you let the team pick these ideas. The manager, the stakeholders can propose the ideas, everyone can propose, but the team should use the ICE process to kind of... And especially the product manager is very important here, to choose which ideas to test first. And then the team together needs to develop which steps should we run? How can we validate this? Some of the steps will be done by the PM, some by the data analyst, some by user- user researcher but some will involve the team. There'll be some coding, there'll be some running of experiments, and so there's some ownership around the steps. A team, uh, a sub-team owns each one of these steps and we will change the board very actively. So if an idea turns out to be bad we will take it off the board and put another idea in its place or maybe we achieve the goal, we don't need to work on this anymore, we can focus on something else. So it's a project management tool in a sense.
- LRLenny Rachitsky
Awesome. And as I'm looking at a- at it and I think maybe the most important piece of this is that steps aren't just like a project, like, uh, launch a better onboarding or add this step to onboarding. It's, you want to emphasize the steps that you're going to take to get to more and more confidence essentially and more and more, uh, evidence guided thinking versus just, well, let's figure out how to launch this feature idea.
- IGItamar Gilad
Exactly. It- it's not a engineering milestone or a design milestone, it's a learning milestone. So we build something and along the way we actually...... grow the scope of what we build. We, we are building the product i- in the process, uh, and we learn. So the two have to come hand-in-hand.
- LRLenny Rachitsky
And just to give for folks that aren't watching this on YouTube (laughs) , just to walkthrough one example, do it real quick. So one of your goals here is average onboarding time, you want your goal to be the average onboarding time less than two days, currently five-and-a-half days. An idea there is an onboarding wizard and then the steps are a usability test with mock-ups, and then a usability test as a prototype, and then an A/B test.
- IGItamar Gilad
Yeah, basically. Uh, and you can alter this as you go along. Sometimes you can run multiple steps in parallel, it's not always sequential. But that's basically the process, yeah.
- LRLenny Rachitsky
Awesome. So kind of like what, again, what you're trying to emphasize here as a team is just, we're not just gonna launch this onboarding wizard and we're not gonna figure it out later. It's like let's be upfront about the steps we're gonna take to build more and more confidence this is something we should keep investing more and more in, which is really interesting.
- IGItamar Gilad
Yeah. And something, another interesting thing that happens, every time you run a step if it's successful you have evidence and you can go back to the managers and tell them and share. And say, "You know, with this idea we thought it was great but we got this result. What do you think that means?" And sometimes that manager that proposed it would say, "You know, I think the, the test failed, let's rerun it." Or sometimes they will say, "You know, maybe it's not as strong as, as I thought." The discussion just becomes that much more nuanced and objective, if you like.
- LRLenny Rachitsky
Maybe just to close out this framework, how does this relate to a roadmap that they may have in a spreadsheet or in Jira or in Asana or something like that? Does this sit on top of that? This replacing a roadmap somewhere else?
- 1:02:54 – 1:04:56
Thoughts on roadmapping
- LRLenny Rachitsky
- IGItamar Gilad
Uh, I would say that release roadmaps where you're just saying, "By Q3 we, we want to la- launch this." Or, "By October we have to launch that." They're kind of competing with this. If you're doing that and people know that the goal is to launch that thing by October, forget about learning.
- LRLenny Rachitsky
Mm-hmm.
- IGItamar Gilad
For- forget about evidence guided. I recommend using outcome roadmaps saying, "By October we want to achieve this outcome. By Q4 we want to launch in other three countries." Or, "We want to grow our usage in India by that much. By this time, we need to tackle the problem of churn." And how we achieve this, sometimes we know, we have a concrete idea that is high confidence that we already tested, we switch into delivery, then we can put it on the roadmap and say, "Yeah, we're going to build this thing and we'll aim for October." But otherwise, you want to keep it open and the roadmaps can kind of suffocate this process if you decide upfront with low confidence that this particular idea must be launched.
- LRLenny Rachitsky
Okay, so you're proposing people switch their roadmapping practice to this, which is very ambitious. I love it.
- IGItamar Gilad
Well, this is not a, a roadmap, this is just a tool for the team to manage the product, uh, the project. Uh, but I have a proposal for outcome roadmaps inside the, in the book.
- LRLenny Rachitsky
Oh, okay. Awesome. Okay. So I was gonna ask, if people wanted to try this approach, the book is the best way to fully understand the, the framework and how to implement it?
- IGItamar Gilad
That's one way. I, I have articles, I have resources on my site, um, but I try to condense much of what we just discussed and much more, a lot more nuance in the book. So-
- LRLenny Rachitsky
Great.
- IGItamar Gilad
... if you're interested in that, I would give it a go.
- LRLenny Rachitsky
Awesome. Maybe just on the topic of OKRs real quick, how do OKRs connect to all this? It sounds like broadly you kind of assume people keep working on here's our metric,
- 1:04:56 – 1:07:11
How OKRs fit into the whole picture
- LRLenny Rachitsky
our key results, our objectives, and then that plugs into this kind of gist framework.
- IGItamar Gilad
So the metrics trees, plus your mission, plus the individual missions of the teams give you most of what you need to populate your OKRs. There's, uh, of course, a process of alignment top down, bottom up, side to side, which I talk a little bit about as well. OKRs is a very rich topic, but those things are usually the core. There's usually some other OKRs that are about the health of the company, the health of the product, et cetera. Those are called supplementary OKRs, I talk about those as well. So yeah, I think OKRs are, are a helpful tool if you like them.
- LRLenny Rachitsky
And just zooming out again, basically you don't need to take all of these ideas and lump them all together and change the way you work as a business, you can start with picking some of these ideas and starting to become more and more evidence guided. It sounds like this GIST board isn't where you probably want to start, but maybe it's once you have more and more experience using some of these tools? Or you tell me, do you sometimes go straight to this way of thinking about the roadmap and the plan?
- IGItamar Gilad
So, uh, it might not be the full board because you're missing some of the pieces, maybe your goals are not as good or your idea of prioritization isn't as good. But if your team is very, very delivery focused (clears throat) and sometimes it's also the opposite, your, your, the managers are telling them what to build and, uh, and you want to break this kind of dynamic, you want to create a step backlog. So instead of a product backlog let's create a (clears throat) backlog that, uh, of steps which are just validation steps, betas and previews, et cetera. And that changes the dynamic pretty strongly.
- LRLenny Rachitsky
So by the time this podcast comes out, the book will be out. What is the best place to find the book?
- IGItamar Gilad
Hopefully on Amazon, you can search for it. You can go to my site, itamargila.com and it will be presented, uh, prominently there, and there's also the book landings, uh, page where you'll find everything you need to know about the book, evidenceguided.com.
- LRLenny Rachitsky
Well, with that we've reached our very exciting lightning round. Are you ready?
- IGItamar Gilad
Yes, let's go.
- LRLenny Rachitsky
What are two or three books you've recommended most to other people?
- IGItamar Gilad
So I'm going to cheat, I'm going to recommend a series of books so two series.
- 1:07:11 – 1:12:51
Lightning round
- IGItamar Gilad
Uh, one is-
- LRLenny Rachitsky
Cheating is allowed.
- IGItamar Gilad
All right, cool. Uh, one... And, and there's... Those are obvious one. One is the series published by SVPG, Silicon Valley Pro- P- Product Group. So Inspired, Empowered. Now I think Transformed is, has come out. I haven't read it yet, but I'm sure it's amazing. So this is Marty Cagan and his colleagues. They're... They write some tremendous books and every product manager should read them. The other series, m- a bit older. This is the Lean series, the Lean Startup, Lean Enterprise, Lean Analytics. There's gold in all these books. Lean UX. Really, really important books and I think they're not as appreciated as they should be. Running Lean, that's another example.
- LRLenny Rachitsky
What is a favorite recent movie or TV show?
- IGItamar Gilad
I'm not really a big TV or movie buff. I, I just put on whatever comes up. Uh, I'd... I'm discovering that YouTube is actually becoming one of my sources of, uh, information and entertainment. I'm learning a lot of Spanish recently, so I discovered this channel called Dreaming Spanish which is... If you're learning Spanish, it's incredible. So that's my recommendation.
- LRLenny Rachitsky
That's a unique choice. I love it. Favorite interview question you like to ask candidates?
- IGItamar Gilad
I like to ask them to design something for a niche audience. So a navigation system for el- elderly people or, um, some sort of laptop for people with, uh, vision impairment, et cetera. So those are good questions to see their customer empathy, their creativity, their ability to evaluate multiple ideas, their ability to f- to find flaws in their own ideas. So, uh, there's a lots of room to dig in there and kind of see how this person is thinking as a product person.
- LRLenny Rachitsky
What is a favorite product you recently discovered that you love?
- IGItamar Gilad
It's a cliché but it's AI. There's a company called ElevenLabs-
- LRLenny Rachitsky
Mm-hmm.
- IGItamar Gilad
... uh, that do voices and v- like the best voices, synthetic voices you heard. But they can also replicate your own voice, so you can create a voice signature. If you're American, you can use their kind of default, uh, free version or cheap version to replicate your own voice and that could be pretty useful if you need to, I don't know, narrate an audiobook or do some online course. Uh, so I'm, I'm finding this service very interesting.
- LRLenny Rachitsky
This is all part of my big retirement plan, find all these components together that can replace me eventually. Got AI generating content, we'll have this voice thing, I love it. It's all happening.
- IGItamar Gilad
Yeah, yeah. There's, there's an AI version of you right? I can ask you questions now with a-
- LRLenny Rachitsky
Oh, there is. Lennybot.com.
- IGItamar Gilad
Right.
- LRLenny Rachitsky
It's all part of the plan.
- IGItamar Gilad
Cool.
- LRLenny Rachitsky
Okay. What is a favorite life motto that you repeat most to yourself, that you share with others?
- IGItamar Gilad
That's a big one. Albert Einstein I think said, um, "Strive not to be a success but to be of value." And I think that's a great motto for people and for companies. Uh, it's something that kind of guides me and this whole concept of the value exchange, et cetera, is, is kind of loosely connected to that.
- LRLenny Rachitsky
I love that. That's such a important point for people putting out content online. So many people are just like, "I just want to be successful, get followers. Here's all these things I'm tweeting and showing," and the thing that actually works is deliver value, create valuable stuff that people really value and want. And I find the signal for that is you find it interesting and valuable, like if you're like, "Oh, wow. That's really interesting," oftentimes other people are going to find it interesting. So that's... I love that. Great choice. I'm going to look that one up. Two more questions. What's the most valuable lesson you learned from your mom or your dad?
- IGItamar Gilad
Uh, I think both of them in their own way. They had relatively modest jobs, you know, teaching or doing other things. Uh, but they always strived, again, to be the best they can and to deliver the most value they can, so it's very connected somehow.
- LRLenny Rachitsky
Mm-hmm.
- IGItamar Gilad
Uh, maybe I'm, I'm seeing the world through this lens.
- LRLenny Rachitsky
Mm-hmm.
- IGItamar Gilad
But, uh, they kind of taught me to strive to be the best I can at what I do.
- LRLenny Rachitsky
Final question. You're Israeli, for folks that can't tell. What is your favorite Israeli food that people should definitely check out or try to get whenever they can?
- IGItamar Gilad
Ooh. When I arrive in Israel I usually go for shawarma, which is like doner kebab if you know it, just better. Uh, so if you're in Israel, if you go visit Haifa which is the city where I grew up, definitely check out the shawarma.
- LRLenny Rachitsky
Awesome. Itamar, I hope people got the gist of your book from our conversation. What's the best way to find it? What's the best way to learn about you and reach out if they want to ask you any questions? And then also, how can listeners be useful to you?
- IGItamar Gilad
To find it, uh, you can go to itamargila.com or to evidenceguided.com and you will find the book and you'll find me. Uh, best value to me? Try it out, just take some of these ideas, bring them back to your office, talk with your colleagues. Say, "What do you think we should do about this?" Just give it a go and reach back to me, tell me. I- I'm easy to find in my website. Tell me what happened. I'm, I'm really interested.
- LRLenny Rachitsky
Amazing. Itamar, thank you again so much for being here.
Episode duration: 1:12:51
Install uListen for AI-powered chat & search across the full episode — Get Full Transcript
Transcript of episode aJWSn-tz3jQ
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome