Lenny's PodcastNicole Forsgren: How AI moves the bottleneck to code review
How AI accelerates coding but the bottleneck moves to review and trust; DORA creator anchors productivity on flow state, cognitive load, and feedback loops.
EVERY SPOKEN WORD
135 min read · 27,183 words- 0:00 – 5:09
Introduction to Nicole Forsgren
- LRLenny Rachitsky
A lot of companies are trying to measure productivity for their teams.
- NFNicole Forsgren
Most productivity metrics are a lie. If the goal is more lines of code, I can prompt something to write the longest piece of code ever. It's just too easy to game that system.
- LRLenny Rachitsky
How do I know if my eng team is moving fast enough, if they can move faster, if they're just not performing as well as they can?
- NFNicole Forsgren
Most teams can move faster, but faster for what? We can ship trash faster every single day. We need strategy and really smart decisions to know what to ship.
- LRLenny Rachitsky
One of the biggest issues we're gonna probably have with AI is learning how much to trust code that it generates.
- NFNicole Forsgren
We can't just put in a command and get something back and accept it. We really need to evaluate it, you know, are we seeing hallucinations? What's the reliability? Does it meet the style that we would typically write?
- LRLenny Rachitsky
So much of the time is now gonna be spent reviewing code versus writing code.
- NFNicole Forsgren
There's some real opportunity there to not just rethink workflows, but rethink how we structure our days and how we structure our work. Now, we can also make a 45-minute work block useful, because getting into the flow is actually kind of handed off, at least i- in part to the machine, or the machine can help us get back into the flow by reminding us of context and generating diagrams of the system.
- LRLenny Rachitsky
What's just, like, one thing that you think an eng team, a product team can do this week, next week to get more done?
- NFNicole Forsgren
Honestly, I think the best thing you can do...
- LRLenny Rachitsky
Today, my guest is Nicole Forsgren. With so much talk about how AI is increasing developer productivity, more and more people are asking, "How do we measure this productivity gain, and are these AI tools actually helping us or hurting how our developers work?" Nicole has been at the forefront of this space longer than anyone. She created the most used frameworks for measuring developer experience, called DORA and SPACE. She wrote the most important book in the space called Accelerate, and is about to publish her newest book called Friction Less, which gives you a guide to helping your team move faster and do more in this emerging AI world. Her core thesis is that AI indeed accelerates coding, but developers aren't speeding up as much as you think, because they still have to deal with broken builds and unreliable tools and processes, and a bunch of new bottlenecks that are emerging. In our conversation, we chat about her current best and very specific advice for how to measure productivity gains from AI, signs that your team could be moving faster, what companies get wrong when trying to measure engineering productivity, how AI tools are both helping and hurting engineers, including getting into flow states, her seven-step process for setting up a developer experience team at your company, how to get buy-in and measure the impact of a team like this, and a ton more. This episode is for anyone looking to improve the performance of their engineering teams. If you enjoy this podcast, don't forget to subscribe and follow it in your favorite podcasting app or YouTube. It helps tremendously. Also, if you become an annual subscriber of my newsletter, you get a year free of 15 incredible products, including Lovable, Replit, Bolt, Indata and Linear, Superhuman, Descript, Whisperflow, Gamma, Perplexity, Warp, Granola, Magic Patterns, Raycast, ChatBRD, and Mobit. Head on over to lennysnewsletter.com and click Product Pass. With that, I bring you Nicole Forsgren. This episode is brought to you by Mercury. I've been banking with Mercury for years, and honestly, I can't imagine banking any other way at this point. I switched from Chase and holy moly, what a difference. Sending wires, tracking spend, giving people on my team access to move money around, so freaking easy. Where most traditional banking websites and apps are clunky and hard to use, Mercury is meticulously designed to be an intuitive and simple experience. And Mercury brings all the ways that you use money into a single product, including credit cards, invoicing, bill pay, reimbursements for your teammates, and capital. Whether you're a funded tech startup looking for ways to pay contractors and earn yield on your idle cash, or an agency that needs to invoice customers and keep them current, or an e-commerce brand that needs to stay on top of cash flow and access capital, Mercury can be tailored to help your business perform at its highest level. See what over 200,000 entrepreneurs love about Mercury. Visit mercury.com to apply online in 10 minutes. Mercury is a fintech, not a bank. Banking services provided through Mercury's FDIC-insured partner banks. For more details, check out the show notes. Here's a puzzle for you. What do OpenAI, Cursor, Perplexity, Vercel, Plat, and hundreds of other winning companies have in common? The answer is they're all powered by today's sponsor, WorkOS. If you're building software for enterprises, you've probably felt the pain of integrating single sign-on, SCIM, RBAC, audit logs, and other features required by big customers. WorkOS turns those deal blockers into drop-in APIs with a modern developer platform built specifically for B2B SaaS. Whether you're a seed stage startup trying to land your first enterprise customer or a unicorn expanding globally, WorkOS is the fastest path to becoming enterprise ready and unlocking growth. They're essentially Stripe for enterprise features. Visit workos.com to get started or just hit up their Slack support where they have real engineers in there who answer your questions super fast. WorkOS allows you to build like the best with delightful APIs, comprehensive docs, and a smooth developer experience. Go to workos.com to make your app enterprise ready
- 5:09 – 8:33
The concept of developer experience (DevEx)
- LRLenny Rachitsky
today. Nicole, thank you so much for being here and welcome to the podcast.
- NFNicole Forsgren
Thank you. It's so good to be here.
- LRLenny Rachitsky
It's so good to have you back. I was just watching our first episode, which we did two and a half years ago. I was watching it, and I was both shocked and not shocked that we barely talked about AI. Uh, the episode was called How to Measure and Improve Developer Productivity and we got to AI barely, like an hour in, and we're just like, "Hmm, I wonder what's gonna happen with AI in productivity?" Does that just blow your mind?
- NFNicole Forsgren
Yeah, because, I mean, it was just hitting the scene. It was the topic of so much conversation, and at the same time, so many things don't change, right? So many things are still important. So many things are the same. Um, yeah, it's also a little wild that it's been two and a half years. Where does, where does time go? Time is a social construct. (laughs)
- LRLenny Rachitsky
(laughs) Yeah, it was like most of our conversation was just, like, questions. Like, "Well, how might this impact people? How will we change..."... the way we build product. And now, it basically was not, it was barely a thing back then. Now, it's the only thing that I imagine people wanna talk about when they talk about engineering productivity. That's where-
- NFNicole Forsgren
Oh, yeah.
- LRLenny Rachitsky
... we're spending a lot of our time focusing on today. The reason I'm excited about this conversation, it feels like there's been so much money poured into AI tools, increasing productivity. All the fastest growing companies in the world are these engineering AI tools. And now more and more people are just asking this question of just, like, what gains are we getting out of this? How much is this actually helping us be more productive? How do we become more productive? You've been at the center of this world for longer than anyone. You've, uh, invented so many of the frameworks that people rely on now. So, I'm really excited to have you back and to talk about this stuff. I wanna talk, I wanna start with just, like, this term DevEx, which is a, something that comes up a lot in this, in this whole space. So we're gonna... And we're gonna hear this term a bunch in this conversation. Could you just explain what is DevEx, this term DevEx?
- NFNicole Forsgren
So DevEx is a developer experience. And when we think about developer experience, it's w- we're really talking about what it's like to build software day-to-day for a developer, right? So the friction that they face, the workflows that they have to go through, um, any support that they have. And it's important because when DevEx is poor, everything else just isn't gonna help, right? The best processes, the best tools, the best whatever magic you have, right? If, if the DevEx is bad, everything kind of tanks.
- LRLenny Rachitsky
And so within DevEx is productivity. And I think the key insight-
- NFNicole Forsgren
Yeah.
- LRLenny Rachitsky
... that you had and other folks in the space have had is not just, like, productivity, but there's also engineering happiness. And, and we're gonna-
- NFNicole Forsgren
Yeah.
- LRLenny Rachitsky
... get into a lot of these parts. But just maybe speak to if there's productivity and there's broader components to engineers being successful at a company.
- NFNicole Forsgren
Yeah. And I love that point, right? Because productivity, first of all, is hard to define anyway. But if, if you're just looking at, like, output, you can get there a lot of different ways. But if you're getting there in ways that are high toil or high friction, then at some point a developer's gonna burn out. Or if it's, you know, super high cognitive load, if it's hard to even think about what you're doing 'cause you're concentrating on, like, the ma- the mechanics of, you know, the plumbing of something, then you don't have the brain space left to come up with, like, really innovative solutions and, and questions. And so I love that it's kind of this self-reinforcing loop in terms of you do more work, you do better work, and it, it's better for people, it's better for, you know, the systems, it's better
- 8:33 – 12:02
Flow state and cognitive load in the age of AI
- NFNicole Forsgren
for our customers.
- LRLenny Rachitsky
Something... I was gonna get to this later, but I wanna actually get to this right now. This idea of flow state for engineers. So I was an engineer actually early in my career. I went to school for computer science. I was an engineer for 10 years. The best part of the job is, for me, was just this flow state you enter when you're coding and building and just things feel, like, so fun. It feels like AI is making that harder in a lot of ways because there's all these agents you're working with now. There's all this code that's kind of being written for you. Talk about just the importance of flow state to a developer, happiness, develop productivity, and just what you've seen AI impacting, how you've seen AI impacting that.
- NFNicole Forsgren
A lot of times... Well, there are lots of different ways to talk about DevEx, right? One way to talk about it is kind of three key things that have components that are important of themselves, but they also kind of reinforce each other. So flow state is one of them, cognitive load is another, and then feedback loops are another. And so I think, you know, when you touch on this, your question about flow state is a really good one. And I'll, I'll admit, you know, we're just a few years into this. We're still figuring out what the best flow state and, and, like, cognitive, uh, requirements are for people in this because, to your point, sometimes we're getting interrupted all the time, right? You don't just get in the flow and lock down and write a whole bunch of code and do, like, the typing of a whole bunch of code as much anymore. Instead, you're kind of creating a prompt, uh, getting some code back and reviewing the code, trying to integrate what's happening in the system. Um, and that can really interrupt. At the same time though, it can contribute to flow if... And I've, I've seen some senior engineers pull together some tool chains that are really incredible where they figured out how to kind of keep the flow going, right? The fast feedback loops really, really w- work well for them. They can kind of assign out different pieces, uh, to agents. It helps them keep in the flow in terms of instead of details and line by line writing, they're in the flow in terms of, "What's my goal? What are the pieces that I need to get there? How quickly can I get there so then I can step back and kind of evaluate everything and then dive back in and, you know, fix some pieces?"
- LRLenny Rachitsky
Is there anything more you could say about this engineer that figured out this really cool workflow, about just what that looks like?
- NFNicole Forsgren
So I've spoken with a handful of them and I've kind of watched them work. I haven't built it myself yet. It's on my list. Um, so they've been able to set up this, like, really incredible workspace and workflow where... Like, right now a lot of us, you know, play around with tools and we'll, like, put in a prompt and we'll get a few lines back, or maybe we'll put in a prompt and we'll get, like, whole programs back. Well, what they can do is they can... Many times I'll see them say, uh, to kind of help, help prime it, you know, "This is what I want to build. It needs to have these basic architectural components. It needs to have this kind of stack. It needs to follow, like, this general workflow. Help me think that through." And it'll kind of design it for it. And then for each piece, it'll assign an agent to go work on each pace in parallel. And then it'll say, oh, and upfront, you know, "These need to be able to work together, make sure it's architected correctly, make sure we use, you know, appropriate, uh, APIs and conventions." Then at the end... Uh, and then they can, like, let it run for a few minutes and they can think through something else that's interesting or they anticipate is gonna be hairy and they come back to something that's, I mean, probably a little better than vibe coded, right? Because, like, because they were so systematic about it upfront, they're much closer to something that looks like production code.
- LRLenny Rachitsky
So what I'm hearing is spending a little more time upfront planning what all these-
- NFNicole Forsgren
Mm-hmm.
- LRLenny Rachitsky
... AI engineers are doing versus just, like, pow- powering through and just figuring out as you go.
- 12:02 – 21:19
Challenges in measuring productivity with AI
- NFNicole Forsgren
Yeah.
- LRLenny Rachitsky
Okay, cool. Let me get to this kinda core question that I think has a lot of people's minds at-A lot of companies are trying to measure productivity for their teams. Is this improving our productivity? Is this hurting our productivity? So let me just start with this question. How are people doing this wrong currently when they try to measure their productivity gains with AI?
- NFNicole Forsgren
I will say most productivity metrics are a lie (laughs) . Uh, you know, it's, it's really tricky because historically... Now look, lines of code has always been a bad metric, right? But many folks-
- LRLenny Rachitsky
It has.
- NFNicole Forsgren
... still use lines of code as some proxy, yeah, as some proxy for, uh, output or productivity or complexity or something, right? Well now, for many of the systems that they would sometimes like whisper and not super talk about that uses lines of code, it's just blown out of the water because, w- what do you mean by lines of code? Uh, if the goal is more lines of code, I can prompt something to write, you know, the longest piece of code ever and add tons of comments and, you know, we know that agents and, and LLMs tend to be very verbose, uh, by definition. And so it's just too easy to game that system and then introduce complexity and technical debt into all of the work that you're doing. I will say there are some things that we can kind of watch and pay attention to because... So lines of code as a productivity metric isn't great, right? It's pretty bad. But now it's kind of more relevant if we can tease out which code came from people and which code came from AI-
- LRLenny Rachitsky
Mm-hmm.
- NFNicole Forsgren
... because now we can answer downstream questions. What is the code survivability rate? What is the quality of our code? Is our code being fed back into train systems and for that code that's, that's retraining systems later, especially if we're doing like fine-tuning and local tuning, how much of that is machine generated, right? What types of loops is that creating and what types of patterns or biases might it be inadvertently introducing? So on the one hand like it's not good as a productivity metric, but it can be useful, right? And I'll, I'll even say the same for DORA, right? So I have done DORA metrics, their speed metrics, their stability metrics. If that's all you're looking at, it's, it's not gonna be sufficient anymore because AI has now changed the way we think about feedback loops. Right? They need to be much faster. Now what DORA is meant for, uh, you know, kind of assessing the pipeline overall and just speed and stability, it's still, that works. But we can't just blindly apply the existing metrics we've used before because we'll miss super important phenomenon and, and changes in the way people work.
- LRLenny Rachitsky
Interesting. So, so you, you invented DORA. That was kind of the main framework people used for a long time to measure productivity. And then there's SPACE. There's, uh, COR4. There's probably others. So what I'm hearing here is all these are kind of out of date now where AI is contributing large portions of code.
- NFNicole Forsgren
I will say if it is a prescriptive metric-
- LRLenny Rachitsky
Mm-hmm.
- NFNicole Forsgren
... it needs to be used only in the way it was prescribed.
- LRLenny Rachitsky
Mm-hmm.
- NFNicole Forsgren
So DORA4, there are four key metrics. There's, uh, two speed metrics, uh, deployment frequency and lead time, so code commit to code deploy. There's stability metrics, uh, MTTR and change fail rate. If those are used to assess the speed of the pipeline and the general performance of the pipeline, that's great. If you're trying to use those to understand, uh, 'cause implied in that is feedback loops, right? Because you used to kind of like to get feedback from customers, um, but we can't just use that blindly now when we're using AI as an example because we have feedback loops much earlier and not even just at like the local build and test phase. We have feedback loops throughout and even sometimes in the middle of some of the pipeline that we really wanna leverage in ways that weren't as useful before. I won't say they weren't possible, but like we just didn't really focus there. So those are prescriptive metrics. When we think about SPACE, SPACE is a framework. It doesn't tell you what metric to use. So I'll say sometimes people get real frustrated 'cause I didn't tell them what to measure, right? Uh, but now I think that's the power of it. We're actually seeing that SPACE applies fairly well in these new emergent contexts like AI, because we still wanna look at... So SPACE is an acronym, right? So we still wanna look at satisfaction. We still wanna look at performance. What's the outcome? Um, we still wanna look at activity. Yes, you know, in some ways lines of code and number of PRs can be useful for something, right? Uh, or number of alerts or number of, y- things, activities or accounts. C is communication and collaboration. This is also super important and useful because it's how our systems communicate with each other and also how our people do. You know, what proportion of work is being offloaded to a chatbot versus talking to the senior engineer on the team? More isn't always better. Less isn't always better. It depends. And then efficiency and flow. Can people get in the flow? How much time does it take to do things? What is the flow like through our system? And here I would probably add a couple dimensions, right? So, uh, chatting with some of the early authors to say, you know, trust. Not to say trust wasn't important before, but now it is very, very front of mind, right? Before you, you know, build your code, like if the co- if the compile comes back, you're fine, and like that's the way it is. LLMs are non-deterministic, right? Now we can't just put in a command and get something back and accept it. We really need to evaluate it. So, you know, are we seeing hallucinations? What's the reliability? Does it meet like the style that, that we would typically write? And if it doesn't meet, is that fine? So that's my kind of, kind of it depends answer. Prescriptive, y- you gotta make sure you're using it fit for purpose, right?
- LRLenny Rachitsky
And we're gonna get to your current thinking on the best way to do this stuff. You have a book coming out that explains the how to do this well, so we're gonna get to that. One thing I wanted to highlight, in our last chat that we had, you, uh, you highlighted one of the biggest issues we're gonna ex- probably have with AI is trust. Understanding and learning how much to trust code that it generates and also how much, you said this two and a half years ago, that so much of the time is now gonna be spent reviewing code versus writing code. That's exactly what I'm hearing.
- NFNicole Forsgren
I think it'll be interesting to see how that impacts the way we structure work moving forward. You know, we were talking about flow state and cognitive load.... now that our attention has to focus on things at certain times and, and it's broken up from how we used to do it, um, I think there's some real opportunity there to not just rethink workflows, but rethink how we structure our days and how we structure our work.
- LRLenny Rachitsky
Can you say more about that? Just what does that ... What do you, what are you thinking will be happening? Where do you think things go? What are you seeing working?
- NFNicole Forsgren
It's, uh, so purely speculative. Um, but for example, uh, Gloria Mark has done some really good work on attention and deep work, and humans can get about four hours of good deep work a day, and like, that's about it.
- LRLenny Rachitsky
Yeah.
- NFNicole Forsgren
Like-
- LRLenny Rachitsky
I feel that.
- NFNicole Forsgren
... and that, that's like kind of the upper limit-ish for the most part. And I'm sure people are gonna be like, "Well, I am superhuman. I can do this." Yeah.
- LRLenny Rachitsky
What if you take 20 grams of creatine?
- NFNicole Forsgren
Right. (laughs)
- LRLenny Rachitsky
Just kidding.
- NFNicole Forsgren
What if we micro-dose?
- LRLenny Rachitsky
Yeah, exactly. (laughs)
- NFNicole Forsgren
Um, yeah. So in the context of knowing we have about four hours of good deep work, and I'm sure s- many of us have probably hit this, right? We're like, we have good periods, like maybe it's morning, maybe it's afternoon for folks, and then you hit a time where you're like, "I'm gonna clean out my inbox because that is all I can do right now." Right? Like, "I can be functional, but I'm not gonna come up with my best innovative, problem-solving, authoring, code-writing work." A lot of times, the way to do that and to get into it is to have these long chunks to get into flow and to get that deep work, right? And it's usually, this is I'm making ... I'm like hand waving, right? Two hoursish minimum, right? Like an hour can be tricky 'cause it can take time to get into that state. Okay, well when we think about what it used to be like back in the olden days, three years ago, three and a half years ago, we could block off four hours of time and we could probably get two or three hours of really good work done. Now, 'cause we were just focused, right? There were no interruptions, minimal interruptions. Now, the nature of writing code in systems itself is interrupt-driven or, or full of interruptions at least, right? Because you start something and then it interjects. And so how do we think about that? Does that mean that a four-hour work block is still useful? I mean, probably, but does that mean that now we can also make a 45-minute work block useful? Because getting into the flow is actually kind of handed off, at least in part, to the machine, or the machine can help us get back into the flow by reminding us of context and generating diagrams of the system and, you know, all the things. And so I think that's a really, really interesting area that's just ripe for questions and opportunity. And, and please folks, do this research and, and come back to me 'cause (laughs) it might not make my list, but it's such a great question.
- LRLenny Rachitsky
That is so interesting. Essentially everyone, every engineer is turning into an EM, engineering manager, coordinating all of these junior AI engineers. And so your point is even if you have like a 30-hour block, you can't get deep into code, but you can unblock all these AI engineers that are running off doing tasks. Plus, your point is they give you, they remind you of just like, "Here's where you left off. Okay, you can just jump into this code, maybe make some tweaks."
- NFNicole Forsgren
Yeah.
- LRLenny Rachitsky
So interesting.
- 21:19 – 22:20
The importance of developer experience for business value
- LRLenny Rachitsky
Let me zoom out a little bit and before we get into your framework for how to approach developer experience, the latest thinking you've got, beyond just like obviously engineering, engineers doing more is great, what's, what's your best pitch for why companies should really, really, really focus on developer experience?
- NFNicole Forsgren
I hate to say return on investment, but like the business value is the opportunity here is huge, right? In general, we write software, well, for fun and for hobbies, right? But we also have software because it meets a business need. It helps us with market share. It helps us, uh, attract and entertain customers. It helps us do all of these things, and you know, I think DevEx is important because it enables all of that software creation. It enables all of that problem-solving. It enables the super rapid experimentation with customers that before, you know, you, you'd need a while for a prototype and maybe a little bit longer to actually flight it through an AB test on a production system. I mean, you can do it in hours right now.
- 22:20 – 26:49
Common issues and solutions in developer experience
- NFNicole Forsgren
- LRLenny Rachitsky
Getting maybe the opposite end of the spectrum, getting very tactical before we get into the larger framework. What's just like one thing that you think an eng team, a product team can do this week, next week to help their developer experience maybe get more done?
- NFNicole Forsgren
Honestly, I think the best thing you can do is go talk to people and listen. And I love that, you know, the audience of this podcast is primarily PMs 'cause they tend to be really good at this. And I would say start with listening and not with tools and automation. So many times companies are like, "Well, I'm just gonna build this tool or gonna, I'm gonna build this thing." Often you build a thing that you yourself have had a challenge with or that like is easy to do, easy to automate. And if you just go talk to people and ask the developers, like, "Think of, think of yesterday. What did you do yesterday? Walk me through it. What were the points that were just delightful? What were the points that were really difficult? Where did you get frustrated? Where did you get slowed down? Where was there friction?" And if you go talk to a handful of people, a lot of times you can surface a handful of things that are a relatively low lift and still have impact. Um, or you can identify a process that's unnecessarily complex and slow.
- LRLenny Rachitsky
So the listening tour here almost is you want to help your teams move faster, be happier eng teams. Your advice is just before you do anything, just like go ask them, "What is bothering you?"
- NFNicole Forsgren
Go ask them.
- LRLenny Rachitsky
(laughs)
- NFNicole Forsgren
Yeah. And trust me, like most developers are gonna be more than happy to tell you what's broken and what's bad.
- LRLenny Rachitsky
Mm-hmm.
- NFNicole Forsgren
And yeah, I mean, I will, I'll say there was one company that I had worked with. I remember they s- they had a process that was like really difficult and it was on an old mainframe system and they were gonna have to like replat the whole thing and so they never went to work on it or talk about it. Uh, everyone hated it 'cause it was this huge delay. I mean, all they had to do was change a process. Sometimes all you have to do is change a process. And they changed it so that (laughs) instead of, I think it was someone had to like print it out and walk it down three or four flights and then get approval and then someone else had to like walk it back up, and so it was just that interim.They didn't repla- anything. They didn't redesign anything major. They just had it send an email.
- LRLenny Rachitsky
Let me push on that and I'm curious just what are the most common things people do? Like, if you were just starting on, okay, we need to focus on engineering experience, what do you find are the most, like, I don't know, two or three most common improvements companies need to make?
- NFNicole Forsgren
I will say, you know, I'll (laughs) kind of echo that process. There's almost always a process that can be improved and-
- LRLenny Rachitsky
Hm.
- NFNicole Forsgren
... that can be approved, improved without a lot of engineering lift or a lot of engineering headcount, right? Uh, most large companies in particular have something that is several, several steps. It's the way it is because it's the way it is, but that's no longer the way it is, right? Um, and even small companies, sometimes it's just a little too YOLO and you don't know what it is and you're kind of chasing everyone around. So if you can create a very lightweight process, that can also be helpful. That can be one of the best places to start, especially if you have limited, uh, exposure to the whole rest of the org, right? Sometimes just a team process can help. Um, I will say from a business leader standpoint, a lot of what you can do is provide structure and support for this organizational change. Communicate what you're doing, communicate what the priorities are, communicate why this is important, celebrate wins. Because if folks try to do this just like a one-off side, fully isolated project, it's really challenging to get some good momentum and get people to care and to get them stay involved, right? Because it feels like just another interim internal project that isn't gonna matter or that isn't gonna get celebrated, but it has these huge, uh, upside, uh, potential returns for the business.
- LRLenny Rachitsky
It's interesting. What I'm hearing here is nothing about tools or technologies. It's not like move to this cloud. It's not like install this new deployment system. It's processes and people and org and morale.
- NFNicole Forsgren
Yeah. Now, there will be technical pieces that are very important, right? Uh, especially now with AI, right? We're, we're rethinking how build and test systems work. We're rethinking feedback to users so that it's very, very customized in terms of what is shared and when it is shared. There are a lot of technical pieces that are involved, but that's not the only thing, right? It's necessary, but not sufficient. And that doesn't have to be the place
- 26:49 – 29:52
Signs your eng team is moving too slow
- NFNicole Forsgren
that you start.
- LRLenny Rachitsky
I'm gonna ask you, I have a hard question I wanna ask you that I thought of as you were talking.
- NFNicole Forsgren
Okay.
- LRLenny Rachitsky
I feel like this is the question that most founders and heads think about, and the question is just like, how do I know if my eng team is moving fast enough, if they can move faster, if they're just not performing as well as they can? What are just maybe smells, signs that tell you, yeah, my team should be moving faster versus like, this is just the way it works. This is as fast as they can move.
- NFNicole Forsgren
Most teams can move faster, right? So, and also, uh, given what we know about cognitive load, not all speed gains are necessarily good, right? Um, or the upside is gonna be kind of limited, right? Once you hit kind of a certain point. Most people are not even near that point. Uh, I don't know a single team, frankly. But how do you know? Uh, you know if you're always hearing about bills breaking, flaky tests, uh, overly long processes, if you have to request a new system or if you need to provision a new environment or if it's really, really hard to switch tasks or switch projects, right? So if someone has the opportunity to go work in another part of an org and they don't, for reasons that are unclear and, like, not political, and anyone says anything about the system, that's usually a pretty good smell, uh, that there's friction somewhere because once you finally figure out your system and you're able to get work done, you don't... The switching costs can often be really, really high to go anywhere else. And so sometimes people will do that, but, you know, I've worked with companies where switching orgs within the company, uh, you had to basically pay the same tax as a new hire because the systems were so different and they were so full of friction and it was so difficult to do so many things.
- LRLenny Rachitsky
I love the first part of your answer especially, which is you can always move faster. I think every founder is gonna love hearing that.
- NFNicole Forsgren
(laughs)
- LRLenny Rachitsky
(laughs) Uh, to your point, though, there's diminishing returns over time.
- NFNicole Forsgren
Yeah. And- and you don't know about the quality, right? So, like, I think that's the other side is that you can always move faster, but faster for what? Are we making the right business decisions? And I think, you know, that's especially where PMs come in. You c- we can ship trash faster every single day. We need strategy and really smart decisions to know what to ship, what to experiment, uh, with what- what features we wanna do in what order and what rollout, right? The strategy is the core piece. And then think about speeding that up. If we don't have the other pieces in place, I mean, garbage in, garbage out.
- LRLenny Rachitsky
I'm gonna follow that thread, but before I do that, just to mirror back what you shared. So signs that your team... There's a lot of low-hanging fruit to improve this-
- NFNicole Forsgren
Mm-hmm.
- LRLenny Rachitsky
... the productivity of your team is builds are always breaking, there's flaky tests that are constantly incorrect, false positives, uh, it's hard to context switch between different projects, the system. You just hear people talking about the system is just really hard to work with. Is that roughly
- 29:52 – 33:32
How AI is improving productivity
- LRLenny Rachitsky
right?
- NFNicole Forsgren
Yeah.
- LRLenny Rachitsky
Cool. Okay, so going back to the point you just made, there's a sense that AI is making teams so much faster because it's writing all this code for them, you're gonna have all these asynchronous agents, engineers working for you. Feels like a core part of your message is that's just a one part of engineering work. There's so much more including figuring out what to build, alignment internally. Maybe just speak to just, like, there is a lot of opportunity to improve engineering's performance, productivity, but there's so many other elements that are not improved through AI.
- NFNicole Forsgren
Yes. Uh, or- or could be in the future, right?
- LRLenny Rachitsky
Mm-hmm.
- NFNicole Forsgren
Like, I think there are a lot of ways that we can pull in AI tools to help us refine our strategy, refine our message, think about the experimentation methods, or, you know, uh...... targets of experimentation or think about our total addressable market, right? But we need to have that strategy and plan fairly well-aligned, right? Or at least have, like, two or three alternatives that you wanna test. Because now the engineering can go, or at least the prototyping especially, can go much, much faster, right? Right? We can throw out prototypes. Uh, we can run A/B tests and experiments that are customer-facing, right? Assuming that, you know, we have the infrastructure in place, which allows us to learn and progress much faster before, right? Like, it- it... Some places, it used to take, you know, months to get something through production, to do A/B testing and get feedback. We can do this in a day or two, right? Definitely under a week. But we wanna make sure that we're building and testing the right things. Are we partnering with the right fol- Do we have the data that we need, right? And I will say, AI can actually be a pretty good partner there if you keep- if you have, like, a good conversation with it, and then also (laughs) check with your experts, right? What type of data should I be looking at? What type of instrumentation do I need? What type of analysis can I do? Uh, because then you can also go to your data science team and say, like, "I'm planning on doing this. I'd like to..." 'Cause let's not just yell, "A/B test," right? 'Cause that can be... It, it's a shame to do a large test and end up disrupting users or disrupting customers or (laughs) breaking privacy or security, uh, protocols and also end up with data that's unusable, right? Because you just can't get the signal that you're looking for. But now I'm also seeing people kind of accelerate that into a few days versus a few weeks. And so they can kind of start those key stakeholder discussions from a much more informed, kind of filled-out space.
- LRLenny Rachitsky
Today's episode is brought to you by Coda. I personally use Coda every single day to manage my podcast and also to manage my community. It's where I put the questions that I plan to ask every guest that's coming on the podcast. It's where I put my community resources. It's how I manage my workflows. Here's how Coda can help you. Imagine starting a project at work, and your vision is clear, you know exactly who's doing what and where to find the data that you need to do your part. In fact, you don't have to waste time searching for anything, because everything your team needs, from project trackers and OKRs to documents and spreadsheets, lives in one tab, all in Coda. With Coda's collaborative all-in-one workspace, you get the flexibility of docs, the structure of spreadsheets, the power of applications, and the intelligence of AI, all in one easy-to-organize tab. Like I mentioned earlier, I use Coda every single day, and more than 50,000 teams trust Coda to keep them more aligned and focused. If you're a startup team looking to increase alignment and agility, Coda can help you move from planning to execution in record time. To try it for yourself, go to coda.io/lenny today and get six months free of the Team Plan for Startups. That's C-O-D-A dot I-O slash Lenny to get started for free and get six months of the Team Plan. Coda.io/lenny.
- 33:32 – 36:35
Real examples of productivity improvements
- LRLenny Rachitsky
I love that you work with a bunch of different companies and a bunch of different types of businesses. I think I'll... Very few people get to see inside a lot of different places. What kind of gains are you just seeing in terms of increased productivity with AI? Like, how real, like, how big of a gain have you seen?
- NFNicole Forsgren
I'd say it's real, and I would also say we don't have great measures for it yet. We're still trying to figure out what to measure and what that looks like. Uh, one of the best is going to be velocity, right? All the way through the system. How quickly can you get, uh, a feature or a product or something through the system so that you can then experiment and test, right? Either from, like, idea to, like, final end, or even kind of a, uh, feature and a piece through the system so we can test. That's really good. Now, that's also hard (laughs) to tie back directly to, like, a particular AI tool in the hands of a particular developer. Um, but there are some other things that, that we can look at and we can see, um, and, and that I've seen is, again, this kind of rapid prototyping. Um, I hate lines of code, but I'm gonna use (laughs) lines of code. Um, we do see... I, I know I worked with, uh, some folks who, who had kind of a, a whole set of companies they were looking at, and they found that, uh, AI was generating, like, significantly more code for the people who were using it regularly. But then they also found that for folks who were, like, you know, regular users of AI coding environments, AI IDEs, the, the tool kind of gave them more code, and then the engineers themselves, the increase was double what the, uh, coding agent had given them. And so one, I'd say probably 'cause it's kind of a secondary or knock-on or, or just a smell, right, is it can unblock you. It can speed up the work that you would already do, right? I know sometimes when I work, it's like the first few minutes, it's hard for me to start. But once I get started, I'm there. And so they're really good at unblocking and unlocking that.
- LRLenny Rachitsky
Something I've seen people on Twitter sharing is how good, uh, OpenAI Codex especially is at finding really gnarly bugs. And I think it was Karpathy that shared he was, like, so stuck on a bug and no AI tool could figure it out, and then the latest version of Codex spent, like, an hour or something (laughs) looking into it and found it for him.
- NFNicole Forsgren
Yeah. I'm, I'm hearing incredible things like that, right? Well, and even also, you know, writing unit tests and spinning up unit tests and-
- LRLenny Rachitsky
Mm-hmm.
- NFNicole Forsgren
... uh, creating documentation and cleaning up documentation, because I know now people are like, "Oh, well, we have agents. I don't need to, I don't need to read the docs 'cause there's the code there." Turns out, uh, agents rely on good data, right? Because it's, it's all about how they've been trained or how they've been grounded. Um, and better data gives you better outcomes. And some of that data includes documentation and comments.
- LRLenny Rachitsky
Mm-hmm.
- NFNicole Forsgren
And the better documentation and the better comments you have, the better performance you're gonna get out of your AI tools.
- LRLenny Rachitsky
And AI can help you write that documentation. I've been working with Devin a little bit and it's really good at that stuff.
- NFNicole Forsgren
Yeah.
- 36:35 – 43:40
Introducing her new book, Frictionless
- LRLenny Rachitsky
Okay. Let's talk about this framework, this book. So you're publishing a book called Friction Less, which sounds like a dream. How do you-... created, uh, a, a dev team that's frictionless. It's called Frictionless: 7 Steps to Remove Barriers, Unlock Value, and Outpace Your Competition in the Age of AI. There's a seven-step process to this. Walk us through this. Maybe give us just context on this book, what, who it's meant for, what problem it solves, and then the seven steps.
- NFNicole Forsgren
So I will say, I, uh, also wrote this with Abi Noda, who has just, uh, of DX, he has incredible experience in this space, right? He's worked with hundreds of companies. And so it was kind of nice bouncing ideas off of him, and, you know, also thanks to all of the, uh, engineering leads and DevEx leads and CTOs and engineers that we talked to, to kind of make sure that we, our smells were right, right? And so who is this book for?
- LRLenny Rachitsky
Let me actually take a, let me take a tangent on Abi and DX, since you mentioned him. Uh, this is super interesting, and I think it connects so directly with this conversation. So Abi started this company called DX, which is such a great name for a company around developer experience. They just sold the company for a billion dollars to Atlassian. Uh, it's a very high multiple on their ARR. It, to me, shows exactly why this conversation is so valuable, just how much value companies are putting into improving developer experience. Atlassian would spend a billion dollars on this, like, a early stage-ish startup. It was doing really well, and people loved it, but it was, like, early stage-ish. A billion dollars. Uh, and now, and the idea is they have all these companies working using Jira and other products, uh, they're all trying to figure out how do we measure productivity. Uh, it's worth a lot of money to them. So, and I know you were an early advisor to them too, so-
- NFNicole Forsgren
Yeah.
- LRLenny Rachitsky
... it just shows us how important this is.
- NFNicole Forsgren
Yeah. Well, and I think it also shows us, um, how much value you can get out of this, right?
- LRLenny Rachitsky
Mm-hmm.
- NFNicole Forsgren
Like, there's so much low-hanging fruit, there's so much unlocked potential, and it's hard to know where to start a lot of times, even in... I've been at large companies that have a lot of expertise and a lot of really, really smart people. But if you haven't kind of been in this space and thinking about it this way, it's hard to know where to start, or it's easy to make simple mistakes upfront that mean, like, you kind of need to start over later. Um, so I guess, which kind of also brings us back to, you know, who is this book for? It's for anyone that cares (laughs) about DevEx, right? So definitely technology leaders, um, anyone who's trying to kick off a DevEx program or is working on a DevEx, DevEx improvement program. Um, I think it's particularly the, particularly relevant for PMs, because if you're PMing something that involves software, building and creating software, improving DevEx will only help your team. And also, uh, you have key skills and insights and instincts that are so important to DevEx that many times, I will say, I've seen engineering teams just miss.
- LRLenny Rachitsky
Hm. Okay, what is the framework? What are the steps? Where do people start?
- NFNicole Forsgren
Uh, so the book goes through a seven-step process and then also kind of provides some, some key kind of, uh, principles at the end. Step one is to start the journey, right? So assuming you're kicking off, you can start the journey, and this involves what we have already talked about, right? Go talk to people, have a listening tour, synthesize what you learn, uh, visualize the workflow and tools, right? Like, get a handle on kind of what the current state is. Uh, step two is to get a quick win, right? So start small, get a quick win, uh, pick the right projects, share out what you've done. Um, step three is using data to optimize the work, right? So kind of establish some of your data foundation, find the data that's there, start collecting new data, um, use some surveys for some really fast insights. We include, uh, example surveys. Uh, step four, that is to decide strategy and priority. Once you have some data, then you need to know, of all the things that are potentially broken, and you've already gotten your quick win, of all the things that are left, what should I do next? And so we walk through some evaluation frameworks there. Step five is to sell your strategy. Once you've decided, now you have to kind of convince everyone else, right? So now you want to get feedback, you want to share why this is the right strategy right now. Uh, step six is to drive change at your scale. So here, we address folks that have local scope of control, right? If you're starting on just a dev team, you wanna do it yourself kind of grassroots effort, uh, or global scope of control, right? If you're, you know, the, the VP of developer experience or something, like, there are some things that you can leverage for a top down. Um, and then how do you drive change when you're kind of somewhere in the middle? 'Cause you can leverage both, both types of strategies. Uh, and then step seven is to evaluate your progress and show value, and then kind of, uh, loop back around. And I will say that we wrote this so that you could kind of jump into any step, where, wherever you are right now, right? Like, if you're kicking off a team, you'll, or an initiative, you'll probably wanna start at step one. You should definitely start at step one. If you're joining an existing initiative, you could jump into, uh, picking the priority or, uh, implementing the changes. So those are, so those are the seven steps. Um, there are a few practices that we also recommend. So thinking about resourcing it, uh, change management, uh, making technology sustainable, and then also bringing a PM lens to this, right? How can we think about developer experience as a product, and how do we think about the metrics that we have as a product?
- LRLenny Rachitsky
Awesome. Okay, I have questions. Point people to the book real quick. What, uh, what's the URL? How do they get it?
- NFNicole Forsgren
Yeah.
- LRLenny Rachitsky
When does it come out?
- NFNicole Forsgren
Yeah, developerexperiencebook.com. So right now, you can sign up for the mailing list. We'll, we'll let you know when it's out on pre-order, and we'll also be sharing pieces of the workbook. So we've got almost a 100-page workbook that goes along with the book. Um, and then it should be out by end of year.
- LRLenny Rachitsky
Okay. So one piece of this is just this term developer experience feels very intentional in that it's not developer productivity, developer, uh, work. It's how do we make developer experiences better at our company? Which includes they get more done, but also they're happier, things like that. So I think that's an important element of this, right?
- NFNicole Forsgren
Yep.
- LRLenny Rachitsky
Okay, good.
- NFNicole Forsgren
Yeah, absolutely, because...Again, it's not just about productivity, right? We talked about this from, from kind of the frame and the lens of we need to be building the right thing, and you wanna be productive, but you also wanna be thinking about ... And this is what engineers are also just really incredibly good at. Give them a problem and don't tell them how to solve it, and then they can solve it better, right? They have, uh, the freedom, they have the innovation, they have the creativity so that they can solve this problem. If it's only about productivity, then it's just like lines of code or number of PRs or whatever, right? But we really wanna talk about value and how do we unlock value and how do we get value faster? And that involves, yes, making them more productive and removing friction because then they have the flow and the cognitive load and the things that
- 43:40 – 45:15
How to get started building a DevEx team
- NFNicole Forsgren
we kind of talked about.
- LRLenny Rachitsky
Awesome. Okay. And then say someone wants to start this team. What does it usually look like? At Airbnb, I remember this team forming and it was just like an engineer or two getting it started and taking charge. What do you recommend as the, the pilot team? And then what does it look like as it grows?
- NFNicole Forsgren
So there are ... There are a few ways to do this, right? So if you're doing it yourself, you could do it with a couple engineers, uh, maybe a, a PM or a, or a PGM or a TPM, um, to kind of help communicate because really comms plans are just so important here. Uh, on a small, on a small scale, right, what we wanna do is look for those quick wins. Look for things that you can do at small scale. Are there ... You know, some folks call them things like paper cuts. Are there small things that you can do to help people see the value and feel the benefit themselves, right? How can a developer's work get better? How can their day-to-day work get better? Kind of build momentum from there. If you're working from a top-down structure and you have the remit, you still want, uh, some, some quick wins, but those quick wins can look a little more global in scale because you have the, the infrastructure or the backing to make, you know, different types of changes that aren't only local. So, you know, an example of a small local change could be just cleaning up your tests and your test suites, right? Any team could do that. Any team could do that. Uh, more, more global scale might be changing a organization-wide process that is just overly cumbersome or throwing some resourcing into, you know, cleaning up the provisioning environment.
- 45:15 – 46:15
The impact of forming developer experience teams
- NFNicole Forsgren
- LRLenny Rachitsky
What kind of impact have you seen from teams like this forming on the engineering teams at their companies?
- NFNicole Forsgren
I ... I'll say I've seen a huge impact, right? Uh, for, for smaller companies, hundreds of thousands of dollars. For large companies, uh, in the billions. Cut well also. We need to learn how to communicate that, right? Like, what does the math look like? Are we ... Many times we can look at saving time, we can look at saving costs. Um, we can look at a lot of different things. We can look at speed to value and speed to market. We can look at risk reduction. Um, but the gains really are there. I, I will mention that it tends to follow something like J-curve, right? So like, you'll have a couple quick wins and it'll look like a big, big win, and then you'll hit a, kind of a little divot where suddenly the really obvious projects, the low-hanging fruit are handled. So now we need to do a little bit of work, right? We ne- We might need to build out a little bit more infrastructure. We might need to build out a little more telemetry so that we can capture the things we wanna capture. And then once we get that done, uh, then we start to see those benefits really
- 46:15 – 48:53
How to measure the impact of DevEx teams
- NFNicole Forsgren
compound.
- LRLenny Rachitsky
So going back to that measurement number, what do you recommend? How do people find these numbers? 'Cause I think that's so much of the power of this is like, "We saved a million dollars doing this." What do you look at to do to figure that out?
- NFNicole Forsgren
You know, I think there are a few different things to keep in mind, right? Who, who is our key audience? And we usually have a few key audiences, right? We really wanna be able to speak to developers because they're the ones that are gonna be using the systems. They'll be partnering with you on either building them or at least providing feedback about what you're doing. Um, and so for them, we often wanna frame this in terms of things they care about. So time savings, right? If something gets faster, they can save time. They can... You know, they don't spend time doing setup when they don't need to anymore. Related to that is reduced toil, right? So compliance and security are super important. Also, many times it requires several, several manual steps that I don't say they're not value-add. They are not value-add from an individual human perspective, right? If we can automate as much as possible, that's great. Um, and, you know, improved focus time. So that's from the developer side of view. Leadership often cares about... They care about those things, right? But they often (laughs) care more about other things. So we could talk about, uh, usually costs in dollars, right? So can we accelerate revenue? What does our, uh, time to value look like? Uh, what is our velocity? How quickly can we get fee- feedback from customers? Um, and for folks and organizations that are in really competitive environments, that can be really compelling because it's all about speed. We could talk about saving money, right? So here we can look at maybe quantifying savings. So, you know, one example is, uh, test and build. If we can clean up a test and build suite. To a developer, they really wanna hear about, uh, time saved and more reliable systems, right? There's less toil because they don't have to keep rerunning tests or kind of go clean up test suites. From the business perspective, cleaning up a test and a build suite can be, uh, cloud cost savings because all of those tests are running somewhere on a cloud. And if they always fail or if they're ... If it's just kind of a waste of spend, that can be useful, right? Uh, recovering some capacity, right? Uh, we can always talk about time and productivity gains, right? So how much, uh, equivalent developer time are we losing on things that are not necessarily value-add, right? And then sometimes we can correlate to business outcomes, and correlate is usually the best we can, we can do here. But there can be some pretty compelling correlations in terms of speeding up time to value and increased market share, for
- 48:53 – 55:16
Measuring the impact of AI tools on productivity
- NFNicole Forsgren
example.
- LRLenny Rachitsky
So let me pull that thread and come back to this, what I think is the biggest question people have right now with AI and productivity, which ... And I don't...I don't think anyone has the answer yet, but I'm curious to get your take of just what should people do today? What's the best approach to understanding what impact AI tools are having on their productivity because they're spending all this money on there. Like, I don't know, what are we getting out of this? And I guess things are moving faster, but I don't know. So if someone had to just like, okay, here's what I should probably try to do, what would be your best advice here for measuring the impact of AI tools on productivity?
- NFNicole Forsgren
I would say it depends. (laughs) Um, and in part it depends on what your leadership chain really cares about, right? Like, we, we're usually pretty good at like figuring out what matters to developers and we could communicate that to them. But if we're trying to just identify two or three data points to really kind of focus on 'cause when we're first starting with data, sometimes it can be challenging. What do they care about? Think about the messaging you've been hearing. Have they been talking about market share, right? Losing market share or con- or competitiveness in the marketplace? If that's it, focus on speed. Think about ways that you can capture metrics for speed for, from like feature to production or feature to customer or feature to experiment, and what that feedback loop looks like. If they're talking about, uh, profit margin all the time, right? Now, we always talk about money, right? 'Cause this is business. But if that seems to be an overarching narrative, look for ways that you can save money and then translate that into, uh, recovered and recouped, uh, headcount cost, right? Or sometimes you'll kind of like reinvent, change a process and then you no longer need as many vendors, right? So, so reductions in vendor spend can also help there. Um, and I, I say also it depends because sometimes something will, they'll say something, right? Like leadership will say something and it, it kind of comes up as a theme. If you can solve a problem that they have or it's like something that they're focused on, if you can slightly reframe it even, right? Like if they're calling everything developer productivity, go ahead and call it productivity. If they're calling it velocity and velocity is what matters to them, think about how to frame this in terms of velocity. If they're talking about transformation or disruption, right? How does this help with the disruption? Because then it will resonate with them. We don't want to make them work to understand what it is that we're doing and the value that we provide.
- LRLenny Rachitsky
That is such good advice. So just to reflect back the advice here is if your company's trying to figure out what sort of impact our AI tool's having on our company, first is just like, what do, what does the company care about most? What do leaders care about most? Could be market share, could be profit margin, could be velocity, we need higher velocity, or we need to transform, transformation. So your advice there is like figure that out based on words and phrases you're hearing. Then figure out ways to measure that. Ways to measure market share growing, uh, profit margin increasing. So it could be, uh, I love these examples like time from feature idea to production or to experiment. So maybe start tracking that. If it's margin, it's like money saved by fewer tests failing or some vendor you don't have to pay for, things like that. And then velocity. And velocity, I imagine that's where things like DORA come in of just like speed of engineering, shipping, or what would you think about there for velocity?
- NFNicole Forsgren
I would say it's actually, you know, one of those, uh, I would pick as, as broad a swath as you can. So if you can go from idea to customer or idea to experiment, how long does that take? How long does it typically take and how long can it take and, and does it take now with, uh, improved use of AI tooling and reduction in friction, right? And that's where I will say, we talk about this a little bit in the book, uh, you know, how do we deal with attribution challenges? So like what was responsible for this? Was it the DevEx or was it the AI? Um, go ahead and disclose that, right? Say, "Yes, we rolled out AI tools. We also had this effort in DevEx." They partnered very closely together. Both of them probably contributed to this, right? Like if we had AI tools without the DevEx improvements, we probably would've had some improvements, but not nearly as much, right?
- LRLenny Rachitsky
If people were starting to do this today, say they're just like, "I wanna start measuring developer experience," are there like a two or three metrics everybody basically needs that you should just start measuring ASAP?
- NFNicole Forsgren
If you're just starting today and if you have nothing at all, uh, talk to people obviously. After that, I would do surveys. Uh, because surveys can give you a nice kind of overall view of the landscape quickly so that you know where the, the big kind of challenges are. And I say that because if, if you're just starting you, you might not have instrumentation through your system, all the metrics. And if you do already, it might not be what you think you want, right? Metrics that were designed without purpose, questionable. Metrics that were designed for another purpose, they might work for what you want, but they might not. So we can't just assume we have them. So, that's one reason I like surveys and, uh, we include an example in the book. You can just ask a few questions, right? How satisfied are you? Uh, what are the biggest barriers to your productivity or what are the biggest challenges to getting work done? Um, and let them pick, you know, maybe either from a set of tools or maybe like a set of, uh, processes and then say, and like let them pick three. Just three. Of those three, how often does this affect you, right?
- LRLenny Rachitsky
Mm-hmm.
- NFNicole Forsgren
Is this hourly? Is this daily? Is this weekly? Is this quarterly, right? Because sometimes it hits you every single day and you're just mad about it. Sometimes it only hits you once a quarter 'cause it's end of quarter, but it's so onerous, right? And then kind of open text, right? Like is there anything else we should know? Um, that, that can give you incredible signal because by making folks prioritize the top three things, if you let them pick everything, like it- it makes the data super, super messy, but three things and how often, you can just come up with a score or a weighted score if you want, and then go kind of dig into where should that data, where should that data be? What data do we need? But also then you've got at least some kind of baseline. Baseline, right? It'll be a subjective baseline, but now you'll know what the biggest challenges are.
- LRLenny Rachitsky
I love how all this just comes back to starting by talking to people, asking them these things, which is very similar to product management and just building great products is have you talked-
- NFNicole Forsgren
Yeah.
- LRLenny Rachitsky
... to your customers? And everyone thinks they're doing this, but most people are not doing this enough.
- 55:16 – 57:59
Survey design for developer experience
- LRLenny Rachitsky
- NFNicole Forsgren
Yeah. Yeah, and I will say, like, one thing that's challenging when you start, when you think about getting data, right? So interviews are data, and that's important. Surveys are a little more quantified, right? 'Cause we can, we can turn it to counts. But that's where we also wanna be careful, right? A lot of folks go to write a survey question and they'll say something like, "Were the build and test systems slow or complicated in the last week?" You're asking four different questions there. If someone answers yes, was it the build, was it the test, was it slow, or was it, like, flaky or complicated or something, right? So it can be really difficult to untangle what the signal is you're actually getting there. And so it is worth time, uh, chatting with, with someone who's familiar with survey design, um, having a conversation with Claude or Gemini or ChatGPT around, uh, here are the survey questions or can you propose some? And then make sure you take a couple rounds. Is this a good survey question? What s- what questions can I answer f- from the data that I get? What problems could I solve? If you can't answer a question with data, don't get it.
- LRLenny Rachitsky
Hm. And you have example surveys in your book for folks-
- NFNicole Forsgren
Yes.
- LRLenny Rachitsky
... that wanna just copy and paste and not have to think about it as much.
- NFNicole Forsgren
Yeah. Example surveys, um, a lot of example questions. We even recommend, like, what the format, like how, what the flow should look like, how long it should be, how long it should not be.
- LRLenny Rachitsky
One thing that I was reading, uh, is that you don't love happiness surveys specifically asking engineers how happy they are. Is that true? If so, why is that?
- NFNicole Forsgren
I don't. Now, I will, I will say, I, I don't love a happiness survey, because there are too many things that contribute to happiness. Happiness is a lot, right? So happiness is work, happiness is family, happiness is hobbies, happiness is weekends, happiness... There are so many things that contribute to happiness. Now, that doesn't mean I don't care about happiness. I think happiness surveys are not particularly useful here. What can be helpful is satisfaction. And people are like, "Well, it's the same thing." It's not. Because you can ask, "Are you satisfied with this tool," right? And then ask some follow-up questions. Now, those two are related because the more satisfied you are with your job and your tools and the work and your team, it contributes to happiness. And I used to joke, remember the old commercials, like, "Happy cows make happy cheese?"
- LRLenny Rachitsky
Mm-hmm.
- NFNicole Forsgren
Out of California. That was the best. Um, happy devs make happy code. They, they write better programs, they do better work, um, they're, they're better team members and collaborators. But, but capturing and trying to directly influence happiness, like that's, that's, that's not what we're here for, right? And it's just, it's too challenging. It's too all-encompassing. Satisfaction can give us some
- 57:59 – 59:08
Popular AI tools for developers
- NFNicole Forsgren
signal.
- LRLenny Rachitsky
In a totally different direction, uh, in terms of just tools you see people using, are there any that just, like, oh, yeah, this one's really commonly, uh, great for people? This is just, like, a tool people are finding a lot of success with. Like, there's the common ones, Copilot, Cursor. Uh, I don't know. Is there anything that stands out that you, you wanna share of just like, "Hey, you should check this tool out. People seem to love it"?
- NFNicole Forsgren
I, I think the usual, right? Copilot, Cursor, Gemini.
- LRLenny Rachitsky
Claude Code.
- NFNicole Forsgren
Yep, Claude Code. I love Claude Code.
- LRLenny Rachitsky
I have an, I have a whole post coming on ways to use Claud- Claude Code for non-engineering use cases. It's so-
- NFNicole Forsgren
Ooh, nice.
- LRLenny Rachitsky
... it's so interesting. For example, Claude Code, uh, find ways to clean up storage on my laptop. And it just tells you-
- NFNicole Forsgren
Oh, that's great.
- LRLenny Rachitsky
... here's a bunch of files. Y- It's just like ChatGPT running on your computer. And he could do all kinds of crazy stuff on your computer for you like a little mini, mini god.
- NFNicole Forsgren
Well, I'm gonna do that now.
- LRLenny Rachitsky
And yeah.
- NFNicole Forsgren
This is great.
- LRLenny Rachitsky
It's so good. It's, yeah. That's why I'm writing this. Uh, I had a ... Dan Shipman's on the podcast and he said Claude Code is the most underrated AI tool out there 'cause people don't realize what it's capable of. It's not just for coding. And that's something I'm trying to explore more and more.
- 59:08 – 1:00:40
Bringing a product mindset to DevEx improvements
- LRLenny Rachitsky
Okay. Is there anything else that you think would be valuable for people to hear for, to help people improve their developer experience, help them adapt to this new world of AI, uh, and engineering that we haven't covered?
- NFNicole Forsgren
I think something that's important to think about in general is to bring a product mindset to any type of DevEx improvements that are happening. And also, the metrics that we kinda collect and capture. And by that, I mean we want to identify a problem, right? Make sure we're solving a problem for a set of users. We want to think about creating MVPs and experiments and get fast feedback, you know, some, do some, like, rapid iteration. We wanna have a strategy. We wanna know who our addressable market is. We wanna know what success is. We wanna basically have a go-to-market function, right? We, we need to have comms. We need to get continuous feedback from our customers. We wanna keep improving. Um, and at some point, we wanna think about, you know, sunsetting something, right? Is it in maintenance mode? Is it sunsetting? And I think that's important in general, but I think it's extra important now because when we have AI tools, we're using AI tools, we're embedding AI into our products. Things are changing so rapidly that it, it can be really important to take, like, half a beat and say, "Okay, what's the problem I'm trying to solve right here? Is this metric that we've had for the last 10 years still important or should this be sunset because it's, it's not really important anymore? It's not driving the types of decisions and actions that I need."
- 1:00:40 – 1:02:33
AI corner
- NFNicole Forsgren
- LRLenny Rachitsky
Before we get to our exciting lightning round, I wanna take us to AI Corner, which is a recurring segment on this podcast. Is there some way that you've found a, a use for an AI tool in your life, in your work that you think might be fun to share, that you think might be useful to other people?
- NFNicole Forsgren
So I, uh, have been kind of working on some home design and like, like-
- LRLenny Rachitsky
Mm-hmm.
- NFNicole Forsgren
... redecorating rooms and stuff. Um, I'm working with a designer because I know what I like, but I don't know how to get there. I'm not good at this. But I've really been loving...... ChatGPT, and Gemini especially, to render pictures for me, right? So I can give it the floor plan, I can give it one shot of the room that's, like, definitely not what it's supposed to look like, and then I can give it a co- pictures of a couple different things, and then I can just tell it, "Change the walls," or, "Change the furniture layout," or, "Change something." And it helps me, and it's relatively quick, um, it helps me kind of visualize the things. Again, I know what I like, but I don't know how to get there. So I know if I like it or not. Which is probably a very random use, but it's fun for now.
- LRLenny Rachitsky
My wife does exactly the same thing. She's sending me constantly, "Here's what this rug will look like in our living room. Here's this water feature."
- NFNicole Forsgren
(laughs)
- LRLenny Rachitsky
Uh, it's so good, and it keeps getting better. It's just like-
- NFNicole Forsgren
Yeah.
- LRLenny Rachitsky
... "Wow, that's exactly our house with this new rug."
- NFNicole Forsgren
Yeah.
- LRLenny Rachitsky
Uh, and all you do is just upload these two photos and just like, "Well, how would this look in-"
- NFNicole Forsgren
Yeah.
- LRLenny Rachitsky
"... in that room?"
- NFNicole Forsgren
I've been impressed a couple times. I mean, definitely the machines are listening to us. Um, it's given me a mock-up of, of a room or something, and then it throws in a dog bed, uh, 'cause I have, I have dogs. I'm like, "I did not tell you to do that, but yeah, that's probably the color, color and style of dog bed that I should have in this room." (laughs)
- LRLenny Rachitsky
And speaking of that, have you tried this use case? Ask ChatGPT, uh, generate an image of what you think my house looks like based on everything you know about me.
- NFNicole Forsgren
I haven't.
- LRLenny Rachitsky
'Cause it remem- it is memory, it re- remembers everything you've talked about, and it's hilarious. You gotta do it.
- NFNicole Forsgren
Okay. That's, that's on my to-do list.
- LRLenny Rachitsky
There we go. Uh, uh, bonus
- 1:02:33 – 1:07:47
Lightning round and final thoughts
- LRLenny Rachitsky
use case. Nicole, with that, we've reached our very exciting lightning round. I've got five questions for you. You ready?
- NFNicole Forsgren
Awesome. Let's go.
- LRLenny Rachitsky
What are two or three books that you find yourself recommending most to other people?
- NFNicole Forsgren
Outlive by, uh, Peter Attia is fantastic. Another one that's I guess maybe related, I hurt my back, um, so, like, it's not great. Back Mechanic by Stuart McGill is incredible. So, shout out to anyone who has a hurt lower back. Um, it's for a layperson to read through and, like, figure out how to fix lower back problems. It's kind of a random one. I will say, I love How Big Things Get Done. I can't pronounce the name. I think when's, they're Scandinavian, one is. Um, but it's, it kind of dissects really large projects through recent-ish history, and where they failed and why. And I think it's really interesting for us to think about, especially now in this AI moment where basically all of our, at least software systems are gonna be changing. So how do we think about approaching what is essentially gonna be a very large project? Um, and then, sorry, I'm gonna throw in a bonus one, The Undoing Project by Michael Lewis. Uh, Matt Veloso recommended it to me, and it's so good.
- LRLenny Rachitsky
Yes. Uh, I read that.
- NFNicole Forsgren
I was aghast at the last sentence.
- LRLenny Rachitsky
Oh. Of the book?
- NFNicole Forsgren
I was not, I was not-
- LRLenny Rachitsky
Wow.
- NFNicole Forsgren
... yeah. I was not expecting it.
- LRLenny Rachitsky
I read that, and I, I do not remember that last sentence.
- NFNicole Forsgren
Oh, man.
- LRLenny Rachitsky
Oh, man. (laughs) Okay, cool. Uh, next question. Do you have a favorite movie or TV show you've recently watched and enjoyed?
- NFNicole Forsgren
I will say I watch Love is Blind. I, if I gotta, like, shut down at the end of the day, Love is Blind is fun.
- LRLenny Rachitsky
There's a new season out.
- NFNicole Forsgren
Yeah, I'm very excited. Um, and Shrinking. Shrinking. Have you seen Shrinking?
- LRLenny Rachitsky
No, I, I, I think I started, so The Therapists and yeah, I gave it a shot.
- NFNicole Forsgren
Strong recommend.
- LRLenny Rachitsky
And didn't, didn't stick.
- NFNicole Forsgren
It's cute.
- LRLenny Rachitsky
Okay, okay. (laughs) Sweet. Is there a product you've recently discovered that you really love? Could be an app, could be some kitchen gadget, some clothing.
- NFNicole Forsgren
Uh, yeah, the Ninja Creami is-
- LRLenny Rachitsky
Did you say this last time?
- NFNicole Forsgren
... the new camera. I don't know. I mean, I don't think so.
- LRLenny Rachitsky
Somebody said this, and I still remember it. It's like a-
- NFNicole Forsgren
It's so good.
- LRLenny Rachitsky
You make ice cream and stuff with it, right?
- NFNicole Forsgren
Yeah, you can basically freeze a protein shake and then it turns it into ice cream-
- LRLenny Rachitsky
Oh, man.
- NFNicole Forsgren
... which is delicious. Um-
Episode duration: 1:07:47
Install uListen for AI-powered chat & search across the full episode — Get Full Transcript
Transcript of episode SWcDfPVTizQ
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome