Skip to content
Lex Fridman PodcastLex Fridman Podcast

Chris Lattner: Compilers, LLVM, Swift, TPU, and ML Accelerators | Lex Fridman Podcast #21

Lex Fridman and Chris Lattner on chris Lattner on Compilers, Swift, TPUs, and ML’s Future Infrastructure.

Lex FridmanhostChris Lattnerguest
May 13, 20191h 13mWatch on YouTube ↗

EVERY SPOKEN WORD

  1. 0:0015:00

    The following is a…

    1. LF

      The following is a conversation with Chris Lattner. Currently, he's a senior director at Google working on several projects, including CPU, GPU, TPU accelerators for TensorFlow, Swift for TensorFlow, and all kinds of machine learning compiler magic going on behind the scenes. He's one of the top experts in the world on compiler technologies, which means he deeply understands the intricacies of how hardware and software come together to create efficient code. He created the LLVM Compiler Infrastructure Project and the Clang compiler. He led major engineering efforts at Apple, including the creation of the Swift programming language. He also briefly spent time at Tesla as vice president of Autopilot Software during the transition from Autopilot Hardware 1.0 to Hardware 2.0, when Tesla essentially started from scratch to build an in-house software infrastructure for Autopilot. I could have easily talked to Chris for many more hours. Compiling code down across the levels of abstraction is one of the most fundamental and fascinating aspects of what computers do, and he is one of the world experts in this process. It's rigorous science, and it's messy, beautiful art. This conversation is part of The Artificial Intelligence Podcast. If you enjoy it, subscribe on YouTube, iTunes, or simply connect with me on Twitter @lexfridman, spelled F-R-I-D. And now here's my conversation with Chris Lattner. What was the first program you've ever written?

    2. CL

      My first program-

    3. LF

      Hmm, back. And when was it?

    4. CL

      I think I started as a kid, and my parents got a BASIC programming book. And so when I started, it was typing out programs from a book, and seeing how they worked, and then typing them in wrong and trying to figure out why (laughs) they were not working right, that kinda stuff.

    5. LF

      So BASIC. What was the first language that you remember yourself maybe falling in love with, like really connecting with?

    6. CL

      Oh. I don't know, I mean, I feel like I've learned a lot along the way and, and each of them have a different special thing about them. So I started in BASIC and then went, like, GW-BASIC, which was the thing back in the DOS days, and then upgraded to QBasic and eventually QuickBASIC, which are all slightly more fancy versions of Microsoft BASIC. Um, made the jump to Pascal and started doing machine language programming in Assembly in Pascal, which was really cool. Turbo Pascal was amazing for its day. Um, eventually got into C, C++, and then kinda did lots of other weird things, so.

    7. LF

      I feel like you took the dark path, which is the, uh... You coulda, you coulda gone Lisp.

    8. CL

      Yeah, yeah.

    9. LF

      You coulda gone higher level-

    10. CL

      Yeah.

    11. LF

      ... sort of functional, philosophical, hippie route.

    12. CL

      Yeah.

    13. LF

      Instead, you went into, like, the dark arts-

    14. CL

      Yeah.

    15. LF

      ... of the C++.

    16. CL

      It was straight, straight into the machine, right?

    17. LF

      Straight to the machine.

    18. CL

      And so I s- so started with BASIC, Pascal, and then Assembly, and then wrote a lot of Assembly. And, um-

    19. LF

      Why? How?

    20. CL

      Eventually, eventually did Smalltalk and other things like that.

    21. LF

      Yeah. (laughs)

    22. CL

      But that was not the starting point.

    23. LF

      But, so what, uh, what is this journey into C? Is that in high school? Is that in college?

    24. CL

      That was in high school, yeah. So, and, and then that was, uh, it was really about trying to be able to do more powerful things than what Pascal could do and also to learn a different world. C was really confusing to me with the pointers and the syntax and everything, and it took a while. But Pascal's much more principled in various ways. C is more the, I mean, it has its historical roots, but it's, it's not as easy to learn.

    25. LF

      With pointers, there's this memory management thing that you have to become conscious of. Is that the first time you start to understand that there's resources that you're supposed to manage?

    26. CL

      Yeah. Uh, well, so you have that in, in Pascal as well, but in Pascal, they s- like the caret instead of the star and thing. There's some small differences like that, but it's not about, uh, pointer arithmetic. In p- in, in C, it, you end up thinking about how things get laid out in memory a lot more.

    27. LF

      Mm-hmm.

    28. CL

      And so in Pascal, you have allocating and deallocating and owning the, the memory, but just the programs are simpler and you don't have to... Well, for example, Pascal has a string type, and so you can think about a string instead of an array of characters which are consecutive in memory, yeah? So it's a little bit of a higher level abstraction.

    29. LF

      So, uh, let's get into it. Let's talk about-

    30. CL

      Yeah.

  2. 15:0030:00

    Mm-hmm. …

    1. CL

      do more things, and then setting new goals and reaching for them, and ... um, and with ... in the case of comp- in the case of, uh, LLVM when I started working on that, my research advisor that I was working for was a compiler guy, and so I spec- he and I specifically found each other because we were both interested in compilers, and so I started working with him and taking his class, and a lot of LLVM initially was it's fun implementing all the standard algorithms-

    2. LF

      Mm-hmm.

    3. CL

      ... and all the, all the things that people had been talking about and were well-known and they were in the, the curricula for, uh, advanced studies in compilers. And so, just being able to build that was really fun and I was learning a lot by instead of reading about it, just building. And so I, I enjoyed that.

    4. LF

      So you said compilers are these complicated systems. Can you even just, with language, try to describe, uh, you know, how you turn a C++ program-

    5. CL

      Yes.

    6. LF

      ... into code? Like, what are the hard parts?

    7. CL

      Yeah, yeah.

    8. LF

      Why is it so hard?

    9. CL

      So I'll, I'll give you examples of the hard parts along the way. So C++ is a very complicated programming language. It's something like 1,400 pages in the spec.

    10. LF

      Mm-hmm.

    11. CL

      So C++ by itself is crazy complicated.

    12. LF

      Can we- can we just, sorry, pause. What makes the language complicated in terms of, uh, what's syntactically ... like-

    13. CL

      Uh, s- so it's what they call syntax, so the actual how the characters are arranged, yes. It's also semantics, how it behaves. Um, it's also, in the case of C++, there's a huge amount of history. C++ built on top of C, you play that forward, and then a bunch of suboptimal, in some cases, decisions were made, and they compound, and then more and more and more things keep getting added to C++, and it will probably never stop. (laughs) But the, the language is very complicated from that perspective, and so the interactions between subsystems is very complicated. There's just a lot there, and when you talk about the front end, one of the major challenges which Clang, as a project, the C/C++ compiler that I built, I and many people built ... um, one of the challenges we took on was we, we looked at GCC. Okay, GCC at the time was, like, a really good industry standardized compiler that had really consolidated a lot of the other compilers in the world and was, was a standard, but it wasn't really great for research. Um, the design was very difficult to work with, and it was full of global variables and other, other things that made it very difficult to reuse in ways that it wasn't originally designed for. And so with Clang one of the things that we wanted to do is push forward on better user interface, so make error messages that are just better than GCC's-

    14. LF

      Mm-hmm.

    15. CL

      ... and that, that's actually hard because you have to do a lot of bookkeeping in an efficient way to be able to do that. We wanted to make compile time better, and so compile time is about making it efficient, which is also really hard when you're keeping track of extra information. We wanted to make new tools available, so refactoring tools and other analysis tools that, that GCC never supported, also leveraging the extra information we kept, um, but enabling those new classes of tools that then get built into IDEs. And so that's been one of the, one of the areas that Clang has really helped, uh, push the world forward in, is, uh, in the tooling for C and C++ and things like that. But C++ in the front end piece is complicated and you have to build syntax trees and you have to check every rule in the spec and you have to turn that back into an error message to the human, human that the human can understand when they do something wrong.... but then you start doing the, uh, what's called lowering, so going from C++ in the way that it represents code down to the machine.

    16. LF

      Mm-hmm.

    17. CL

      And when you do that, there's many different phases you go through. Often, there are, I think LLVM has something like 150 different, what are called, passes in the compiler that the code pass- passes through. And these get organized in very complicated ways, which affect the generated code and the performance and compile time and many other things.

    18. LF

      What are they passing through? So after you do the, the Clang, um, parsing, what's, what's the... is, is it a graph?

    19. CL

      What's the-

    20. LF

      What does it look like?

    21. CL

      Mm-hmm.

    22. LF

      What's the data structure here?

    23. CL

      Yeah, so in, in the parser, it's usually a tree, and it's called an abstract syntax tree. And so the idea is you, you have a node for the plus that the human wrote in their code, or the function call, you'll have a node for call with, uh, the function that they call and the arguments they pass, things like that. This then gets lowered into what's called an intermediate representation. And intermediate representations are like, LLVM has one. And there, it's a, um, it's what's called a control flow graph. And so you represent each operation in the program as a very simple, like, this is gonna add two numbers, this is gonna multiply two things, this maybe will do a call. But then they get put in what are called blocks. And so you get blocks of these straight-line operations, where instead of being nested like in a tree, it's s- straight-line operations. And so there's a sequence and an ordering to these operations. And then you have-

    24. LF

      Is that within the block or outside the block, uh?

    25. CL

      That, that's within the block.

    26. LF

      Okay, let me-

    27. CL

      And so it's a straight-line sequence of operations within the block. And then you have branches, like conditional branches between blocks. And so when you write a loop, for example, um, in a syntax tree, you would have a for node, like for a for statement in a C-like language, you'd have a for node, and you have a pointer to the expression for the initializer, a pointer to the expression for the increment, a pointer to the expression for the comparison, a pointer to the body, okay? And these are all nested underneath it. In a control flow graph, you get a block for th- the code that runs before the loop-

    28. LF

      Mm-hmm.

    29. CL

      ... so the initializer code. Then you have a block for the body of the loop. And so the, the body of the loop code goes in there, but also the increment and other things like that. And then you have a branch that goes back to the top, and a comparison and a branch that goes out. And so it's more of a assembly level kind of representation. But the nice thing about this level of representation is it's much more language independent. And so there's m- lots of different kinds of languages with different kinds of, um, you know, JavaScript has a lot of different ideas of what is false, for example. And all that can stay in the front end, but then that middle part can be shared across all those.

    30. LF

      How close is that intermediate representation to, uh, to neural networks, for example? Is there, are they, uh... c- 'cause everything you described is a kinda-

  3. 30:0045:00

    Right. …

    1. CL

      and transparency, or portability problems, I think it's been really good. Now, Java ultimately didn't win out on the desktop.

    2. LF

      Right.

    3. CL

      And, like, there are good reasons for that. But, um, it's been very successful on servers and in many places it's been a very successful thing over, over decades.

    4. LF

      So what, uh, h- has been, uh, LLVM's and, uh, clang's, uh, improvements in optimization that s-... eh, throughout its history, what- what are some moments where you had to sit back and really proud of what's been accomplished?

    5. CL

      Yeah, I think that the (sighs) interesting thing about LLVM is not the innovations in compiler research. It has very good implementations of various important algorithms, no doubt. Um, and a- and a lot of really smart people have worked on it. But I think that the thing that's most profound about LLVM is that through standardization, it- it made things possible that otherwise wouldn't have happened, okay? And so interesting things that have happened with LLVM, for example, Sony has picked up LLVM and used it to do all the graphics compilation in their movie production pipeline.

    6. LF

      Mm-hmm.

    7. CL

      And so now they're able to have better special effects because of LLVM. That's kinda cool. That's not what it was designed for (laughs) , right? But that's- that's the sign of good infrastructure when, uh, it can be used in ways it was never designed for because it has good layering and software engineering and it's composable and- and things like that.

    8. LF

      Which is where, as you said, it differs from GCC.

    9. CL

      Yes. GCC is also great in various ways, but it's not as good as a infrastructure technology. It's- it's, you know, it's really a C compiler or it's- or it's a Fortran compiler. It's not- it's not infrastructure in the same way.

    10. LF

      Uh, is it... Now, you can tell I don't know what I'm talking about because I'm s- I keep saying Clang. You can- you could always t- tell (laughs) when a person is clueless by the way they pronounce something. Um, I don't think... Have I ever used Clang?

    11. CL

      Uh, entirely possible. Have you... Well, so you've used code. It's generated probably. So Clang is, and LLVM are used to compile all the apps on the iPhone effectively-

    12. LF

      Mm-hmm.

    13. CL

      ... and the OS's. It compiles Google's production server applications. It's used to build, like, GameCube games and PlayStation 4 and things like that, up till-

    14. LF

      As a user I have. But, uh-

    15. CL

      Yeah.

    16. LF

      ... just everything I've done, uh, that I experienced through Linux has been, I believe, always GCC.

    17. CL

      Yeah, I think Linux still defaults to GCC and-

    18. LF

      And is there a reason for that? Or is it bec- I mean, is there a reason for that?

    19. CL

      It's- it's a combination of technical and social reasons. Um, many GC- or many Linux developers do use- do use Clang. Um, but the distributions for lots of reasons, uh, have used GCC historically, and they've not switched.

    20. LF

      Yeah. 'Cause I- and just anecdotally online, it seems that LLVM has either reached the level of GCC or superseded on different features or whatever, so-

    21. CL

      The way I would say it is that they're with- they're so close, it doesn't matter (laughs) .

    22. LF

      Yep, exactly.

    23. CL

      Like, they're slightly better in some ways, slightly worse in other ways. But n- it doesn't actually really matter anymore, um, at that level.

    24. LF

      So in terms of optimization breakthroughs, it's just been solid incremental work is what you're saying?

    25. CL

      Yeah, yeah.

    26. LF

      Yeah.

    27. CL

      Which- which is- which describes a lot of compilers. The hard- the hard thing about compilers, in my experience, is the engineering, the s- the software engineering, making it so that you can have hundreds of people collaborating on really detailed low level work and scaling that. And that's- that's really hard, and that's one of the things I think LLVM has done well. Uh, and that kinda goes back to the original design goals with it to be modular and things like that. And- and incidentally, I don't wanna take all the credit for this, right? I mean, some of the- the- the best parts about LLVM is that it was designed to be modular. And when I started, I would write, for example, a register allocator, and then somebody much smarter than me would come in and pull it out and replace it with something else that they would come up with. And because it's modular, they were able to do that. And that's one of the challenges with- with GCC, for example, is replacing subsystems is- is incredibly difficult. It- it- it can be done, but it wasn't designed for that. And that's one of the reasons that LLVM has been very successful in the research world as well.

    28. LF

      But in a- in a community sense, Guido van Rossum, right? From Python, uh, just retired from... what is it? Benevolent Dictator For Life, right? Uh, so in managing this community of brilliant compiler folks, is there- uh, d- did it, for a time at least, fall on you to- to approve things?

    29. CL

      Oh yeah. So I- I mean, I still have something like an order of magnitude more patches in LLVM than anybody else. Um, and, uh, m- many of those I wrote myself (laughs) . But-

    30. LF

      So you still write... I mean, you still- you still close to the- to the... I don't know what the expression is, to the metal. You still write code, you still, uh...

  4. 45:001:00:00

    Okay. …

    1. CL

      a bit memory constrained, right? And so being able to compile the code and then ship it, and then have, having standalone code that is not JIT compiled was, is, is a very big deal, and it's very much part of the Apple, um, value system.

    2. LF

      Okay.

    3. CL

      Now, JavaScript's also a thing, right? I mean, it's not, it's not that this is exclusive, and technologies are good depending on how they're applied, right? Um, but in the design of Swift saying like, "How can we make Objective-C better," right? Objective-C was statically compiled, and that was, uh, the contiguous natural thing to do.

    4. LF

      Just skip ahead a little bit. Now we'll, we'll right back just, just as a question. As you think about today in 2019-

    5. CL

      Yeah.

    6. LF

      ... uh, in your work at Google, TensorFlow and so on, is again compilations, static compilation still the-

    7. CL

      The right thing?

    8. LF

      ... the, still the right thing?

    9. CL

      Yeah, so the, the funny thing, after working on compilers for a really long time, is that, uh, and one of... this is one of the things that LLVM has helped with, is that I don't look as comp- at compilations being static or dynamic or interpreted or not. This is a spectrum, okay? And one of the cool things about Swift is that Swift is not just statically compiled.

    10. LF

      Mm-hmm.

    11. CL

      It's actually dynamically compiled as well, and it can also be interpreted, though nobody's actually done that. Um, and so what, what ends up happening when you use Swift in a workbook, uh, for example, in Colab or in Jupyter, is it's actually dynamically compiling the statements as you execute them. And so this gets back to the I- the software engineering problems.

    12. LF

      Mm-hmm.

    13. CL

      Right? Where if you layer the stack properly, you can actually completely change how and when things get compiled because you have the right abstractions there. And so the way that a Colab workbook works with Swift is that, um, when we start typing into it, it creates a process, a Unix process, and then each line of code you type in, it compiles it through the Swift compiler, the, the front end part, and then sends it through the optimizer, JIT compiles machine code, and then injects it into that process. And so as you're typing new stuff, it's putting, s- it's like squirting in new code-

    14. LF

      Yeah.

    15. CL

      ... and overwriting and replacing and updating code in place. And the fact that it can do this is not an accident. Like, Swift was designed for this. Um, but it, i- it's an important part of how the language was set up and how it's layered and, and this is a non-obvious piece. And one of the things with Swift that was, uh, for me, a very strong design point, is to make it so that you can learn it very quickly. And so from a language design perspective, the thing that I always come back to is this UI principle of, um, progressive disclosure of complexity.

    16. LF

      (laughs) Yeah.

    17. CL

      And so in Swift, you can start by saying print quote hello world quote, right? And there's no slash n, just like Python, one line of code, no main, no...

    18. LF

      No header files.

    19. CL

      N- no, no header files, no public static class void, blah, blah, blah, string, like Java has, right? It's one line of code, right? And you can teach that and it works great. Um, then you can say, "Well, let's introduce variables." And so you can declare a variable with var, so var X equals four. What is a variable? You can use X, X plus one. This is what it means. Then you can say, "Well, how about control flow?" Well, this is what an if statement is. This is what a for statement is. This is what a while statement is. Um, then you can say, "Let's introduce functions," right? And, and many languages like Python have had this, this kind of notion of let's introduce small things and then you can add complexity, then you can introduce classes, and then you can add generics in the case of Swift, and then you can build in modules and build out in terms of the things that you're expressing. But, um, this is not very typical for compiled languages.

    20. LF

      Mm-hmm.

    21. CL

      And so this was a very strong design point and one of the reasons that, um, Swift in general is designed with this factoring of complexity in mind so that the language can express powerful things. You can write firmware in Swift (laughs) if you want to.

    22. LF

      Yeah.

    23. CL

      Um, but, uh, it has a very high level feel, which is really, uh, this perfect blend because often you have very advanced library writers that wanna be able to use the, the nitty-gritty details, but then other people just wanna use the libraries and work at a higher obstruction level.

    24. LF

      It's kinda cool that I saw that you can just enter a parability. I don't think I pronounced that word enough. But, uh, you can just drag in Python. It's just strange, you can import, like, I saw this in the demo-

    25. CL

      Yeah.

    26. LF

      ... importing NumPy. Like how do you make that happen?

    27. CL

      Yeah, well-

    28. LF

      What's, what's up with-

    29. CL

      Yeah, so-

    30. LF

      Is that, is, is that as easy as it looks? Or is it-

  5. 1:00:001:13:11

    (laughs) Yeah, that's super…

    1. CL

      the research side and people working on optimizing network transport of, uh, weights (laughs) across the network originally, and trying to find ways to compress that. But then it got burned into silicon, and it's a key part of what makes TPU performance so amazing and s- and, and great. Now, TPUs have many different aspects of it that are, that are important, but the, uh, the co-design between the low-level compiler bits and the software bits and the algorithms is all super important, and it's a, this amazing trifecta that only Google can do. (laughs)

    2. LF

      (laughs) Yeah, that's super exciting. So can you tell me about MLIR project, previously the s- the secretive one?

    3. CL

      Yeah, so MLIR is a project that we announced at a compiler conference three weeks ago or something, at the Compilers for Machine Learning conference. Basically if you, again, if you look at TensorFlu as a compiler stack, it has a number of compiler algorithms within it. It also has a number of compilers that get embedded into it, and they're made by different vendors. For example, uh, Google has XLA, which is a great compiler s- system. Nvidia has TensorRT. Intel has nGraph. Um, there, there's a number of these different compiler systems, and, um, they're very hardware specific and they're trying to solve different parts of the problems, um, but they're all kind of...

    4. LF

      ... similar in a sense of they wanna integrate with TensorFlow. Now, TensorFlow has an optimizer, and it has these different code generation technologies built in. The idea of MLIR is to build a common infrastructure to support all these different subsystems. And initially, it's to be able to make it so that they all plug in together, and they can share m- a lot more code and can be reusable. But over time, we hope that the industry will, uh, start collaborating and sharing code. And instead of reinventing the same things over and over again, that we can actually foster some of that, that, uh, you know, working together to solve common problem energy that has been useful in the compiler field before. Beyond that, MLIR is, uh ... Some people have joked that it's kind of LLVM 2.0. It learns a lot about what LLVM has been good and what LLVM has done wrong, and it's a chance to fix that. Um, and also there are challenges in the LLVM ecosystem as well, where LLVM is very good at the thing it was designed to do but, you know, 20 years later, the world has changed, and people are trying to solve higher level problems, and we need, we need some new tech- technology.

    5. CL

      And, uh, what's the future of open source in, in this context?

    6. LF

      Uh, very soon. So it is not yet open source, but it will be, hopefully, in the next couple of months.

    7. CL

      So you still believe in the value of open source and these kinds of oh yeah, absolutely. And I think that the TensorFlow community at large fully believes in open source.

    8. LF

      So, I mean, that's, uh ... There is a difference between Apple, where you were previously, and Google now-

    9. CL

      Yeah.

    10. LF

      ... in spirit and culture. And I would say the open sourcing of TensorFlow was a seminal moment in the history of software 'cause-

    11. CL

      S-

    12. LF

      ... here's this large company releasing a very large code base-

    13. CL

      Yes.

    14. LF

      ... that's, uh, open sourcing. What's, what are your thoughts on that? Uh, h- how ha- happy or not were you to see that kind of degree of open sourcing?

    15. CL

      So between the two, I prefer the Google approach, if that's what you're saying. The Apple approach makes sense given the historical context that Apple came from, but that's been 35 years ago.

    16. LF

      (laughs)

    17. CL

      And I think that Apple is definitely adapting. And the way I look at it is that there's different kinds of concerns in this space, right? It is very rational for a business to con- to care about making money. Like, that fundamentally is what a business is about, right?

    18. LF

      Yes.

    19. CL

      But I think it's also incredibly realistic to say, "It's not your string library that's the thing that's gonna make you money. It's gonna be the amazing UI product differentiating features and other things like that that you build on top of your string library." And so, um, keeping your string library proprietary and secret and things like that is n- m- maybe not the, the important thing anymore, right? Where before, platforms were, were different, right? And even 15 years ago, it w- things were a little bit different. But the world is changing. So Google strikes a very good balance, I think. And, um, I think that TensorFlow being open source really changed the entire machine learning field and re- it caused a revolution in its own right. And so I think it's amazing forw- amazingly forward-looking because, um, I could have imagined, and I wasn't at Google at the time, but I could imagine a different context and a different world where a company says, "Machine learning is critical to what we're doing. We're not gonna give it to other people," right? And so that decision is a profound, a profoundly brilliant insight that I think has really led to the world being better and better for Google as well.

    20. LF

      And, uh, has all kinds of ripple effects. I think it is really ... I mean, th- you can't understate Google deciding that. How profound that is for software is, is awesome.

    21. CL

      Well, and, and it's been ... And again, I can understand the, the, the concern about, "If we release our machine learning software, our, our competitors could go faster." But on the other hand, I think that open sourcing TensorFlow has been fantastic for Google.

    22. LF

      Yeah.

    23. CL

      And, um, it w- I'm sure that obvious was ... That, that, that decision was very non-obvious at the time, but I think it's worked out very well.

    24. LF

      So let's try this real quick.

    25. CL

      Yeah.

    26. LF

      You were at Tesla for five months as the VP of Autopilot Software. You led the team during the transition from H, Hardware 1.0, to Hardware 2.0. I have a couple questions. So one, first of all, uh, to me, that's one of the bravest engineering decisions undertaking sort of like ... Uh, uh, undertaken really ever in the automotive industry to m- software-wise, starting from scratch. It's a b- really brave engineering decision. So my one question is, there is, what was that like? What was the challenge of that?

    27. CL

      Do you mean the c- the career decision of jumping from a comfortable good job into the unknown, or?

    28. LF

      Uh, that combined. So the ... At the individual level, you making that (laughs) decision. And then when you show up, you know, it's a really hard engineering problem.

    29. CL

      Yes.

    30. LF

      So you could just stay, maybe slow down, say, Hardware 1.0, or that, those kinds of decisions. I'm sorry, just taking it full on, "Let's, let's do this from scratch." What was that like?

Episode duration: 1:13:05

Install uListen for AI-powered chat & search across the full episode — Get Full Transcript

Transcript of episode yCd3CzGSte8

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome