Skip to content
The Joe Rogan ExperienceThe Joe Rogan Experience

Joe Rogan Experience #2044 - Sam Altman

Sam Altman is the CEO of OpenAI, an artificial intelligence research and development company. www.openai.com

Sam AltmanguestJoe Roganhost
Jun 27, 20242h 36mWatch on YouTube ↗

EVERY SPOKEN WORD

  1. 0:0015:00

    (drumming) Joe Rogan podcast.…

    1. SA

      (drumming) Joe Rogan podcast. Check it out. The Joe Rogan Experience.

    2. JR

      Train by day, Joe Rogan podcast by night. All day. (heavenly music) Hello, Sam. What's happening?

    3. SA

      Not much. How are you?

    4. JR

      Thanks for coming in here. Appreciate it.

    5. SA

      Thanks for having me.

    6. JR

      So, what have you done? (laughs)

    7. SA

      Like, ever?

    8. JR

      No. I mean, what have you done with AI? I mean, it's, um ... one of the things, um, about this is, I mean, I think everyone's fascinated by it. I mean, everyone is, uh, absolutely blown away at the current capability and wondering what the potential for the future is and whether or not that's a good thing.

    9. SA

      I think it's gonna be a great thing, but I think it's not gonna be all a great thing. And that, that is where I think that's where all of the complexity comes in for people. It's not this, like, clean story of, "We're gonna do this, and it's all gonna be great."

    10. JR

      Right.

    11. SA

      "It's we're gonna do this." It's gonna be net great, but it's gonna be, uh, like a technological revolution. It's gonna be a societal revolution, and those always come with change. And even if it's like net wonderful, you know, there's things we're gonna lose along the way, some kinds of jobs, some kind, parts of our way of life, some parts of the way we live are gonna change or go away. And e- no matter how tremendous the upside is there, and I, and I believe it will be tremendously good, you know, there's a lot of stuff we gotta navigate through to make sure. Um, that's, that's a complicated thing for anyone to wrap their heads around, and there's, you know, deep and super understandable emotions around that.

    12. JR

      That's a very honest answer that it's not all gonna be good, but it seems inevitable at this point.

    13. SA

      I- i- it's, yeah, I mean, it's definitely inevitable. My, my view of the w- world, you know, when you're like a kid in school, you learn about this technological revolution, and then that one, and then that one. And my view of the world now, sort of looking backwards and forwards, is that this is like one long technological revolution f- and we had, sure, like, first we had to figure out agriculture, so that we had the resources and time to figure out how to build machines. Then we got this industrial revolution, and that made us learn about a lot of stuff and a lot of other scientific discovery too, let us do the computer revolution, and that's now letting us, as we scale up to these massive systems, do the AI revolution. But it really is just one long story of humans discovering science and technology and co-evolving with it, and I think it's the most exciting story of all time. I think it's how we get to this world of abundance, and although, you know, although we do have these things to navigate, then there, there will be these downsides. If, if you think about what it means for the world and for people's quality of lives, if we can get to a world, uh, where the, the cost of intelligence and the abundance that comes with that, uh, the cost dramatically falls, the abundance goes ways up, goes way up, I think we'll do the same thing with energy. And I think those are the two sort of key inputs to everything else we want. So if we can have abundant and cheap energy and intelligence, that will transform people's lives largely for the better. And I think it's gonna, in the same way that if we could go back now 500 years and look at someone's life, we'd say, "Well, there, there's some great things, but they didn't have this. They didn't have that. Can you believe they didn't have modern medicine?" That's what people are gonna look back at us like but in 50 years.

    14. JR

      When you think about the people that currently rely on jobs that AI will replace, when you think about whether it's truck drivers or automation workers, people that work in factory assembly lines, w- what, if anything, what strategies can be put to mitigate the negative downsides of those jobs being eliminated by AI?

    15. SA

      So, eh, I'll talk about some general thoughts, but I, I find making very specific predictions difficult because the way the technology goes has been so different than even my own intuitions, or certainly than my own intuitions.

    16. JR

      Could we, maybe we should stop there and back up a little. What we, what were your initial thoughts?

    17. SA

      If you had asked me 10 years ago, I would have said, "First, AI is gonna come for blue-collar labor basically. It's gonna drive trucks and do factory work, and, you know, it'll handle heavy machinery. Then maybe after that, it'll do, like, some kinds of cognitive labor, uh, kind of, you know, but not, it won't be off doing what I think of personally as the really hard stuff. It won't be off proving new mathematical theorems. It won't be off, you know, discovering new science, um, won't be off writing code. And then eventually, maybe, but maybe last of all, maybe never, because human creativity is this magic speciable, special thing, last of all, it'll come for the creative jobs." That's what I would have said. Now, A, it looks to m- me, like, and for a while, AI is much better at doing tasks than doing jobs. It can do these little pieces super well, but sometimes it goes off the rails, uh, it can't keep, like, very long coherence. So people are instead just able to do their existing jobs way more productively, um, but you really still need the human there today. And then B, it's going exactly the other direction. Could do the creative work first, stuff like coding second, or they can do things like other kinds of cognitive labor third, and we're the furthest away from, like, humanoid robots.

    18. JR

      Hmm. So back to the initial question. W- if we do have something that completely eliminates, uh, factory workers, completely eliminates truck drivers, delivery drivers-

    19. SA

      Yeah.

    20. JR

      ... things along those lines, that creates this massive vacuum in our society.

    21. SA

      So, I think there's things that we're gonna do that are-... good to do, but not sufficient. So I think at some point, we will do something like a UBI or some other kind of, like, very long term unemployment insurance, something. But we'll have some way of giving people ... like, redistributing money in society e- as a cushion for people as people figure out the new jobs. But l- and I ... maybe I should touch on that. I- I'm not a believer at all that there won't be lots of new jobs. I, I think human creativity, desire for status, wanting different ways to compete, invent new things, feel part of a community, feel valued, uh, that's not gonna go anywhere. People have worried about that forever. What happens is we get better tools, and we just invent new things and more amazing things to do. And there's a big universe out there, and, and I think ... I mean that, like, literally, uh, in that there's, like, space is really big, but also there's just so much stuff we can all do if we do get to this world of abundant intelligence, where you can sort of just think of a new idea and then it gets created. But, but again, that doesn't ... To the point we started with, that, that, that doesn't provide, like, great solace to people who are losing their jobs today. So saying there's going to be this great indefinite stuff in the future, people are like, "What are we doing today?" So, you know, we'll ... I think we will, as a society, do things like UBI and other ways of redistribution, but I don't think that gets at the core of what people want. I think what people want is, like, agency, self-determination, the ability to play a role in architecting the future along with the rest of society, the ability to express themselves and create something meaningful to them. And also, I think a lot of people work jobs they hate, and I think there's ... We as a society are always a little bit confused about whether we wanna work more or work less. But, but somehow the ... We all get to do something meaningful, and we all get to f- play our role in driving the future forward. That's really important, and what I hope is as those truck driving, long-haul truck driving jobs go away, which, you know, people have been wrong about predicting how fast that's gonna happen, but it's gonna happen. Um, we figure out not just a way to solve the economic problem by, like, giving people the equivalent of money every month, but that there's a way that ... And we've had a lot of ideas about this. There's a way that we, like, share ownership and decision-making over the future. Um, a thing I say a lot about AGI is that everyone, everyone realizes we're gonna have to share the benefits of that, but we also have to share, like, the decision-making over it and access to the system itself. Like, I'd be more excited about a world where we say rather than give everybody on Earth, like, one eight-billionth of the AGI money, which we should do that too, we say, "You get, like, one eight-billionth of ... a, a one eight-billionth slice of the system." You can sell it to somebody else. You can sell it to a company. You can pool it with other people. You can use it for whatever creative pursuit you want. You can use it to figure out how to start some new business. Um, and with that, you get sort of, like, a, a voting right over how this is all gonna be used.

    22. JR

      Mm-hmm.

    23. SA

      And so the better the AGI gets, the more your little one eight-billionth ownership is, is worth to you.

    24. JR

      We were joking around the other day on the podcast where I was saying that what we need is an AI government. The, the ... We shou- we should have-

    25. SA

      What does that mean?

    26. JR

      ... an AI president and have AI run things.

    27. SA

      Just make all the decisions?

    28. JR

      Yeah, have something that's completely unbiased, absolutely rational, has the accumulated knowledge of the entire human history-

    29. SA

      Yeah.

    30. JR

      ... at its disposal, including all knowledge of psychology-

  2. 15:0030:00

    Mm. It seems to…

    1. SA

      make, the- the sort of safety systems and the ch- and the checks that the world puts in place, how we think about global regulation or rules of the road from a safety perspective for those projects, it's super important, 'cause you can imagine many things going horribly wrong. But I've been, I feel cheerful about the progress the world is making towards taking this seriously, and, uh, you know, it reminds me of what I've read about the conversations that the world had right around the development of nuclear weapons.

    2. JR

      Mm. It seems to me that this is, at least in terms of public consciousness, this has emerged very rapidly, where I don't think anyone was really aware, uh, people were aware of the concept of artificial intelligence-

    3. SA

      Yeah.

    4. JR

      ... but they didn't think that it was gonna be implemented so comprehensively, so quickly.

    5. SA

      Yeah.

    6. JR

      So, um, ChatGPT is on, what, 4.5 now?

    7. SA

      Four.

    8. JR

      Four. And with 4.5, there'll be some sort of an exponential increase in its abilities?

    9. SA

      It'll be somewhat better, uh, each step, uh, you know, from each, like, half step like that, y- you, uh, you kinda, humans have this ability to, like, get used to any new technology so quickly.

    10. JR

      Mm-hmm.

    11. SA

      The thing that I think was unusual about the launch of ChatGPT 3.5 and then 4 was that people hadn't really been paying attention, and that's part of the reason we deploy. We think it's very important that people and institutions have time to gradually understand this, react, co-design the society that we want with it, and if you just build AGI in secret in a lab and then drop it on the world all at once, I think that's a really bad idea. So, we- we had been trying to talk to the world about this for a while. People, if you don't give people something they can feel and use in their lives, they don't quite take it seriously, everybody's busy, and so there was this big overhang from where the technology was to where public consciousness was. Now that's caught up, we've deployed, I think people understand it. I don't expect the fu- the jump from, like, 4 to whenever we finish 4.5, which will be a little while, I don't expect that to be the crazy, I think the crazy switch, the crazy adjustment that people have had to go through has- has mostly happened. I think most people have gone from thinking that AGI was science fiction and very far off to something that is gonna happen, and that was, like, a one-time reframe. And now, uh, you know, every year, you get a new iPhone, over the 15 years or whatever since the launch, they've gotten dramatically better, but iPhone to iPhone, you're like, "Yeah, okay, it's a little better."

    12. JR

      Mm.

    13. SA

      But now if you go hold up the first iPhone to the 15 or whatever-

    14. JR

      Mm.

    15. SA

      ... that's a big difference. GPT-3.5 to AGI, that'll be a big difference. But along the way, it'll just get incrementally better.

    16. JR

      Do you think about the convergence of, uh, things like, uh, Neuralink and, uh, there's a, a few competing technologies where they're trying to implement some sort of, some sort of a connection between the human biological system and technology?

    17. SA

      Um, do you want one of those things in your head?

    18. JR

      I don't until everybody does.

    19. SA

      Right.

    20. JR

      And, you know, I would joke about it. But it's like, the, the idea is like, once it gets d- You have to, kind of.

    21. SA

      Yeah.

    22. JR

      Because everybody's gonna have it.

    23. SA

      So one of the hard questions about the mer- all of the related merge stuff is exactly what you just said. Like, as a society, are we gonna let some people merge with AGI-

    24. JR

      Right.

    25. SA

      ... and not others? And if we do, then ... and you choose not to. Like, what does that mean for you?

    26. JR

      Right. And will you be protected?

    27. SA

      H- h- how you get that moment right, uh, you know, if we like imagine like all the way out to the sci-fi future. I ... there've been a lot of sci-fi books written about how you get that moment right.

    28. JR

      Yeah.

    29. SA

      You know, who gets to do that first? What about people who don't want to? How do you make sure the people that do it first, like, actually help lift everybody up together?

    30. JR

      Yeah.

  3. 30:0045:00

    Yeah. I'm, I'm not…

    1. JR

      overseeing it, this can get really fucked.

    2. SA

      Yeah. I'm, I'm not anti-regulation. Like, I think there's clearly a role for it, uh, and I also think FTX was like a sort of comically bad situation-

    3. JR

      Yeah.

    4. SA

      ... that we shouldn't learn too much from either.

    5. JR

      It's like worst case scenario.

    6. SA

      Yeah.

    7. JR

      (laughs) Yeah. But it's a fun one.

    8. SA

      Like, it's totally fun, and you to-

    9. JR

      I love that story.

    10. SA

      I mean, you clearly-

    11. JR

      (laughs) I really do. I love the fact that they were all doing drugs and having sex with each other. (laughs)

    12. SA

      The... Yeah. No, no. It had every part of the dramas of like a... It, it... I mean, it's a gripping story-

    13. JR

      Yeah.

    14. SA

      ... because it had everything there.

    15. JR

      They did their taxes with, uh, like what, what was the, the program that they used?

    16. NA

      QuickBooks.

    17. JR

      QuickBooks. (laughs) They're dealing with billions of dollars in cash.

    18. SA

      I don't know why I think the word polycule is so funny.

    19. JR

      Polycule?

    20. SA

      That was what they like... What you call a relationship, like a poly but closed mol... Like polyamorous molecule put together.

    21. JR

      Oh, I see. Okay.

    22. SA

      So, they were like, "This is our polycule."

    23. JR

      So, there's nine of them, and they're poly amongst that whatever.

    24. SA

      Or 10 of them or whatever.

    25. JR

      Yeah.

    26. SA

      And you call that a polycule.

    27. JR

      Oh, yeah.

    28. SA

      And I thought that was the funny like...

    29. JR

      (laughs)

    30. SA

      That became like a meme in Silicon Valley for a while that I thought was hilarious. Um, you clearly want enough regulation that that can't happen. But there, like-

  4. 45:001:00:00

    Mm. But isn't what…

    1. SA

      the whatever." And I think this is like not at all the most important piece of this topic, but it was just interesting to me sociologically that there was- there was only talk about it being about what- what caused it, not about it being an eff- an effect of some sort of change in society.

    2. JR

      Mm. But isn't what caused it... W- well, there's biological reasons why, like when we talk about the phthalates-

    3. SA

      Yeah.

    4. JR

      ... and microplastic, pesticides, environmental factors, those are real.

    5. SA

      Totally. And I don't like, again, I'm so far out of my depth and expertise here, this was j- it was just interesting to me that the only talk was about like biological factors-

    6. JR

      Yeah.

    7. SA

      ... and not that somehow society can have some sort of effect on-

    8. JR

      Well, society most certainly has an effect, but-

    9. SA

      Do you know what the answer to this? I- I- I-

    10. JR

      I don't. I mean, I- I've- I've had a podcast with Dr. Shanna Swan who wrote the book Countdown, and that is all about the introduction of petrochemical products and the correlating drop in testosterone, rise in miscarriages. The fact that these are ubiquitous endocrine disruptors, that when they do, um, blood tests on people, they find some insane number. It's like 90 plus percent of people have phthalates in their system.

    11. SA

      Yep.

    12. JR

      And you-

    13. SA

      I appreciate the metal cups. (laughs)

    14. JR

      (laughs) Yeah, we- we- we try to mitigate it as much as possible, but I mean, you're getting it. If you're microwaving food, you're f- you're fucking getting it. You're get- you're just getting it. You're get- if you eat processed food, you're getting it. You're getting a certain amount of microplastics in your diet and estimates have been that it's as high as a credit card of microplastic per week-

    15. SA

      Like floating around.

    16. JR

      ... in your body. You consume a credit card of that a week.

    17. SA

      Whoa.

    18. JR

      The real concern is with mammals because the introductions, when they've done studies with mammals, when they've introduced phthalates into their body, there's a correlating, um... The- one thing that happens is the- these animals, their taint shrink. Like the taint of-

    19. SA

      Yeah, I remember this.

    20. JR

      ... yeah, the mammal, when you look at males, it's 50% to 100% larger than the females. With the introduction of phthalates on the males, the taints start shrinking, the penises shrink, the testicles shrink, sperm count shrinks. So we know there's a dire- direct biological connection between the- the- this- these chemicals and how they interact with- with bodies. So that's- that's a real one. And it's also the amount of petrochemical products that we have, the amount of plastics that we use, it's- it is such an integral part of our culture and allso- our society, our civilization. It's everywhere. And I've wondered, if you think about how these territorial apes evolve into this new advanced species, wouldn't one of the very best ways be to get rid of one of the things that causes the most problems, which is testosterone? We need testosterone, we're- we need aggressive men and protectors, but why do we need them? We need them because there's other-

    21. SA

      Right.

    22. JR

      ... aggressive men that are evil, right? So we need protectors from ourselves. We need the good strong people to protect us from the bad strong people. But if we're in the process of integrating with technology, if technology is an inescapable part of our life, if it is everywhere, you're using it, you have the internet of everything that's in your microwave, in your f- television, your computers, everything you use...... as time goes on, that will be more and more a part of your life. And as these plastics are introduced into the human biological system, you're seeing ma- a feminization of the males of the species. You're seeing a downfall in birth rate. You're seeing all these correlating factors that would sort of lead us to become this more peaceful, less violent, less aggressive, less ego-driven thing.

    23. SA

      Which the world is definitely becoming-

    24. JR

      Yeah.

    25. SA

      ... over time. Um, and I'm all for less violence, obviously. But I don't ... Look, obviously, testosterone has many great things to say for it and some bad tendencies too. But I don't think a world, if we, if we leave that out of the equation and just say like a world that has a, a spirit that, you know, we're gonna defend ourselves. We're going to ex- we're going to find a way to like protect ourselves, uh, and our tribe and our society into this future, which you can get with lots of other ways. I think that's an important impulse. M- more than that, though, what I meant is a- a- about ... If we go back to the issue of like where are the young founders? Uh, why don't we have, why don't we have more of those? And I don't think it's just the tech startup industry. I think you could say that about like young scientists or, or many other categories. Those are maybe just the ones that I, I know the best. Um ... In a world with any amount of technology, I still think we, we've gotta ... It is our destiny in some sense to stay on this, on this curve. And we still need to go figure out what's next and after the next hill, and after the next hill. And it would be ... My perception is that there is some long-term societal change happening here, and I think it makes us less happy too.

    26. JR

      Right. It may make us less happy. But what I'm saying is if the human species does integrate with technology, wouldn't a great way to facilitate that to be to kind of feminize the, the primal apes and to sort of downplay the role that this-

    27. SA

      You mean like the tech- Like should the AGI-

    28. JR

      Yeah.

    29. SA

      ... phthalates into the world.

    30. JR

      Well, maybe ... I don't know if it's AGI. I mean, maybe it's just an inevitable, inevitable consequence of technology, because e- especially the type of technology that we use, which does have so much plastic in it. And then on top of that, the technology that's involved in few- food systems, preservatives, all these different things that we use to make sure that people don't starve to death. We've made incredible strives in that. There are very few people in this country that starve to death.

  5. 1:00:001:15:00

    So what's left at…

    1. JR

      thing. Well, how do you get rid of that? Well, one of the ways you get rid of that is to completely engineer out all the human reward systems that pertain to the acquisition of, of resources-

    2. SA

      So what's left at that point?

    3. JR

      Well, we're a new thing. I think we've become a new thing.

    4. SA

      And what does that thing do, want?

    5. JR

      I think that new thing would probably want to interact with other new things that are even more advanced than it.

    6. SA

      I do believe that scientific curiosity can drive quite ... Th- that, that can be a great frontier-

    7. JR

      Mm.

    8. SA

      ... for a long time.

    9. JR

      Yeah. I think it can be a great frontier for a long time as well. I just wonder if what we're seeing with the drop in testosterone, the, because of microplastics, which sort of just snuck up on us. We didn't even know that it was an issue until people started studying it.

    10. SA

      How, how certain is that at this point, that that's what's happening?

    11. JR

      (clicks tongue) I don't know.

    12. SA

      I'm gonna go study after this.

    13. JR

      I, it's a very good question. Dr. Shanna Swan believes that it's the-

    14. SA

      Okay.

    15. JR

      ... primary driving factor of the sort of, uh, drop in testosterone and all, miscarriage issues and-

    16. SA

      Mm.

    17. JR

      ... low birth weights. The, the, all those things seem to have a dir- there seems to be a direct factor environmentally. I'm sure there's other factors too. Um, the, the drop in testosterone, I mean, it's, it's been shown that you can increase male's testosterone through resistance training and through making ... There's certain things you can do, like one of the big ones they found through a study in Japan is cold water immersion before exercise.

    18. SA

      Yeah.

    19. JR

      It radically increases testosterone. So you c- cold water immersion and then exercise post that.

    20. SA

      I wonder why.

    21. JR

      Yeah, there, I don't know, maybe ... Let's see if you could find that. Um, but-... it's, uh, it's a fascinating field of study, but I think it has something to do with r- resilience and resistance, and the fact that your body has to combat this external factor-

    22. SA

      Yeah.

    23. JR

      ... that's v- very extreme-

    24. SA

      Yeah.

    25. JR

      ... that causes the body to go into this state of preservation and the, uh, the, uh, implementation of cold shock proteins and the reduction of inflammation, which also enhances the body's endocrine system. But then on top of that, this imperative that you have to become more resilient to survive this external factor that you've introduced into your life every single day.

    26. SA

      Hmm.

    27. JR

      Um, so there's ways, obviously, that you can make a human being more robust, you know? We know that we can do that through strength training, and that all that stuff actually does raise testosterone. Your diet can raise testosterone. And the, uh, a poor diet will lower it and will hinder your endocrine system. It'll hinder your ability to produce growth hormone, melatonin-

    28. SA

      For sure.

    29. JR

      ... all- all these different factors. That seems to be something that we can fix in terms, or at least mitigate, in term, with- with decisions and choices and effort. But the fact that these petrochemical pro- Like, there's a graph that Dr. Shanna Swan has in her book that shows during the 1950s when they start using petrochemical products in everything, microwave, plastic, Saran wrap, all this different stuff, there's a direct correlation between the implementation and the dip, and it all seems to line up. Like, that seems to be a primary factor.

    30. SA

      Uh, does it have an equivalent impact on, uh, like, estrogen-related hormones?

  6. 1:15:001:25:36

    How many of the…

    1. JR

      on Twitter, I would say a lot of them are shitting on people and being angry about things.

    2. SA

      How many of the people that you know that use Twitter those eight or ten hours a day are just saying wonderful things about other people all day, versus the virulent?

    3. JR

      Very few.

    4. SA

      Yeah.

    5. JR

      Very few. I don't know any of them. I, I know ... (laughs) But then again, I wonder, with the implementation of some new technology that makes communication a very different thing than what we're currently ... Like, what we're doing now with communication is less immersive than communicating one-on-one. You and I are talking-

    6. SA

      Yeah.

    7. JR

      ... we're looking into each other's eyes. We're getting social cues. We're smiling- Yeah. ... at each other. We're laughing. It's, it's a very natural way to talk. I wonder if through the implementation of technology-

    8. SA

      Yeah.

    9. JR

      ... if it becomes even more immersive than a one-on-one conversation, even more interactive. And e- you will understand even more about the way a person feels about what you say-

    10. SA

      Yeah.

    11. JR

      ... about that person's memory, that person's life, that person's history, their education, how it comes out of their mind, how their mind interacts with your mind, and you see them. You really see them. I wonder if that, I wonder if what we're experiencing now is just like the first time people invented guns, they just started shooting at things. (laughs)

    12. SA

      You know?

    13. JR

      Yeah. I- if you can, like, feel what I feel when you say something mean to me or nice to me- Right. ... like, that's clearly gonna change what you decide to say. Yes. Yeah. Yeah, unless you're a psycho.

    14. SA

      Unless you're a psycho.

    15. JR

      And then what causes someone to be a psycho? And can that be engineered out? Imagine if what we're talking about, when we're dealing with the human mind, we're dealing with, uh, various diseases, bipolar-

    16. SA

      Yeah.

    17. JR

      ... schizophrenia. Imagine a world where we can find the root cause of those things, and through coding and some sort of an implemation- implementation of technology that elevates dopamine and serotonin and, and does some things to people that eliminates all of those problems, and allows people to communicate in a very pure way.

    18. SA

      It sounds great.

    19. JR

      It sounds great, but you're not gonna have any rock and roll. You'll, stand-up comedy will die. (laughs)

    20. SA

      (laughs) Um-

    21. JR

      You'll have no violent movies. You know, you, you're g- there's a lot of things that are gonna go out the window. But maybe that is also part of the process of our evolution to the next stage of existence.

    22. SA

      Maybe. I, I feel genuinely confused on this.

    23. JR

      Well, I think you should be. I mean, to be-

    24. SA

      We're gonna find out.

    25. JR

      Yeah. I mean, to be sure how it's going to-

    26. SA

      That being said, yeah, yeah.

    27. JR

      That's the same.

    28. SA

      But I don't even have, like, a-

    29. JR

      Hubris beyond belief.

    30. SA

      Right.

Episode duration: 2:36:43

Install uListen for AI-powered chat & search across the full episode — Get Full Transcript

Transcript of episode 7dCPytNTnjk

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome