Skip to content
Uncapped with Jack AltmanUncapped with Jack Altman

Marc Andreessen on The Future of Venture Capital | Ep. 12

(If you enjoyed this, please like and subscribe!) Marc Andreessen is a cofounder and general partner at the venture capital firm Andreessen Horowitz, a venture capital firm that manages $45 billion in assets under management. He is an innovator and creator, one of the few to pioneer a software category used by more than a billion people and one of the few to establish multiple billion-dollar companies. Marc co-created the highly influential Mosaic internet browser and co-founded Netscape, which later sold to AOL for $4.2 billion. He also co-founded Loudcloud, which as Opsware, sold to Hewlett-Packard for $1.6 billion. He later served on the board of Hewlett-Packard from 2008 to 2018. Marc serves on the board of the following Andreessen Horowitz portfolio companies: Applied Intuition, Carta, Coinbase, Dialpad, Flow, Golden, Honor, OpenGov, Samsara, Simple Things, and TipTop Labs. He is also on the board of Meta. We covered: - Evolution of the venture playbook - Small vs large funds - Current AI landscape - Politics and Silicon Valley - Tech and the media A few highlights: - Optimizing for the maximum amount of power - Conflicts being the reason a16z isn’t even larger - The middle is dead; you’re either Gucci or Walmart - Only 8 companies in the S&P 500 are innovating - We’ve lived in an era of intense preference falsification - AI and machines making the ultimate decision Timestamps: (0:00) Intro (0:27) Evolution of the venture playbook (15:54) Small vs large funds (29:10) Becoming a top tier firm (35:33) Limiting factors to building big companies (40:11) Investing in AI (50:02) Developing investors (59:06) AI going wrong (1:09:20) Politics and Silicon Valley (1:11:39) Tech and the media (1:23:22) Preference falsification (1:31:10) Career advice (1:34:07) Huberman “beef” (1:38:21) Question from X More on Marc: https://x.com/pmarca https://pmarca.substack.com/ https://a16z.simplecast.com/ More on Uncapped https://linktr.ee/uncappedpod https://x.com/jaltma Email: friends@uncappedpod.com

Marc AndreessenguestJack Altmanhost
Jun 11, 20251h 39mWatch on YouTube ↗

EVERY SPOKEN WORD

  1. 0:000:27

    Intro

    1. MA

      Here's what I encourage, I'm gonna break the fourth wall.

    2. JA

      Yes. [chuckles]

    3. MA

      [chuckles] Here's what I would encourage people to do. Here's the thought experiment to do: just write down two of these, in the middle of the night with nobody around, door's locked.

    4. JA

      Write it down on a piece of paper, and let's pull it out in ten years?

    5. MA

      Write down on a piece of paper two lists. What are the things that I believe that I can't say?

    6. JA

      Mm-hmm.

    7. MA

      And then what are the things that I don't believe that I must say?

    8. JA

      Hmm. [upbeat music] All right, I am so excited to be here with Marc Andreessen. Marc, thank you so much for doing this with me today.

    9. MA

      Jack, it, it's a pleasure.

  2. 0:2715:54

    Evolution of the venture playbook

    1. JA

      So what I wanted to start with was the topic of small funds, big funds. We had Josh Kopelman on the podcast, and he made a point that resonated around fund size, the outcomes in venture, and sort of just, like, looking at the math of all of it. And I think as venture funds have grown, it sort of spoke to a lot of people about, like, kinda what the plan is a- and sort of how tech is gonna go. And so I guess to start, I'd be curious to hear your thoughts around that whole dynamic. Obviously, you know, s- you've got a big venture firm, and so I just want to hear kinda your perspective on this whole topic to start.

    2. MA

      So, so start by saying, like, Jo- Josh is a longtime friend, and I, I think is a, he's a hero of the industry. Uh, and I say that because, you know, he started First Round Ventures back in the very dark days. Uh, I forget the exact year, but, you know, back, back during the dark days of, uh... after the 2000 crash. Um, and in fact, you know, there was a period of time back there when, you know, the total number of angel investors or seed investors operating in tech was, you know, maybe eight total. [chuckles] And, and, you know, f- uh, actually, Ben and I were two of them. But, you know, this was sort of the heyday of, you know, Ron Conway and, and, um, you know, kind of a, uh, you know, Reid Hoffman and a, a very small group of people who were kind of brave enough to invest in new companies at a point in time when, you know, basically everybody believed the internet was over, like the whole thing was, was, was done. And so he, like, I just think, like, that was an incredibly heroic, brave act. It obviously worked really well. It, you know, turns out buy low [chuckles] sell high actually is a good strategy.

    3. JA

      It is very good.

    4. MA

      Um, it's, it's very nerve-wracking when you're trying to do it, but it does work, and he, he, he had brilliant timing for when he started, and, you know, the companies that he's supported have gone on to become incredibly successful, and we've, we've worked with him a lot. Um, so, you know, we're a big fan, uh, of his. And then second is, I would say I didn't actually... I heard, I heard there was a discussion.

    5. JA

      Yeah.

    6. MA

      I d- I, I never as a rule-

    7. JA

      Yeah

    8. MA

      ... I never, I never read or watch anything-

    9. JA

      Yeah

    10. MA

      ... I'm involved in.

    11. JA

      That's good.

    12. MA

      So I, I totally-

    13. JA

      Well, it wasn't about, you know, and he-

    14. MA

      I to- I totally missed it.

    15. JA

      And to summarize-

    16. MA

      Yeah

    17. JA

      ... basically, what he was saying is, he coined this, like, venture arrogance score idea. But basically, the idea is, you know, if you're gonna own 10% of a company at exit and you wanna have a three X fund, and you're probably gonna have a power law of outcomes, you basically need your big outcome to be, like, really big. And so, like, how's the math shake out? And basically, you know, the question he was sort of posing broadly is: Are the outcomes gonna be much bigger? Are you gonna own a lot more? Are you gonna hit a lot more winners? But it w- it was sort of like that math question.

    18. MA

      So I'd say a couple things. So one is, look, th- venture is a c- is, is actually a customer service business [chuckles] in our, in our view, so start with this. So, uh, it's actually a customer service business. There are two customers. There are the LPs, and there are the founders. Um, and we, we think of them both, both customers. And so, you know, at the end of the day, the market's gonna figure this out, and the, the LP money's gonna flow to where, obviously, they think the opportunities are, and the, you know, like, the, the founders are cer- You know, the, as you know, the best founders definitely pick who their investors are. It's actually very unusual, right, asset class. It's the only asset class in which the, the recipient of the capital picks the-

    19. JA

      Yeah

    20. MA

      ... you know, picks the, the, uh, actually cares where the money comes from and picks, picks, picks it. So the market will figure this out. Um, I, I, I think the big thing, uh, uh, to responding to your general point, I think the big thing is the world has really changed. Um, and so, you know, modern venture capital, uh, in the form that we understand it, is basically, um... You know, there, there were examples of venture capital going back to, like, the 15th century or something, with like, you know, Queen, Queen Isabella and Christopher Columbus and whalers off the coast of Maine in the 1600s and so forth. But modern venture capital was basically a product of the '50s and '60s. Originally, this guy Jock Whitney from the Whitney family sort of created the model. George Doriot, who's a MIT professor, created a version of it. And then, you know, and then the great, you know, the great heyday of the 1960s VCs, Arthur Rock and those guys, um, and everybody that followed, Don Valentine and Pierre Lamond and Tom Perkins and, and so forth, Gene Kleiner. You know, all those guys. Ba- basically, it, uh, basically, from that period, it's about the 1960s, through, call it 2010, there, there was, like, there was just, there was a venture playbook, and it became a very well-established playbook. And it, it sort of consisted in two parts. One was a sense of what the companies were going to be like, right? And then the other was what the venture firm should be like. And so the, the, the playbook was, the companies are basically tool companies, right? Basically, all successful technology companies that were venture-funded in that 50-year stretch were, were basically tool companies, right? Pick-and-shovel companies. So, uh, mainframe computers, d- desktop computers, smartphones, laptops, um, internet access, software, SaaS, databases, routers, switches, um, you know, disk drives, [chuckles] all these things, word processors, uh, tools, right? And so, you know, you, you buy the tool, you, you, the customer buys the tool, they use the tool, but it's, it's a general-purpose technology sold, sold to lots of people. Basically, it, around 2010, I think the industry permanently changed, and, and, and the, the, the change was the, the big winners in tech, more and more, um, are companies that go directly into an incumbent industry.

    21. JA

      Hmm.

    22. MA

      Right, like insert directly. And, and, and I think the big turning point on this was, like, Uber and Airbnb, right? Where Uber could have been... Like, Uber in 2000 would have been special- specialist software for taxi dispatch-

    23. JA

      Hmm

    24. MA

      ... that you sell to taxi cab operators. Uber in 2010 was, "Screw it, we're doing the whole thing."

    25. JA

      Hmm.

    26. MA

      Airbnb in 2000 would have been booking software for bed and breakfasts.

    27. JA

      Yep.

    28. MA

      Right? Running on a Windows PC. Um, uh, right, a- and, and then Airbnb is just like, "S- screw it, we're doing, we're, we're doing the whole thing." And so, and, you know, Chris Dixon came up with this sort of term, the full stack startup-

    29. JA

      Mm-hmm

    30. MA

      ... uh, which he kind of meant. But the other way to think about that is just you're, you're, you're actually, the, the, the company is delivering the entire-

  3. 15:5429:10

    Small vs large funds

    1. MA

      the, the change in the market. Now, having said that, I don't think that's an argument that it's just therefore big, big firms win everything. That's definitely not my, not, not my thesis, and by the way, that's also not how I'm deploying my own money, which, which I wanna talk about because I'm, I'm, I'm living what I'm about to say.

    2. JA

      Yeah.

    3. MA

      Um, which is, I think what happens is what Nassim Taleb calls the barbell. And the way to think about the barbell is basically, you, you, you, you basically draw... You basically, you have a continuum, and on the one side of the continuum, you have high scale, and on the other side, you have high specialisation. And what you see in industries that mature and develop in this way, including many industries in the last hundred years, basically what happens is, as they, as they mature and enter their kind of full state, as they kind of flower, what happens is-... is they often start with generalists that are neither subscale nor particularly specialized. Um, and then, uh, o- over the fullness of time, what happens is th- they get disintermediated, and then there are scale players on the one side, and there are specialist players on the other side. The most obvious example of this in everybody's lives is retail. Um, when I was a kid, there were these things called department stores. Pretty good selection and pretty good price, but not a great selection and not a great price, right? And then sitting here today, those are all out of business. Like, they're just gone.

    4. JA

      Then it gets crushed by Amazon on one end, and then, like, amazing retail on the other end.

    5. MA

      Exactly. Exactly, right? And so... And, and why do you go to Amazon or, or Wal- or Walmart or, you know, what, the, the big... And by the way, there were even these big box guys, you know, Toys "R" Us and so forth. And then-

    6. JA

      Yeah

    7. MA

      ... over time, like Amazon and, and, and Walmart even, even ate, they, ate, ate that. 'Cause when you go to Amazon or Walmart, what you get is just, like, an unbelievable selection of basically anything that's a commodity, right? Um, that, that you just buy a- at, at, like, super low prices, and it's basically impossible to compete with that if you're subscale-

    8. JA

      Yeah

    9. MA

      ... on the one hand. And then to your point, and then the specialist retail experience is, like, the Gucci store or the Apple store.

    10. JA

      Yeah.

    11. MA

      Y- you know, the, the $15 candle store.

    12. JA

      They give you some Perrier when you walk in.

    13. MA

      Oh, they love you.

    14. JA

      Yeah.

    15. MA

      Like, they're so happy to see you.

    16. JA

      Show you, yeah.

    17. MA

      Exactly, right. You know, they'll, they'll do private showings for you-

    18. JA

      Yeah

    19. MA

      ... and, you know, they'll pour-

    20. JA

      It's an experience, yeah

    21. MA

      ... pour the champagne, and it's like-

    22. JA

      Yeah

    23. MA

      ... it's like, it's like an entire experience.

    24. JA

      Yeah.

    25. MA

      And, and so what's happening is, and, and you just, again, you see this in, like, the return. You just look down this return standpoint, like, this is what's happened. This is where this is, th- this is how the value is. And then, then what happens is that just, like, gaps way out, and it never comes back together again.

    26. JA

      Yeah.

    27. MA

      And then what the consumer does is they build a portfolio of, of, of their experiences, and so they, they, they buy things at unbelievably cheap prices at Walmart and Amazon, and then that gives them more spending money to be able to spend on the boutique.

    28. JA

      So, so this middle, the, uh, the, the bar that's in the middle, that's kind of screwed.

    29. MA

      Yes.

    30. JA

      What is the mechanic by which they're in trouble? Is it because-

  4. 29:1035:33

    Becoming a top tier firm

    1. MA

      funds that have gotten larger.

    2. JA

      Why is it so rare for somebody to break through and get... I mean, you did it, and that's one that happened in the last 15 years. Maybe there's a couple others, maybe, but why is it as rare as it is? It seems, like, almost more rare than a new big company-

    3. MA

      Yeah

    4. JA

      ... in a way.

    5. MA

      Yeah. That, that's true. In fact, our analysis, actually, when we started, was there actually hadn't been... I think there had been two firms, Andy Rachleff, actually-

    6. JA

      I mean, Thrive also.

    7. MA

      Well, so Thrive was act- uh, yeah-

    8. JA

      Yeah

    9. MA

      ... they were after us.

    10. JA

      Yeah, yeah, yeah.

    11. MA

      Uh, I mean, they've, they've been great.

    12. JA

      Yeah.

    13. MA

      But, um, but before, before us, in the, in the 30 years before us, we think that there were only two new VCs that actually punched through to become top tier. Um, in, in, in other words, VCs that were not either firms that were built in the '60s and '70s or firms that weren't derivations of those firms.

    14. JA

      Founders Fund?

    15. MA

      No, no, no, no, no, no.

    16. JA

      Oh.

    17. MA

      The Founders Fund started at actually, uh, around the same time we did.

    18. JA

      Okay.

    19. MA

      Um, the, the, they were a little bit earlier-

    20. JA

      Yeah

    21. MA

      ... but around the same time.

    22. JA

      Yeah.

    23. MA

      But I mean, over the preceding, like, 50 years.

    24. JA

      Okay.

    25. MA

      Sevin Rosen. You won't even re-

    26. JA

      No.

    27. MA

      This is sort of the thing, you won't even recognize these names, so-

    28. JA

      Geez, I need to read a book or something.

    29. MA

      So Sevin Rosen was the venture firm that famously funded Compaq Computer-

    30. JA

      Okay

  5. 35:3340:11

    Limiting factors to building big companies

    1. MA

      are much higher than they used to be.

    2. JA

      Maybe one final question on this topic of fund size, and then I wanna go to, to AI. Um, what do you think... And I know you've thought about this a lot. What do you think is the limiting factor for the creation of a lot more really big companies?

    3. MA

      Yep.

    4. JA

      Do you think it's founders? Do you think it's capital? Do you think it's market maturity? Do you think it's underlying tech stuff? Like, if you had to pinpoint the one or two things that you think would allow for there to be way more big companies, like, what is it?

    5. MA

      So there's sort of the holy trinity of venture, uh, startups, which is, uh, you know, p- uh, people, market, and technology. Um, and I think the answer is sort of all three, and the way I would describe it is, there is some limiting issue with just market si- uh, just how many markets are there, how big are they, how, how ready is the market to take something new?

    6. JA

      Mm-hmm, mm-hmm.

    7. MA

      Then there's the technology question, which is, you know, when is the technology actually... The, the, like, from the venture perspective, technology moves in stairsteps, right? And so things become possible in the world of smartphones that just weren't possible. You know, you couldn't do Uber with- when everybody had a laptop, you had to wait until they had phones.

    8. JA

      Yeah.

    9. MA

      Right? Um, and so technology moves in a stairstep. You get these par- you know, paradigm shifts, platform shifts, and, and those just, they come when they come.

    10. JA

      Yeah.

    11. MA

      And until they come, you, you can't do it. And then, and then the people side, you know, and this is the one that, you know, I say, you know, vexes me the most, which is like, okay, like, how do you just get more great f- great founders?

    12. JA

      Yeah.

    13. MA

      Right? Um, and I think part of that is, you know, you... I, I think there is definitely a training thing that is real, and getting people into the right scene in the right way, and, like, the thing that, like, Y Combinator does, or the thing that Thiel Fellows do, like, th- those are real things. Um, and those help a lot. But also, I, you know, there is an inherent... You know, there are just certain- th- there's, there, there are not infinite number of people running around who have the-

    14. JA

      You probably figure there's a lot of people who could have built big companies who haven't, though, and hopefully is-

    15. MA

      Well, a lot-

    16. JA

      Yeah

    17. MA

      ... or a few.

    18. JA

      Yeah, I don't know.

    19. MA

      Some. [chuckles]

    20. JA

      Yeah, some, some, I don't know, some number. But there must be people who are just, like, in academia or government or educa-

    21. MA

      Yeah

    22. JA

      ... who are just doing something completely different, who, if they were attracted to startups, would have built a big company.

    23. MA

      So yes, but then the other question is, like, well, okay, then why didn't they? Why, why didn't they do the things required to get themselves in that position?

    24. JA

      Well, it could have been then, like, 2001, it was just like, too many people were too scared to do it or didn't know about it or whatever.

    25. MA

      But what does that tell you about the people who didn't do it?

    26. JA

      Yeah.

    27. MA

      Is they, they were heard... I can tell you who didn't listen to that, right? It was Mark Zuckerberg.

    28. JA

      Are there more good, um-

    29. MA

      But, but l- let's just press this point harder for a moment-

    30. JA

      Yeah

  6. 40:1150:02

    Investing in AI

    1. JA

      That's a good segue into AI. Do you feel that we're now at the beginning of what is, like, the new next important-... you know, paradigm? Like, is this cloud, but on steroids?

    2. MA

      Oh, but yeah, much, uh, I think much, I think much larger-

    3. JA

      Yeah.

    4. MA

      And I'll, I'll-

    5. JA

      Yeah

    6. MA

      ... explain why. So, um, yeah, so, so I described, you know, I described earli- I described before, right, you know, the, the, the, the triangle people technology market. The, the technology is, ul- ultimately the driver is the technol- the tech- the technological... For venture, the technological step function changes drive, drive the industry, uh, and they always have, right? And so if you talk to the LPs, you can see this. It's like when, when there is a giant new technology platform, it's an opportunity to reinvent a huge number of companies and products that, you know, now have become obsolete and create a whole new generation of companies, often, and, you know, generally end up being bigger than the ones that they, they replaced. And so, so, so, and the venture returns map this, and so it, it get, it, they come in waves-

    7. JA

      Mm.

    8. MA

      -and the LPs will tell you, it's just like, yeah, there was the PC wave, the internet wave, the mobile wave, the cloud wave. Like, that was the thing. And then, by the way, when in venture, when you get stuck between waves, it's actually very hard, right? 'Cause u- you've seen this for the last, like, five years. Like, for the last five years, it's like, how many more SaaS companies are there to found?

    9. JA

      Yeah.

    10. MA

      Like, just we're j- we're just out of ideas-

    11. JA

      Yeah

    12. MA

      ... we're just out of, out of categories.

    13. JA

      Yeah, yeah, yeah.

    14. MA

      All been done.

    15. JA

      Yeah.

    16. MA

      Right? And so it's when you have a fundamental technology paradigm shift, that gives you an opportunity to kind of rethink the entire industry.

    17. JA

      It would've been very sad, by the way, if the AI breakthrough didn't happen. Like, uh, the state of venture would be sad, I think.

    18. MA

      Three years ago, this was... I mean, so when we were talking to our LPs three years ago, we're just, like, basically, like, you know, we're in, you know, we're... Uh, so, uh, Chris Dixon has this, uh, uh, framing he uses. He calls it your, in venture, you're either in, uh, uh, search mode or hill-climbing mode. And in search mode, you're looking for the hill.

    19. JA

      And it was search mode. [chuckles]

    20. MA

      Uh, uh, right. A- a- and, and three years ago, we were all in search mode, and that's how we described it to everybody, which is like, "We're in search mode, and there's all these candidates for what the things could be." And AI was one of the candidates, right? It was like a known thing, but it hadn't broken out yet-

    21. JA

      Yeah

    22. MA

      ... in, in the way that it has now. And so we were in search mode. Now, we're in hill-climbing mode. [chuckles]

    23. JA

      Thank goodness, yeah.

    24. MA

      Big time.

    25. JA

      Yeah.

    26. MA

      Yeah, and then, and then, you know, look, like I, I, as I say, on the technology breakthrough itself, I think a year ago, you could've made the argument that, like: I don't know if this is really gonna work, 'cause LLMs, re- you know, hallucinations. Can... You know, it's great that they can write Shakespearean poetry and hip-hop lyrics. Can they actually do math?

    27. JA

      Mm-hmm.

    28. MA

      You know, can they do, can they write code?

    29. JA

      And now, obviously, yes.

    30. MA

      And now, now they obviously can.

  7. 50:0259:06

    Developing investors

    1. MA

      for.

    2. JA

      It seems like it's very hard to assemble lots of, you know, very good, productive GPs into the same firm. It's just objectively rare.

    3. MA

      Yeah, that's right.

    4. JA

      You've done it, but it's like, doesn't happen very often.

    5. MA

      Yeah.

    6. JA

      Do you... I guess my first question on this is, do you think of-... just finding greatness, and then you can't really teach it much? You know, so you're basically just gonna, like, hire people and see how it goes, or do you think that it's about creating the system and conditions in which people do great work, and you can actually create good investors?

    7. MA

      Yeah. So I think it only works if there's a point, like, if, if there's a reason why you would have a, a aggregation of GPs in the first place, and our, our answer to that is power, right? The, the, the, our, our pitch to GPs as to why they should join us as opposed to go to a smaller firm or start their own thing is, "If you come here, you just, like, plug into this engine that's just, like, massively powerful, and so everything that you do, the, the effects of it are gonna just be, like, blown-

    8. JA

      Yeah

    9. MA

      ... completely out. Y- therefore, you're gonna have a much higher win rate on the deals you, you want to do-

    10. JA

      Yeah

    11. MA

      ... which is much more satisfying, and you're gonna be able to actually help the companies a lot more."

    12. JA

      And you'll probably see more companies anyway.

    13. MA

      Yeah.

    14. JA

      So everything probably gets better.

    15. MA

      Yeah, that's right, that's right. And, and by the way, you, you know, some people wanna have colleagues. Some people don't [chuckles] want to have colleagues, but some people do wanna have colleagues, and you'll be working with people you like, and, you know, who care about the same things you do. So... But there, there has to be a p- there has to be a point to it, and of course, it's, you know, it's on us to keep proving that, right? 'Cause, uh, you know, the, the, the devil's in the details of whether they'll actually, you know, buy that, but so far, so far, a lot, a lot of really good, great people have. And then, yeah, and then the second part of the question is, like, okay, who, who do you, who do you put in those roles? Um, historically, we had a, h- his- our old model was basically we only hire GPs. Uh, we don't... We, we were not developing, and we could go through why that was the case. We changed that, like, eight years ago. We, we now develop our own GPs, um, that we've, we've evolved to where I think that's-

    16. JA

      Mm

    17. MA

      ... that's working quite well. Um, I think the answer to your question is, it's a two-part question, is there's some level of just objective, you know, are they, are they, are they, are they good, uh, are they good at doing the job?

    18. JA

      Yeah.

    19. MA

      Uh, uh, here's a big thing we focus on when we evaluate them, which is, um, y- you know, it's fine to invest in a category, like, five years early or, like, whatever, something goes wrong, like, that's fine. What's not fine is y- you invest in the wrong company, and you could have invested in the right company.

    20. JA

      Yeah.

    21. MA

      Like, at the moment you made the investment, you could... Y- you, you made the wrong decision in that moment of which one you should invest in, and you-

    22. JA

      Yeah

    23. MA

      ... you could have known. And so it's like, did you do the work to fully-

    24. JA

      What's-

    25. MA

      ... address the market?

    26. JA

      How do you handle the fact that, like, you don't know that until-

    27. MA

      Yeah

    28. JA

      ... like, six years later, and now you're going back, and you're like, "Hey, you made this mistake six years ago. This isn't gonna work out now?"

    29. MA

      So it's generally, uh, y- so c- that is a giant problem. Um, and, uh, uh, say the, uh, when we started, actually, when we talked to our, our friends in the business, what they said basically was, they said, "Number one, you, you don't know if somebody's a good GP for 10 years 'cause you don't have the return data." And then they said, "Number two, is nobody ever wants to admit that they made a mistake, and so they never actually fire anybody."

    30. JA

      Yeah.

  8. 59:061:09:20

    AI going wrong

    1. MA

      to finally tell a good joke.

    2. JA

      So on AI, I wanna talk about not just the startup side, but maybe, like, um, just some of your takes on, like, the broader lens of AI. I guess my first question is around AI going wrong, and I know this is, like, a very hard thing, but I'm just sort of, for fun, really curious what you think. You know, the downside case that people are very afraid of would be something like AI embodies humanoid robots, and now we have a Terminator situation on our hand. It gets agency, we have a big problem.

    3. MA

      Right.

    4. JA

      You know, that's one end of the spectrum. The happy path is that it's just, like, the sickest software that anybody's ever seen, and, like, it's a tool that humans use, and everything's great. Do you think about this? If so, do you have any opinion on it, or are you just like, "It's gonna be what it's gonna be?"

    5. MA

      Just to start by saying, it's a, it's an important new technology. Any important new technology is what they call dual use. Um, it can be used for good things, it can be used for bad things. Um, the shovel, [chuckles] it can dig a well and save your life. You can bash somebody over the head with it and kill them.

    6. JA

      Yeah.

    7. MA

      Fire, you know, the computer, um, airplane.

    8. JA

      Yeah.

    9. MA

      You know, the airplane can take you on a most marvelous vacation with your new spouse. It can also bomb, you know, Dresden. Um, right, and so it's just, a- atom, atom, I mean, atomic power was the big one, 'cause atomic power could be unlimited clean energy for the entire world, or it could be nuclear bombs, right? Um, as it turns out, [chuckles] there we just got the bombs, we didn't get the unlimited clean energy. And so, um, uh, like, that, that's just, like, generally true. These, these thing, these things are double-edged swords. The question is, like, all right, like, y- what are you gonna do about that? Um, and are you gonna, like, somehow put it back in the box, or are you gonna somehow, like, try to constrain it and control it? Um, the, the nuclear example is really interesting, um, because the, um, you know, there was a v- you know, very big concern around, obviously, nuclear weapons and then, and then nuclear... There was a kind of big moral panic that developed around nuclear power.

    10. JA

      I mean, we kind of messed up with that-

    11. MA

      Meltdowns. We very badly messed up with it. And, and what happened was the, the, the green movement in the '60s and '70s created something called the precautionary principle, which is now there, which, which the same kinds of people are now trying to apply to AI, which basically says, unless you can prove that any technology is definitely going to be harmless, you should not deploy it. And of course, that literally rules out everything, right? That's just, like, no fire, no shovels, no cars, no planes, no nothing, no electricity. And so, and, and that is what happened to civilian nuclear power, which is they just, they, they, they, they killed it. The story I tell on that is President Nixon in 1971, the year I was born, he declared... He saw the oil crisis coming in the Middle East. Uh, he declared something called Project Independence. He said the Ameri- America needs to build 1,000 nuclear power- civilian nuclear power plants by the year 2000 and go completely gr- clean, carbon, carbon zero, completely electric. Cut the entire, you know, car- cut-- you know, they had electric cars 100 years ago, so it's just obvious you just cut over to electric cars at some point. And, and, and, and basically, we need to do that, and then, and then we're, we're not entangled in the Middle East, and we don't need to go, you know, do, do all this stuff, uh, there. He then created the EPA and the Nuclear Regulatory Commission, which then prevented that from happening. [chuckles] It absolutely killed the nuclear industry in the US, right?

    12. JA

      Yeah.

    13. MA

      Um, and then the Ger- the Germans are going through the new version of that in, with Ukraine, which is they keep shutting... You know, Europe, ex-France, keeps shutting down their nuclear plants, which just makes them more dependent on Russian oil, and so they end up funding the Russian war machine, which invades Ukraine, and then, you know, they, they're always, they're worried now it's gonna invade Russia.

    14. JA

      Mm.

    15. MA

      And so y- y- y- y- the, the social engineering, I would, I would say the moral panic and then the social engineering that comes out of this, uh, the history of it is, it's been, uh, quite bad, like, in terms of its thinking-

    16. JA

      Mm

    17. MA

      ... and then in terms of its practical results.

    18. JA

      Yeah.

    19. MA

      Um, I think it would be a very, very, very big mistake to do that-

    20. JA

      Yeah

    21. MA

      ... you know, in AI. Um, and then-

    22. JA

      To, like, regulate early.

    23. MA

      Yeah, yeah, yeah, absolutely, 100%. Um, to try to offset the risks in order to, like... And, and then, and then cut off the benefits. So let's start with that as number one. Number two, I'd just say, look, we're, we, we're not alone in the, in the world, and we, we knew that before, but especially after DeepSeek, we really know that. Um, and so there is a two-horse race. Um, th- this is shaping up to be the equivalent of what the Cold War was, um, in the, in the, uh, against the Soviet Union-

    24. JA

      Yeah

    25. MA

      ... in the last century. It is shaping up to be like that. China does have ambitions to basically imprint the world on their, on their, their ideas of how society should be organized and how the world should be run, and they obviously intend to fully proliferate their technology, which they're doing in many areas.

    26. JA

      Yeah.

    27. MA

      Um, and the world, you know, 50 years from now, is gonna be running on... You know, 20 years from now, is gonna be running on Chinese AI or American AI. Like, those are your choices.

    28. JA

      You think that's how it'll basically play out?

    29. MA

      Yeah.

    30. JA

      Yeah.

  9. 1:09:201:11:39

    Politics and Silicon Valley

    1. MA

      question.

    2. JA

      I guess this kind of actually feeds into the, the next topic, which to me is, um, I think, like-... tech has now gotten to a place where with the government and politics, like, it's sort of now undeniable. It used to kinda be an underdog, but now for reasons like this and a bunch of others, it's just, like, too important to, like, not be in the mix at, like, the national stage now, which I think has really, like, changed the dynamic even insularly for Silicon Valley. 'Cause now, you know, people are, you know, looking at what people are doing, not just, like, in tech, but pretty broadly now.

    3. MA

      Yeah, that's right. Yeah, so I would say I deeply agree with that. Um, I believe it is mostly our fault. [chuckles] Um, like, the, the current situation is mostly our fault i- in tech, which is... There, there's an old Russian, an old Soviet joke, which is, "You may not be interested in politics, but politics is interested in you." [chuckles]

    4. JA

      Yeah.

    5. MA

      And so I think we, we, we, and I would include myself in this, I think we all got complacent, or a lot of us got complacent between, like, 1960 and 2010, that basically just said, "We can just sit out here. We can do our thing. We can talk about how important it all is, but, like, it's never gonna... You know, th- these are never gonna be big social or, uh, you know, cultural or political issues-

    6. JA

      Yeah

    7. MA

      ... um, and we can just kinda get away with not being engaged." And then I, and, and for all the reasons we've discussed-

    8. JA

      You're saying, and then once it was undeniable, we weren't prepared?

    9. MA

      And then we weren't prepared. We weren't even, I would say, remotely prepared, and then, and then there, use the metaphor, the dog that caught the bus, a- and the, the dog is being dragged behind the bus-

    10. JA

      Yeah

    11. MA

      ... the ta- tailpipe in its mouth-

    12. JA

      Yeah

    13. MA

      ... it doesn't know what to do with the bus.

    14. JA

      Yeah.

    15. MA

      And, and look, you know, geography, I think, has a lot to do with this. We're 3,000 miles away. You know, it's just hard to get there. They don't come here very often. Um, and, and yeah, so I, I guess I would say, like, like, w- it worked. Like, we, we actually... We always wanted to build important things.

    16. JA

      Yeah.

    17. MA

      We actually are building important things. There are obvious political, cultural, social consequences to them. Um, if we don't engage, nobody's going to.

    18. JA

      Yeah.

    19. MA

      And then, by the way, the other thing I'll say is, you know, it's not like there's unanimity even in the industry on a lot of these issues, right? Um, and so there's, you know, I would say, two giant divisions right now-

    20. JA

      Mm

    21. MA

      ... uh, big companies versus small companies.

    22. JA

      Yeah.

    23. MA

      You know, there's, th- they often do not have aligned incentives right now, uh, and aligned agendas. And then the other is, um, you know, like, just on AI, obviously, there's a big dispersion of views even in the industry.

    24. JA

      I guess this probably goes to why it's, um, important for, to some extent, at least some VCs to have relationships with the government because big tech has the resources to do it themselves, small tech can't. And so if this is the state of the world, we actually, as an industry, need somebody to be doing it on behalf of little tech.

    25. MA

      Yeah, that's exactly right. That's why we're doing what we're doing.

  10. 1:11:391:23:22

    Tech and the media

    1. JA

      Yeah. On media, in particular-

    2. MA

      Sure

    3. JA

      ... um, I thought it was really interesting. I can't remember how many years ago, but Balaji, many years ago, started talking about, like, some fracturing, about, you know, the, the sort of relationship between tech and the media was going downhill. I think this was mostly talking about media and inside tech, but I think probably also with the major publications and at, sort of a larger scale. From my read as often, you know, y- I think this was right, and my- from where I sit, it seems like it did kind of continue to degrade the relationship. What's interesting to me recently is I've seen a little bit of life, you know, in the sort of tech publication stuff, but it's actually been from the inside.

    4. MA

      Yeah.

    5. JA

      And so, like, Eric, who you just brought on as GP, is awesome, and he's been really good at doing this. TBPM's really cool, and I don't think I've seen something like that pop up maybe ever inside tech. What's your read, I guess, within our bubble of, like, the sort of tech-media relationship and, and where it's been?

    6. MA

      So my background in this is I, I've, you know, I have a weird kinda history, um, uh, 'cause of what happened in the '90s. But, you know, I started dealing with the national press and the tech press, business press in 1993, 1994. Um, and I did an annual press tour to the East Coast, you know, probably a week, out of each year, usually in the spring. And you know what that means is you kinda go around, and you meet with all the publishers, editors, and reporters, um, you know, cover everything. And I would say the, basically, the, the stretch from '94 to 2016 was generally... Like, I thought it was, like, a quite healthy, normal, productive relationship. You know, like, they would run... You know, they, they would do investigative reporting, and they would run stories I don't like-

    7. JA

      Mm

    8. MA

      ... but generally they, you know, the, the, the major publications in each of those categories were trying to understand what was going on and were trying to kind of be, you know, hon- honest brokers and trying to, you know, kind of represent what was happening. And so then, so, so to me, it's, like, super interesting. They always wanted to learn. They always had tons of questions. They were super curious about everything that was happening. [chuckles] So that was great until 2016. It was the spring of 2017 that I went on the press tour, and it was like somebody had flipped a light switch. Um, and they were, like, across the board, like, unbelievably hostile-

    9. JA

      Mm.

    10. MA

      ... like, unbelievably, like, completely, a- and across the board, like, 100% sweep.

    11. JA

      Do you know why?

    12. MA

      Absolute hostility. I, I think the obvious answer is, uh, Trump. Uh, T- Trump, Trump got nominated, and then he got elected, and then they blamed tech for, for, for bo- for both of those.

    13. JA

      Mm-hmm.

    14. MA

      Uh, and now, uh, by the way, there's, there are a bunch of other factors, including that, that was when the, the... That was when the... It's actually the, the, the there's a business side to it, which is there was the fear that the internet was gonna eat the news business in the '90s. It actually didn't happen. And actually, 2015, I think, was the best year in history for, like, revenues to, like, newspapers.

    15. JA

      Yeah.

    16. MA

      Um, and then it was really after 2015, social networking went big, and then the, their, their businesses started to collapse. And, you know, they started having lots of layoffs-

    17. JA

      Mm

    18. MA

      ... and so that didn't help.

    19. JA

      Yeah.

    20. MA

      And then, you know, look, they would say, look, that was also... You know, they would say, "Hey, smart guy, that's also when you started doing all these things that actually matter more," right?

    21. JA

      Sure.

    22. MA

      Um, and so, you know, the, the, the, what, everything we've been discussing, like y- the tech industry changed, and so, you know, you're gonna get a different level of scrutiny 'cause you deserve it-

    23. JA

      Yeah

    24. MA

      ... 'cause you're doing different things now. The political thing was just a giant swamping factor, and they... And, you know, this is a big-

    25. JA

      Yeah.

    26. MA

      You know, I don't wanna get into the politics per se, but if you just... You know, it's, it's, it's, this whole thing ran in parallel with everything that's, like, in Jake Tapper's book about, you know, uh, [panting] like... So it's just like they, they just, they, they got locked in on a mode of, of interaction. Um-

    27. JA

      Yeah

    28. MA

      ... it just became very polarized.

    29. JA

      Yeah.

    30. MA

      Um, and very polarized and very lockstep. And, you know, from the outside, you just, you, you, you read it, and you're just like: Wow, these people, they're, they're all, like, really wrapping themselves around an axle.

  11. 1:23:221:31:10

    Preference falsification

    1. MA

      moment, so.

    2. JA

      Sort of related to this topic, a little bit adjacent, but I saw you talking about preference falsification recently, and I think this is, like, a super interesting topic in general, but particularly in the last, I don't know, call it five-ish years, I think a lot of preference falsification became made apparent. Um, so I'd be curious first to hear a little bit about what you think happened over the last some number of years where these changes happened. Uh, maybe we can start there, and then I've got a follow-up on it.

    3. MA

      Yeah, so preference falsification, just a, a sketch and outline, it's, it's when people, um... It's actually there's two different definite- there's two different elements of it. Um, it's when people are required to say something in public that they don't actually believe, or they are prohibited from saying something in public that they do believe.

    4. JA

      Mm-hmm.

    5. MA

      Right, so again, so, uh, uh, commission, omission-

    6. JA

      Mm-hmm

    7. MA

      ... uh, uh, issues. And then the, the, the theory of it, there's this great book, uh, by Timur Kuran on it. The, the theory of it basically is, it, it, it's easy to think about what this happens in the case of a single person, which is, are you telling the truth? Uh, or is, are your public statements mirroring what you actually think or not? The thing that gets complicated is when that happens across a group or across a society, and the thing that happens is, if there's widespread f- preference falsification of society, you not only have people lying about what they actually think or, or hiding it, but you also l- everybody loses the ability to actually know what the distribution of views are.

    8. JA

      Yeah.

    9. MA

      Right? And, and he al- and he says, basically, if you look at the history of political revolutions, a political revolution happens when, uh, a, a majority of the country realizes that a majority of the country actually agrees with them, and then, and then, and they, and they didn't realize it.

    10. JA

      Mm-hmm.

    11. MA

      Right? So that whatever system they were in had convinced them that they were in a very small minority, and then you get a c- at, at some point, there's, you know, the boy who points out-

    12. JA

      It's like a catalyst

    13. MA

      ... there's a catalyst, catalytic moment. And, and then I, and then basically there's a pr- what's called a preference cascade.

    14. JA

      Right.

    15. MA

      Um, and then, um, and then all of a sudden-

    16. JA

      It's like the correct prisoner's dilemma's box to live in, all of a sudden flips, everybody realizes it at once.

    17. MA

      Yes, exactly. And, and he said you can see this in, um, you can see this, like, in a crowd of, of, like, a speaker, controversial speaker, where basically, like, you'll have a controversial speaker, and then there'll be silence in the crowd, and then one brave person will start clapping.

    18. JA

      Uh-huh.

    19. MA

      And that person is, like, at severe peril, 'cause if they're the only asshole standing up clapping-

    20. JA

      Yeah

    21. MA

      ... that's it, they might get killed.

    22. JA

      Yeah.

    23. MA

      But then i- if, if the, if it cascades, then a second person starts clapping-

    24. JA

      Right

    25. MA

      ... and then a third, and a fourth, and a fifth, and then you get the snowballing effect, and then the entire auditorium is clapping. And then it, and then that's everybody realizing that they actually are on the side of the majority, which they didn't realize-

    26. JA

      Yeah

    27. MA

      ... before. Uh, by the way, this is what comedy, this is actually why, why comedy's so much-- This is what comedy does well.

    28. JA

      Hmm.

    29. MA

      'Cause people can't control the involuntary response thereafter.

    30. JA

      Right. Yeah, exactly.

  12. 1:31:101:34:07

    Career advice

    1. JA

      point.

    2. MA

      Exactly.

    3. JA

      Okay, um, a few final topics I wanted to ask you about. Um, one is, you're probably in a spot to be giving just sort of life or career advice to young people a lot now, both in general, but also maybe specifically with, like, AI and, like, the current set of tech, you know, changes right now. What do you most often find yourself repeating to a really smart, you know, recent grad about, you know, if they're like, "What should I be doing with my career?" If they get the chance to ask you that.

    4. MA

      To start with, I, I never took any advice, so [laughing]

    5. JA

      Advice is... Yeah, there's something there.

    6. MA

      [chuckles]

    7. JA

      But a lot of people do.

    8. MA

      So may- maybe, maybe-

    9. JA

      Yeah, okay, fair enough.

    10. MA

      Like-

    11. JA

      That's like the Zuck, "You know, if you could have built Facebook" thing.

    12. MA

      Maybe, yeah, maybe-

    13. JA

      Or some of that

    14. MA

      ... maybe the best people probably shouldn't take any advice.

    15. JA

      Okay, well, then, the rest of us.

    16. MA

      But, um, I would just say in gen- so especially for young people, I, I, you know, and again, I, I say this, like, people are very different. Like, I, I, I, I believe very deeply. You know, some people, some people are very happy being in the middle of chaos.

    17. JA

      Mm-hmm.

    18. MA

      Some people are very unhappy [chuckles] being... Let's say, some people are very unhappy being in the middle of chaos, and they will actually, uh, get themselves out of a chaotic situation as fast as they can.

    19. JA

      Mm-hmm.

    20. MA

      Other people love chaos so much that if they don't have any, they will create it, right? And so, like, you, you have to, you know-

    21. JA

      That's true

    22. MA

      ... there's a level of understanding here, where you, where, where, you know, like, not everybody should be in, like, a high-growth, high-risk tech company, 'cause it might just be too nuts.

    23. JA

      Yeah.

    24. MA

      So I, I don't think there's a one, one-size-fits-all, you know, kind of thing-

    25. JA

      Yeah

    26. MA

      ... um, uh, at, at all. Having said that, let's narrow it. So a young, a young person who wants to kind of be i- in tech, I, I think a big part of it is, I think it's, it's, I would say, at least, it's like run to the heat, like, y- or, or the, the, the, the seed thing we were talking about. Like, w- where, where are the interesting things happening? And that's a conceptual question, and it's also, like, a place question, and a community question, network question.

    27. JA

      Yep.

    28. MA

      Um, and so, you know, run to that as fast as you can. And it doesn't mean, you know, running to the fads, but it means-

    29. JA

      Yeah

    30. MA

      ... trying to identify-

  13. 1:34:071:38:21

    Huberman “beef”

    1. JA

      Yeah. Yeah. Okay, next is, um, your Andrew Huberman thing that I see on Twitter. Like, what's... I, I actually can't completely parse what it is. What's going on with that?

    2. MA

      So we have a completely fake beef. We're, we're good friends. We're very good friends. Um, and we're actually neighbors, neighbors in Malibu, and, um, I've been on his podcast, and, like, we're, we're very good friends. Um, but, um-

    3. JA

      But you don't follow his protocols.

    4. MA

      I don't do anything that he says. I, I don't do a single thing that he says. Um, I... With one, one exception, we'll, we'll talk about. But yeah, I don't, I don't, I don't do any of it. You know, he says maintain a regular sleep schedule. I... There's no way.

    5. JA

      You're all over the place on sleep?

    6. MA

      I'm all over the place.

    7. JA

      Mm.

    8. MA

      He says always get up, you know, f- i- see, uh, get up, you know, see sunlight as you, as you can. I'm like, "No, [chuckles] I don't want... It's to see sun- last thing I wanna do when I wake up, to see sunlight." Don't drink caffeine for the first two hours of the day. It's like NFW. It sounds like torture. It sounds like being in a North Korean concentration camp.

    9. JA

      That, that sounds bad.

    10. MA

      Like, I can't even imagine what-

    11. JA

      You drink a lot of coffee?

    12. MA

      A lot of coffee. Hot plunge, cold plunge thing, I'm not-

    13. JA

      The cold plunge is miserable.

    14. MA

      I'm not doing any of that shit.

    15. JA

      Yeah.

    16. MA

      Um-

    17. JA

      You think it's good for you, though, all this?

    18. MA

      Oh, I'm sure it's, I'm sure it's good for you. I'm just not, I'm not gonna do any of it. It all sounds just completely miserable.

    19. JA

      [chuckles] It's good.

    20. MA

      Um, the one thing that, um, he says that I, I do is, uh, stop drinking alcohol.

    21. JA

      Mm-hmm.

    22. MA

      Um, and I would say I am, uh, I am physically much better off as a result, and I am, but I'm very bitter and resentful-

    23. JA

      It is poor-

    24. MA

      ... towards him specifically.

    25. JA

      Why'd you do, why'd you do that one?

    26. MA

      'Cause it's much better for you physically.

    27. JA

      Yeah.

    28. MA

      Like, it, it, it really is.

    29. JA

      Yeah.

    30. MA

      Like, it fixes sleep and energy problems.

  14. 1:38:211:39:33

    Question from X

    1. JA

      Okay, my last question: when I tweeted out a request for questions, I got almost ratioed by one question, so I'm gonna ask this one, like, nearly verbatim. It was by an anon, uh, named Signal. "If you were frozen for 100 years and you woke back up and you looked around, what would be the piece of data that you'd wanna know that would tell you whether or not your dominant worldview turned out to be correct in the fullness of time?"

    2. MA

      Yeah, so I will pick a very unfashionable answer to this, and I would say United States, uh, GDP, just, like, straight out US GDP.

    3. JA

      Okay.

    4. MA

      'Cause I would say embedded in that is the question of technological progress, which is, if you have rapid technological progress, you'll have rapid productivity growth, which means you'll have very rapid GDP growth. If you don't, you won't have rapid GDP growth, so you'll see that in the GDP numbers immediately. You know, number two is, you know, well, uh, number two would be just, like, our markets a great way to organize.

    5. JA

      Yeah.

    6. MA

      Um, and the US is the best market, and so, you know, is that, is that gonna keep working? And then third is, is, does, is the US a gonna be a great country?

    7. JA

      And you are along all of this?

    8. MA

      I am very along all three of those.

    9. JA

      Yeah.

    10. MA

      I am very convicted on all three of those, but, you know-

    11. JA

      Yeah

    12. MA

      ... if I'm wrong about something big, it's, it's, it's gonna be something in there, and it will show up in that number.

    13. JA

      Marc, this was amazing. Thank you so much again.

    14. MA

      Good. Awesome. Thank you, Jack. [upbeat music]

Episode duration: 1:39:33

Install uListen for AI-powered chat & search across the full episode — Get Full Transcript

Transcript of episode 53FImKtf2i0

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome