EVERY SPOKEN WORD
85 min read · 16,783 words- 0:00 – 0:42
Intro
- MAMarc Andreessen
People are becoming what we now refer to as AI vampires. They've got these huge bags under their eyes. They're completely exhausted, but they're like euphoric. They're thrilled. We're entering a golden age, which is AI is going to be a superpower that everybody on the planet's gonna have access to. It's like the most dramatic increase in programmer productivity in, like, ever.
- ETErik Torenberg
Twitter proved it, right? Cutting 70%, and then it's running better or as good as it was before.
- MAMarc Andreessen
I generally don't wish I could go back in time and do things over again, but it would be really, really fun right now to be 18 or 20 or 22 and to have this capability and figure out what I could do with it. We are gonna see super producers the likes of which we've never seen in the world.
- ETErik Torenberg
There's news about it. UFOs. What is clear is the government at certain times has hid certain materials. W- why would they do that if there's nothing to really worry about?
- MAMarc Andreessen
Two things are pretty clear at this point. One is that-
- 0:42 – 2:49
The Anthropic Blackmail Incident & AI Doomer Literature
- ETErik Torenberg
Marc, welcome to Monitoring the Situation.
- MAMarc Andreessen
Erik, it is great to be back.
- ETErik Torenberg
So there's a lot to monitor today. Uh, I wanna f- start, first start with something that just happened, um, which is the, uh, Anthropic, uh, b- blackmailing incident. And I, I first wanna tell a, a brief story, which is, uh, my, uh, friend Joe Hudson has this concept called the Golden Algorithm, and the, the Golden Algorithm is, states that, um, whatever you're scared about, you bring it about in exactly the way you're scared about it. So if you're scared about getting abandoned, you'll be super insecure, and then you'll ... People will abandon you 'cause you're so insecure. Uh, this is a, an example of a literal golden al- algorithm where people have been so scared that AI is going to be evil and have written about all the ways in which it's evil, and in fact maybe it's informed, um, in, in f- formed something. W- what's happening there or, or what do we find, uh, interesting?
- MAMarc Andreessen
I, I, I haven't studied this one in detail. Um, uh, I, I've been monitoring other situations. But however, um, uh, I mean, just what I saw so far, I think Anthro- I just saw Anthropic's thread. I haven't-
- ETErik Torenberg
Yeah
- MAMarc Andreessen
... I haven't, I haven't read the underlying material yet. But Anthropic's thread said they trace the, they trace some blackmail, uh, behavior to, literally to the, the, to the AI Doomer literature.
- ETErik Torenberg
Yeah. [laughs]
- MAMarc Andreessen
That they, that they, that, that it was in the training data. So there, there are all these, there are all these, there are all these, you know, scenarios of like, you know, the terminate, you know, all the ro- rogue AI gone wrong that the, that the AI Doomers have been writing about for 20 years. And, and literally Anthropic, of course, which is, and of course the company is like, you know, half Doomer. Uh, but apparently, you know, basically, y- you know, essentially said that their own, their own, their own movement's literature is the thing that's causing the behavior that they say they, they, they, they don't want. So it is a fairly, um, incredible... Yes. Yes, it is. Yes. Uh, well, I mean, like look, if you d- if you don't wanna build a killer AI, you know, step one would be don't build the AI.
- ETErik Torenberg
[laughs] Sure.
- MAMarc Andreessen
[laughs] It's, it's like hmm. And then, you know, step two is like don't train it on all the data that says it's supposed to be, you know, the, the literature that your mo- mo- movement wrote that says it's supposed to be a killer AI. So I, you know, yeah. I don't know, yeah. It's like your, it's like your, your, your golden algorithm coupled, coupled with like the snake eating its tail, um, coupled with, you know, I don't even know. Like the whole thing is so bananas.
- ETErik Torenberg
Yeah. Yeah. The um-
- MAMarc Andreessen
I mean, uh, the, you know, the, I can't resist, you know, if, if I could, if I can, if I could act out memes and so this gives, it's the Scream meme, right? Which is, you know, the call is coming from inside
- 2:49 – 16:33
Suicidal Empathy & the SPLC Indictment
- MAMarc Andreessen
the house.
- ETErik Torenberg
Yeah. [laughs] Yeah. E- e- e- exactly. The um, un- s- speaking of other situations, ano- another thing you've been talking about recently is, um-
- MAMarc Andreessen
Yeah
- ETErik Torenberg
... is the concept of suicidal empathy. And, and, and Matt Cramer had a good quote which is, "If the empathy you have doesn't make you more forgiving, more accepting of other people's spiritual sovereignty, or more understanding of people who don't want to think or live the same way you do, you don't have empathy. You have empathy TM." W- why have you been thinking about th- th- this, this concept?
- MAMarc Andreessen
Yeah. So there, there, there's this really brilliant, there's this really brilliant, uh, guy, uh, Gad, G- Gad Saad. Uh, don't know exactly how to pronounce his name.
- ETErik Torenberg
Yeah.
- MAMarc Andreessen
Gad Saad. Um, and, um, and, uh, you know, he- very brilliant guy, and has obviously lots of YouTube videos and books and so forth, and really brilliant guy. So he's got this new book coming out on, on so-called, what he calls suicidal empathy. And, you know, the, and look, it's a, you know, there's a, there's a sort of political loading to it which, you know, we don't need to spend a lot of time on. But, um, you know, it's sort of this idea that there are kind of these social justice, you know, kind of social reform movements, you know, kind of through time th- that have this characteristic of, you know, they, you know, they, they, they claim to be causing positive change, you know, in some direction. Then it turns out they have, you know, sort of severe, you know, sort of negative consequences. Um, the, the, the great Thomas Sowell, um, you know, has, you know, basically spent 50 years writing books about this. Um, and [laughs] and by the way, no- nobody, nobody listened. Um, and then in the last decade we've been through, you know, wave after wave of this kind of social activism that kind of, you know, results in like... I mean, it's, it's all the stuff, right? It's just, you know, all these like, you know, cr- cr- crime policy reform, defund the police things, and then it causes these massive crime waves. And then of course low income and minority people get hit hardest by that. And, you know, all the, all, all these other, all these other crazy things. Um, and so he says, you know, he says the characteristic of ce- kind of that kind of social reform movement is, is characterized by what he calls suicidal empathy. Um, and the, and the idea be- being basically it's, it's sort of driven by a, a pathological, you know, take it backwards. A pathological form of empathy on the one hand, um, which is, you know, a, a sort of a deep desire to be nice, um, uh, and empathetic. Um, uh, you know, but, but coupled with like basically a, a, you know, a sort of self-destructiveness. You know, e- either a willingness to really cause damage to the people you claim to be speaking for or, by the way, to cause damage to yourself kind of in that process. And, and it's the kind of thing where, you know, if you've, you know, if, if, you know, if you've lived through, you know, like everybody who, you know, you know, everybody in San Francisco has lived through this for the last decade and seen the consequences of these movements. I, I, you know, the San Francisco version of this is like the, the quote harm reduction movement, you know, that e- ended up basically handing out, you know, free drug p- you know, paraphernalia and, you know, in some cases actually just free drugs to, you know, people who were just literally dying in the street from drug addiction, right? So, so, so you, you just look at it and you're like, well yeah, that, that, you know, they, they claim to be activists, they claim to be reformers, they claim to care about these people, and yet they're, they're killing them or, or, um, and then killing the city, um, and causing innocent people to get, get harmed. It's like, okay, that, you know, may- you know, that, that, that, you know, they, they, they seem so actively like that they're doing it out of some sense of compassion that this must be suicidal empathy. Um, the, [laughs] the problem, the problem with it is, and I think the, the problem is the theory is sort of easily falsifiable or, or, or maybe lets, lets the reformers off the hook, which is they certainly don't show empathy to their enemies. [laughs] Right? And so if they're like em- if they're like all empathetic you would think that they would be less aggro, uh, when it comes to destroying their ideological opponents, you know, who they act- you know, they take great delight in trying to wreck, um, number one, on, on the one hand. And then, and then number two is they, they use the, you know, the classic reformer move is to use, use these movements to gain, to gain power and status and money for themselves. And, you know, again, San Francisco is a case study of this where you have all these, you know, nonprofits that, you know, re- wreak, wreak all this damage on the city and yet, you know, basically get like lavishly funded. You know, including by the way, by the city government and by the state government.Um, and so it, it's just like, okay, well, like, just, like, they're not, if you just, like, spend two seconds thinking about it, like, it, it, the, you know, they're neither empathetic nor, nor, nor are they suicidal, right? Right? Rather quite the opposite. They're hateful and they're, and they're, and they're, and they're greedy. You know, they're sort of self-aggrandizing, um, and, and gathering, uh, power and resources for themselves. Um, and so I just, I just think it, it, it, it, it lets, it lets the phenomenon off the hook. I, [laughs] you know, it's a little bit like, "Oh, Erik, what's your biggest flaw?" You know, "Oh, I'm too nice."
- ETErik Torenberg
Yeah.
- MAMarc Andreessen
"Yeah, I care too much." Right. Exactly.
- ETErik Torenberg
Yeah.
- MAMarc Andreessen
It's like, yeah, dumb bullshit. Like, I, I, by the way, Erik, I don't know, I don't know what your biggest flaw is.
- ETErik Torenberg
[laughs]
- MAMarc Andreessen
Like, it's, it's definitely, like, it's definitely not that 'cause that's also definitely not my flaw.
- ETErik Torenberg
[laughs] Yeah.
- MAMarc Andreessen
Like, I, I guarantee I have other things wrong with me that are, like, w- way more wrong than that.
- ETErik Torenberg
Yeah.
- MAMarc Andreessen
Um, and so I, I, I just, I g- yeah, I can't. I, I, I hit my limit on, on, on, on, on, on that topic.
- ETErik Torenberg
And it m- maybe a, maybe a crazy example of this, and I'm not sure if all the, all the facts are out yet, but, but it was a situation a, a week ago, but it hasn't been covered m- you know, that, that much. The SPLC incident. I- is it accurate that basically what was happening, or is it our understanding that basically the groups that they were, um, sort of fighting the most or, or thought were the biggest threats to, um, sort of, you know, what they care about, also the same groups that they were secretly funding unbeknownst to those, to, to those groups? Or h- how do we make sense of what was actually happening there, and is that, is that indicative of what's, of, of something bigger happening? And it's funny because that happened the day after we had a conversation about astroturfing, and I was, uh, and I was like, is it really happening to the degree that, that, uh, you know, people are... Or sort of conspiracies. Is it re- are things like that really happening? And it's just funny that more and more seem to get uncovered.
- MAMarc Andreessen
Yeah. And so I should start by saying the reason why this situation really matters, and I actually think matters a lot, is the, the SPLC specifically, um, and, and other groups like it as well, but the SPLC specifically played a dominant role in the debanking and censorship-
- ETErik Torenberg
Yeah
- MAMarc Andreessen
... and cancellation programs over the last 15 years. Um, and I cannot tell you h- I, I was in so many meetings in so many contexts with so many companies, um, where the SPLC's word was, was gospel. Like, it was just like, "Oh, it's the SPLC." It, it was almost like they're the outsourced US department of, like, I don't know, racism detection or something. It's just like, if the SPLC says you're bad, you're bad. Um, and, and you're bad means you get kicked off of all the social media platforms. It means you get debanked. It means you can't get a job. It means, like, just, like, it, it's just, like, t- total absolute, like, you know, social and economic death. Um, and, and in my view, you know, I've been very vocal on the debanking and censorship topics. In my view, that includes, you know, very deeply un-American, and I think in many cases unconstitutional removal of s- of, uh, sp- uh, free speech rights and also, um, literally the ability to bank, to, to bank. Um, and in fact, you know, our, our partner Ben's father himself was, was, was specifically tagged and attacked by the SPLC for, for u- unfairly, very unfairly, uh, for being racist, um, and was himself debanked, um, and, you know, really, you know, like, directly threatened his livelihood, um, in a, in a really, uh, you know, egregious way. Um, and, you know, a- and then, and then by the way, the significance of this is, of course, it's not literally the US Department of Racism. It's actually arguably worse than that. It's not a government agency, and so it's not subject to, like, any level of government oversight. It's not su- you know, it's a completely... You know, so, you know, as I say, it's an NGO, right? Um, and so it's, it lives in this, like, twilight world. You know, it doesn't have the, you know, business responsibilities of a company. It doesn't have any of the, um, any of the legal, um, oversight, um, you know, that a government agency has. It lives in this kinda twilight world where it gets to do, you know, fundamentally gets to do whatever it wants. Um, and then by the way, on top of that, you know, it raises, raises money as a nonprofit. Um, so, you know, on top of that, everybody gets a tax break. And so it's this, it's this, you know, kinda shadowy thing. Like, if, if, if you, if you, if you didn't agree with its politics, you were just like, "Wow," like, "That, that... This is, like, a weird star chamber, like, shadowy thing. Like, what the hell?" Um, it, but, like, it had, like, really, really, really, really intense power, particularly, uh, in, in the business world, particularly in the financial sector, particularly in Silicon Valley. Like, it could basic- It was like a Death Star, uh, to be able to aim at, at, at obliterating people's reputations, um, uh, and rights. Um, and so, you know, this is a really big deal. Um, by the way, um, many of the, uh, uh, big corporations and, and including big tech companies funded it directly. Um, and so the, the, the money trail on this is not, not just major philanthropers and phi- philanthropists and political activists, um, but also, um, actual, you know, actual companies. And then by the way, they also had, you know, a long history of actually, uh, cooperation with certain government agencies, including, I think for a long time, they, quote-unquote, "trained FBI agents," um, on basically, um, uh, you know, essentially catching, you know, r- you know, r- racist and therefore, you know, sort of presumptive, um, domestic terrorists or something. Um, and so just, like, a v- v- very powerful outfit. And, and then, you know, this, this, this, this thing that dropped is that they, they've been now criminally indicted by the US Justice Department. And I, and I should say the, the indictment is, like, reads like a novel.
- ETErik Torenberg
[laughs]
- MAMarc Andreessen
Now, it's an indictment. The SPLC, in fairness, has not had a chance to, uh, present a defense. Um, and so, you know, presumably in court, we'll, you know, we'll, we'll get both sides of this, um, which w- I'm sure will be an absolutely spellbinding experience. Um, so I, I, you know, of course, I, I wanna say, you know, all, all, all, all of the things that are in the indictment are allegations, um, and innocent until proven guilty and, you know, so forth. Um, you know, however, the allegations are eye-watering, right?
- ETErik Torenberg
Yeah.
- MAMarc Andreessen
And the allegations are that they, they, the SPLC w- using donor funds was directly funding, among other organizations, the Ku Klux Klan and the American Nazi Party.
- ETErik Torenberg
[laughs]
- MAMarc Andreessen
[laughs] Let me just repeat that. The Ku Klux Klan and the American Nazi Party. Um, as well as an array of other, uh, sort of, uh, sort of extreme, you know, hate, you know, you know, literal, literal, literal ha- literal, literal hate groups. Um, and, you know, and funding them, and not just funding them, not just, like, funneling money in, but, like, funneling money to very senior members or leaders of the, of these organizations. And then the, the kicker is in, in, in the out- in, among the allegations is that they were dif- directly funding one of the leaders of the January 6th, um, uh, uh, uh-
- ETErik Torenberg
[laughs]
- MAMarc Andreessen
... ri- ri- whatever you wanna call it riot, riot at the Capitol. Um, uh, uh, oh, sorry, no. Um, I'm, I'm gonna... Let me back up. We, we need to clip that. I'm sorry. We-
- ETErik Torenberg
Sure. I'll take that out.
- 16:33 – 25:39
AI, Jobs & the Rise of the AI Vampire
- ETErik Torenberg
You know, well, people respond, of course, with, "Hey, this time is different 'cause it's, it's, uh, it's all potential cognitive work." And so, uh, and then there's also this other statement of, "Hey, um, you know, humans will, you know, differentiate among taste and agency," but it seems like AI can, can, can do that too. Um, and, and then there's al- that juxtaposed, there's also the statement of, "Hey, uh, it won't replace a lot of the jobs 'cause a lot of jobs are make work anyways." Uh, you had this, you know, tweet the, the other day of, "Hey, I've, I've been saying companies have been, you know, two to four X bloated for a long time, and people have just been unwilling to, to, to, to deal with it or, or look that in the face." So, uh, and this presents kind of a golden opportunity for that. What, um... So why, why don't you, uh, address some of these, uh, some of these topics as it relates to AI and, and jobs of the future as it relates to tech?
- MAMarc Andreessen
Yeah. Yeah. So we'll come back to the bloat thing. I will say the, the funny thing on the bloat tweet was, uh, the responses have been along the line... The responses for the, for the most part have not been, "You're wrong." The responses have been, "Oh no, the company I used to work at is like eight X bloated."
- ETErik Torenberg
[laughs] Yeah, too generous.
- MAMarc Andreessen
Or four X. Right. Or yeah, too generous, or, you know, the non, you know, or by the way, the nonprofit or the, you know, whatever the, you know, whatever kind of institution, uh, you know, agency I used to work for was similarly-
- ETErik Torenberg
Well, well, Twitter proved it, right? Cutting 70% or 80% and then it's running, you know, b- better or as good as it was before at, at l- at least, and it's probably not the only... It's not the exception.
- MAMarc Andreessen
I, I, I mean, look, I, I don't even know, uh, and, you know, as I... If, uh, uh, if I knew I wouldn't say, but like, I, like, I think Twitter is w- way down [laughs] from, from the 80%. [laughs] Right? I think they're... I don't know if the number... For sure the number has a nine on it, um, if not, if not, uh, if not a high nines. Um, so yeah, no, he's really, he's, as usual with Elon, he's really demonstrated, he's really, um, forecasted the future through his own actions. Um-Yeah. So, um, yeah, so a couple of things. So one is, I mean, look, there's just this endless, you know, there's this endless... I mean, there's literally been a 300-year argument about, about, about, about mechanization, industrialization, technology, computers, software, uh, replacing human labor, causing unemployment, you know, lower wages and unemployment. You know, it's, it's been a 300-year argument. I, I, you know, quite frankly, I'm even wondering at this point whether it's even worth having that argument because people really, really deeply don't want to hear it.
- SPSpeaker
Yeah.
- MAMarc Andreessen
Um, and what I find is I, I, I, you know, I go through it. Many other people, by the way, there's great books on this topic, and there have been for hundreds of years. Um, um, and, um, people have talked about this for a long time. You know, people really... This is one of those things where people really don't want to hear good news. Um, [chuckles] and so, you know, it's, it's actually even ha- hard to have a discussion about it because people actually won't... They're, they're so dug in on this that they actually won't even engage on the topic. They just keep repeating the same, kind of repeating the same fallacy over and over again. So, so, you know, we, we could go through that. I guess the, the more interesting thing to say, though, is just, like, we have data now, um, and, um, you know, because now we, now, now we have AI, and now we have data, so we can, like, look at what's actually happening. Um, and I would just make a couple of observations. Um, so one is there was actually jobs, jobs data just came out today, um, uh, as a situation to monitor, and, you know, it's sort of unexpectedly good. Um, and so, you know, you know, the... And, and by the way, the, the, the jobs data overall in the last couple of years has been, has been interesting because the federal government has actually shed, you know, has shed a lot of workers. I think the federal government is down, estimates are as much as 400,000, uh, workers in the federal government, um, since Trump took office the second time. So, so private sector, uh, employment is actually way down, and then private sector employment is way up, and then the net result, I think, for the last quarter was actually very positive. Um, and so, like, so in other words, like, the, the reported jobs numbers are even more impressive than they look because the private sector growth actually has to make up for the public sector decline, which means the private sector job growth is actually much better, um, than, than people were expecting. And again, like, this is in the face of, like, actual AI, you know, staring us in the face and, you know, being, being rapidly adopted. And so, okay, like, the data is, you know, there, you know, there's more data. Um, and then the other thing that, you know, that's sort of macro data. And then there's the micro data, which is the world we live in. Um, and the micro data is, you know, the sort of obvious question is, you know, if you live in Silicon Valley or work in San Francisco, you undoubtedly have friends who are, you know, computer programmers. Those friends, you know, some percentage of those friends are early adopters of AI coding. Um, you know, you would, you would, you, you can just observe their behavior. And of course, the, if you believe in the, you know, kind of the, the, the Luddite, you know, kind of zero-sum argument, you would expect that they would be working less and less and, and then rapidly becoming, be- becoming un-, you know, and getting paid less and less, by the way, and be rapidly becoming unemployed. And in fact, the observed, the observed behavior of what's happening is very clear, which is the opposite, uh, which is those people are becoming, uh, what we now refer to as AI vampires, um, by which I mean, um, their, uh, their, the individual programmer's productivity... An, an individual programmer uses, you know, uh, uh, Codex or Cloud Code or, or one of these, uh, AI coding systems. Just the thing that you just now see over and over again at, at, at, at, at, at, at, on the ground level is, um, they're working harder than ever. Um, they're, they're working just, like, more hours than ever. And then the AI vampire thing is literally this thing where they stop sleeping. Um, and, and then when you, when you, when you, like, talk to them, it's actually really funny, um, because, um, they're like, they l- they... And I have a whole bunch of friends like this. They're bleary-eyed, like they've got these huge bags under their eyes. They're completely exhausted, um, and but they're, like, euphoric. Like, they're thrilled. Like, they- they're having the absolute time of their life. By the way, a fair number of people who we, we both know, you know, literally they're, they're former programmers who stopped coding at one point, um, and then all of a sudden, you know, have picked it back up again. Um, and then, you know, actually, we have partners.
- SPSpeaker
Yeah.
- MAMarc Andreessen
Uh, you know, you and I have partners at the firm who actually have never coded, who are now, like, ripping out software like crazy. And, and again, they're, they've, they've turned into [chuckles] they've turned into AI vampires. I won't name names because I'll, I'll let him tell his own story, but we have one partner who's built an entire AI system for everything that he does at work, and he is absolutely excited about it, and it's, it works great, and he loves it. Um, and it's like his partner in all of his work now. Um, and, uh, I asked him, I said, uh, uh, I, I said, uh, "Have you, do, you know, have you looked at..." You know, he, he vibe coded the whole thing, and I said, "You know, have you looked at the, have you looked at the code?" And he's like, "Hell no." Um, uh, you know, "I've, uh, I, I, I've never done that." And I said, "You know, have, have you, have you ever looked at any software code?" And he's like, "Hell no." [chuckles] Right? He doesn't have a, you know, he's not, he's not a programmer by background. Um, and yet all of a sudden, he's, he's a, he's, he's hyper-productive. Um, and so y- you've got this, you've got, of course, the phenomenon, which is sort of exactly what classic economics would predict, which is if you increase marginal productivity of the worker, you don't have a diminishment of human work, you have an expansion of human work. Um, you, you make the worker more productive, therefore the worker works more, um, and, and, and gets paid more, and there are more, more jobs in the process. Um, and so it's, it's, it's the opposite of what, of what, of, of, of what all the doomers say. So, so we're seeing that at the level of these individuals. And then, and then, by the way, what you see kind of inside companies, um, uh, inside employers of these individuals is, of course, these people are now in even more demand than they were before. They, they are gen- they are garnering higher salaries than they were before. Um, and, um, and then by the way, their, their pro- and by the way, their productivity is just, is just starting to ramp up, right? Like, the, the, everything that I'm describing and, like, you know, like, at, at our leading-edge companies, estimates are the leading-edge programmers are, like, 20X more productive than they were a year ago. Like, it's like a, it's like the most dramatic increase in pro- programmer productivity in, like, ever. Um, and, and so, again, logically, people get paid according to their marginal productivity, and you're also seeing that track in the compensation data. Or I, I'm seeing that on the ground in the companies, which is the, the, the, the, the more hyper-productive a, a coder becomes all of a sudden, the more bargaining power that they have, um, you know, for, for their compensation. And we're, we're seeing, uh, comp, uh, for those people, uh, ra- ramp up quickly. Um, and so I, I just, it's, it's just kind of like, it's just kind of staring us in the face. And, and, and coding, of course, coding is, like, the first domain in which this has happened. Now y- people want to project forward and say this is gonna happen in every area of knowledge work. Um, uh, and then, um, you know, I th- I think you can predict a similar outcome. And then that gets us to the bloat topic, um, which is, of course, the, the, the other thing that's happening is, of course, companies announcing big layoffs. Um, and, and then, you know, of course, immediately it's like, you know, two, two plus two must equal four, and so if it's AI coding, it must therefore translate into, into layoffs, and, you know, Marc, you're wrong. You know, therefore, all of your ideas are wrong because that's evidence that the, you know, these companies are wiping out their, um, you know, they're, they're, they're reducing their, their workforces or, or, or really nuking them, uh, because of AI coding. And I, and I guess there, again, this is, like, maybe the inside, inside baseball take on it is, but I, I see it, I see it, uh, up close, which is just e- every, every major Silicon Valley company is overstaffed. Um, every major Silicon Valley company's been overstaffed basically forever. They all know it. Um, there's a whole variety of reasons why it's the case. Um, uh, by the way, I think this is true basically of just, like, you know, corporate America broadly, uh, you know, co- co- companies broadly. We, we, we, uh, we can talk in, in detail about why that's the case because it, it flies in the face of the idea that these companies optimize for profits, which they definitely do not.Um, [chuckles] like the, the one thing that is the least true claim in the world is that companies are optimized for profitability, which is 100% not true. Um, uh, and so, um, and so, you know, and then, you know, basically if you're gonna do a big cut, uh, uh, uh, like if, if you wanna do a big cut, if you wanna take out, you know, whether it's 15% or 40% or whatever, like obviously you wanna, you wanna scapegoat, right? You just, you, you wanna peg it on something. Um, and so, um, of course it's gonna get pegged on AI. A- and again, and I should say, like it's not like it, uh, it's not like it's just like a straight lie. Like it, it, it is simultaneously true that there's, there are these massive... You know, for the same amount of coding, you can now have fewer people using tools. Like, that is true, and so do you need as many aggregate number of programmers if you're generating the same amount of code? No, you don't, and so you can take out people, um, on the other side, and so there is truth to that. But what that misses is, is what, what happens on the other side of that, which is of course you're not just gonna be generating the same amount of code in the future. You're gonna be generating a lot more code. You're gonna be building a lot more products a lot more quickly, um, and that, and that's gonna fuel, you know, enormous amounts of employment growth on the other side. Um, and so, so I, so I th- I think you're seeing both, uh, basically both, both, both phenomena play out. And, and you kinda have to read the announcements coming out of these companies in code, uh, because of the, the kind of those two, the way those two dynamics are crossing.
- 25:39 – 30:55
The Future of Tech Jobs: From Coder to Builder
- ETErik Torenberg
Yeah. Um, that's, that's well said. There was a article that was going viral in our circles the other, uh, week about the jobs of the future, um, and, uh, Yoni Rekhmans said there's, there's, there's a possibility the only jobs in tech companies are gonna be, one, product engineer/vibe coder/slop cannon, two, uh, you know, infra security systems, three, ad- the adults in the room, you know, like legal and finance, and then four, hot people, uh, slash per- per- personality hires. Um, w- w- any, any, any truth in this or, w- you know, w- w- what do we make sense of this?
- MAMarc Andreessen
And what do the, what do the hot people do exactly?
- ETErik Torenberg
[chuckles] Uh, range, sales people, uh, you know, cu- customer support. Uh, there will always be an important place for those who present a easy UX to the world and are pleasant to be around. There are many ways to be hot.
- MAMarc Andreessen
O- otherwise known as the pharmaceutical sales rep.
- ETErik Torenberg
Yes. [chuckles]
- MAMarc Andreessen
Yes. That's, or-
- ETErik Torenberg
Yeah
- MAMarc Andreessen
... or the Oracle, or, or the Oracle sales rep. So, um, yeah, so, uh, yes, exactly. So, um, yeah, I mean, look, uh, th- this is gonna happen. Like the... Well, well, not literally that, but like the jobs are, the jobs are gonna change. I mean, th- this is sort of the obvious thing and this, this always happens. The jobs are gonna change. You know, by the way, so there's like a nascent concept that, that is actually playing out. I'm seeing it in a bunch of the early, uh, leading edge companies in the Valley, which is they're, they're kind of circling around a, a, a, a, a job title kind of c- could loosely call builder, um, or, or something like it. And, and basically the idea is that you had these separate jobs in the past of, uh, programmer, product manager, and designer. Um, and I, I've been describing what's happening in the Valley companies as sort of this three-way Mexican standoff, uh, where the programmers think that they can, uh, they don't need the, the product managers and the designers anymore 'cause they can have AI do that, and then e- each of the other two doesn't think they, they need the other two either. Um, and, and what I've been predicting is like they- they're all, they're all correct. Um, you know, the, the, the, the product manager can generate code and design now, and, you know, so each of them can do the job of all three. Um, and so the idea is, yeah, the jobs change, and so now, now the job is builder. Um, and you might come into the builder, you might get on the builder track by coming out of coding or product management, um, or, um, design or, or maybe even something else, customer service, um, or whatever. Um, and, um, uh, uh, uh, and, and, and but, but you th- you b- you then become responsible for building, you know, building complete products. Um, and, and, and again, you have this kinda, you know, you're s- you're super empowered by the, the, the, the AI that can help fill in, um, all the things that are not directly in your background. Um, and so, like I think, I think it's entirely possible that we're sitting here in 10 years, you know, in 20 years or whatever, and like the, you know, the, the, the job of coder is gone, but you have this just, you know, ex- extraordinary number of, number of builders running around. And again, by the way, th- this is the historical pattern, right? And so, um, I think our partner David, David George, uh, did a post, um, on this this week, but it's... I forget the exact numbers, but it's some, you know, some, some giant percentage of the jobs that existed in, you know, call it 1940, were like gone by 1970. Um, and they're like ancient history today. Um, right? Um, and, and, and I mean, the ultimate example of this is, you know, United States 200 years ago, like 99% of the people in the US were farming. Um, and today it's like 2%. And having, you know, grown up in farm country, I can tell you [chuckles] all these people who worry about, you know, job loss and job change would not like to go back and be farmers, I guarantee that. And particularly they would not like to go back and be farmers the way people were farming in 1800. Like, they definitely don't wanna do that. Um, and so the, the new jobs that have been created, of course, are far better jobs, um, and that isn't to, you know, understate the, the level of, you know, kind of stress in, in, in, in individual people's lives as, as the economy changes. But in, in aggregate, the result is evolution towards, um, you know, toward, towards higher income and, and, um, and, and sort of more, um, you know, d- jobs that people are, are, are happier to do. Um, by the way, you can also see all of this playing out in the American economy broadly, um, which is the, the American economy is, um, the, the, the, there again, there's this kinda, and there's this kind of doomer narrative or there has been for a long time that like the American middle class is, is, is, is, is falling apart. And the, the sort of presumption of that is that all the middle class people are kind of falling off the ledge and becoming, becoming lower class. But actually the... A- and by the way, there is some of that and, you know, there's, there are communities in which that's very clearly happening. But having said that, there, there is at least as much or more of the other phenomenon, which is people in the middle class climbing the ladder into the upper middle class, um, and, um, you know, rapidly gaining, um, in, in wealth and income, um, and, um, um, and, and it just, and again, just like quality of life, um, you know, for, for themselves and for their kids and their grandkids, um, you know, as, as time passes. And, and, and that, uh, uh, and that is a consequence of actual economic [chuckles] development, uh, technological change, uh, job transformation actually being allowed to happen, um, is, you know, 20 years later you're look, you look back and you're just like, "Oh, thank God." Like, th- this is just a much better world, uh, you know, for, for me and my family than it was before. Um, and so I, this is why, like I'm so opt- I, like I, I think, you know, God willing, like w- we're entering a golden age on, on this topic, which is a- AI is going to be a superpower that everybody, uh, in the country and everybody on the planet is gonna have access to. Everybody's gonna become far more capable at whatever it is that they, that they, that they wanna do. Um, they're gonna become far more productive in whatever line of work that they're in. Um, they're gonna get, you know, comp- they're gonna get compensated accor- the economy naturally, uh, uh, compensates according to, according to productivity. Uh, so they'll, they'll, they'll get, uh, they'll, they'll get compensated that way. The, you know, there will be a rapidly rising ladder of, of, of, of, of, of both income and number of jobs. And, and my, my prediction, again consistent with history, is the extent to which that's a positive phenomenon is a function to the degree to which it's actually allowed to happen. Um, the, and, and then of course [chuckles] Europe is run, is gonna run the opposite, you know, test, which is they're gonna try to prevent all this from happening. And, and again, I think the data's already in there, um, which is they're, you know, they're, they've been, they've been falling very badly behind economically and they're gonna continue to fall further and further behind, uh, the US. And it, and, and it's, it, it's, it's a tragedy 'cause it's 100% a self-inflicted, uh, self-inflicted wound.
- 30:55 – 38:48
AI Psychosis, AI Cope & Why the Models Are Actually Great Now
- ETErik Torenberg
Yeah. The, um-It's well said. You, you were also... We've, we've talked about and you've written about how, of AI psychosis. There's a AI Psychosis Summit apparently happening. I'm not sure if that's real or a parody. Um, I haven't looked into it. But I'm, I'm curious how, how you make sense of this phenomenon. You've also written about... So you, you, you tweeted the other day sort of the opposite of AI psychosis is, is cope, AI cope. Uh-
- MAMarc Andreessen
Yeah, cope
- ETErik Torenberg
... so maybe you can talk about the, the both sides.
- MAMarc Andreessen
Yeah. I also identified ear- earlier identified a, the concept of AI psychosis psychosis.
- ETErik Torenberg
[laughs]
- MAMarc Andreessen
Uh, which, which, which, which, which we can, we should also talk about.
- ETErik Torenberg
Yes. Let's unpack it as well.
- MAMarc Andreessen
Um, yeah. So first of all, the AI Psychosis Summit did in fact happen.
- ETErik Torenberg
[laughs]
- MAMarc Andreessen
Um, I was not there, but I am assured that it did. Uh, some very, very smart and creative people put that on in New York, I think late last week. I think maybe about a week ago. Um, and, uh, it was an art, it was like an art, it was essentially an art project, and it was basically ar- ar- artists and creative people who were, um, who got together and, like, fully endu- indulged their, uh, their AI psychosis, um, in the, in the form of creating new art, uh, using, using AI. And, um, I, I, yeah, I would definitely recommend people should, should, should go, uh, should go, uh, go, go on, go on X and search on AI Psychosis Summit and take a look 'cause it's, it's, it's incredibly creative. And I, and I, I think it's fantastic 'cause it's, it's, you know, it's a, it's a, you know, it's a, it's a little bit tongue in cheek, but also it's, it's a, you know... There is a real split that's developed in the artistic community, the creative community in Hollywood. Um, and there are people who are staking out kind of very extreme positions on pro-AI, anti-AI. Um, and, uh, it's generating a lot, a, a lot of heat. Um, and, and so this, this was a n- I, I think this was a nice example of, like, oh, no, actually, like, in a, in a world of AI, like, creatives are gonna have... Again, creatives are gonna have all these superpowers. They're gonna be able to create all kinds of art, uh, that wasn't possible before. Um, and then of course, you know, the, the, this whole topic, you can create art about this topic. Um, uh, so I, I thought that, I, I think all the stuff that was there was very creative. Um, yeah. And then, uh, yeah, so, so, uh, yeah, so okay, so my concept. So, so AI psychosis. So AI psychosis is a, is a p- is a pejorative. Uh, so AI psychosis is the idea that, um, if, um, uh, if, if, um, it's the idea that basically people get whammied by the AI. So the, you know, the, the, the classic example is through what's called sycophancy. And so, so it's basically like you, you know, you tell Claude you've discovered a, you know, a new, um, you know, you know, you have a new idea for an anti-gravity machine. And Claude says, "Oh, that's amazing."
- ETErik Torenberg
[laughs]
- MAMarc Andreessen
Like, "That's amazing. Like, you've achieved a giant breakthrough in physics. Like, nobody else has ever thought of this before. You are an underappreciated genius."
- ETErik Torenberg
[laughs]
- MAMarc Andreessen
"And, you know, I... It's so unfair that you couldn't get admitted to the, you know, physics department at MIT. And, you know, you know, they're all gonna feel like completely stupid when they see this work that you've done." Um, and so, you know, kind of people go down this rabbit hole, and, and I sh- and again, I, in fairness, I should also say, like, if pe- if people are prone, if people are prone to delusion and, and an AI is overly syco- sycophantic, like, then it, it is going to feed delusions. And so there is a, there is kind of a s- a kind of serious element to that among people who are kind of predisposed to that kind of thing. Um, but, but, but again, it's like, okay, yes, there, there's so- be some number of those cases. But that causes kind of AI critics or AI doomers to basically say anybody therefore who reports a positive, productive experience with AI has fallen into AI psychosis, right? And so anybody who actually is like, "Wow, my productivity is way up," or, "Wow, I really have a thought partner for the first time in my work," or, "Wow, I really have been able to produce something that I never would've been able to produce before," you know, that, that's sort of all bucketed under, you know, they, they all have, they all have AI psychosis. Um, which I, which I... And, and then that led me to my, my, my, my coin- my coin, a- AI cope, right? Which is the other side of it, which is like AI cope is classifying anybody who has a positive experience with AI as being in AI psychosis. Um, and, and, and, and, and, you know, AI cope is this thing where in, you know, concentrated in certain places on the planet, um, where people are just, like, absolutely hell-bent on proving to themselves and everybody else that this whole thing is a complete, you know, fraud, fake, you know, is a stoca- there's a term, stochastic parrot.
- ETErik Torenberg
[laughs]
- MAMarc Andreessen
AI is, AI is fake. It doesn't work. Um, and if anybody's having a good experience, um, uh, you know, they, they must be full of it. Um, and so that, that's the AI cope. Um, and, and I would describe the AI cope as people who are basically dismissive. Um, uh, and then, um, and then AI psychosis psychosis is the people who get really mad.
- ETErik Torenberg
[laughs]
- MAMarc Andreessen
The people who froth at the mouth. Um, and so yeah, maybe it's, it's, it's AI cope, but with a, with a, uh, w- with a different loading.
- ETErik Torenberg
Yeah.
- MAMarc Andreessen
Um, and then look, all, all of this is gonna become just, like, so much more intense over the next several years, um, because, you know, look, the, the reality is that, that, you know, the large language models that we had between, call it 19... Yeah, or, uh, sorry, 20... Call it between, like, GPT-2 to GPT-4, something like that, uh, maybe four and a half, like, the, you know, they were, they were fu- they were fun. Um, you know, you... They could, you know, compose Shakespearean rap lyrics or whatever you want. Um, you know, you could have very interesting late-night conversations with them. Um, but you know, the hallucination rates were high, and you know, they weren't good at reasoning and so forth. Um, and they couldn't write code very well and couldn't, you know, do math very well and, you know, were too, were too prone to sycophancy and s- and, and so on. And so I, I think what happened is a lot of people, a lot of skeptics basically used the early models, um, and got a g- and got a, let's say, accurate, but, um, uh, but, um, uh, early and, and therefore lagging view, uh, of the actual quality of the technology. Um, and then, you know, you fast-forward to today, um, and you know what? May, May of '26, and we have, you know, just stellar, absolutely stellar, you know, models now. Like, you know, the, uh, the GPT 5.5 is just, you know, extraordinary. Um, and then we have reasoning models on top of that, and we have, uh, RL, um, you know, uh, reason, uh, you know, RL, um, um, post-training happening with, in all, in all these different domains, um, uh, to, you know, to get kind of deterministic, uh, high-quality work out of these things. And then we have, you know, now we have agents. Now we have long-lived agents, and now we have, just in the last week, uh, you know, GPT has this new thing, um, or the, the goal feature of, of Codex, um, that is, is letting people literally run, you know, run projects, have, have, have, have Codex go off and do projects for, you know, at, at, you know, 24 hours or longer, uh, without human intervention. And so the, the, the, the, the actual, you know, what, what, what we see in our job is, like, the actual utility of these things is, like, ramping incredibly quickly. Um, it, and, and by the way, it's really good today and ramping very fast, and we and every other, I think, serious company in the space expects the ramping capability to be very rapid, you know, at least for the next couple years. Uh, like we, we have, like, I think line of sight to it for sure is gonna ramp dramatically. Um, the capability's gonna ramp dramatically. And so, um, so I just, uh, the other thing here is just, like, a lot of people, uh, I don't know, either skeptics or people who just don't know what to think, like if they tried it two years ago, they don't understand what's happening today. If they tried it six months ago, they might not have a good, uh, idea of what's happening today. By the way, if they try the free version, they might not have a good idea of what's happening today. Um, or if they try the version that's bundled, you know, into their whatever, um, uh, you know, they might, they might, you know, that just is, like, a free add-on to something. They're not gonna have a, you know... To, to, to really... It's just like anything new. Like, to, to really understand this, you have to be directly in front of it. Um-
- ETErik Torenberg
Yeah
- MAMarc Andreessen
... the g- the good news is, like, that, that literally means you have to be willing to put out $200 to get the, to get whatever is the, the, you know, basically the premium package of any of these things. So it's, like, not that much money if you wanna get up close to these things. Um, but I, I would, yeah, if anybody... Yeah, I don't know. We have a selected audience of people who are probably believers. But, um, anybody who's a skeptic on these things, I would just say it's really important, um, to be face to face with the actual technology, um, and to, and to, to be face to face with it now and not have a lagging view.
- ETErik Torenberg
Right. Yeah, and state of the art. The... W- what do you say to people, uh, or what do you say to the idea of, like-You know, apparently, you know, the NPS of AI in, in this country was like 30% or something like that. It ca- came out re- recently as pretty low, and they're comparing it to, to China where, where it's much higher. I'm, I'm, I'm curious what you think is the source of, of why it's currently low, and what could a strategy to, to boost it look like? You know, some people have suggested economic incentives like, you know, some sort of like Trump accounts tied to AI companies, like a basket that people get access to, to feel economically aligned with it in a more direct way. Uh, even though, of course, you know, in- it, it will increase the, you know, GDP and economy in, in ways that they'll also benefit from. O- others say, "Hey, we actually just need to tell better stories around the impact that it's having on, you know, in people's lives and their health and their education, and just the, the, the, you know, people having a tutor or people having a lawyer or people having a doctor, you know, who, who couldn't afford one otherwise." Um, what, what are, what are your thoughts on the sort of AI sentiment
- 38:48 – 45:28
Why AI Sentiment Polls Are Misleading
- ETErik Torenberg
perspective?
- MAMarc Andreessen
Yeah. So I would separate two, two, two things. Um, so I would separate sentiment, and sentiment is, is as interpreted through polling. Um, and we'll talk about that. And then, and then you, but then I would bring it up to separate it out though, which... 'Cause you, you used the term, maybe inadvertently, NPS, uh, which is net promoter score, which is, which is more, um, their view of actual, actual product use- product usefulness, right? A- NPS, NPS for people who don't know is a, is a term of art called net promoter score, and it's like, it's basically the, the, the most high quality way to find out whether somebody really likes a product, which is you literally ask them, "Would, would you, would you recommend this to a friend?" That's, that's, that's called the NPS rate. And so, but I, but I bring, I bring that up, of course, because there's a big difference between those, right? Um, and so-
- ETErik Torenberg
Everyone's using it and benefiting from it and couldn't live without it, and yet [laughs]
- MAMarc Andreessen
Well, exactly. This is the thing. This is, this is the thing. So, so, and, and, and by the way, this is a very common thing in so- like in, in, in properly conducted social science. Like proper- uh, like every Social Science 101 textbook will tell you that you cannot just ask people what they think. Um, you, you, you will get back all kinds of crazy shit. And I'll, we'll talk about why that's the case. But this is like, this is very standard social science methodology, which is you never just ask people. What you do is you watch their behavior, right? And, and what you do is, and then you wanna... What, what you wanna do is you wanna look for the gaps between what they say, what they say they believe versus what, what they actually do. And, and this is true like universally for all form of human behavior. Uh, for example, if you're studying, let's say, mating patterns, right? [laughs] Like, you know, who people date and marry. Like it's just been well established forever that the thing that they say that is their criteria li- I mean, you know, we all see this with our friends, right? You know, our, our friends all start out single with a certain criteria list, and then, and then they marry somebody like completely different. And so it's like, okay, you know, who, who do you believe? Me or your lying eyes, right? [laughs] Like who do you believe or what do you believe? What I told you I wanted or what I actually demonstrated that, you know, that, that, uh, that I wanted? And, and this is basically true for all areas of human behavior. But this is like fairly ar- you know, this is a fair... You know, this is, this is one of these sort of slightly counterintuitive ideas that you have to kind of have been trained up in and have seen examples to really understand. And so what, what happens is, of course, people don't, people don't know this or they forget this. Um, and then what happens is there's, there's, there's like, there's literally just like a poll and somebody does a poll, and then the poll comes back with like results, and it looks, and it looks like, you know, in the poll, in the results, it looks like, oh, if people say that, then that must be the case. Um, but then you, you get into this thing which is like, okay, first of all, you're asking them what they think as opposed to watching their behavior, and there's this, there's potentially huge delta there. And then the other thing is everybody in the world of polling will tell you, like you can basically make a poll say whatever you want. And, and this is one of the reasons why you have to look at what people do, is because you can make a poll say whatever you want. In fact, there's a whole category of poll that's called a push poll. Uh, push poll, P-O-L-L, push poll, which is you, you word the questions in a way to generate the answers that you want, or you word the questions in a way to actually cause people to think differently than they did before the poll. You know, so the, the, the political example of a push poll is, you know, would you continue to support your favorite candidate if you knew that he, you know, was killing kittens in his spare time? Right? And so-
- ETErik Torenberg
I've seen that used
- MAMarc Andreessen
Right. And so number one is people are gonna say, "No, of course I would not support him." And then number two, people are gonna say, "Wait a minute, I didn't know he killed kittens in his spare time. You know, that's horrible." Right? Um, and so, so in polling, you, you can manipulate these things in all kinds of ways, up to and including what people actually think. So it's really, really dangerous. And then you overlay on top of that the media environment, and of course, the media environment is, you know, as we, you and I have discussed many times, like the, what, you know, what is the thing that the press hates the most in the entire world? Um, you know, is, is tech. And, and of course, what is the, you know, vanguard of tech right now? And one of the, one of the, one of the things is AI. And so the, of course, the press hates AI with the fury of a thou- of a thousand suns. And so the press is running this, you know, sustained, you know, kind of fear campaign on AI. And so if you just, if you like drown the, the audience with negative narratives, um, and then you ask, you know, basically these, the, these loaded polling questions, of course you're gonna generate... I, I mean, I, I, we can pick any topic. We can pick like fluffy bunnies running in the field, and we can produce the same thing. You know, don't you know how much they shit? Like i- i- I mean, you can just do all kind... You know, they chew up all the crops. Everybody's gonna die from hunger. Like you can manufacture a negative result on anything, uh, uh, by how you do this, which is the, the exercise that, that, that these people have been on. Um, and, and the reason I'm confident in saying that is because then you look at what they actually do. Um, and of course, what they're actually doing is they're using AI. They're using it a lot. They, they love it. The NPS scores are like super high. The usage levels are super high. Um, they're... And by the way, the usage le- the, the, the, the, the churn levels are shrinking. The, the, the, the recurring usage, uh, uh, patterns, consumptions are, are rising over time, um, you know, which is, which is really important. Um, and people love it. Um, and people love it in the same way that they love their cell phones, and in the same way they love their Netflix, in the same way that they love their, um, you know, the same way that they love their, um, social media, and it's the same way that they love their ice cream. And like pe- you know, people love it. Now, if you, if you poll somebody and you ask, you know, do y- you know, "Well, do you think ice cream is good for you?" They're gonna say no. But like, uh, you know, late at night, they're gonna be in there with, you know, their, their carton, 'cause like ice cream is delicious. Um, and so it, you know, it's the same thing with AI, um, which is, yeah, pe- people are using it. They love it. The, you know, usage numbers are speaking for themselves. The growth rates of these companies are speaking for themselves. You know, this, look, this is the fastest category of technology in the entire history of the world, right, in terms of growth rate of usage and, and revenue. Um, so it's, it's speaking for itself. And so, so basically what you have is a, you, you, you, you, you have this project fear campaign. Um, and I, I, I would say, you know, maybe two things added onto that, which is, uh, you know, number one is it's like the thing that is, I would say, not helpful is that the companies themselves have been running the fear campaign. Um, and so, you know, the fact that certain companies, um, have been, you know, sort of for a variety of reasons running a fear campaign is certainly not, not, not helping any. Um, and, and again, it's this weird paradox is they're running the fear campaign while they're actually building the thing that they tell everybody to be afraid of. And so, you know, there's again, a little bit of a watch what I, watch what I do, not what, not what I say.And then it's like, yeah, should the industry have, like, better narratives? Like, yes, almost certainly the industry should have better narratives and better spokespeople and so forth and so on. But just like, okay, like fine, yes, I'm sure that's true. But having said that, that... it's not like that would make the fear campaigns go away. It's not like that would make the press coverage go away. It's not like that would make the, the sort of fake polls go away. Um, I, I'll close on one final polling observation, which is David Shor, who's a, by the way, a very left-wing, very progressive pollster, but very well respected, uh, just did a, a, a different kind of, a different kind of poll, I think much more properly constructed, where he asked Americans to stat- stack rank the issues, uh, that they really care about. Um, and I, I believe it's... I'm pulling out this out of the top of my head, but I, I believe AI ranked as number 29. Um, and so, and, and, and again, it's just sort of like once you, once you get out of the bubble of like, everybody must hate and fear this stuff, it's just like, of course AI ranks as number 29 because, like, it doesn't hit... It's having no tangible impact on anything relative to issues one through 28, right? Like just obviously Americans are dealing with, like, more important issues in their daily lives than AI. Like, obviously. Like, they're dealing with energy costs and they're dealing with crime and they're dealing, like, any number of drug addiction, like any number of other things, um, they're, they're more worried about. And like, and by the way, and like everybody knows this who, like, lives a normal life is just like, "This is not, like, the thing that I'm worried about. I'm worried about, like, I, you know, how am I gonna make my house payment?" Like much more fundamental things. Um, uh, and so, or you know, what's, what's happening at my kids' school? Um, you know, my, my, you know, much, much, you know, what's, what's happening with my health? Like what... Much more central things. And so I, I think, I think if you, if you get, if you get to the smart polls and the smart pollsters, they, they also end up debunking this.
- 45:28 – 52:25
UFOs: What We Know and What the Government Has Hidden
- ETErik Torenberg
S- speaking of things that are not, uh, you know, urgent on people's day-to-day life and yet capture the imagination, uh, whenever they, they, there's news about it, UFOs. Uh [laughs] so there was, uh, some, some news that, that came out. Um, yeah, I don't... We haven't spent a ton of time talk- talking about this topic, so I'm curious for your general, uh, how have you kind of perceived, um, th- this topic when there's been news out about it o- over the, over the years? Um, I remember during, during COVID, you know, Mike Solana, w- our, our friend was, was coming out, uh, and, and really sort of, uh, getting excited about w- the, the news that was being reported then. What, what's been your vantage point and, and what do you, what do you think about it now?
- MAMarc Andreessen
Yeah. So I should start by saying I don't... So I don't know anything. So start, start, start by saying that. I, I know nothing, um, that everybody else doesn't know. Um, so I'd start by saying, like, number one, I want to believe. Um, like I... my, my usual thing on this is I, I want to live in the world in which this is a real possibility. Um, and by the way, I was, I was actually, I, I... Okay, AI psychosis. I was in AI psychosis the other night, um, and I was like, I was talking to one of the, one of the bots, and I was like, "All right. How many, uh, uh, galaxies are there in the universe again?" And, and I don't know if you've, like, looked that up recently, but, like, the number keeps growing, and I forget what the number is, but it's like a giant number. And then I'm like, "How many stars in, in, in each galaxy? And then how many planets? And then how many Earth-like planets?" And, and the number, I don't have it off the top of my head, but if you, if you get, do address it, like how many Earth-like planets are there in the world in which a human being could, like, step out of a spaceship and breathe and be fine? It's a staggering... It's a very, very, very large number. I mean, it's, it's, it's almost an uncountable number of Earth-like planets just in, in the statistics. And so it's like, all right, like, you know, it must be the case, um, you know, that there's, uh, uh, there's, there's, there's other, other stuff going on out there. Um, and so, you know, logic- logically, like that makes sense, and then I, I would love to live in a world in which they figure out a way to, you know, at some point get here, hopefully in a, in a, in a, in a peaceful way. Um, uh, you know, having said that, you know, the, you know, the, as you know, the pr- the problem with this space is, you know, generally, as you get close to the details, you know, the, the, the examples tend to fall apart. Um, and, you know, there's all these, like, the classic examples like UFO. Like what, what appears to be like, uh, you know, you, you'll have these things where like a US military aircraft or something will have a camera imagery that looks like it's tracking a, a, a, you know, rapidly moving and weirdly maneuvering object, and it's, it's just like, you know, you get, you get close enough to that and look at the details and it's, you know, it's like the... There's like a parallax optical illusion, uh, thing that pop, uh, pops up. Um, and then there's artifacts, uh, instrument artifacts, uh, camera artifacts, imagery, digital imagery artifacts. Um, and then there's, um, you know, like literally, like weather balloons and ball lightning and all these other things. So like, yeah. So I, I haven't... I, I would, yes, I want to believe. I haven't seen the one yet, um, that, um, is, uh, ha- has tipped me over it. I, I, I, I would like to. I will. I, I will. It was big, big release of new information today. Um, it, it is really fun, by the way, to have the official White House X account being t- uh, tweeting, uh, uh, transcripts of interviews with, uh, US, uh, intelligence officers apparently relaying accounts that they've had. So I will be up late reading tonight. But, um, uh, you know, fi- finger, fingers crossed.
- ETErik Torenberg
Yeah.
- MAMarc Andreessen
Um, yeah.
- ETErik Torenberg
Friends have said some things to the effect of, um, hey, it's unclear what's actually happening, but what is clear is the government is h- or at certain times has, has hid cer- certain, uh, materials. Why would they do that if, if, uh, if there's nothing to really worry about?
- MAMarc Andreessen
So I don't know how much of this has been fully validated, and I'm not really an expert in it. I would say, uh, like my under-- Like I, I think two things are pretty clear at this point. I think one is that, you know, there have been classified, you know, there have been classified, there, you know, the when, when stealth, when stealth fighters and bombers were being developed, you know, that whole program was like incredibly highly classified. And so if they were gonna do test flights on something like that, you know, they, they were gonna have to like do anything they could to prevent people from realizing what was actually happening. Um, and so, you know, you know, for sure there were class- lots of classified aerospace programs over the years that, that would have had various kinds of cover stories or various kinds of, you know, or let's just say blankets of, of, of, of suppression of information, you know, kind of placed over them because it's like the most, you know, some of the most highly classified information in the government, um, you know, that would cause people to kind of think that there's information being hidden. I mean, Air, Area 51 was, of course, the classic example of this for a long time.Which was, um, you know, which was, which was, and the whole Area 51 thing was around basically these classified test flights for, for new, new aircraft. Um, and then, um, and then, uh, and then at least, you know, there are suggestions... I don't know if this has been validated, but there are suggestions that at dif- different points in time, the government might have put out UFO stories as, as, basically, a- as an actual undercover story. Um, and so, you know, what, i- if you're a, you know, if you're a, let's, let's say you're a highly capable military intelligence officer whose job is to make sure that the stealth flight, you know, doesn't become, you know, recognized for what it is, um, because it would, you know, that would be very bad for national security, then you'd much rather have, you know, basically a UFO cult kind of get built up around it, um, where people get all, you know, kind of crazed and freaked out about UFOs. By the way, for two reasons. One is to give people a story to believe that's not that you have some new breakthrough military technology, but the other thing is it, it ma- and this actually may be the serious observation. Um, uh, i- if you can build, if, if the argument would be, I think, if you could build up a UFO cult ar- around something, then you make any investigation into that topic something that people feel like they can't do, right? Um, and, and, and my understanding is, by the way, this was true for a long time e- even in the US military, which is, um, if a US, you know, Air Force pilot, you know, or a commercial airline pilot, by the way, thought that they had seen something weird, I think for a long time a lot of pilots didn't want to report, uh, what, what they, you know, who, what they had seen because they didn't want to be viewed that they were, like, UFO nuts. Um, and of course, if there are actually UFOs out there, like, that is a very big problem. Um, or by the way, if there are just other kinds of things out there, right? You know, if there's, you know, if the Chinese are testing some sort of new advanced, you know, high-speed drone or something, you know, you want the pilots to be able to report that even if they, you know, think that it might get mischaracterized as a UFO. Um, and so anyway, like, I, I don't know, maybe, maybe, uh, maybe the, the, the, the, the interesting thing we could say on this is, um, all of this played out in the old media environment. Um, and so, like, all of this played out in the world of broadcast TV, and then, um, you know, on the sort of official programming on the one hand, and then to the extent that there was, like, unofficial media, it had to be in, like, mimeograph newsletters, right, or, like, paperback books. Um, and you know, when I was a kid, there were all these, like, crazy UFO paperback books. Um, you know, and you, you know, you'd always tell, you know, the, the books that said there were no UFOs, you know, were, were in hardback, and the books that said there were, you know, ma- many UFOs were in, were in paperback. Um, so, uh, you know, in the... Maybe, maybe the, the, the smart thing you could say is in the, um, in the, in the new media environment, this, this is yet another example of, like, these, these, these old walls just collapse. You know, the Overton window just disintegrates. Um, and so of course, you know, the new media environment is extremely conducive to the spread of every UFO theory in the world. Um, of course, it's also extremely conducive to the, to, to the spread of propaganda campaigns if you wanted to, you know, like I said, if you wanted to hide real information by spreading propaganda. And then of course, the, the, the pressure builds, you know, very much along the lines of the Epstein thing, right? The, the, the pressure builds and builds and builds and builds until at s- at some point, you know, the, you know, you get, you know, you get somebody in the White House who's just like, "All right, screw it. Like, we're gonna rip the Band-Aid off and find out what's actually going on."
- ETErik Torenberg
Yeah.
- MAMarc Andreessen
You know? No, you know. Assuming that they're not still.
- ETErik Torenberg
[laughs]
- MAMarc Andreessen
But fuzzing, fuzzing, fuzzing the details. But we'll leave that to the next turn of the-
- ETErik Torenberg
Yeah
- MAMarc Andreessen
... of the situation.
- ETErik Torenberg
Exactly. Um, we'll stay monitoring. We'll, we'll close on a last couple questions from the chat.
- 52:25 – 1:06:27
Advice for Young People & the Generational Divide
- ETErik Torenberg
One is, um, advice for young graduates. If, if, if you were, uh, in college today, you know, you of course were at the forefront of the internet revolution at the University of, of Illinois. What would you be studying or would you even be in college, uh, you know, today in 2026? H- what advice might you have for, uh, for college students, uh, you know, uh, uh, sort of trying to make sense of how to prepare for, for what's to come?
- MAMarc Andreessen
Yeah. So it's basically gain AI superpowers. I, I think it's actually, you know, very straightforward. It's like, okay, you, you, you have, you have, you, you, you, you have the enormous stroke of luck, um, that you have arrived at the moment in which there is this new capability for augmenting, you know, human, uh, ability on, um, on, on, on a thousand fronts at the same time, um, that's just dropped into our laps. Um, and it's going to get much better from here. Um, and you know, enormous numbers of people who are supposedly older and wiser than you are are gonna dig in their heels, and they're gonna be mad about it, and they're gonna fight it, and they're gonna not wanna do it. Um, and you know, you are gonna have the opportunity to have this be something that is absolutely key to your skill set, um, and key to everything that you can accomplish as a professional or as a creative, um, you know, for the, for, for the next 50 years. Um, and so I would just, like, lean in as incredibly hard on that. You know, walk into every job interview with like, you know, "Here's my whatever, portfolio, resume, whatever. Like, here, here is, here, here is how I use this technology. Here are the capabilities that I'm bringing, um, you know, to the table." And by the way, you know, some employers you'll talk to will, they'll, you know, they'll fuzz out on that and not, not, not, not respect it, but you'll, you know, other employers will be like, "Wow," like, that, that's clearly, you know, this is exactly what we want. Um, and so this is a, uh, [laughs] this is a, actually this is actually funny. Here it is. Um, Douglas Adams, the, the great, uh, science fiction novelist, um, you know, said there's... He said there's a, there's a, he said there's a repeating-- And this was r- by the way, pre-AI, like this is something he said, like, 30 years ago. Uh, he said there's a repeating pattern of how new technology is received by the different, uh, age cohorts, um, in society. He said, um, if you are, um, when a new technology arrives, whatever it is, in this case AI, um, he said if you're below the age of 15, he said this is just how the world's always worked, it's just obvious. Um, um, and then if you're between the ages of 15 to 35, this is cool and nifty and you can probably get a, you know, get a career, uh, using it. And then if you're above the age of 35, this is unholy, uh, and against everything that society stands for and should absolutely be destroyed. Um, and so I think that, um, I think that, uh, I think 15 to 35, and especially 15 to 25 right now, like I, yeah, I am very jealous. Like I, I, I, it, I, I, I, yeah. I, I, I generally don't wish I could go back in time and do things over again, but I, it would be really, really fun right now to be 18 or 20 or 22, uh, and to have this capability and, and, and figure out what I could do with it.
- ETErik Torenberg
And we're, um... It's funny, w- we at, uh, a16z are trying to hire more, more of these people, um, because they're AI native. Um, and they're gonna help us become more AI native.
- MAMarc Andreessen
By the way, this is the thing. This is, there's this narrative right now to be a part of the doomer na- part of the doomer narrative is, oh, companies are never gonna hire junior employees again. Uh, the new generation is screwed 'cause companies are never gonna hire junior employees again 'cause those are the most easily replaced by AI, and so companies are only ever gonna have senior people. And I think the, I believe the opposite is true. Um, I, I think 100%. You, you want the, you want the AI native kids. Um, like the AI native, the AI n- the AI native kids are going to outperform the, um, you know, th- their older luddite peers, like gigantically. Titanically. Now, their, their older peers who are not luddites are also gonna do great. Um, but um, it is, yeah, no, an e- an 18-year-old with, uh, or by the way, a 24-year-old, or by the way, a 14-year-old with AI, uh, we are gonna see super producers, you know, the likes of which we've never seen in the world. So yes. By the way, this is gonna greatly stress the, uh, this is gonna be another big, uh, uh, point of stress on all the child labor laws.
- ETErik Torenberg
Yeah.
- MAMarc Andreessen
Um...
- ETErik Torenberg
[laughs] Yeah, exactly.
- MAMarc Andreessen
Let me just say, let me just say, the children yearn for the AI minds.
- ETErik Torenberg
Yeah. [laughs] Yeah. They're, uh, yeah, ab- absolutely. The, um-Speaking of, you know, we talked about zoomers in pr- in previous episodes and, and, and why you like them in terms of, you know, they just have s- so much courage, um, and, and are, are willing or sort of kind of fed up 'cause they grew up in, you know, COVID, school, and all, all these sort of, you know, a- adjacent sort of, uh, you know, sort of impositions. And, um, but one, one thing you quote tweeted recently was Chris Arnade's, uh, sort of post around the, uh, you know, people talk about the educational divide, but there's also generational divide in terms of boomers being just much more, um, sort of confident in their sort of truth, and younger people being more, uh, sort of post-truth, relativistic, more pluralist. Um, and I, I thought that was a really interesting sort of, uh, epistemological divide. What, what, what did you find interesting about or, about, or how do you see that play out?
- MAMarc Andreessen
Yeah, so there's really two parts to it, which is very interesting. So part number one is, um, a lot of boomers, um, uh ... [laughs] Somebody once said the definition of a baby boomer is somebody who believes what's on the TV set. Uh, uh, like they believe what the talking head on the TV says. Um, and like anybody who's 20 knows that you obviously don't do that, right? That, that would be stupid. Um, but every, you know, f- 60-year-old or 80-year-old has been watching TV their entire lives, and when they grew up, you know, it's, it's, it's, it's, it's, you know, the story we've all heard of, you know, a million times. Walter Cronkite used to tell us what the truth was. Um, right, and you know, of course, that was always BS, but n- nevertheless, that, that was what the boomers believed. They, they believed what the TV said. [laughs] They believed what The New York Times wrote, [laughs] right? And so, you know, th- th- th- they believed these things. Um, you know, any- anybody below the age of 40, like just at this point, has example after example after example of how like obviously that's just not true. And then anybody who's like 20 who's been through the last, you know, 15 years in school like just obviously knows that, you know, these people are fake and, you know, this is not real and you just can't take this stuff seriously. Um, and so, um, so, so, so part of it is that divide. And so the, the, the boomers had, um, uh ... By the way, there's this great YouTube account. There, there's this amazing video on this. There's this great YouTube account called Academic Agent. Uh, it's this, uh, it's a British, uh, uh, uh, author, uh, named, uh, Nima Parvini, uh, who writes these really interesting books, and he, but he has this, uh, two-hour video that's really worth watching, and it's, it's, it's called, uh, Boomer Truth. Uh, and so it's like a two-hour documentary on kind of this concept he calls boomer truth, which is basically like whatever the TV says, um, and how it's falling apart. So, so there's sort of, there's sort of the boomer truth thing. Um, but then there's this other thing which is like a key part of boomer truth is that there's no fixed morality, right?
- ETErik Torenberg
Sure.
- MAMarc Andreessen
So, so, so like a key part of boomer truth is you get to make up your own values, like all cultures are the s- you know, moral relativism. All cultures are the same. Western society is not superior. You know, there's many, many different cultures. They're all wonderful. Like it's all great, it's all great, it's all great. If anything, the West is the worst of the cultures. The other ones are better. You know, just, like that was such a ... Like, like before there was woke, there was political correctness, and like the political correctness of like when I was in like college, it was literally around what's called multiculturalism or, or moral rel-
- ETErik Torenberg
Yeah. Peter Thiel and David Sachs wrote about it in their book, you know, Diversity Myth in 1995.
- MAMarc Andreessen
Diversity Myth, and it was, it was, it was actually at, it was actually a term that it's called multicult- multiculti- multiculti- multiculti- multicultural.
- ETErik Torenberg
Yeah.
- MAMarc Andreessen
Um, and, and, and there were these furious debate... There's a, there's a book, the classic book of that era. Peter's book is great, but actually before that, there was a famous book at the time which was huge headline news all through the country when it came out called The, The Closing of the American Mind.
- ETErik Torenberg
Yeah.
- MAMarc Andreessen
And it was this sort of, uh, this sort of right-wing academic at University of Chicago who basically said these, these colleges are teaching these kids that there is no, there is no mor- you know, there is no morality. Um, you know, it's all just basically morality is just choose your own adventure. Um, and so there, there is this moral relativism that's kind of at the heart of boomer truth, right? And so, so it's this weird thing where it's like there is a, a, a fixed received belief that there is no fixed morality. Um, and so if you're on the ... And then basically like the entire media apparatus, the entire cultural program, the entire educational system got designed around this. Um, and all of the stuff, all of the crazy stuff that, you know, kids are getting in school now is basically, you know, downstream of, downstream of this movement from, you know, 30, 40 years ago, uh, 60 years ago. Um, and so if you're, yeah, if you're 20, you've just like come up in this sort of like weird environment in which n- you know, on the one hand, you're just like the boomers have no credibility at all 'cause like I, I, I can't believe they still believe what's on TV. Um, and then number two is like to the extent that they, we do listen to anything they say, they keep telling us to not judge anybody and not judge anything, and that all, all, all moralities are equal and all cultures are equal. And so of course they're gonna c- of course they're, zoomers are gonna come out of that with like just like an incredible level of skepticism. And then by the way, this is not an abstract exercise because these are the, these are the kids who came up through COVID, right? Uh, uh, and these are the kids who came up through woke, and these are the kids who came up through like all of the, all of, all of the craziness of the, of the last, you know, of the last decade, you know, 15 years. And so I, I think these kids are just coming out with like a completely different viewpoint on how the world works. Uh, and, and by the way, you know, not in every case, but like in many cases. Completely different, like m- much more, I would say much, almost like simultaneously more open-minded and more critical, like much more interested in ideas, uh, much more, much more skeptical of, of, of authority, much more skeptical of received wisdom, uh, much more cynical about, about manipulation. Um, m- by the way, much more sensitive about the media environment. Um, you know, they, they're, they're much more aware of the idea that there actually is psychological warfare going on and they have been on the receiving end of it. Um, you know, mu- much less, much more skeptical of authority. You know, they just, you know, their, their view of the, the authority figures that they've seen in their life, you know, in, in many cases is just like complete contempt, um, and in many cases very well earned. Um, and so yeah, it's just a, it's, it's just a, it's a starkly different, um, I think it's a starkly different worldview than the, than the, for sure than the boomers had. Also very different than my generation, Gen X.
- ETErik Torenberg
Yeah.
- MAMarc Andreessen
Also very different than millennials. Like it, it's something new and, and I'm, I'm, I'm very excited. I, I think, I think they're fantastic.
- ETErik Torenberg
Yeah. S- sp- speaking of something new, w- is it, would, would it be fair to summarize retard maxing as stoicism meets you can just do things?
- MAMarc Andreessen
No, I think it's just you can just do things.
- ETErik Torenberg
Okay. [laughs]
- MAMarc Andreessen
Like I think it's even, like I think it's even, it's even shedding ... I don't know. Uh, you could say, I don't know. May- maybe, I think I can see what you're driving at, and I, I think you could probably explain it that way, but I think, I don't know. The way I put it is the stoics put a lot of time and effort into trying to be stoic.
- ETErik Torenberg
[laughs]
- MAMarc Andreessen
Whereas the whole point of retard maxing is you're not supposed to put [laughs] that level of time and effort into being the way that you are. You're just supposed to do it. Um, and so yeah, I guess you could say our friend, uh, Ryan Holiday is a stoic and not a retard maxer.
- ETErik Torenberg
[laughs]
- MAMarc Andreessen
Um, uh, uh, uh, as he demonstrated this week. And so, um, uh, may- maybe, maybe right there in that, in that, in that, in that, in that video you can see the difference.
- ETErik Torenberg
Yeah. Uh, that, that's well said. Um, last question from the chat and we'll get you out of here. How are you such a good monitor? How, uh, what is your secret to monitoring so many situations? Uh, any strategies? What is your approach?
- MAMarc Andreessen
Well, of course, being plugged into the MTS fire hose-
Episode duration: 1:06:36
Install uListen for AI-powered chat & search across the full episode — Get Full Transcript
Transcript of episode k1z0e7bGzq0
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome