EVERY SPOKEN WORD
150 min read · 30,004 words- 0:00 – 15:00
And here we go.…
- JRJoe Rogan
And here we go. All right, Nick. This is, uh, one of the things that scares people more than anything, is the idea that we're creating something, or someone's gonna create something, that's gonna be smarter than us, that's gonna replace us. Is that something we should really be concerned about?
- NBNick Bostrom
I presume you're referring to babies.
- JRJoe Rogan
(laughs)
- NANarrator
(laughs)
- JRJoe Rogan
I'm referring to artificial intelligence.
- NBNick Bostrom
Ah, yes.
- JRJoe Rogan
Ugh.
- NBNick Bostrom
Well, it's the, the big fear and the big hope, I think.
- JRJoe Rogan
Both?
- NBNick Bostrom
At the same time, yeah.
- JRJoe Rogan
How is it the big hope?
- NBNick Bostrom
Well, there are a lot of things wrong with the world as it is now.
- JRJoe Rogan
I'm trying to pull this up to your face, if you would.
- NBNick Bostrom
Um, all, all the problems we have, uh, most of them could be solved if we were smarter or if we had somebody on our side who are a lot smarter with better technology and so forth. Um, also, I think if we wanna imagine some really grand future where humanity or our descendants one day go out and colonize the universe, I think that's likely to happen, if it's gonna happen at all, after we have superintelligence that then develops the technology to make that possible.
- JRJoe Rogan
The real question is whether or not we would be able to harness this intelligence, or whether it would dominate.
- NBNick Bostrom
Yeah, that certainly is one question. Um, not the only. You could imagine that we harness it, but then use it for bad purposes as we have a lot of other technologies through history.
- JRJoe Rogan
Yeah.
- NBNick Bostrom
So I think there are really two challenges we need to meet. One, one is to make sure we can align it with human values, and then make sure that we, together, do something better with it than fighting wars or oppressing one another.
- JRJoe Rogan
I think... Well, what I'm worried about more than anything is that human beings are gonna become obsolete, that we're going to invent something that's the next stage of evolution. I'm, I'm really concerned with that. I'm really concerned with if we look back on ancient hominids, uh, Australopithecus, just think of some primitive ancestor of man, we don't wanna go back to that. Like, that, that's a terrible way to live. I'm worried that what we're creating is the next thing.
- NBNick Bostrom
I think we don't necessarily want, or at least I wouldn't be totally thrilled with, with a future where humanity as it is now was, was the last and final word, the pa- like, ultimate version beyond.
- JRJoe Rogan
Right.
- NBNick Bostrom
I, I think there's a lot of room for improvement.
- JRJoe Rogan
Sure.
- NBNick Bostrom
But not anything that is different is an improvement.
- JRJoe Rogan
Right.
- NBNick Bostrom
So, so the key would be, I think, to find some path forward where the best in us, uh, can continue to exist and develop, uh, to even greater levels. And maybe at the end of that path, it looks nothing like we do now. Maybe it's not two-legged, two-armed creatures running around with three pounds of thinking matter, right? It might be something quite different. But as long as it... what, what we value is, is present there, and ideally in a much higher degree than in the current world, then that could count as a success.
- JRJoe Rogan
Yeah, the idea that we're in a state of evolution, that we are... just like we look at ancient hominids, that we are eventually going to become something more advanced or at least more complicated than we are now. But what I'm worried is that biological life itself has so many limitations. When we look at the evolution of technology, if you look at Moore's law or if you just look at new cellphones, like, they just released a new iPhone yesterday and they talked about all these incremental increases in the ability to take photographs and wide-angle lenses and night mode and a new chip that works even faster, these things, uh... there's not... the, the word evolution's incorrect, but the innovation of technology is so much more rapid than anything we could ever even imagine, biologically. Like, if we had a thing that we'd create, if we created, um... instead of artificial intelligence in terms of, like, some- something in a chip or a computer, if we created a life form, a biological life form, but this biological life form was improving radically every year, like, it didn't even exist, like a... the iPhone existed in 2007, that's when it was invented. If we had something that was 12 years old, but all of a sudden was infinitely faster and better and smarter and wiser than it was 12 years ago and the newest version of it, version X1, we would, we would start going, "Whoa, whoa, whoa! P- hit the brakes on this thing, man." How, how many more generations before this thing's way smarter than us? How many more generations before this thing thinks that human beings are obsolete?
- NBNick Bostrom
Yeah, it's coming, coming at us fast it feels like.
- JRJoe Rogan
Yes.
- NBNick Bostrom
I mean, uh... and, but some, some people think, "Oh, it's, uh, slowing down now." Um...
- 15:00 – 30:00
One of the things…
- NBNick Bostrom
- JRJoe Rogan
One of the things that scares me the most is the idea that if we do create artificial intelligence, then it will improve upon our design and create far more sophisticated versions of itself, and that it'll continue to do that until it's unrecognizable, until it reaches literally a godlike potential.
- NBNick Bostrom
Mm-hmm.
- JRJoe Rogan
That su- I mean, I forget what the real numbers were, maybe you could tell us, but someone had calculated, some reputable source had calculated the amount of improvement that sentient artificial intelligence would be able to create inside of a small window of time. Like, if it was allowed to innovate and then make better versions of itself, and those better versions of itself were allowed to innovate and make better versions of itself, you're talking about not an exponential increase of intelligence, but an explosion.
- NBNick Bostrom
Well, well, we don't know. So it, it's hard not to forecast the pace at which we will make advances in AI, b- because we just don't know how hard the problems are that we haven't yet solved.
- JRJoe Rogan
Right.
- NBNick Bostrom
And, you know, once you get to human level or a little bit above, I mean, who, who knows? It could be that there is some level where to get further you would need, like, to put in a lot of thinking time to kind of get there. Now, what is easier to, to estimate is if, if you just look at the speed, 'cause that's just a function of the hardware that you're running it on, right? So, so there we know that there is a lot of room in principle. If, if you look at the physics of computation and you look at what would an optimally arranged physical system be that was optimized for computation, that would be like way many, many orders above what, what we can do now. Um, and that then you could have arbitrarily large systems like that. So, um, from, from that point of view, we, we know that that could be things that would be like a million times faster than the human brain and, and with a lot more memory and stuff like that.
- JRJoe Rogan
That... And then something... If it did have a million times more power than the human brain, it could create something with a million times more comput- (clears throat) computational power than itself.
- NBNick Bostrom
Well-
- JRJoe Rogan
It could make better versions. It could continue to innovate. Like, if we-
- NBNick Bostrom
Let me-
- JRJoe Rogan
... create something that we, and we say, "You are..." I mean, "It is sentient. It is artificial intelligence. Now, please go innovate. Please go follow the same directive and improve upon your design."
- NBNick Bostrom
Yeah. Well, but we don't know how, how long that would take then-
- JRJoe Rogan
Right.
- NBNick Bostrom
... to get to... So, I mean, we already have sort of millions of times more thinking capacity than a human has. I mean, we have millions of humans.
- JRJoe Rogan
Right.
- NBNick Bostrom
Um, so if you, if you kind of break it down, you think there's like one milestone when you have maybe an AI that could do what one human can do, but then that might still be quite a lot of orders of magnitude, uh, you know, until it would be equivalent of the whole human species. Um, and maybe during that time other things happen. Maybe we upgrade, you know, our, our own abilities in some way. So there, there are some scenarios where it's so hard to get even to one human baseline that, that we kind of use this massive amount of resources just to barely create kind of, you know, a village idiot.
- JRJoe Rogan
Yes.
- NBNick Bostrom
Uh, using billions of dollars of compute, right?
- JRJoe Rogan
(laughs)
- NBNick Bostrom
So if, if that's the way we get there, then, I mean, it might take quite a while.... because you can't easily scale something that you've already spent billions of dollars building.
- JRJoe Rogan
Yeah, some people think the whole thing is blown out of proportion, that we're so far away from creating artificial general intelligence that resembles human beings, that it's all just vaporware.
- NBNick Bostrom
Mm-hmm.
- JRJoe Rogan
What do you say to those people?
- NBNick Bostrom
Uh, well, I mean, uh, uh, well, one, one would be that I would wanna be more precise about just how far away does it have to be in order for us, uh, to be rational to ignore it.
- JRJoe Rogan
Right.
- NBNick Bostrom
And it, it might be that if something is sufficiently important and high stakes, then even if it's not gonna happen in the next five, 10, 20, 30 years, it might still be wise for, you know, our pool of seven billion plus people to have some people actually thinking about this ahead of time. Um-
- JRJoe Rogan
Yeah, for sure.
- NBNick Bostrom
So, so, so some of these disagreements, I guess this is my point, are, are more apparent than real. Like, there's some people say it's gonna happen soon, and some other people say, "No, it's not gonna happen for a long time." And then, (laughs) you know, one, one person means by soon five years, and another person means by a long time five years. And, uh, you know, it's more of different attitudes rather than different specific beliefs. So, so I would first want to make sure that there actually is a disagreement.
- JRJoe Rogan
Mm-hmm.
- 30:00 – 45:00
And so this could…
- NBNick Bostrom
most desirable attributes.
- JRJoe Rogan
And so this could be a trend in terms of how human beings reproduce, that we instead of just randomly having sex, woman gets pregnant, gives birth to a child, we don't know what it's gonna be, what's, what's gonna happen, we just hope that it's a good kid. Instead of that, you start looking at the... all the various components that we can measure-
- NBNick Bostrom
Yeah. Uh, and so, I mean, in, to some extent, we already do this. There are a lot of, um, testing done, um, um, for various chromosomal ab- abnormalities that you can already check for. But, but our ability to, uh, to look beyond clear, stark diseases that is one gene is wrong-
- JRJoe Rogan
Right.
- NBNick Bostrom
... like the, like the, to look at more complex trait is, is, is, is increasing rapidly. Um, so obviously, there are a lot of ethical issues and-
- JRJoe Rogan
Yeah.
- NBNick Bostrom
... different things that come into that.
- JRJoe Rogan
That's what I was gonna get to.
- NBNick Bostrom
But if I, if we're just talking what is technologically feasible-
- JRJoe Rogan
Mm-hmm.
- NBNick Bostrom
... I, I think that, that ... I mean, already you could do a very limited amount of that today, and maybe you'd get, you know, two or three IQ points in expectation more if you selected using current technology based on 10 embryos, let us say, so very small. But, but as genomics, uh, gets better at deciphering the genetic architecture of complex traits, like whether it's intelligence or, or personality attributes, then, then you, you would have more selection power and you could do more. A- and then there is a number of other technologies we don't yet have, but which if you did, would then kind of stack with that and, and enable much more powerful forms of, of enhancement. Um, so, so, so there, uh, yeah, I don't think there are any major technological hurdles really in, in the way, just some small amount of incremental further improvement.
- JRJoe Rogan
That's wh- when you talk about doing something with genetics and human beings and selecting, selecting for the superior versions, and then if everybody starts doing that, the ethical concerns when you start discussing that, people get very nervous-
- NBNick Bostrom
Mm-hmm.
- JRJoe Rogan
... 'cause they start to look at their own genetic defects and they go, "Oh my God, what if I didn't make the cut?"
- NBNick Bostrom
Yeah.
- JRJoe Rogan
Like, "I wouldn't be here."
- NBNick Bostrom
Yeah.
- JRJoe Rogan
And then you start thinking about all the imperfect people that have actually contributed in some pretty spectacular ways-
- NBNick Bostrom
Yeah.
- JRJoe Rogan
... to what our culture is. And like, w- what if everybody has perfect genes, would all these things even take place? Like, what are we doing really if we're bypassing nature and we're choosing to select for the traits and the attributes that we find to be the most positive and attractive? Like, what are ... like, that gets slippery.
- NBNick Bostrom
I, uh, and, and you, you think what, what would happen if, say, at some earlier age had had this ab- ability to kind of lock in their-
- JRJoe Rogan
Yes.
- NBNick Bostrom
... you know, their, their, their prejudices or, um, if the Victorians had had this.
- JRJoe Rogan
Sure.
- NBNick Bostrom
Maybe we would all be, uh, whatever, pious and patriotic now or something.
- JRJoe Rogan
Yeah. Who knows? The Nazis.
- NBNick Bostrom
Uh, or any other, yeah. So, um, so, so in general, with all of these powerful technologies we, we are developing, there, there is ... I- I think the ideal course would be that we would first gain a bit more wisdom, and then we would get all of these powerful tools. Um, but it looks like we're getting the powerful tools before we have really a- achieved a very high level of wisdom.
- JRJoe Rogan
Yeah.
- NBNick Bostrom
And so-
- JRJoe Rogan
But we haven't earned them. The people that are using them are sort of, uh, we're, we haven't ... Like, think about the th- the technology that all of us use. How many, how many pieces of technology do you use in a day and how much do you actually understand any of those? Most people have very little understanding of how any of the things they use work. They put no effort at all into creating those things, but yet they've inherited the responsibility of the power that those things possess.
- 45:00 – 1:00:00
Now, you pay attention…
- NBNick Bostrom
Um, so I think we could have gotten to all the, the good uses of nu- nuclear technology that we have today w- without having to had, had kind of the nuclear bomb developed.
- JRJoe Rogan
Now, you pay attention to, like, Boston Dynamics and all these, uh, all these different robotic creations that they've made?
- NBNick Bostrom
Well, they seem to have a penchant for doing really sinister-looking, um, bots.
- JRJoe Rogan
(laughs) I think all robots that are... uh, you know, anything that looks autonomous is kind of sinister-looking, if it could do backflips.
- NBNick Bostrom
Well, I mean, you see the Japan... Yeah, I mean, th- like, the Japanese have these, like, big eyes, sort of rounded.
- JRJoe Rogan
Yeah.
- NBNick Bostrom
So it's a different type.
- JRJoe Rogan
They're trying to trick us.
- NBNick Bostrom
Boston Dynamics is-
- JRJoe Rogan
Yes.
- NBNick Bostrom
... I guess, they want the Pentagon to, uh, give them funding or something.
- JRJoe Rogan
Right, DARPA. Yeah, they, they look like they're developing terminators.
- NBNick Bostrom
Yeah.
- JRJoe Rogan
Yeah. But what, what I was thinking is, if we do eventually come to a time where those things are going to war for us instead of us, like, if we get involved in robot wars, our robots versus their robots-
- NBNick Bostrom
Yeah.
- JRJoe Rogan
... and this becomes the next motivation for increased technological innovation to try to deal with superior robots by the Soviet Union or by China, like these, these are more things that could be threats that could push people to some crazy level of technological innovation.
- NBNick Bostrom
Yeah, it, it, it could. I mean, I think there are other drivers for technological innovation as well, um, that, that seems, uh, plenty, um, strong, um-
- JRJoe Rogan
Sure.
- NBNick Bostrom
... like com- commercial drivers, let us say, um, that we wouldn't have to rely on, on war or the, the threat of war to, to kind of stay innovative. Um, and I mean, there has been this effort to try to see if it would be possible to, uh, have some kind of ban on lethal autonomous weapons.
- JRJoe Rogan
Mm-hmm.
- NBNick Bostrom
Um, just as... I mean, there are, there are a few-
- JRJoe Rogan
Drone.
- NBNick Bostrom
There are a few te- technologies that we have, like there is, has been a relatively successful ban on, on chemical and biological weapons, um, which have, by and large, been, you know, uh, honored and upheld. Um, there, there are kind of treaties on, on nuclear weapons, which has limited proliferation. Yes, there are now maybe, I don't know, a, a dozen. I don't know the exact number. But it, it's certainly a lot better than 50 or 100 countries.
- JRJoe Rogan
Yes.
- NBNick Bostrom
Um, and some other weapons as well, uh, uh, blinding lasers, um, landmines, cluster munitions.
- JRJoe Rogan
Mm-hmm.
- NBNick Bostrom
And so, so, so some people think may- maybe we could do something like this with, um, lethal autonomous weapons, killer bots that-
- JRJoe Rogan
Yeah.
- NBNick Bostrom
... you know, do we... is that really what humanity needs m- most now? Like, another arms race to develop, like, killer bots? It seems arguably the answer to that is no. Um, I've, I've, I've kind of... there's a lot of my friends who are, who are supportive. I, I kind of stood a little bit on the sidelines on that particular campaign, being a little unsure, um, exactly what it is that... Well, I mean, certainly, I think it'd be better if we refrain from having some arms race to develop these than not. But if, if you start to look in more detail, what, what precisely is the thing that you're hoping to ban? So if the idea is the autonomous bit, like, the robot should not be able to make its own firing decision, well-
- JRJoe Rogan
Right.
- 1:00:00 – 1:15:00
Right. Yeah. I mean,…
- JRJoe Rogan
beings. If you develop an autonomous robot that's really autonomous, it has no need for other people, that's where we get weirded out. Like it, it doesn't need us.
- NBNick Bostrom
Right. Yeah. I mean, I, I think the same would hold even if it were not a robot but just a program in- inside a computer.
- JRJoe Rogan
Sure.
- NBNick Bostrom
Um, but, but yeah. Yeah. And the, and the idea that you could have something that is strategic and deceptive and so forth.
- JRJoe Rogan
Yeah.
- NBNick Bostrom
So that, that... I mean, but then other elements of the movie, of course, and in general, w- a reason why it's bad to get your kind of map of the future from, from Hollywood is if it... So if, if you think, so is this one guy, presumably some genius living out in the nowhere and kind of inventing this whole system. Like in reality-
- JRJoe Rogan
Yeah.
- NBNick Bostrom
... um, it's like anything else. There are a lot, like hundreds of people programming away on their computers, writing on whiteboards, and sharing ideas with other people across the world. Uh, it doesn't look like a human. Um, s- s- and, and that would often be some economic reason for doing it in the first place. Like not just, "Oh, we have this Promethean attitude that we want to-"
- JRJoe Rogan
Yeah.
- NBNick Bostrom
"... kind of bring..." And I like... So-Um, so all of those things don't make for such good plot lines, so they just get removed. But then-
- JRJoe Rogan
Mm-hmm.
- NBNick Bostrom
... I wonder if people actually think th- of the future in terms of some kind of super villain and some hero and it's gonna come down to these two people and they're gonna wrestle and-
- JRJoe Rogan
Yeah.
- NBNick Bostrom
... you know? Um, and it's gonna be very personalized and concrete and localized, whereas a lot of things that determine what happens in the world are very spread out and bureaucracies churning away and...
- JRJoe Rogan
Sure.
- NBNick Bostrom
Um...
- JRJoe Rogan
Yeah, that was a big problem that a lot of people had with the movie, was the idea that this one man could innovate at such a high level and be so far beyond everyone else is ridiculous. That he's just doing it by himself on-
- NBNick Bostrom
Yeah.
- JRJoe Rogan
... this weird compound somewhere.
- NBNick Bostrom
Yeah.
- JRJoe Rogan
Come on.
- NBNick Bostrom
Yeah.
- JRJoe Rogan
That's, that... But that makes a great movie, right?
- NBNick Bostrom
Yeah.
- JRJoe Rogan
Fly in in the helicopter, drop you off in a remote location.
- NBNick Bostrom
Yeah.
- JRJoe Rogan
This guy shows you something he's created that is gonna change the whole world.
- NBNick Bostrom
And it looked beautiful. I mean, I-
- JRJoe Rogan
Yeah.
- NBNick Bostrom
... can imagine doing some writer's-
- 1:15:00 – 1:18:12
Right. …
- JRJoe Rogan
slaves.
- NBNick Bostrom
Right.
- JRJoe Rogan
They used to think it was slaves, but now because of the bones or the food they were eating really well, and they think that, well ... And also, the, the level of sophistication involved, this is not something you just get kind of slaves to do. You, this seems to be that there was a population of structural engineers, that there was a, a population of skilled construction people, and that they tried to, you know, utilize all of these great minds that they had back then-
- NBNick Bostrom
Mm.
- JRJoe Rogan
... to put this thing together. But it's still a mystery. I think that's the spot that I would go to because I think it would be amazing to see so many different innovative times. I mean, it would amaze ... It'd be amazing to, to be, uh, alive during the time of Genghis Khan or, you know, to be alive during some of the, some of the wars of thousand, 2,000 years ago, just to see what it was like on the ... But the pyramids would be the big one. But I think if I was in the future, some weird dystopian future where artificial intelligence runs everything and, and human beings are, you know, linked to some sort of neurological implant that connects us all together and we long for the days of biological independence and we would like to see, what, what was it like when they first-
- NBNick Bostrom
Mm-hmm.
- JRJoe Rogan
... started inventing phones? What was it like when the internet was first opened up for people? What was it like when people saw ... When, when, when someone had someone like you on a podcast and was talking about potential artificial intelligence and where it could lead us and what it could do?
- NBNick Bostrom
It's the most interesting time.
- JRJoe Rogan
It is the most interesting time.
- NBNick Bostrom
It is now.
- JRJoe Rogan
Yeah. That's what's cool about it to me is that we seem to be in this, this really Goldilocks period of great change, where we're still human but we're worried about privacy, we, we're concerned our phones are listening to us, we're concerned about surveillance states and, you know, p- people put little stickers over their laptop camera. W- we see it coming-
- NBNick Bostrom
Yeah.
- JRJoe Rogan
... but it hasn't quite hit us yet. We're just seeing the problems that are associated with this increased level of technology in our lives.
- NBNick Bostrom
Which is, yeah, that, that is a strange thing if you add up all these pieces. It does-
- JRJoe Rogan
Yeah.
- NBNick Bostrom
... put us in this very weirdly special position.
- JRJoe Rogan
Yeah.
- NBNick Bostrom
And you wonder, hmm, it's a little bit too much of a coincidence. I mean, it might be the case, but yeah, it, it does put some strain on it.
- JRJoe Rogan
When you say a little too much of a coincidence, how so?
- NBNick Bostrom
Well, so, um, I mean, I guess the intuitive way of thinking about it, like what way, like what, what are the chances that-
- JRJoe Rogan
Right.
- NBNick Bostrom
... just by chance you would happen to be, uh, living in the most interesting time in history-
- JRJoe Rogan
Yeah.
- NBNick Bostrom
... being like a celebrity, like whatever, like what, well, like that's pretty low prior probability. Like most people-
- JRJoe Rogan
Like you mean like for me?
- NBNick Bostrom
Well, for you. Or I mean, for, for ... But for all of us, really.
- JRJoe Rogan
For all of us.
- NBNick Bostrom
Um, um, and so, that, that, that could just be. I mean, uh, uh, if there's a lottery, somebody's got to have the ticket, right?
- JRJoe Rogan
Yeah. But, um-
- NBNick Bostrom
Or.
Episode duration: 2:32:58
Install uListen for AI-powered chat & search across the full episode — Get Full Transcript
Transcript of episode 5c4cv7rVlE8
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome