EVERY SPOKEN WORD
150 min read · 30,132 words- 0:00 – 15:00
(drumbeats) Joe Rogan podcast,…
- NANarrator
(drumbeats) Joe Rogan podcast, check it out.
- RKRay Kurzweil
The Joe Rogan Experience.
- JRJoe Rogan
Train by day, Joe Rogan podcast by night, all day. (instrumental music) Good to see you, sir.
- RKRay Kurzweil
Great to see you.
- JRJoe Rogan
I was sta- telling you before, I'm admiring your suspenders, and you told me you have how many pairs of these things?
- RKRay Kurzweil
30 of them, yeah.
- JRJoe Rogan
How did you-
- RKRay Kurzweil
I wear them every day.
- JRJoe Rogan
Do you really?
- RKRay Kurzweil
Yeah.
- JRJoe Rogan
Every day?
- RKRay Kurzweil
Yeah.
- JRJoe Rogan
Why, why do you like suspenders?
- RKRay Kurzweil
Um...
- JRJoe Rogan
Practicality thing?
- RKRay Kurzweil
No, it's, uh... expresses my personality.
- JRJoe Rogan
Mm.
- RKRay Kurzweil
And different ones have different, uh... different personalities that express how I feel that day, so.
- JRJoe Rogan
I see. So, it's just another style point.
- RKRay Kurzweil
Yeah.
- JRJoe Rogan
See, the reason why I was asking-
- RKRay Kurzweil
But, but you don't see any, uh, hand-painted suspenders. Have you ever seen one?
- JRJoe Rogan
Uh, I don't know and I would've not noticed. I only noticed-
- RKRay Kurzweil
Hm.
- JRJoe Rogan
... 'cause you were here (laughs) . I'm not really a suspender aficionado.
- RKRay Kurzweil
Yeah, well-
- JRJoe Rogan
But the reason why I'm asking is 'cause you're, you know, basically a technologist. I mean, you know a lot about technology. And you would think that suspenders are kinda outdated tech. (laughs)
- RKRay Kurzweil
Uh... Well, people like them.
- JRJoe Rogan
Clearly.
- RKRay Kurzweil
Yeah. And I'm surprised they haven't caught on.
- 15:00 – 30:00
Right. …
- RKRay Kurzweil
to happen, like, five years ago.
- JRJoe Rogan
Right.
- RKRay Kurzweil
And we had them two years ago, but they didn't work very well. So it began little less than two years ago that we could actually do large language models. Uh, and, and that was very much a surprise to everybody. Uh, so that, th- that's probably the primary example of exponential growth.
- JRJoe Rogan
We had Sam Altman on. One of the things that he and I were talking about was that AI figured out a way to lie, that they used AI to go through a CAPTCHA system, and the AI told the system that it was vision-impaired, which is not technically a lie, but it used it to bypass-
- RKRay Kurzweil
Well-
- JRJoe Rogan
... are you a robot?
- RKRay Kurzweil
What we don't know now is f- for large language models to say they don't know something. So you ask it a question, and if that, the answer to that question is not in the system, it still comes up with an answer. So it'll look at everything and give you its best answer. And if the, the best answer is not there, it still gives you an answer, but that's, uh, considered a h- hallucination. And we know-
- JRJoe Rogan
A hallucination?
- RKRay Kurzweil
Yeah, that's what it's called.
- JRJoe Rogan
Really?
- RKRay Kurzweil
So-
- JRJoe Rogan
A AI hallucination? So they cannot be wrong. They have to be able to answer things.
- RKRay Kurzweil
So far, we're, we're actually working on being able to tell if it doesn't know something. So if you ask it something, it says, "Oh, I, I don't know that." Right now, it can't do that.
- JRJoe Rogan
Oh, wow. That's interesting.
- RKRay Kurzweil
So it, it gives you some answer. Um, and if the answer's not there, it just, like, makes something up. It's the best answer, but the best answer isn't very good-
- JRJoe Rogan
Mm-hmm.
- RKRay Kurzweil
... 'cause it doesn't know the answer. And the way to fix hallucinations is to actually give it more capabilities to memorize things and, uh, and give it more information so it knows the answer to it. If you, if you tell, uh, an answer to a question, it will remember that and give you that correct answer. Um, but these models are not... we don't know everything. And it, it has to... we have to be able to scan an answer to every single question, uh, which we can't quite do. And it'd be actually better if it could actually answer, "Well, gee, I don't know that."
- JRJoe Rogan
Right. Like, uh, and particularly, like, say when it comes to, um, exploration of the universe, if there's a certain amount of, I mean, vast amount of the universe we have not explored. So if it has to answer questions about that, it would just come up with an answer.
- RKRay Kurzweil
Right. And it, and it's, right, it'll just come up with an answer-
- JRJoe Rogan
Interesting.
- RKRay Kurzweil
... which will likely be wrong.
- JRJoe Rogan
Hmm, that's interesting. But that, that would be a real problem if someone was counting on the AI to have a solution for something too soon, right?
- RKRay Kurzweil
Right. They, they don't know everything. Uh, search engines actually know, are pretty well vetted. And if it actually answers something, it'll, i- it's usually correct. Um...
- JRJoe Rogan
Unless it's curated.
- RKRay Kurzweil
But large language models don't have that capability. Uh, so it'd be good actually if they knew that, that they were wrong. That also tells...... what we have to fix.
- JRJoe Rogan
What about the, the idea that A- AI models are influenced by ideology, that AI models have been programmed with certain ideologies?
- RKRay Kurzweil
I mean, they do learn from people.
- JRJoe Rogan
Yeah.
- RKRay Kurzweil
And people have ideologies.
- JRJoe Rogan
Right.
- 30:00 – 45:00
We used to be…
- RKRay Kurzweil
that, that's something that's positive, really. Um, I mean, if, if we were like mice today, um, and we had the opportunity to become like humans, w- we wouldn't object to that. In fact, we are humans, and we don't object to that.
- JRJoe Rogan
We used to be shrews.
- RKRay Kurzweil
(laughs) Um, and this is gonna basically make us smarter. Uh, eventually, we'll be much smarter than we are today. And, uh, and that's a positive thing. We'll be able to do things that are t- today that we find bothersome, uh, in a way that's much more palatable.
- JRJoe Rogan
The idea of us getting smarter sounds great. Great. It'd be great to be smarter. But-
- RKRay Kurzweil
Right. But people object to that-
- JRJoe Rogan
... the concerns-
- RKRay Kurzweil
... because it's, uh, it's like competition.
- JRJoe Rogan
Hmm? In what way?
- RKRay Kurzweil
Well, I mean, Google has, I don't know, 60,000, 70,000 programmers? And how many programmers, uh, exist in the world? How, how much longer is that gonna be a viable career?
- JRJoe Rogan
Hmm.
- RKRay Kurzweil
Uh, because, uh, large-
- JRJoe Rogan
The AI program.
- RKRay Kurzweil
... language models already can code.
- JRJoe Rogan
Yeah.
- RKRay Kurzweil
Not quite as good as, uh, a real expert coder, uh, but how, how long is that gonna be?
- JRJoe Rogan
Right.
- RKRay Kurzweil
It's not g- it's not gonna be 100 years. It's gonna be a, a few years. Um, so people see it as competition. I have a slightly different view of that. I see these things, uh, as actually adding to our own intelligence, and we're merging with these kinds of computers and making ourselves smarter by merging with it, and eventually, it'll go inside our brain and be able to make us smarter i- instantly, uh, just like we had more connections inside our own brain.
- JRJoe Rogan
Well, I think people have reservations always when it comes to great change, and this is probably the greatest change. The, the greatest change we've ever experienced in our lifetimes, for sure, has been the internet. And this will make that look like nothing. It'll, it'll, it'll change everything, and it seems inevitable. Um, I understand that people are upset about it, but it just seems like what human beings were sort of designed to do.
- RKRay Kurzweil
Right. We're the only animal that actually creates technology.
- JRJoe Rogan
Yeah.
- RKRay Kurzweil
And it's a combination of our brain and something else, which is our thumb. So I c- I can imagine something. Oh, if I take that-
- JRJoe Rogan
You can manipulate it, yeah.
- RKRay Kurzweil
... leaf from the tree, I could create a, uh, a tool with it. Uh, other animals have actually a bigger brain, like the whale.
- JRJoe Rogan
Dolphins.
- RKRay Kurzweil
Uh, dolphins, um, elephants. They have a larger brain than we do, but they don't have something equivalent to the thumb.
- JRJoe Rogan
Right.
- RKRay Kurzweil
Monkey has a thing that looks like the thumb, but it's actually an inch down, and it doesn't actually work very well. So they can actually create a tool, but they don't create a tool that's powerful enough to create the next tool.
- JRJoe Rogan
Hmm.
- RKRay Kurzweil
So we're ac- actually able to cr- use our tools and create something that's that much more significant. Um, so we can create tools, and that's really part of who we are. Um, it, it makes us that much more intelligent, and that's a good thing. Uh, I mean, here's... So here's US personal income per capita. So this is the average amount that we make, uh, per person in constant dollars, and it's just-
- JRJoe Rogan
There it is right here. It's on the screen.
- 45:00 – 1:00:00
Uh, I mean, we'll…
- JRJoe Rogan
- RKRay Kurzweil
Uh, I mean, we'll be able to create, I mean, the singularity is when we multiply our intelligence a million-fold, and that's 2045. So that's not that long from now. That's like 20 years from now.
- JRJoe Rogan
Right.
- RKRay Kurzweil
Um, uh, and therefore most of your int- intelligence will be, uh, handled by the computer part of ourselves. Um, the only thing that won't be c- captured is what comes with our body originally. We'll ultimately be able to do that as well. It'll take a little longer, but we'll be able to actually capture what comes with our normal body, uh, and be able to re- recreate that. So, that also has to do with, uh, h- how long we live because if, if everything is backed up... I mean, right now, any time you put anything into a phone or any kind of electronics, it's backed up. So, I mean, I could loo- this has a lot of data. I could flip it a- and it ends up in, uh, a river and we can't capture it anymore. I can recreate it 'cause it's all backed up.
- JRJoe Rogan
Right. And you think that's gonna be the case with consciousness?
- RKRay Kurzweil
Th- that's gonna be the case of our normal, uh, biological body as well.
- JRJoe Rogan
What's to stop someone like Donald Trump from just making 100,000 versions of himself? Like, if you can back someone up, could you duplicate it? Couldn't you have three or four of them? Couldn't you have a bunch of them? Couldn't you live multiple lives?
- RKRay Kurzweil
Yes, um, uh-
- JRJoe Rogan
Would you be interacting with each other while you're living multiple lives, having consultations about, "What is St. Louis Ray doing? Oh, I don't know, let's talk to San Francisco Ray. San Francisco Ray is talking to Florida Ray."
- RKRay Kurzweil
Uh, it, it's basically a matter of increasing our intelligence and being able to multiply Donald Trump, for example. That, that comes with that.
- JRJoe Rogan
Do you think there'll be regulations on that to stop people from making 100,000 versions of themselves that operate a city?
- RKRay Kurzweil
Th- there'll be lots of regulations. There's lots of regulations we have already. You can't just, like, create a medication and sell it to people that it cures this disease.
- JRJoe Rogan
Right.
- RKRay Kurzweil
We have tremendous nu- amount of regulation on that.
- JRJoe Rogan
Sure, but we don't, really, with phones.
- RKRay Kurzweil
Yeah.
- JRJoe Rogan
Like, with your phone, you could, uh, essentially, if you had the money, you could make as many copies of that as you wanted.
- RKRay Kurzweil
Yes. Uh, um, there are some regulations. We, we have, we regulate everything, but...
- JRJoe Rogan
Yeah.
- RKRay Kurzweil
But you're right. Generally, electronics is, uh, doesn't have as much regulation as-
- JRJoe Rogan
Right. And when you get to a certain point, we will be electronics.
- RKRay Kurzweil
Yes. Yes, I mean, certainly if we multiply our intelligence a million-fold, everything of that additional million-fold of yours is, is not regulated.
- JRJoe Rogan
Right. When you think about the, the concept of integration and technological integration, when do you think that will start taking place, and what will be the initial usage of it? Like, what will be the first versions, and, and what would, what would they provide that-
- RKRay Kurzweil
Well, we, we have it now. Large language models are pretty impressive. And if you look at what they can do-
- JRJoe Rogan
But I mean, I mean, I'm talking about physical integration with the human body, like a Neuralink type thing.
- RKRay Kurzweil
Right. Some people feel that we could actually understand what's going on in your brain and actually put things into your brain without actually going into the brain, uh, with something like Neuralink.
- JRJoe Rogan
So something that, like, sits on the outside of your head?
- RKRay Kurzweil
Yeah. Uh, it's not clear to me tha- if that's feasible or not. I've, I've been assuming that you ac- have to actually go in. Now, Neuralink isn't exactly where we want because it's too slow, uh, and it actually will do what it's advertised to do, like if... I actually know some people like this who were active people and they completely lost the ability, uh, to speak and to understand language and so on, um, and so they can't actually say anything to you. Um, and we can use something like Neuralink to actually, uh, have them express something. They could think something and then have it be expressed to you.
- JRJoe Rogan
Right. And they're doing that, right? They had the first patient, the first patient that was-
- RKRay Kurzweil
Yeah.
- 1:00:00 – 1:15:00
Right. Right. …
- RKRay Kurzweil
two atomic weapons within a week, uh, 80 years ago, what, what's the likelihood that we're gonna go another 80 years, uh, and not, and not have that happen again? Everybody would say zero.
- JRJoe Rogan
Right. Right.
- RKRay Kurzweil
But it actually has happened.
- JRJoe Rogan
Shockingly.
- RKRay Kurzweil
Yeah.
- JRJoe Rogan
Yeah.
- RKRay Kurzweil
Uh, and I think there's actually some message there. Um...
- JRJoe Rogan
Mutually assured destruction. But the thing is, would-
- RKRay Kurzweil
But, but, but-
- JRJoe Rogan
... Artificial General Intelligence...
- RKRay Kurzweil
But that, but that has not happened.
- JRJoe Rogan
Right.It has not happened yet. But would artificial general intelligence, in the control of the wrong people, negate that mutually assured destruction that keeps people from doing things? Obviously, we did drop bombs on Hiroshima and Nagasaki.
- RKRay Kurzweil
Right.
- JRJoe Rogan
We did. We did indiscriminately kill, who knows how many hundreds of thousands of people with those weapons. We did it. And if human beings were capable of doing it because no one else had it, if artificial general intelligence reaches that sentient level and is in control of the wrong people, what's to stop them from doing... Th- th- there's no mutually assured destruction if you're the one who's got it. You're the only one who's got it, and you possibly... My concern is that whoever gets it could possibly stop it from being spread everywhere else, and, and control it completely, and then you're looking at a completely dystopian world.
- RKRay Kurzweil
Right. So that's... If you ask me what I'm concerned about, it, it, it's along those lines.
- JRJoe Rogan
It's that. Along those lines.
- RKRay Kurzweil
(laughs)
- JRJoe Rogan
Yeah. That's what... 'Cause that's what I always want to get out of you guys, because there's so many people that are, um, rightfully so, so high on this technology and the possibilities for enhancing our lives. But, uh, the concern that a lot of people have is that at what cost and what are we signing up for?
- RKRay Kurzweil
Right. But, uh, I mean, if we wanna, for example, live indefinitely, this is what we need to do. We, we can't do... We can't-
- JRJoe Rogan
What if you're denying yourself heaven? Have you ever thought of that possibility? I know that's a ridiculous abstract concept, but if heaven is real, if the idea of the afterlife is real, and it's, uh, the next level of existence, and you're constantly going through these cycles of life, what if you're stepping in, artificially denying that?
- RKRay Kurzweil
That's hard to imagine. I mean-
- JRJoe Rogan
It is hard to imagine, but so is life.
- RKRay Kurzweil
I-
- JRJoe Rogan
So is the universe itself. So is the-
- RKRay Kurzweil
Right.
- JRJoe Rogan
... Big Bang.
- RKRay Kurzweil
My, my father-
- JRJoe Rogan
So is black holes.
- RKRay Kurzweil
My father died when I was 22, uh, so it's more than 50, 60 years ago. Um, and, uh, it's hard f- And he was actually a great musician, and he great, created, uh, fantastic music, but he hasn't done that since he died. Um, and there's nothing that exists, uh, that is at all creative, um, based on him. We have his memories. Uh, I actually created a large language model that represented him. I can actually talk to him.
- JRJoe Rogan
You do that now?
- 1:15:00 – 1:30:00
They certainly do, but…
- RKRay Kurzweil
Can they actually... I mean, they can talk.
- JRJoe Rogan
They certainly do, but would you wanna be one?
- RKRay Kurzweil
Uh, are we different than that, than that?
- JRJoe Rogan
Yeah, we're people. We shake hands. I give you a hug. You pet my dog. You listen to music. You have, you have-
- RKRay Kurzweil
Yeah, we'll be able to do all of that as well.
- JRJoe Rogan
Right, but will you want to? Will you even care? The thing is, like, a lot of what gives us joy in life is biological motivations. There's human reward systems that are put in place that allow us to-
- RKRay Kurzweil
Well, it's gonna be part of who we are.
- JRJoe Rogan
Right.
- RKRay Kurzweil
It'll be just like-
- JRJoe Rogan
But that-
- RKRay Kurzweil
... a person, and we'll also have our physical bodies as well, and that'll also be able to be backed up. And we'll be doing the things that we do now, except we'll be able to have them continue.
- JRJoe Rogan
So if you get hit by a car and you die, there's another Ray that just pops up. "Oh, we got the back-up Ray." And the back-up Ray will have no feelings at all about how it had died and come back to life?
- RKRay Kurzweil
Well, that's a question. Uh-
- JRJoe Rogan
Yeah.
- RKRay Kurzweil
... I mean, uh, I mean, why wouldn't it be just like Ray is now?
- JRJoe Rogan
It... Why wouldn't it? If it gets to a certain p- if we figure out that if, if biological life is essentially some... a kind of technology that the universe has created, and we can manipulate that to the point where we understand it, we get it, we, we've, uh, we've optimized it and then replicate it. Physically replicate it. Not just, not just replicate it in form of, you know, uh, in a, uh, computer, but an actual physical being.
- RKRay Kurzweil
Right. Well, that's where we're headed.
- JRJoe Rogan
Do you anticipate that people will be happy with whatever they have? 'Cause, uh, if you decide, "I don't like being 5'6". I wish I was 6'6". I don't like being a woman. I like... I wanna be a man. I don't wanna be, uh, Asian. I wanna be," you know, whatever. "I wanna be a Black person. I wanna be..."
- RKRay Kurzweil
Uh, we'll actually be able to do all of those things, uh, simultaneously and so on. We're not gonna be limited by those kinds of-
- JRJoe Rogan
Right.
- RKRay Kurzweil
... happen- happenstance.
- JRJoe Rogan
Which is gonna be very strange. Like, what will human beings look like if you give people the ability to manipulate your physical form?
- RKRay Kurzweil
Well, we do things now that, uh, were impossible even 10 years ago.
- JRJoe Rogan
We certainly do, but we don't change races, size, sex, gender, height. We don't, we don't do all those... And the, the radical increase in just your intelligence, like, what is that going to look like? What, what kind of an interaction is it gonna be between two human beings when you have a completely new form? You know, you're, you're much different physically than you ever were when you were alive. You're, you're taller, you're stronger, you're smarter, you're faster, you're-
- RKRay Kurzweil
Well-
- JRJoe Rogan
... you're basically not really a human anymore. You're a new thing.
- RKRay Kurzweil
Uh, I mean, we're expanding who we are. We already expanded who we are from, you know...
- JRJoe Rogan
Sure. Right.
- RKRay Kurzweil
Uh...
- JRJoe Rogan
Over a course of hundreds of thousands of years-
- 1:30:00 – 1:31:40
Um, because it's based…
- JRJoe Rogan
by mere human creations, creativity, all of these different things, why would it even have the ambition to do any sort of galaxy-wide engineering? Why would it want to?
- RKRay Kurzweil
Um, because it's based on us, I mean.
- JRJoe Rogan
It is based on us until it decides it's not based on us anymore. That's my point. If it realized that like if we're based on a w- a very violent chimpanzee and we say, "You know what? There's a lot of what we are because of our genetics that really are a problem and this is what's causing all of our violence, all of our crime, all of our war," if we just step in and put a stop to all that, will we also-
- RKRay Kurzweil
Well, I- I would-
- JRJoe Rogan
... put a stop to our ambition?
- RKRay Kurzweil
I would maintain that we're actually moving away from that and the s- s-
- JRJoe Rogan
We are moving away from that.
- RKRay Kurzweil
Yeah.
- JRJoe Rogan
But- but that's just natural, right? That's natural with our more- uh, our understanding and our-... mitigations of these social problems.
- RKRay Kurzweil
Right. So if we expand that even more, we'll be even more in that direction.
- JRJoe Rogan
As long as we're still we. But as soon as you become something different, why would it even have the desire to expand? If it was infinitely intelligent, why would it even wanna physically go anywhere? Why would it want to? What's the reason for our reason- uh, our, our, our motivation to expand? What is it? It's human. These are like, the same humans that were tribal creatures that roamed, the same humans that stole resources from neighboring villages. This is our genes, right? This is what made us, that got us to this point. If we create a sentient artificial intelligence that's far superior to us, and it can create its own version of artificial intelligence, the first thing it's gonna engineer out is all these stupid emotions that get us in trouble.
Episode duration: 2:03:02
Install uListen for AI-powered chat & search across the full episode — Get Full Transcript
Transcript of episode w4vrOUau2iY
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome