Modern WisdomWhy Is No One Talking About Existential Risk? | Mara Cortona | Modern Wisdom Podcast 229
EVERY SPOKEN WORD
120 min read · 24,139 words- 0:00 – 0:46
Intro
- MCMara Cortona
The COVID-19 pandemic, the level of uncertainty about it shouldn't have been there that seemed to have been there. I mean, we know that pandemics happen. We know that pandemics at this scale happen. You know, it, it's, it's shocking to us 'cause we haven't experienced something like this in our lifetimes, but we all knew. It w- it was modeled, like, that something like this was going to happen. And we were completely unprepared globally. And not only were we unprepared globally, but the fact that so many people in the world are just caught up in these concerns about whether it's a hoax or these conspiracies. I mean, it, it's really very simple measures that need to be taken to control the spread of this pandemic that we all knew was coming. It's very clear that our communication and collaboration systems are very broken and things are heavily politicized.
- 0:46 – 1:31
Welcome
- CWChris Williamson
I am joined by Mara Cortona. Mara, welcome to the show.
- MCMara Cortona
Thank you.
- CWChris Williamson
Pleasure to have you on. So, what are we, what are we gonna be talking-
- MCMara Cortona
Thanks a lot.
- CWChris Williamson
... about today?
- MCMara Cortona
Um, I'd love to talk about existential risk, um, and the way that we relate to it as individuals and, and as societies.
- CWChris Williamson
Positive conversation for us today, then. Everyone feeling good leaving this (laughs) leaving, leaving this in fear of the world stopping?
- MCMara Cortona
Okay. It's something that really troubles me that people aren't concerned about more than they are, both existential risk and global catastrophic risks, um, to the point that it is something that I tend to bring up in casual conversation, which makes me very fun at parties, um, but it is a thing that I, I really think we should be talking about more. So,
- 1:31 – 7:57
Definitions
- MCMara Cortona
here we are.
- CWChris Williamson
What do we need to know about to start then? What's the, the glossary of words and key terms or whatever it is that we need to be aware of before we can begin?
- MCMara Cortona
Sure. Um, well, I, I, initially, I think I'd like to draw a distinction between ex-risk, or existential risk, which is, um, a risk to the entire species as we know it, and, uh, global catastrophic risks, which, um, perhaps wouldn't cause the en- entire extinction of our species, um, but would lead to mass die-offs and a really low quality of life. And those are, um, those types of risks are both more likely to happen and more likely to happen sooner than the major types of ex- ex-risks that are frequently modeled and talked about. So, those are important things to discuss. Um, some of the main, some of the most critical and pressing forms of ex-risk, obviously climate change is the one on everyone's mind, um, though engineered pandemics, um, bio-weapons, and, um, nuclear war are, are right up there. So, it's really all, they're really all anthropo, um, excuse me, anthropic, um, risks. And those are distinct from sort of this background rate of existential risk that's always there, um, from, like, asteroid collisions or, um, perhaps natural pandemics, or, um, super volcanoes or the like. There's always this background risk of those happening, which is fairly low, um, as we have been on this planet to, in some degree for, um, you know, 2,000 centuries and we, we haven't come across anything like that yet. So, we're at the point where, um, those natural background risks are far outweighed by the anthropic risks that, um, are being precipitated and accelerated by our own activity. So, those are some of the main, um, the main terms that I use.
- CWChris Williamson
The interesting thing that I learned upon reading Toby Ord's The Precipice, which is gonna contribute to a big chunk of my understanding (laughs) for what we're talking about today, one of the things that I thought was really interesting was technology is the cause and furthering civilization, wellbeing, further industrialization, et cetera, is causing many of the anthropic risks. However, because the natural risk is non-zero, i.e. if you l- ran this world, this planet for long enough, you would get hit by an asteroid, a super volcano would come and fuck you in the ass. If that's the case, you have to have sufficient technology to be able to avoid that from occurring. So, if you were a, a militant Luddite and said, "Well, all that we need to do, just stop all technology, stop all industrialization. We existed as hunter-gatherers for ages. Let's just go back to being farming people like 10,000 years ago. We'll be fine." Um, that's not even an option because from a, a civilization standpoint, eventually the natural ex-risk is gonna catch up with you and you don't have any of the technology to save yourself.
- MCMara Cortona
It's true. There, though there are schools of thought on that as well, um, there are many types of organisms that have been around in an almost unchanged state for millions and millions and millions of years, mostly marine animals, um, but cockroaches and horseshoe crabs are doing far, far better than we are in terms of longevity and stability. So, um, on the one hand, their rather simple way of organizing and maintaining themselves seems to be very successful, and, um, you're getting into, it's, like, basically eternity at what, at some point, we will be extinct-ed on this planet, um, but at the same time, you know, 580 million years, as some animals ha- are currently (laughs) , that's currently their record, like sponges, um, that's a great run of it. Um, and it's clearly far more than we have yet pulled off and it seems like we're likely to. So, on the one hand, um, there is the possibility that regressing to a, a more simple organizational structure could create the longest term payoff, um, and, but that's not really in our nature, is it? I think even if we wanted to go back to that sort of, um, you know, back to, back to nature-... type of organization structure for that purpose, I don't, we wouldn't be successful. It's not, it's not what we do. And, um, I'm not convinced it would be best anyway because there is so much suffering in the natural world, uh, and we have this constant drive to eliminate suffering. And so this kind of glorification or romanticization of the animal world, um, (smacks lips) feels a bit misplaced to me. I think it's really unpleasant to be almost every on- other animal besides a human, um, ultimately. It's, you know, very unpleasant to be a rabbit being eaten by a fox or, um... I mean, throughout most of human history, the vast majority of human history, and, um, in (laughs) the lives of most animals, it's just really, really hard to be alive, and there's a lot of suffering. And so what you're pointing to with, um, these kind of parallel tracks where, yes, our technological advancement is really jeopardizing everything that we hold dear, um, at the same time, what it's making possible is something that, you know, a, a, a type of reality, a type of world without suffering that, um, has never, we've never even been able to conceive of. Um, I mean, just it's, it's a funny thing to talk about the massive amounts of risk that we're facing right now given the fact that our lives are so much more comfortable and pleasant and the possibility is so much greater for nearly everyone in the world than it has ever been by far. (smacks lips) So it's, um, I think the way that I relate to that sort of dichotomy, that dilemma is that suffering is inherently, like, from an ev psych perspective, more motivating than pleasure or, um, or happiness. And so it's, it's more, you know, we're, we're more motivated to wanna avoid a catastrophic outcome than we are motivated by realizing this type of utopic future that we're hoping for.
- 7:57 – 15:10
Pain Avoidance
- MCMara Cortona
- CWChris Williamson
Do you think that scales when we're thinking about civilization-wide? I think people are quite capable of being pain-avoidant rather than pleasure-seeking, um, when it's themselves, but when you abstract that to even your town or at the very least, a civilization, like if, if ever there was a year to show us this as an example, it was 2020. Like the number of people that are saying, "You know, eh, th- the, the death rates are so low, it doesn't matter. Let's just get the economy started again." That's very short-termist thinking. I appreciate that not everyone has the utilitarian view that our goal is to reach our full potential as a space-faring, galaxy-colonizing civilization, which on the biggest, biggest sort of picture-thinking, that's what we're, we should be aiming for. We should be sacrificing everything that we can in our lives right now in order to ensure that the trillions that come after us are still able to, to be alive. Um, but people can't think with that much abstraction, not, not naturally, not without learning an awful lot.
- MCMara Cortona
Yeah, absolutely. Gosh, it, it's so interesting how my thinking has evolved over the course of 2020 be- due to watching, um, responses to the pandemic. It's a really interesting example because the COVID-19 pandemic, um, it, it, it isn't quite... The level of uncertainty about it shouldn't have been there that seemed to have been there. I mean, we know that pandemics happen. We know that pandemics at this scale happen. It's, um, you know, it's, it's shocking to us 'cause we haven't experienced something like this in our lifetimes, but we all knew. Like it, it was modeled, like, that something like this was going to happen, and we were completely unprepared globally. And not only were we unprepared globally, but the fact that so many people in the world are, um, I mean, just caught up in these concerns about whether it's a hoax or these conspiracies. I mean, it, it's really very simple measures that need to be taken to control the spread of this pandemic that we all knew was coming. And I think prior to this year, I had more faith in, um, the possibility of persuasion, mass persuasion and collaboration. Um, I, like, I would say that the number one (laughs) x-risk is really actually, like, communication. It kind of underlies our responses to all of the others, collaboration as a global whole. Um, and I think watching the res- global responses to COVID, it's very clear that our communication and collaboration systems are very broken, and things are heavily politicized. And the way to solve, I, I believe, climate issues and, um, pandemic issues, I mean, you can only imagine if this had been, like, an engineered pandemic with a high, far higher mortality rate. I mean, it would've been catastrophic, and that very well could happen in the next 100 years. It's, like, a one in 30, I wanna say. I think that was Ford's modeling, the Oxford guy-
- CWChris Williamson
Mm-hmm.
- MCMara Cortona
... of the chances of an engineered pandemic wiping us out within the century. Um, so I think ultimately, it's gonna come down to technological development. Uh, it's not gonna be something where we're gonna be able to sway the masses and get everybody on board with being very concerned about x-risk because, like you're, like you're talking about, um, the heuristics that we use to relate to these massive problems just don't work. So our biology and our psychology has really evolved in keeping with the Dunbar number, which is, um, it's, like, 150. It's the number of people that we are expecting to be able to form a relationship with, and, um, historically in a tribal setting would have known. And so that translates to our sphere of influence. Our sphere of influence is effectively, like, 150 people. Except now it's not. We had no, we, we never had any concept of force multipliers like we have now. And so our actions and our inactions not only affect, can affect people all over the world. Like, I can donate $5 to, um, you know, an organization providing bed nets for malaria, which I think has been, um, shown to be, like, the most, single most effective use of-... monetary donation for alleviating poverty and suffering. Um, and the impact that I can have is huge, but I'm still operating in terms of, like, expecting to see people, my community, and build a relationship with them and have a story and influence them on this one-to-one level. So like you were talking about, um, offline, it, we see a story about a little girl and we're so motivated, and we might waste massive amounts of resources in a way that's, you know, ultimately not very helpful when we could have done much more with that, with those resources. And so when you take it a step further and you talk about, like, the infinite set of lives that don't even exist, you know, like, I can relate to my children. It's a lot harder for me to start abstracting out relating to my grandchildren or my great-grandchildren, much less, like, the trillions, infinite possible lives that don't exist, and valuing those lives and giving them a spot in our policy discussions and giving them representation. It's a really, um, it's a really hard concept to get our heads around. And, uh, at this point, I, I don't think it's reasonable to expect that of the general discourse.
- CWChris Williamson
I think you're right.
- MCMara Cortona
And so, of course, then... Yeah. Um, yeah, I think the, the only other solution then is action by really the technological elite.
- CWChris Williamson
What's that look like?
- MCMara Cortona
Um, I mean, it would depend on, on the x-risk obviously, like in terms of climate change. Um, we talked about electric cars and the way that those have become mainstream, and it hasn't been by persuading a lot of people that this is the best thing to do in a moral sense. It's had to just come in through, um, from a, a technological standpoint and become a thing in the world that people wanna do for their own intrinsic motivations. It's gonna be the same for, like, um, animal cruelty or vegan foods and, um, you know, things like that that are big issues for people, but you're never, you're never going to convince the majority of people to be vegan, even if, you know, whether they should or not is another question from a health perspective. Um, but it's just not going to happen until we get to a point... Factory farming is not gonna be eliminated until we get to a point where, um, we have, we, the technology is there and we've provided a cheaper, easier, better, superior way of providing that value for people. Um, so I think with virtually every, um, every issue, every, especially every existential risk issue, it's, it's gonna come down to the actions of a few people in power and the way that they're able to reorganize. And so some of it... It feels a little gas lady with the climate change debate. There's, like, huge onus that's put on the individual consumer and the way it's, you know, the fault of the average person that this is happening. If, and if we drove less or we ate less meat, we personally could alleviate, um, some of-
- CWChris Williamson
Have
- 15:10 – 17:33
Conflict of Interest
- CWChris Williamson
you had a look at the... Have you had a look at, like, the stats on how much industrial units and large factories versus, uh, factory farming versus et cetera, et cetera contribute as opposed to whether I need to buy a light bulb which is three times the cost but uses half of the energy? Like, have you had, ever looked into the differentials on that?
- MCMara Cortona
Mm-hmm. I have, um, a bit, and it's, it's, it's really staggering. Um, it's, it's, like, fundamentally, it's not going to happen. We're not going to sway the masses. And even if we did, that's not really where the power lies. Like, the average person around the w- the average person globally is starving. They are not worried about their contribution to climate change. And then even in a wealthier Western nation, um, the average person in America right now is struggling through a pandemic and trying to feed their family. And so we have this conflict of interest, um, which is always there. It's, it's part of, it's part of nature. Even, um, you know, like, the most symbiotic relationship that we know of, like a pregnant mother and a fetus, it's really not that harmonious. That's the reason that pregnancy and birth are so fraught with danger. It's like the fetus has an ultimate impetus of, like, completely draining the mother of all nutrients and resources so that it can be very healthy and robust. And then the mother organism has an ultimate end goal of, you know, giving the fetus enough to successfully birth it, but to maintain as much as she can so she can then go on and bear more children. So there's, at every single level, there's always this conflict between being, um, an individual actor in a system, a cell in an organism, and part of this macro organism. And so at every level, we, we see that.
- CWChris Williamson
What's your... If you were to do a rundown, Mara's top three most likely existential risks to look out for over the next 100 years, what would they be? Starting at number three and then working to number one, which would be the, the biggest risk?
- MCMara Cortona
Starting at number three. Well, some of this has been modeled, so... Of course there's so much uncertainty. That's the difficult thing with modeling it, is, um, it's always, it's always kind of a guess.
- 17:33 – 19:15
The Unknown Unknown
- MCMara Cortona
- CWChris Williamson
No one's gonna hold you to it.
- MCMara Cortona
So-
- CWChris Williamson
If it, if it happens, they're all gonna be dead. So if you get it right, it doesn't matter.
- MCMara Cortona
Right. (laughs)
- CWChris Williamson
If you get it wrong, if you get it wrong, they might have a problem. But if you get it right, doesn't matter.
- MCMara Cortona
(laughs) Well, the one that, um, I might put at number three would be the unknown unknowns. So 50 or 100 years ago, the major risks that we see today, we hadn't even conceived of. I mean, there, we had no language to discuss, um, some of the environmental issues we're facing as well as, you know, the idea of bio-weapons at this scale, or, you know, like, mass-deployed autonomous drone bot-... that are, you know, nanotechnology. I mean, that type of risk was not, um, even in our parlance. So that risk of the unknown unknown I think is quite real within the next (laughs) 50 to 100 years. And, um, and that's something that is difficult to prepare for. And the only way I think to take it on head on is to, um, look at our mitigation methods and to invest in much in R&D as we can. Um, but then beyond that, it seems like... I, I wanna say, I, I actually don't think that I w- I don't know if I would put climate change in the top three. I would say bioweapons and, um, what's called, like, misaligned artificial intelligence might be the top three.
- CWChris Williamson
So I think I put... The unknown unknowns is a really clever answer that I didn't think of, but that would be mine as well. I guess,
- 19:15 – 23:25
What Would You Do
- CWChris Williamson
I don't know. I'm not, I'm not sufficiently familiar with nanotechnology and whether how that or a worry of that would be distinct from misaligned AGI. Um, like the gray goo concern that we have, like, where does the line get drawn between that and AGI? Does that make sense? Like, does nanotechnology fall underneath? Could it even realistically be deployed en masse at that level of sort of capability without some sort of artificial general intelligence over the top? But yeah, I think, I think that's not a bad, that's not a bad top three. Um, should we talk about what you would do if you were in charge of the world to make the public more aware of the impending existential risk?
- MCMara Cortona
Hmm. Yeah. Um, it's interesting. I do think about this a lot. I'm, I'm curious. When I talk to people, it seems like there's some amount of awareness, but it's something that we don't wanna think about. And in my generation, I'm like older Gen Z, young Millennial. Um, there's, there seems to be quite a bit more. Like, as someone who was not old enough to remember 9/11, like, our whole, um, our whole coming of age has been really dominated by this talk of existential threats. So I, I, I do see in, like, the younger generations, it seems like there's more awareness. But at the same time, it's almost like... I mean, it, it's such an abstract problem that's so hard to wrap our heads around and it feels so helpless. It feels so fruitless. We feel so small. And so one of my main focus over the last year, um, has shifted to, how do we relate to these risks on an individual level? Because I could... I have all sorts of prescriptions about ways we can mitigate, um, each of these individually and at a collective level. But, um, there's some amount of hubris in assuming that my policy recommendations would, um, ultimately be the right thing to do. And there's also the fact that I have very limited influence. I, I couldn't, I can't actually enact any of these things. But as (laughs) a member of the human species, I'm, like, deeply concerned for our future. So how do I relate to that? And how do I balance this need to live a fulfilling and fully developed, actualized life with the need to safeguard our future? So for some people, that's not even a given. It's not even a given for many, many people that I've come across that the future of human- humanity is worth safeguarding. Um, I have an inherent value on consciousness. Uh, I believe it is a gift back to the universe in a sense. Knowing the universe as it is, is a really beautiful thing. Um, and as far as we know, existence is better than non-existence, or at least it's more interesting. So it's inherently, it's like an a priori good. Um, and at the same time, there are many people who do feel that that suffering, given that, you know, it, it's, it's more of an, um, it's more of an impetus. It can out- it outweighs that happiness, um, or that pleasure that we receive throughout the course of existence. And so is it even worth, um, safeguarding? And obviously, these people don't exist yet. So it's kind of like in religious communities where they, like, have to keep popping out more and more babies because inherently more life is better than less life. That's like the ultimate end of that line of thinking. Um, so there's some balance where, is it true that inherently more life is, is better than less life? Um, how do we put a value on the unknown amounts of suffering that might be persisting in the future? So if we can create this sort of society that seems to be on the horizon in terms of, um, greatly diminished suffering and greatly diminished poverty, um, the world over, and technological advances. I mean, like, so the poorest people in the world today are still considerably better off than, in many ways, than, um... Well, I don't know. I, I don't wanna say it quite like that. But they have... The poorest people in the world today still usually have access to things that the richest people in the world didn't have 100 years ago. So there's this, there's still an asymmetry, but the quality of life overall is so, um, is so fantastic
- 23:25 – 25:32
How Can We Drive This Home
- MCMara Cortona
and-
- CWChris Williamson
But how do we get-
- MCMara Cortona
... is only gonna keep increasing.
- CWChris Williamson
How do we get people to think about this? Like, how can we drive this home? Um, I'll give you my prescription after, after yours.
- MCMara Cortona
Oh, okay. Um, I'm so curious to hear yours. Um, I think it comes back to examining heuristics and examining the way that we relate to all the information that's coming in. Um, so I have, I have all these motivations for the things that I do, and some of them are conscious and some of them are unconscious. Um, some of them are conditioned. Most of them are... Most of them haven't actually been given as much thought as I think that they have. Um, e- even when I believe I'm acting altruistically, I'm usually acting altruistically in a way that creates a positive sense of feedback.... for me. And that often is not the most effective way to, to be. So our idea of saintliness or goodness or morality are still really caught up in that Dunbar number (laughs) , um, type of society where I see a person, I make them smile, I get a feedback loop. My mirror neurons respond and I feel like a good person, and now I'm contributing to the world. And that's, that's... We need to like... I think my, my prescription would be to completely reexamine our idea of morality. So for instance, like Bill Gates ha- m- very well might be the most... might have done more good than anyone in history up to this point, just by the sheer... And that's not to say that he's a particularly saintly person, like, but just that he has had more reach and more influence and he's relatively, um, strategic about the way that he allocates funds. So, actually looking at outputs and efficacy and responding and, and assigning moral value based on real outputs and the way that they affect problems in the world is considerably more fruitful than, um, responding from any other place.
- 25:32 – 30:15
Human hubris
- CWChris Williamson
It's hard though. We, we, we talk a lot on this show about living a consciously designed life, trying to get rid of the genetic predispositions and the ways that you've dealt with past trauma and the paths of least resistance and the... everything. Like, trying to deprogram all of the programming and be as conscious as possible with your actions and your thoughts and your words. But the problem is, I think we often do... The, the hubristic tendency of humans is to believe just how fucking smart we are, and we're not. We're very, very, very primitive. And if ever there was a time to see that, it's in the response to the pandemic. Like if you were... if everyone had the capacity to think on a civilization-wide level, everyone would have happily locked themselves in their house until every last drop of COVID left. But we don't. We've still got a lot of personal motivations that are these, like archaic hangovers from a time where we needed to be tribal, we needed to fight over mates and resources and whatever, you know, pick whatever it might be. Um, I, I don't think that the vast majority of people, myself and you included, are anywhere near actualized enough to properly, properly know exactly what it is that we're supposed to be doing in order to be able to do that. Um, my, my only... Y- your idea was much more abstract and fun than mine, um, mine is just to continue this conversation with guys like Toby Ord, with guys like Nick Bostrom and Sam Harris who are sufficiently charismatic. They're, they're at the, the, uh, correct corner section of charisma and understanding, and that they can... Like, give them a TED Talk, give them 10 TED Talks. Give them, like every TED Talk from now until people believe that existential risk is a big deal. Um, and that for me is, in 2020, it's how an idea pathogen really transmits. It's by finding someone who has some social equity with sufficient visibility and/or reach or clout and then just distribute it. But at the same time, there was that clip from, I'm sure that you saw of Bill Gates, at the beginning of this year, where in like 2014, he was like, "Yeah, the nec- the next big sort of risk that everyone's going to come up against is global pandemics." And this video had been on like a documentary, a really big documentary with a production budget, and people were sharing it around going like, "Why didn't anyone know?" And it's like everyone knew. Like the only reason that you don't see that is because the- there wasn't sufficient reach on it. So just getting more charismatic people talking about it is like my solution. But I know that really individual actors, it's the same, the same as us talking about how to change climate change. Like you could have probably 90% of the population be concerned about existential risk, but the top 10%, they're the ones that actually influence policy and the direction of civilization. If they're, if they're not on board or can't be bothered or it doesn't make sense to them, the entire population below them trying to enact change is not gonna make any difference.
- MCMara Cortona
Right. Yeah. I think what you're pointing at, there's like these two separate things that need to be in place for real social upheaval or cultural upheaval. And one of them is the sort of the public substrate, um, which can be influenced by getting more visibility in the ways that you're describing with these prominent figureheads. But, um, a great example would be like what we're seeing in the US right now with the BLM protesting, and that was precipitated by George Floyd's death. Um, and that particular death was not the most gruesome or the most offensive or the most anything. So why did it spark off this summer of protest, which is still going on, like in Denver right now? I mean, every single night, it's still happening months later. And, um, there was, like fire outside of the police station down here the other night. Um, but there, it... there's this... Uh, Mica White was ta- the Occupy Wall Street guy was talking about this recently. There's not, um... No matter how carefully you plan, like a rebellion or a revolution or an insurrection or a demonstration, like it's not... There has to be some sort of natural cataclysmic event that happens right around the same time that precipitates it. So, um, it, it can kind of feel like talking about these issues, um, whatever your pet cause is, like you're... you know, you're just grinding the wheels and you're not getting anywhere. And there, there's some amount of chance. There has to be something that really shakes everything up. And, and that's
- 30:15 – 32:16
Why is everyone not thinking about existential risk
- MCMara Cortona
just luck. Is that gonna happen in time? So, you know
- CWChris Williamson
But why... Like we're, we're in the middle of a pandemic.
- MCMara Cortona
It could be.
- CWChris Williamson
Why is everyone not thinking about X-risk now?
- MCMara Cortona
Right. (laughs) Well, um, to some degree they are. I've seen quite a bit more- quite a bit of a rise in, um, concern, uh, the word apocalypse I've heard, like, 3,000 more times than I ever heard before (laughs) in my life up to this point. Um, I think people are getting more and more interested and concerned. Of course, eh, this is- this is one particular issue that I worry is, um, not representative of how some of the bigger risks that we're facing might take hold, um, but I- it does seem like it's a good time to be- to be talking about this. There's more receptivity for sure. It feels- it's palpable in the air. People feel like the world is on the edge of collapse. There's major natural events and there's, you know, um, threats of war and major political issues going on and the pandemic. It's kind of- kind of a- a confluence of a lot of risk factors, um, so. But I do wanna circle back around to the- that concept that there's existential risk and then there's global cata- catastrophic risk which is much more likely, which is not a- a situation, like with climate change, the likelihood of all the humans being killed in the near term is not really that high, but the likelihood that the majority of the world is gonna become uninhabitable, lots of people will die and the remainders will have, um, a really low quality of life is- is much higher. And that seems to be more of a motivating for people. It's like we can't abstract out and conceive of the lives of people that don't exist and might never exist, but if we can think about our children (laughs) suffering in a really unpleasant world and think about the likelihood of that, it's quite likely and it's- it's quite unpleasant, and, um, I think some people would probably prefer nonexistence to some of the types of dystopian futures that we're facing anyway.
- 32:16 – 34:05
David Attenborough on climate change
- MCMara Cortona
- CWChris Williamson
Well, if it's just everyone Mad Maxing around wearing leather with a lot of stuff with spikes on-
- MCMara Cortona
(laughs)
- CWChris Williamson
... like riding around in an old Ford- Ford F- Focus or something driving across the desert. Yeah, I am- I have an interesting sort of view. I sent you a video earlier on, David Attenborough, um, he was trending on Twitter today talking about how important-
- MCMara Cortona
Yeah.
- CWChris Williamson
... climate change is and how don't waste anything. Don't throw away food, don't throw away packaging, don't do... The- the planet is on the edge of- of collapse. How correct is David's science there?
- MCMara Cortona
Uh, I didn't watch the whole video. I didn't see his- the science that he cited. I did watch, um, I did see him talking about the urgency and his- the critical, um, impetus to get this out to the masses, and it was interesting how I related to it. I really appreciate Sir David Attenborough and his approach and, um, he is definitely trending with that. And at the same time, it- it again brought me back to that, you know, um, that question of how much impact does this really going to have and in what way? And it a really s- direct straightforward way, I doubt it's gonna have much impact, like the majority of people who were already aware of their environmental impact are going to continue to be and the ones who aren't whether because they are unconcerned, um, and unfortunately a lot of those people have a very big impact, um, or, you know, they're just in survival mode and it's not a priority for them. It's probably not gonna shift that much. It- it- there's, you know, I- so I- my question is about who he is- who he's influencing and- and what the intent is there.
- CWChris Williamson
Mm-hmm.
- MCMara Cortona
Um, because the people with the real capacity to make a change are- are in tech.
- 34:05 – 36:32
Who is David Attenborough influencing
- MCMara Cortona
- CWChris Williamson
How would they make a change?
- MCMara Cortona
Um, so on a- well, on the climate side actually, I would say more research. Like there are still so many unknowns about what the biggest risks are, um, and then in terms of switching to clean energy, that's really gonna come from- from the- from tech. It's not gonna come from the individual, the average individual consumer. So, I have curiosity around what the best way to affect and influence that is. Um, it does seem- it does seem at every level though to start with the individual which is- which is interesting. It's, like, fundamentally... And this is why so much of my focus is on personal, um, kind of self-development. I mean, it's- it's really the only thing we ultimately have that much control over, but there's this- a way of relating with self-development as, like, an internal growth thing and then there's a way of just completely moving past that idea of self and seeing ourselves as a part of a giant macroorganism like ants in a colony. We're like one being. And the more we can relate to that, um, the more that we can align all the choices in our lives around that. So the more I'm- I'm operating from, like... It's- it's like a marriage of- of two critical pieces that this is what I try to convey, um, when I talk about it mostly. It's, um, it's alignment with the macroorganism and then it's also, like, a rigorous use of reason which of course is not to get into hubris and to assume that we can predict all the outcomes of all of our actions, but it's a commitment to really charting what are my impacts in my career and the way that I choose to live, um, and how do they relate to the biggest risks that we're actually facing? So, I think there are people who are in much better positions to make major change in the world than others, and, um, the more that those people can be reached, I think the better. But I don't know that that sort of messaging, now is the time to recycle more, now is the time to consume less is really gonna make the biggest impact. It seems like getting that point of self-development and real, like, alignment and real rigorous examination of impact to the people who actually have a big impact in the world, CEOs and tech leaders and researchers and people working in policy, um...... those seem like they're really critical actions to take. How does that-
- CWChris Williamson
I-
- MCMara Cortona
How does that
- 36:32 – 42:32
The Improvement Imperative
- MCMara Cortona
land?
- CWChris Williamson
I think so. I came up with a, an idea a few months ago called the improvement imperative, which was that-
- MCMara Cortona
Hmm.
- CWChris Williamson
... it is your duty to be everything that you can. The reason being that you can impact the lives of the people around you and if you raise them up, then they raise the people that are around them up. Kind of like a, a positive sum effect, a positive pathogen, I suppose, that spreads. And for every one person that's able to do that, for instance, you know, Joe Rogan. How many people's lives, how many eyes has he opened? Say what you want about some of the stuff that he comes up with, like, "Oh, I don't like what he says about transgender athletes." Like right, okay, mate, he's done f- 4,000 hours of online programming which has reached billions and billions of sets of ears. Like how many people's eyes have been opened to a, a different way of viewing the world, to be more reasonable, to be more nuanced, to be more, you know, complex, to have their self-development improved? Like that is him contributing at around about as close to a highest cadence as I can think he could. And the opportunity for everyone to do that, whether it be a single mom who's r- raising two children, "Okay, I'm gonna raise these children to be as actualized and happy and independent and positive and da, da, da, da, da, da, da," you know, all the things that we know that produce a good human, "as possible," there you go. Like that's, that is, that is contributing. Um, but there's just so much (laughs) inherent tribalism and laziness and dis- all of this stuff that we've carried over from a time where we really needed it, but in- it's not fitness, it's no longer fitness enhancing, it's no longer adaptive. Um, and as the environment that we're in continues to change as quickly as it does, any adaptation that evolution did manage to luck out would be completely ba- past its sell-by date within 15 years in any case. Pointless adapting yourself to the w- to, let's say that you adapted a way to, um, not become a- addicted to a device that was in your hand and that you wouldn't be able to have your dopaminergic system, um, manipulated by that. 20 years' time, you're not gonna have a device in your hand, it's gonna be attached in your brain. And then in, you know, 150 years' time, the transhumanism movement will be with us and you won't even be walking around in any case and you'll just have electrodes plugged in and you'll just be sat, like that Bruce Willis movie where everyone was just like a weird at home, fat, weak g- guy playing-
- MCMara Cortona
Mm-hmm.
- CWChris Williamson
... a computer game with their model version of themselves floating around. Um, but one of the things that keeps on coming up here is climate change. What, what has made climate change such a high priority, highly visible, existential risk category, uh, i- issue for people to be bothered about? Why is it not that everyone is thinking about the concerns with biotechnology or nanotechnology or artificial general intelligence, which by pretty much everyone that I know's standards, AGI is the risk to be concerned about over the next 100 years? And Greta Thunberg isn't driving to go and see the DeepMind people and talk to them about, "Have you considered your alignment and control problem? Do we have machine extrapolated volition here so that we can actually try and wrangle this thing back?" Like Greta Thunberg shouting at adults about the fact that they're not, that they're wasting money on bottled water.
- MCMara Cortona
Right. Yeah, it is interesting. I've been so curious about that as well. We have this whole specter of, hmm, risk assessment and response that has had varying degrees of success. So to put those, to put them on like, on a spectrum, um, with climate change towards the center, on one hand there's like the response to asteroids. Um, that has been a very successful response to exi- existential risk. Granted, the risk of being wiped out by an asteroid is fairly low, um, but still, we had a great response to it. We didn't know until the last 30, 40 years, um, that, that asteroids were likely to have been what wiped out the dinosaurs. That was, that's like really new knowledge. And that quickly, I mean, I didn't realize it was so new, 'cause my entire life, it's been something that, you know, the government has responded to and now it's like, it's pretty neg- it, it was already fairly negligible and now it's like, you know, a microscopic risk because we have charted everything around us and we have a really good... we have really great situational awareness in that regard. Um, but that didn't become politicized. It was something that like, it hit at the right time, um, there was, there was a few things that happened all at the same time, it was like we gained more knowledge about the impact of an asteroid hitting the Earth. Um, there were a bunch of movies that came out around that time because it was, um, hmm, it was a hot topic and then there was the Shoemaker-Levy comet hitting Jupiter, so, um, and that was like, uh, created a really big impact for people. And so it wasn't politicized and everybody was able to get behind it and it was funded and, you know, this one country basically took it on and kind of solved that problem for the rest of us. And then on the other extreme end of the spectrum, there's, um, these concerns around, um, exponentially advancing tech that are pretty, fairly neglected. Um, there's not very much work being done. And then in the middle, there's the- there's this, I think, this perfect storm that's created all the fervor over the climate change debate, which is that, on, on both ends of the spectrum, it hasn't been heavily politicized. And so we're either able to like effectively respond or it's kind of left alone, but then when something gets politicized for whatever reason, it becomes just, um, there's this, there's this fervor over it and it, um, it becomes a tool and a weapon for people with completely, you know, unrelated aims and goals. And I think that-
- CWChris Williamson
And skillsets as well.
- MCMara Cortona
... it seems to be why climate change is.
- CWChris Williamson
Like
- 42:32 – 45:02
Climate Change
- CWChris Williamson
don't, don't forget that there's people talking about...... climate change who don't have the first idea about what the actual stats are behind it. You know, there's not many people who are discussing the control problem for AGI that don't actually know what's going on. But sadly, as you get a problem which has more, um, social signaling behind being associated to it, people jump on board without having done their research.
- MCMara Cortona
Right. Right. Virtually everyone has an opinion on climate change, so it's quite natural, I think, that it's, um, that it's exponentially compounded into this massive, um, debate. But it does seem to be... I'm much more concerned about the next pandemic, personally, than I am about climate change. And seeing our response to COVID-19, I'm even more so. So I, I'm curious about... And, and also, you know, tensions escalating as we're, we're coming up, um, against our re- our election in the US. I'm very curios what's going to happen. But there's, there are a lot of things that, um, are not really being addressed at a global level. And I think what would be really useful would be, um, to have more, more of a, more of a holistic, like, global agency focused on mitigating x risk and observing it and, like, an independent agency really focused on that. So I work in space. I work in the space sector in, um, astropolitics, so a lot of what I observe is the way that different space agencies in the world communicate around, um-
- CWChris Williamson
Astro- astropolitics.
- MCMara Cortona
... their shared and disparate... Astropolitics, right. Um, which is what it sounds like, the politics of space, which is a fairly, um... It's kind of like the Wild West. There's a really small degree of, um, well thought out collaboration and foresight actually going on in space law. It's a commons. It's like, how do, how do we, how do all these different players, um, governments, and, um, organizations, and private companies that are trying to utilize this commons do so in a way that protects both our, our shared and our disparate goals and make sure that it's not weaponized and... It's, we've really done, it's kind of a shit show right now, quite honestly.
- CWChris Williamson
(laughs) I saw a-
- MCMara Cortona
Um, and so that has-
- CWChris Williamson
I, I saw, uh, an article-
- MCMara Cortona
What's that?
- CWChris Williamson
... that was
- 45:02 – 46:42
Colonizing Mars
- CWChris Williamson
decolonize Mars. Did you see this?
- MCMara Cortona
I didn't see it.
- CWChris Williamson
Oh, wow. This is so, so up your street. So, um, James Lindsay shared it. Basically the concern was that colonizing Mars is an echo of the colonization of the West by Europeans and the subsequent destruction of the native peoples that were therein. Um, we need to have a discussion about who is going to colonize Mars, and James Lindsay quote tweeted it saying, "We're having a discussion about decolonizing a planet that we haven't even colonized yet."
- MCMara Cortona
Right, right. (laughs) Oh, that's so funny. I have heard that, um, that debate quite a bit. (laughs) It is, it's interesting. I'd love for you to send that to me so I can, I can read it.
- CWChris Williamson
Yeah.
- MCMara Cortona
Um, yeah. (laughs) There's so, there's so much concern around our technological advancement as well as, like, moving out into the universe and colonizing other planets because of these concerns that we'll take human nature with us and that we'll just reenact these atrocities everywhere we go. Um, and I have, I do actually have more faith in human nature than that. It seems like the more that we, um, the more that we are able to collaborate towards a shared goal, I always say the ends are the means. It's like, as we collaborate towards these sort of massive goals that surpass, um, that we can't really achieve as these individual warring factions, the more that, um, things tend to smooth out all the way down. So.
- CWChris Williamson
I agree. I agree completely.
- MCMara Cortona
But
- NANarrator
That's the way it's gonna be there.
- CWChris Williamson
... I think as well that the people who are, the people
- 46:42 – 50:25
The Prisoners Dilemma
- CWChris Williamson
who are being selected for that as well are going to have had some pretty-
- MCMara Cortona
Right.
- CWChris Williamson
... rigorous psychometric evaluations. You're not just going to get someone who happens to become a neo-Nazi halfway to Mars. Like, you know, they've-
- MCMara Cortona
Right, right.
- CWChris Williamson
... they've been through it and they've dedicated decades of their life to, to this one purpose. And for all that Hollywood might decide to dramatize the, the guy who goes crazy in space, we're yet to see that. I don't think there's any examples of that. Now, again, we haven't done, you know, a, a trip to Saturn's moons, but we've been, spent a fair bit of time. There's people that have done, like, 100 plus long stints up there and have held it together pretty well. Um, the signs are encouraging. But you, the thing which struck me the most, uh, upon, like, hearing, uh, chatting to you over the last few weeks in preparation for this and then also reading Toby's book, is that we are at the perfect junction between having enough power to be able to do something that could severely neuter our ability in future and having nowhere near enough wisdom to corral that power. It's, uh, Eric Weinstein, uh, "We are gods, but for the wisdom."
- MCMara Cortona
Mm. Yes, yeah. It's so interesting. It, it brings me back to the prisoner's dilemma really. Um, how do we... We are these individual actors and, um, we'll be much more successful the more we're a- we're able to collaborate towards a whole. But we, we don't currently live in a world where that's entirely possible. I struggle a lot to keep, you know, like, everything going in my life. And, um, and I have it significantly better than the vast majority of the world. And so, being able to have... It's a luxury to be able to focus on the overall wellbeing of the greater organism. Um, but the, the, oh, who, what was the name of the guy who came up with tit for tat, that, um-
- CWChris Williamson
Oh.
- MCMara Cortona
... way of resolving prisoner's dilemma?
- CWChris Williamson
I know the-
- MCMara Cortona
Rappaport.
- CWChris Williamson
I know the guy that you mean.
- MCMara Cortona
... yeah, Rapoport. I can't remember his first name, but, um, yeah. It's essentially, like start from a... The, the one algorithm that's so simple that is the most successful at removing the prisoner's dilemma is to start from a place of assuming the best intentions and then to mimic whatever you received from your, um, from the person that you're working with. Whatever their last move was, to mimic that. And that's, that's the best chance we have of being successful. Obviously we're living in a very complex world, so it's not quite so simple, but we are all responding. You know, am I in a world where I need to really fight to get ahead? Um, then that's what I'm gonna do, and that's what I feel like I need to do to stay alive. And so I can't consider the long-term health of the rest of, the rest of society. There's some amount of, um, antisocial behavior that can be tolerated in a society before it starts to collapse, and once it gets over that line, you know, there's massive collapse. And so that's why I think I keep coming back to, um, yes, massive personal responsibility in terms of aligning ourselves with our impact, exercising full agency over everywhere in our lives that we have impact. But also really holding accountable the people who have the power because it's highly concentrated and those are the people who set the tone for everyone else, and can help create this world. And I th- and we are moving towards it. We're moving towards it swiftly, um, this- moving towards this world where people can relax into greater possibility and begin acting in more pro-social ways. Um, it seems to be what works and so human- we will tend to default to what works.
- 50:25 – 54:06
The Chance of Extinction
- MCMara Cortona
- CWChris Williamson
Toby Ord's, um, numerical values that he gives for society's chances in the future make for pretty stark reading. So he says that the chance that we go extinct or that we permanently mute our ability to reach our full potential as a civilization within the next century is one in three, and within, uh, forever is one in two. So the vast majority of our X-risk is front-loaded over the next century. One in three, so only a two in three chance that we decide to, that we ac- we actually manage to make it, and then a one in two in even like a time cost that we do after that. What's your opinion? What would you... Where would you put your numbers if you were doing the same equation?
- MCMara Cortona
Yeah. I don't know that I would differ significantly. It does seem to be heavily front-loaded. Um, it does seem to be an absolutely critical juncture and, and that's one of the questions that becomes difficult when modeling X-risk too, is it- just as it's impossible to account for the unknown unknowns, it's also impossible to count for the ways that are, you know, we are adapting. So for instance, a lot of, um... We're very vulnerable because of our dependence on the sun and, um, if we are able to devise new ways of nourishing ourselves without needing agriculture, um, you know, a lot of our major concerns will go down quite a pit. Like, life still might be very unpleasant in, in case of volcanic dust and ash, um, filling the atmosphere, or massive global warming, but we could survive it if we, if we're more able to develop these different sorts of adaptations. So I... That should be more of a focus, I think, than it is, um, but it also skews our ability to model, um, a bit. So it does seem like Ord is really on point as far as any modeling that I've seen, and there's always some element of uncertainty which is, that element of uncertainty is, is more of, um, a stressor for me than, than it is soothing. So yeah, it's interesting. I think that's where, um, I come back to also like chop wood, carry water. Ultimately we're at a very critical juncture and, you know, things are in play. There, there are thing... We can ha- keep having these types of conversations publicly and get as much focus on these issues as we can and hope that the right people end up receiving the right messages, um, but fundamentally we don't, we don't ultimately have that much control as individuals. And so there's some amount of getting in right relation with that and, um, accepting it that I think needs to happen in order to face these issues. We have to be able to like eyes wide open accept what does it mean for the entirety of humanity, the entirety of consciousness as we know it to not exist. It's huge. Um, and that becomes really a metaphysical kind of almost spiritual practice, um, and that's kind of a whole other conversation coming back to like accepting that I am no, I am not really a self. And so when there's no self left, maybe it'll be okay, 'cause only when you, only when you really understand that can you start making really conscious decisions about how to relate to it.
- CWChris Williamson
Yeah. It is a, um, it's a blessing and a curse to be able to step into our own programming. I th- I've been thinking about this a lot recently 'cause I've been spending (laughs) too much time with dogs, which there is no such thing as spending too much time with dogs.
- MCMara Cortona
Too much time with dogs?
- CWChris Williamson
Yeah. It's impossible.
- MCMara Cortona
(laughs) .
- CWChris Williamson
Um, but everyone's looked at
- 54:06 – 1:09:06
Our Responsibility
- CWChris Williamson
a dog and thought, "Oh, it'd be so brilliant if I was that dog. You know, I'd just lie on the floor and he'd look so happy all the time and life would be simple and this, that and the other." But the fact that we're the only animals that we know of in the entire universe that can step into our own morality puts such a huge burden on us. Like, the fact that, in the words of mutual friend Daniel Schmactenberger, that we're not just, uh, cargo on spaceship Earth, but we're crew.
- MCMara Cortona
Mm-hmm.
- CWChris Williamson
It is our job to try and move the direction of the only-... conscious beings in the entire universe that we know of that can step into the Romarelli and decide to girdle their future in a direction. Like, the lions aren't making it to space. And as much as I'm terrified of cephalopods, and as clever as they are, they're not building rocket ships either. Like, f- for all the-
- MCMara Cortona
Right, right.
- CWChris Williamson
... for all the Adrian Tchaikovsky in Children of, uh, Children of Ruin might want that to be the case. He genetically modified some octopuses, octopi in that, and then they're flying around in big orbs of water. Wonderful book if anyone wants to read something that will make your brain explode. Um, but that's, that's not a concern. It's us. It's on us. It's just on us. We have Fermi paradox, h- got no answer to that yet. Don't really know where they are. If it's on the heads-
- MCMara Cortona
Right. It's-
- CWChris Williamson
... of, of seven billion previous apes recently found electricity within the last couple of 100 years, I don't know, like it feels ... A, a precipice isn't enough of a, a violent term for Toby to use. It should be like Planck length knife edge.
- MCMara Cortona
Right. Right. And w- it's so funny to relate to it that way too when you look at the way that, that we (laughs) , that we wield this responsibility. We think in terms of quarterly profits and election cycles, and these like utter blips in time, and then we plot, you know, our goals around, you know, the next four years or the next three months and maximizing returns in those time periods. Um, it, it really reminds me most of a toddler and the way that a toddler relates to the world, in this very short term thinking, unable to make decisions, or a teenager. Maybe that's, I think, I think Ord actually, um, makes that analogy as well that we're kind of like in our adolescence, and the way that we are, the way that we're, um, the way that we're wielding that power really shows that. You know, we're like smoking a cigarette and we don't ... You know, we have no concept of what the long term impact of that is gonna be, 'cause we're so, we're so focused on the here now, and we, suddenly we have all this power, and we have these lofty ideals and we're using it in these different ways but we're just, we're really lacking this, uh, this wisdom and this, this systems thinking ability. So, the question is how to quickly grow up (laughs) ? How to quickly up-level that as a cell? Like, not as the, the organism itself, 'cause each of us is only an individual cell. Um, and it's, it's, it's interesting and it's amusing and, and kind of sad.
- CWChris Williamson
It is melancholy, isn't it? Like when y- I, I find myself-
- MCMara Cortona
It is.
- CWChris Williamson
I find myself feeling sort of very melancholic after I read stuff like Superintelligence by Nick Bostrom or uh, The Precipice by Toby Ord. Um, I, I, it, it's weird. It, it makes me feel very grounded and very down to earth in the same way that looking at the night sky does. Like, it's kind of reassuring to know how li- how small you are, and how limited your impact can be, but then also gets me very agitated at seeing both my own and everyone around me's wasteful use of their consciousness. Like, you have this second, and this second, and this second, and this second, and how are you spending it? You're spending it thinking about how amazing it would be if that hot girl in work would come and ask you out or like ... And you just think, "Oh my G- is this really the best that I've (laughs) got?" Like, you know, it's, it's ... It is. It's, it's a, an interesting blend.
- MCMara Cortona
Mm-hmm. Right.
- CWChris Williamson
If you were to-
- MCMara Cortona
And, um-
- CWChris Williamson
Go on.
- MCMara Cortona
I was just gonna ... I'm not sure if you're familiar with, um, Integral Theory and Ken Wilber's work much, but the way that he relates to that is, um, to transcend and include so we're not int- we're not attempting to move beyond our base instincts. We're attempting ... We ... They're a part of us, and everything that's a part of us serves a function, so we wanna transcend it and include it and bring it up to that next level of consciousness, but not like reject it. So yeah, these, these really base sort of things that we deal with like, uh, like jealousy or, um, insecurity, uh, hubris, lust. Like, they seem so pointless, um, and yet they're a part of what, they're a part of what drives us. Like, a lot of ... I would, I would bet that a lot of people working, um, at the forefront of some of these fields and having a really big impact, um, at some level, you know, one of those motivations is they wanna get laid, and that's a thing, and that like motivates a lot of people to achieve really highly. Um, or they want, you know, they want money, and there's usually not just like one driver for why we do what we do. It's, it's kind of about transcending and including and channeling those drive toward, um, something that's aligned on every level. So, if we can just like bring it out of the shadow (laughs) , like, "Why am I doing this? Why do I want this?" Is it really aligned with what's ultimately gonna serve? Um, as long as we're ... Yeah. I think it's better to, not to like repress it and, um, just trying to force our way into being this like super enlightened being who only cares about the whole interiority of the human race-
- CWChris Williamson
Have you-
- MCMara Cortona
... is likely to be less successful.
- CWChris Williamson
Have you seen the Futurama episode where they create perfectly realistic human sex robots? And everyone-
- MCMara Cortona
Oh, yeah.
- CWChris Williamson
... everything on the planet grinds to a halt. All the scientists stop working. All the bankers stop working. All of the road cleaners, because like their subtext is any- everyone is just doing everything in an effort to get laid. It's like Futurama decided, like Matt Groening how to cra-
- MCMara Cortona
(laughs) .
- CWChris Williamson
... how to crack at that. (laughs) It's like-
- MCMara Cortona
Right.
- NANarrator
That's funny.
- CWChris Williamson
It's the same as every Rick and Morty episode, like the painful... The funny thing is the fact that it's true.
- MCMara Cortona
Mm-hmm. Mm-hmm. Right, exactly. Oh, that's hilarious. Yeah, yeah, I definitely, I, I think if any of us are honest about what's driving us, uh, we'll find th- there's a lot of things we won't find 'cause they're too, they're too unconscious, um, and then there's a lot that, that we'll find that maybe feels silly or feels not super, not super aligned with, um, our higher order selves, but they can be brought into alignment.
- CWChris Williamson
But the genes, our genes are bastards. Like, they're so s- like, if you were able to bottle whatever a gene has and then turn it into a spy, you would, it would just be the best in- information agent on the planet. Like, James Bond wouldn't have shit on your genes if you were able to manifest it into a human, because what you think you're doing and why you think you're doing it, you don't have the first idea. The more and more I read about self-deception, the more that I realize that that's true. Um-
- MCMara Cortona
Mm-hmm.
- CWChris Williamson
So I agree.
Episode duration: 1:09:07
Install uListen for AI-powered chat & search across the full episode — Get Full Transcript
Transcript of episode ImcFOF5EX9A
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome