A History Of Existential Risk - Thomas Moynihan | Modern Wisdom Podcast 306

A History Of Existential Risk - Thomas Moynihan | Modern Wisdom Podcast 306

Modern WisdomApr 10, 20211h 24m

Thomas Moynihan (guest), Chris Williamson (host), Chris Williamson (host), Narrator

The intellectual history of extinction and existential riskDifference between apocalypse (religious end-times) and scientific extinctionPhilosophical arguments for the moral importance of safeguarding the future (Parfit, Bostrom, Ord)Technology as both existential threat and necessary protectionPsychological biases: scope neglect, wishful thinking, denial of deathSpace colonization, astronomical waste, and opportunity cost of delayModern cultural attitudes toward humanity, progress, and the Enlightenment project

In this episode of Modern Wisdom, featuring Thomas Moynihan and Chris Williamson, A History Of Existential Risk - Thomas Moynihan | Modern Wisdom Podcast 306 explores how Humanity Awoke To Its Own Extinction — And What’s Next Thomas Moynihan and Chris Williamson explore the intellectual history of existential risk: how humans gradually came to understand that our species could irreversibly disappear.

How Humanity Awoke To Its Own Extinction — And What’s Next

Thomas Moynihan and Chris Williamson explore the intellectual history of existential risk: how humans gradually came to understand that our species could irreversibly disappear.

Moynihan contrasts ancient apocalyptic thinking—where endings fulfill a divine moral order—with the modern notion of extinction as the permanent loss of all future value and potential.

Drawing on thinkers like Parfit, Bostrom, and Ord, they argue that recognizing existential risk is a hard‑won, recent achievement that should make us both cautious with technology and hopeful about our capacity for moral progress.

They discuss natural and technological threats, space colonization, psychological barriers to caring about long‑term risk, and why this era’s work on x‑risk ethics may be a pivotal inflection point in human history.

Key Takeaways

Recognizing existential risk is a recent, major intellectual breakthrough.

For most of history, people assumed species, values, or civilizations would always return or exist elsewhere; only in the last few centuries did we grasp that humanity and its values could be irreversibly lost.

Get the full analysis with uListen AI

Extinction is morally far worse than even near‑total catastrophe.

Following Derek Parfit, the conversation emphasizes that the key difference is not between peace and 95% mortality, but between 95% and 100%—because total extinction forecloses all future generations and all unrealized value.

Get the full analysis with uListen AI

Technology is simultaneously the main risk source and the only long‑term safeguard.

Advanced technologies can generate “black ball” risks (e. ...

Get the full analysis with uListen AI

Our moral and philosophical tools lag behind our technological power.

Applied ethics and secular moral philosophy are comparatively young fields; developing better ethical frameworks (e. ...

Get the full analysis with uListen AI

Ancient apocalyptic visions are not the same as modern extinction risk.

Religious apocalypses depict a morally meaningful consummation (judgment, completion), whereas scientific extinction describes the senseless, permanent ending of meaning and moral progress within an indifferent universe.

Get the full analysis with uListen AI

Humanity’s unique capacity for error correction is a core source of hope.

Humans are the only known beings that revise their beliefs and ethics based on argument and evidence; that same capacity which revealed existential risk can also be used to mitigate it and improve the future.

Get the full analysis with uListen AI

Historical perspective counters despair and anthropophobic pessimism.

Seeing how wrong even great thinkers were about extinction and how much conceptual progress we’ve made helps balance current focus on injustices with appreciation of real, hard‑won advances in knowledge and ethics.

Get the full analysis with uListen AI

Notable Quotes

Extinction is the rule. Survival’s the exception.

Thomas Moynihan

The ability to grasp the prospect of our own extinction is a significant intellectual achievement.

Chris Williamson (paraphrasing Moynihan’s point)

Apocalypse supplies a sense of an ending, whereas extinction anticipates the ending of sense.

Thomas Moynihan

We’re the only animal that’s ever corrected itself.

Thomas Moynihan

If I’m hurtling towards a cliff edge, I want to know where that cliff edge is, rather than just wishfully thinking, ‘I’ll be fine.’

Thomas Moynihan

Questions Answered in This Episode

If accepting our vulnerability to extinction is such a recent achievement, what other major blind spots might our current worldview still contain?

Thomas Moynihan and Chris Williamson explore the intellectual history of existential risk: how humans gradually came to understand that our species could irreversibly disappear.

Get the full analysis with uListen AI

How should policymakers practically balance the need to slow dangerous technologies with the need to develop protective ones and avoid astronomical opportunity costs?

Moynihan contrasts ancient apocalyptic thinking—where endings fulfill a divine moral order—with the modern notion of extinction as the permanent loss of all future value and potential.

Get the full analysis with uListen AI

What cultural or educational changes would most effectively overcome scope neglect and make people care about safeguarding the far future?

Drawing on thinkers like Parfit, Bostrom, and Ord, they argue that recognizing existential risk is a hard‑won, recent achievement that should make us both cautious with technology and hopeful about our capacity for moral progress.

Get the full analysis with uListen AI

Could an intense focus on existential risk inadvertently justify harmful short‑term trade‑offs, and how can we guard against that “fanaticism” risk?

They discuss natural and technological threats, space colonization, psychological barriers to caring about long‑term risk, and why this era’s work on x‑risk ethics may be a pivotal inflection point in human history.

Get the full analysis with uListen AI

If we discovered advanced extraterrestrial civilizations tomorrow, how would that change the moral calculus around human extinction and long‑termism?

Get the full analysis with uListen AI

Transcript Preview

Thomas Moynihan

99.9% of all species that have ever existed are now extinct. So it's, extinction is the rule. Survival's the exception. That's an important thing to know is that, you know, potentially we have come close before. This isn't something that is completely, uh, unprecedented.

Chris Williamson

If we're worried about existential risks annihilating our future, why spend any time studying the past?

Thomas Moynihan

(laughs) That's a good question. Uh, so I, I, I hope as we talk through this, uh, that the true, uh, significance of what I'm about to say will, um, be elaborated further. But, um, I think that it's so easy to focus on, uh, the risks coming towards us, coming down the track. Um, and it's slightly harder to take stock and look backwards and see just how far we've come. One of the things I mean by that is that, uh, the very ability for us to even be able to see those risks ahead, uh, the risks on the horizon, uh, that's a massive achievement, uh, for, for humanity, um, for, for our knowledge of the world, for our knowledge of what is best to do within the world. Um, that's a massive achievement. And again, I hope that, you know, as we speak through this, uh, you know, um, the truth of this might, uh, hopefully kind of, uh, unfurl. Um, but, you know, some of our biggest achievements are almost in- invisible to us. Um, some of, you know, uh, some of the most profound, um, uh, breakthroughs of human knowledge are often invisible to us. Um, so, you know, I often point towards the fact, um, of, uh, well, take slavery, for example. For most, for the majority of human history, uh, people presumed that it was just part of the natural order of things. Uh, you know, um, it wasn't questioned. Uh, all of us these days kind of take it for granted that, uh, that's, you know, inherently wrong. Um, a- another example I like to use, uh, is, um, perspective, right? So, you know, think back to being a kid in school. Um, you'd learn to draw your first cube or your first, uh, uh, prism or, you know, triangle. Uh, pyramid, sorry. Um, it's so easy. It comes to you so naturally. Uh, rewind, you know, uh, six centuries, seven centuries, uh, it wouldn't have come naturally at all. Uh, you know, I was drawing cubes, I don't know what age, but pretty young. And that's not because I'm a genius or a prodigy or some da Vinci tier, uh, you know, mega genius. It's because of, uh, cultural osmosis, because the ideas that we just take for granted, we inherit. But someone had to come up with them and... Well, often lots of people had to come up with them. And it takes, uh, you know, centuries, decades of effort, of hard work and error correction, uh, of finding out the ways in which we are so severely wrong about the world. Um, and yeah, so to tie up my point, um, thinking, being able to even notice these risks, uh, the risks facing humanity or just the fact of how bad, uh, human extinction would be, um, those are really, uh, huge achievements. And they're quite modern ones as well. Uh, so yeah, I, I would say, I would put it like this. It's, it's a, it's a, uh, it's a cure for despondency, um, because like I said, it's easy to see the risks ahead, uh, harder to see how far we've come. So it's easy to be despondent. It's easy to despair. Um, but it's deceptively easy, because we have that kind of bias where it's easier to look straight ahead rather than look to the past.

Install uListen to search the full transcript and get AI-powered insights

Get Full Transcript

Get more from every podcast

AI summaries, searchable transcripts, and fact-checking. Free forever.

Add to Chrome