Skip to content
Modern WisdomModern Wisdom

How To Avoid Destroying Humanity - Rob Reid | Modern Wisdom Podcast 346

Rob Reid is an entrepreneur, podcaster and an author. The last 15 months have been a terrifying taster of just what a global crisis is like, except it wasn't lethal enough to be a threat to our long term survival - but just because this one wasn't, doesn't mean that more deadly existential risks aren't out there. Expect to learn how synthetic biology might be the biggest risk to our survival, what we should have learned from 2020, whether Artificial General Intelligence is an immediate threat, Rob's opinion on my solution for saving civilisation, whether we should totally stop all technological development, if synbio is preventable, how we can avoid civilisation’s destruction through nuclear bombs and much more... Sponsors: Get 20% discount on all pillows at https://thehybridpillow.com (use code: MW20) Get perfect teeth 70% cheaper than other invisible aligners from DW Aligners at http://dwaligners.co.uk/modernwisdom Extra Stuff: Check out Rob's Podcast - https://after-on.com Follow Rob on Twitter - https://twitter.com/rob_reid?lang=en Get my free Ultimate Life Hacks List to 10x your daily productivity → https://chriswillx.com/lifehacks/ To support me on Patreon (thank you): https://www.patreon.com/modernwisdom #existentialrisk #syntheticbiology #pandemic - 00:00 Intro 01:19 The Thrill of Existential Risk 07:00 Why is Climate Change our Focus? 10:19 Humanity’s Close Calls 20:31 Democratising the Apocalypse 30:46 The Threat of Covid-19 and Pandemics 54:00 Is the Research Worth the Risk? 1:02:00 Would Moon Labs Reduce Risk? 1:08:18 Helpful Lessons from Covid-19 1:16:45 What if China Leaked Covid-19? 1:22:15 How to Prevent Destroying Humanity 1:37:47 Creating Silo Communities 1:46:54 Lesser-known Existential Risks 1:51:17 Making Existential Risk Sexier 2:01:50 How Can Individuals Help? 2:08:07 What’s Next for Rob? - Listen to all episodes online. Search "Modern Wisdom" on any Podcast App or click here: Apple Podcasts: https://apple.co/2MNqIgw Spotify: https://spoti.fi/2LSimPn Stitcher: https://www.stitcher.com/podcast/modern-wisdom - Get in touch in the comments below or head to... Instagram: https://www.instagram.com/chriswillx Twitter: https://www.twitter.com/chriswillx Email: https://chriswillx.com/contact/

Rob ReidguestChris Williamsonhost
Jul 14, 20212h 11mWatch on YouTube ↗

At a glance

WHAT IT’S REALLY ABOUT

Rob Reid Warns: Democratized Technology Is Making Doomsday Alarmingly Easy

  1. Rob Reid and Chris Williamson explore existential risks, focusing on how modern technologies like nuclear weapons, synthetic biology, and AI have radically increased humanity’s ability to self-destruct in just the last several decades.
  2. Reid argues that risk is shifting from a few tightly controlled state actors to thousands of private individuals and labs, a trend he calls the “privatization” or “democratization” of the apocalypse.
  3. They review historical near-misses (nuclear close calls, anthrax mailings, BSL-4 lab leaks) and use COVID-19 as both a warning shot and a test case for how poorly prepared the world is for engineered pandemics.
  4. The conversation ends with practical ideas: banning gain-of-function research, building global DNA-screening standards into lab tools, investing heavily in universal vaccines, reshaping culture through stories, and even creating an isolated backup human community.

IDEAS WORTH REMEMBERING

5 ideas

Ban gain-of-function research that makes pathogens more dangerous.

Reid argues the benefits of gain-of-function on lethal viruses (e.g., H5N1 made airborne in ferrets) are marginal and speculative, while the downside risk of a leak is civilization-toppling; a global treaty to prohibit such work would cheaply remove a significant chunk of bio-risk.

Mandatory DNA screening infrastructure must be built into all synthesis providers and benchtop printers.

Today, an industry group (IGSC) voluntarily screens orders for dangerous sequences, but coverage is incomplete and non-binding; making this screening universal, legally required, and embedded in future desktop DNA printers would sharply reduce the number of actors able to access doomsday genomes.

Invest aggressively in pan-familial (universal) vaccines against major virus families.

Researchers believe that for relatively small sums (on the order of billions globally over a decade) we could likely develop “universal” vaccines for influenza, coronaviruses, and other lethal families, massively reducing both routine disease burden and catastrophic pandemic risk, yet governments have not moved at scale even post-COVID.

Treat doomsday tech in private hands as a structural incentive problem, not just a morality problem.

Whereas nuclear risk was centralized in cautious states with no economic upside for pushing the edge, private AI labs and synbio groups face huge financial incentives to take ‘tiny’ global risks for large personal gains—mirroring the 2008 crisis pattern of privatized gains and socialized losses.

Use storytelling and popular culture to normalize existential risk thinking.

Fiction like 1984, WarGames, and The Terminator shaped public attitudes toward totalitarianism and nuclear or AI risk; Reid stresses that novels, films, and series about synbio and AI gone wrong can shift the Overton window far more effectively than technical papers alone.

WORDS WORTH SAVING

5 quotes

We spent trillions of dollars preventing two people from hitting the flashing red button. Soon we’ll be relying on thousands of people not to screw up.

Rob Reid

Bringing a virus into existence that doesn't currently, in an effort to inoculate us from the chance that it might come into existence, is stark raving mad.

Chris Williamson

All labs leak. All labs at every biosafety level can absolutely leak, and particularly if we get malicious actors in there.

Rob Reid

COVID is a very, very difficult warning shot to miss. The whole world has been traumatized by this… The question is, will it be adequate attention, will it be sustained attention, and will it be intelligent attention?

Rob Reid

We weren’t trained on the savanna to think, ‘If I screw up, all humans die.’

Rob Reid

The recent emergence and psychology of existential risk awarenessHistorical nuclear near-misses and Cold War deterrence as a “public good”Synthetic biology risks, gain-of-function research, and lab leaksPrivatization/democratization of doomsday technologies and misaligned incentivesPandemics, COVID-19 as a warning shot, and universal vaccine strategiesSuperintelligent AI risk and private-sector economic pressuresCultural, political, and educational levers for building global resilience (including backup communities and narrative/fictional influence)

High quality AI-generated summary created from speaker-labeled transcript.

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome