Skip to content
The Diary of a CEOThe Diary of a CEO

Yuval Noah Harari: An Urgent Warning They Hope You Ignore. More War Is Coming!

If you enjoy hearing about the potential impact of AI on humanity, I recommend you check out my conversation with ex-Google office, Mo Gawdat which you can find here: https://www.youtube.com/watch?v=bk-nQ7HF6k4 0:00 Intro 2:15 What Is Your Mission, and What Is Your Warning to the World? 07:58 Is This the End of Humanity as We Know It? 12:29 Connecting Computers to Human Brains 15:11 What Are Your Concerns About AI? 27:35 The Dangers of AI to the Financial System and Governments 37:56 Do Humans Have Free Will, and Will AI Take It from Us? 45:41 The Problems of AI Forging Relationships with Humans 52:42 Are We Happy? 55:42 Fighting Immortality and Its Consequences 01:00:00 Will Bioengineering Create Different Social Classes and Types of People? 01:06:30 Will AI Take Over Our Jobs? 01:12:06 What Should We Teach Our Children to Be Prepared for the Future? 01:14:29 We're Entering a Scary New Era. 01:19:31 What Should We Do to Stop/Change the Trajectory We're Heading Towards? 01:22:05 The Importance of Disconnecting from Information 01:27:10 What Media Corporations Want from You 01:30:16 We Need More Boredom in the World and Politics 01:36:49 Is There Hope for Humanity? 01:40:07 The Importance of History for Our Future 01:43:34 The Last Guest Question You can pre-order the 10th anniversary edition of ‘Sapiens’, here: https://bit.ly/48JVQ6c Follow Yuval: Twitter: https://bit.ly/3HdUxR7 Instagram: https://bit.ly/41WLbCT YouTube: https://bit.ly/3vyAwm0 Flight fund: https://flightfund.com/ Get tickets to The Business & Life Speaking Tour: https://stevenbartlett.com/tour/ FOLLOW ► Instagram: https://www.instagram.com/steven/ Twitter: https://x.com/StevenBartlett?s=20 Linkedin: https://www.linkedin.com/in/steven-bartlett-56986834/ Sponsors: Huel: https://try.huel.com/steven-bartlettEight Sleep: https://www.eightsleep.com/uk/steven/ CODE: STEVEN

Yuval Noah HarariguestSteven Bartletthost
Jan 10, 20241h 46mWatch on YouTube ↗

At a glance

WHAT IT’S REALLY ABOUT

Yuval Noah Harari Warns: AI, War, And Humanity’s Final Transformation

  1. Yuval Noah Harari argues that humanity has entered a new era marked by escalating wars, runaway AI development, and powerful bioengineering technologies that could end Homo sapiens as we know it. Our unique power—collective belief in shared fictions—now risks making us powerless, as algorithms begin to make crucial economic, political, and personal decisions. He warns that AI is fundamentally different from past technologies because it can independently make decisions and generate ideas, creating systemic risks from finance to warfare to intimate relationships. Yet he insists catastrophe is not inevitable: humans still have agency and must rebuild a global order, slow dangerous bioengineering, regulate AI, and cultivate inner mental resilience and critical thinking.

IDEAS WORTH REMEMBERING

5 ideas

Learn to distinguish human-made fictions from underlying reality.

Harari stresses that money, corporations, nations, and even many conflicts are built on shared stories rather than physical scarcity. Our superpower—creating and believing in fictions—enables large-scale cooperation but also drives wars when groups cannot agree on a common story (e.g., Israel–Palestine). Practically, this means training yourself to ask: “Is this a physical constraint, or just a story we’re all following?” That question is central to avoiding manipulation by ideologies, media narratives, and economic myths.

Treat AI as an independent decision-maker, not just a neutral tool.

Unlike the printing press or steam engines, AI can decide what to show, what to prioritize, and can generate original content (music, manifestos, financial products) without direct human instructions. Algorithms already shape loan approvals, hiring, sentencing recommendations, dating, and social media feeds. Individuals and institutions should demand transparency, retain meaningful human oversight, and resist delegating critical decisions (finance, warfare, justice) to systems whose reasoning no human fully understands.

Beware of AI-driven financial complexity that no human can understand.

Harari outlines a plausible near future where AI designs financial instruments and runs markets at such speed and complexity that not a single human truly grasps the system. This would leave elected leaders and regulators dependent on algorithms for advice in crises, undermining democratic control. Policymakers should proactively cap complexity, mandate human-comprehensible models, and start treating ‘human intelligibility’ of core financial infrastructure as a safety requirement, much like physical safety in nuclear plants.

Slow down bioengineering ‘upgrades’ that could deepen inequality and backfire.

Genetic enhancements and brain–computer interfaces may first be accessible only to a wealthy elite, turning economic inequality into biological inequality and potentially splitting humanity into ‘superhumans’ and everyone else. Harari also warns that what’s sold as an upgrade (e.g., more intelligence) might come with unseen tradeoffs (less compassion, less spiritual depth). He argues societies should deliberately move slowly, regulate enhancements, and prioritize safety and ethics over speed—unlike an AI arms race where strategic pressure is much harder to resist.

Recognize how hackable you are—and drop the naïve belief in inviolable ‘free will.’

Harari calls humans ‘hackable animals’: our choices are strongly shaped by cultural, biological, and psychological factors that can be modeled and exploited. People who cling to mystical notions of free will (“no one can manipulate me”) are easiest to manipulate. Actionably, this means engaging in self-inquiry (therapy, meditation, critical reflection) to understand your own triggers and patterns, and being suspicious of systems—social media, political campaigns, personalized feeds—that profit from steering your behavior.

WORDS WORTH SAVING

5 quotes

We humans should get used to the idea that we are no longer mysterious souls. We are now hackable animals.

Yuval Noah Harari

AI is nothing like print. It's nothing like the Industrial Revolution of the 19th century. It's far, far bigger.

Yuval Noah Harari

If we get a C-minus again in the 21st century, that's the end of us.

Yuval Noah Harari

We are now in a new era of wars, and unless we re-establish order fast, then we are doomed.

Yuval Noah Harari

An organism that is excited all the time dies.

Yuval Noah Harari

Human fictions, cooperation, and the roots of conflict and warAI as a decision‑making agent and systemic global riskBioengineering, cyborgs, and the possible end/transformation of Homo sapiensImmortality, happiness, and the manipulation of the inner worldEconomic inequality, financial systems, and algorithmic controlWork, education, and employment in the age of AIGlobal order, rising wars, and the need for renewed international cooperation

High quality AI-generated summary created from speaker-labeled transcript.

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome