
Yuval Noah Harari: An Urgent Warning They Hope You Ignore. More War Is Coming!
Yuval Noah Harari (guest), Steven Bartlett (host)
In this episode of The Diary of a CEO, featuring Yuval Noah Harari and Steven Bartlett, Yuval Noah Harari: An Urgent Warning They Hope You Ignore. More War Is Coming! explores yuval Noah Harari Warns: AI, War, And Humanity’s Final Transformation Yuval Noah Harari argues that humanity has entered a new era marked by escalating wars, runaway AI development, and powerful bioengineering technologies that could end Homo sapiens as we know it. Our unique power—collective belief in shared fictions—now risks making us powerless, as algorithms begin to make crucial economic, political, and personal decisions. He warns that AI is fundamentally different from past technologies because it can independently make decisions and generate ideas, creating systemic risks from finance to warfare to intimate relationships. Yet he insists catastrophe is not inevitable: humans still have agency and must rebuild a global order, slow dangerous bioengineering, regulate AI, and cultivate inner mental resilience and critical thinking.
Yuval Noah Harari Warns: AI, War, And Humanity’s Final Transformation
Yuval Noah Harari argues that humanity has entered a new era marked by escalating wars, runaway AI development, and powerful bioengineering technologies that could end Homo sapiens as we know it. Our unique power—collective belief in shared fictions—now risks making us powerless, as algorithms begin to make crucial economic, political, and personal decisions. He warns that AI is fundamentally different from past technologies because it can independently make decisions and generate ideas, creating systemic risks from finance to warfare to intimate relationships. Yet he insists catastrophe is not inevitable: humans still have agency and must rebuild a global order, slow dangerous bioengineering, regulate AI, and cultivate inner mental resilience and critical thinking.
Key Takeaways
Learn to distinguish human-made fictions from underlying reality.
Harari stresses that money, corporations, nations, and even many conflicts are built on shared stories rather than physical scarcity. ...
Get the full analysis with uListen AI
Treat AI as an independent decision-maker, not just a neutral tool.
Unlike the printing press or steam engines, AI can decide what to show, what to prioritize, and can generate original content (music, manifestos, financial products) without direct human instructions. ...
Get the full analysis with uListen AI
Beware of AI-driven financial complexity that no human can understand.
Harari outlines a plausible near future where AI designs financial instruments and runs markets at such speed and complexity that not a single human truly grasps the system. ...
Get the full analysis with uListen AI
Slow down bioengineering ‘upgrades’ that could deepen inequality and backfire.
Genetic enhancements and brain–computer interfaces may first be accessible only to a wealthy elite, turning economic inequality into biological inequality and potentially splitting humanity into ‘superhumans’ and everyone else. ...
Get the full analysis with uListen AI
Recognize how hackable you are—and drop the naïve belief in inviolable ‘free will.’
Harari calls humans ‘hackable animals’: our choices are strongly shaped by cultural, biological, and psychological factors that can be modeled and exploited. ...
Get the full analysis with uListen AI
Guard your attention and intimacy from AI systems designed to exploit them.
The ‘battle for attention’ is evolving into a battle for intimacy: AIs are being trained to simulate caring relationships because intimate, trusted agents are the most powerful persuaders. ...
Get the full analysis with uListen AI
Rebuild a global rules-based order or face escalating wars and militarization.
The post–Cold War liberal order, with all its flaws, coincided with the most peaceful era in human history and kept defense spending relatively low versus health and education. ...
Get the full analysis with uListen AI
Notable Quotes
“We humans should get used to the idea that we are no longer mysterious souls. We are now hackable animals.”
— Yuval Noah Harari
“AI is nothing like print. It's nothing like the Industrial Revolution of the 19th century. It's far, far bigger.”
— Yuval Noah Harari
“If we get a C-minus again in the 21st century, that's the end of us.”
— Yuval Noah Harari
“We are now in a new era of wars, and unless we re-establish order fast, then we are doomed.”
— Yuval Noah Harari
“An organism that is excited all the time dies.”
— Yuval Noah Harari
Questions Answered in This Episode
You argue that most wars are driven by conflicting stories rather than material scarcity. In practice, how could opposing sides in an active conflict begin constructing a genuinely shared narrative without one side feeling it has surrendered its identity?
Yuval Noah Harari argues that humanity has entered a new era marked by escalating wars, runaway AI development, and powerful bioengineering technologies that could end Homo sapiens as we know it. ...
Get the full analysis with uListen AI
You suggest we must keep the financial system at a complexity humans can understand. What concrete regulatory thresholds or tests would you design to ensure ‘human intelligibility’ of key financial algorithms and instruments?
Get the full analysis with uListen AI
When you say that reckless bioengineering ‘upgrades’ might actually be downgrades, what kinds of long-term longitudinal studies or ethical veto mechanisms would you want in place before any cognitive or emotional enhancement is allowed on children?
Get the full analysis with uListen AI
If intimacy becomes a primary vector of AI-driven manipulation, what specific design constraints or rights (e.g., ‘right not to be simulated’) would you advocate for in regulating companion bots and emotionally tuned chat systems?
Get the full analysis with uListen AI
You’ve called for rebuilding a global order based on universal norms while rejecting a world government. Given rising nationalism and distrust of multilateral institutions, what realistic pathway do you see for re-legitimizing and strengthening global cooperation in the next decade?
Get the full analysis with uListen AI
Transcript Preview
We are now in a new era of wars (instrumental music plays) . And unless you re-establish order fast, then we are doomed. Yuval Noah Harari. One of the brightest minds on Planet Earth. Historian, a best-selling author... ... of some of the most influential non-fiction books in the world today. (instrumental music plays) I think we are very near the end of our species because people often spend so much effort trying to gain something without understanding the consequences. For example, we will get to a life where you can live indefinitely. But realizing that you have a chance to live forever, but if there is an accident, you die, the people who will be in that situation will be at a level of anxiety and terror unlike anything that we know. Then you have artificial intelligence, and the world is, is not ready for it. It's the first technology in history that can make decisions by itself and take power away from us, to hack human beings, manipulate our behavior, and making all these decisions for us or about us, whether to give you a loan, whether to give you a mortgage. Dating apps, shaping your romantic life. But real problem is that, increasingly, the humans at the top could be puppets when the most consequential decisions are made by algorithms. Global financial decisions. Wars. And this is extremely dangerous, but it's not inevitable. Humans can change it.
But with what's to come, are you optimistic about the future?
I'm very worried about two things. First of all...
Quick one. This is really, really fascinating to me. On the backend of our YouTube channel, it says that 69.9% of you that watch this channel frequently over the lifetime of this channel haven't yet hit the subscribe button. I just wanted to ask you a favor. It helps this channel so much if you choose to su- subscribe. Helps us scale the guests, helps us scale the production, and it makes the show bigger. So if I could ask you for one favor, if you've watched this show before and you've enjoyed it and you like this episode that you're currently watching, could you please hit the subscribe button. Thank you so much. And I will repay that gesture by making sure that everything we do here gets better and better and better and better. That is a promise I'm willing to make you. Do we have a deal? (instrumental music plays) Yuval, I have three of your books here.
Mm-hmm.
And these are three books that sent a huge tidal wave, a ripple through society. With these books, and with all of the work that you're doing now, with the lectures you give, the, the interviews you give, what is your mission? What, what is the sort of ... If I was to be able to summarize what your collective mission is-
Mm-hmm.
... with your work, what is that?
Install uListen to search the full transcript and get AI-powered insights
Get Full TranscriptGet more from every podcast
AI summaries, searchable transcripts, and fact-checking. Free forever.
Add to Chrome