Joe Rogan Experience #1736 - Tristan Harris & Daniel Schmachtenberger

Joe Rogan Experience #1736 - Tristan Harris & Daniel Schmachtenberger

The Joe Rogan ExperienceJun 27, 20243h 1m

Narrator, Narrator, Joe Rogan (host), Tristan Harris (guest), Narrator, Daniel Schmachtenberger (guest), Tristan Harris (guest), Joe Rogan (host), Tristan Harris (guest), Narrator, Narrator

Engagement-driven algorithms, persuasive technology, and the “race to the bottom of the brainstem”Social media’s role in polarization, misinformation, and democratic breakdownForeign influence, troll farms, and social media as a national security vulnerabilityExponential technologies: AI, GPT-3, deepfakes, CRISPR, drones, and cyberweaponsAuthoritarian vs democratic tech models: China, the CCP, and Taiwan’s digital democracyPossible regulatory and design interventions (identity, virality limits, humane tech)Human nature, meaning, addiction, and the need for cultural/psychological transformation

In this episode of The Joe Rogan Experience, featuring Narrator and Narrator, Joe Rogan Experience #1736 - Tristan Harris & Daniel Schmachtenberger explores ex-Google ethicist warns: social media is destabilizing civilization itself Joe Rogan speaks with Tristan Harris (Center for Humane Technology, ex-Google design ethicist) and systems thinker Daniel Schmachtenberger about how social media and exponential technologies are driving polarization, fragility, and potential civilizational collapse.

Ex-Google ethicist warns: social media is destabilizing civilization itself

Joe Rogan speaks with Tristan Harris (Center for Humane Technology, ex-Google design ethicist) and systems thinker Daniel Schmachtenberger about how social media and exponential technologies are driving polarization, fragility, and potential civilizational collapse.

They argue that engagement-driven algorithms exploit human cognitive weaknesses, radicalize users, and systematically erode trust in institutions, making democratic governance increasingly unworkable while empowering adversarial states and troll farms.

The conversation widens to AI, deepfakes, CRISPR, drones, and cyberwarfare, framing them as “godlike powers” in the hands of Paleolithic brains and medieval institutions, creating twin attractors of decentralized catastrophe or centralized dystopia.

They call for a “third attractor”: using technology to strengthen democracy, social cohesion, and human development (with examples like Taiwan’s digital democracy and Apple’s privacy moves), and for a cultural shift toward digital literacy, wiser design, and collective problem‑solving.

Key Takeaways

Engagement-based algorithms inherently favor outrage, not truth or wellbeing.

Platforms optimize for clicks, shares, and watch time, so they systematically push incendiary, polarizing content. ...

Get the full analysis with uListen AI

Social media is structurally incompatible with healthy democracy in its current form.

By personalizing realities and amplifying extreme voices, platforms polarize voters, which polarizes elected officials, leading to gridlock and institutional paralysis just as societies face complex crises (pandemics, climate, geopolitics) that require coordinated action.

Get the full analysis with uListen AI

Foreign and domestic actors are weaponizing platforms at scale.

Troll farms and state adversaries exploit the same engagement dynamics as advertisers: for example, top Christian and minority Facebook pages being run from Eastern Europe, or targeted radicalization of veterans. ...

Get the full analysis with uListen AI

Exponential tech creates a choice between decentralized catastrophe and centralized dystopia.

AI, automated code generation, CRISPR, cheap drones, and cyberweapons put “godlike powers” in ever more hands. ...

Get the full analysis with uListen AI

Design changes can reduce harm without banning speech.

Harris highlights simple but powerful interventions—like limiting frictionless reshare chains, adding identity layers, or reorienting recommendation systems away from virality—showing we can curb irresponsible reach while preserving free expression, if platforms and regulators prioritize it.

Get the full analysis with uListen AI

There are concrete models for “technology that strengthens democracy.”

Taiwan’s use of digital tools (e. ...

Get the full analysis with uListen AI

Individual behavior matters but systemic redesign is indispensable.

Rogan’s personal strategy (no comments, limited engagement) improves his own mental health, but Harris and Schmachtenberger argue that without changes to business models, incentives, and governance, most people can’t realistically resist highly optimized attention-extraction systems.

Get the full analysis with uListen AI

Notable Quotes

We have Paleolithic emotions, medieval institutions, and godlike technology.

Tristan Harris (quoting E.O. Wilson and extending the idea)

Facebook isn’t trying to polarize the population. It’s an externality of optimizing ad revenue.

Daniel Schmachtenberger

If I don’t do the unethical thing to capture attention, my competitor will.

Tristan Harris

If you’ve got an F-35, I don’t need one. I’ve got Facebook.

Tristan Harris

Either we centralize the power and get dystopias, or it’s decentralized and we get catastrophes. We need a third attractor.

Daniel Schmachtenberger

Questions Answered in This Episode

How could social media recommendation systems be redesigned to prioritize consensus-building and nuance over outrage and virality without collapsing their business models?

Joe Rogan speaks with Tristan Harris (Center for Humane Technology, ex-Google design ethicist) and systems thinker Daniel Schmachtenberger about how social media and exponential technologies are driving polarization, fragility, and potential civilizational collapse.

Get the full analysis with uListen AI

What practical governance structures or institutions could embody the “third attractor” between decentralized catastrophe and centralized tech-enabled authoritarianism?

They argue that engagement-driven algorithms exploit human cognitive weaknesses, radicalize users, and systematically erode trust in institutions, making democratic governance increasingly unworkable while empowering adversarial states and troll farms.

Get the full analysis with uListen AI

How might widespread digital literacy and “attention hygiene” realistically be cultivated in a population that is already deeply addicted and economically stressed?

The conversation widens to AI, deepfakes, CRISPR, drones, and cyberwarfare, framing them as “godlike powers” in the hands of Paleolithic brains and medieval institutions, creating twin attractors of decentralized catastrophe or centralized dystopia.

Get the full analysis with uListen AI

What should democratic societies learn—and not learn—from China’s aggressive interventions in youth screen time, gaming, and TikTok-equivalent content?

They call for a “third attractor”: using technology to strengthen democracy, social cohesion, and human development (with examples like Taiwan’s digital democracy and Apple’s privacy moves), and for a cultural shift toward digital literacy, wiser design, and collective problem‑solving.

Get the full analysis with uListen AI

Given the rapid advances in AI-generated text, code, and media, how can we preserve any shared sense of reality when we can no longer reliably know if content—or even other users—is human and authentic?

Get the full analysis with uListen AI

Transcript Preview

Narrator

(drumbeats) Joe Rogan podcast, check it out.

Narrator

The Joe Rogan Experience.

Joe Rogan

Train by day, Joe Rogan podcast by night, all day. (instrumental music plays) Gentlemen, thank you for being here. Why don't you let's in- let's, I, I keep doing these podcasts where I just talk to people, so please introduce yourself and tell people what you do.

Tristan Harris

(smacks lips) Uh, I am Tristan Harris and, uh, came on this show about a year ago after The Social Dilemma came out. That's probably where most people know me. Um, and, uh, used to be a design ethicist at Google studying how do you ethically influence people's attention and thoughts and behaviors. Uh, and, uh, really enjoyed the conversation last year. The reason that today I'm here with, um, Daniel Schmactenberger who's really, uh, a person I've learned so much from the last few years and why I thought it'd be a good through line, um, is that the issues of social media, which I know we're gonna talk about today, are connected to a number of other issues that are going wrong in society that are all kind of interconnected. And I've learned a tremendous amount from Daniel, and I thought I'd help really clarify some of these, these issues for everyone.

Joe Rogan

Well, thank you, Daniel. Thanks for coming aboard.

Narrator

Thanks for having me here.

Joe Rogan

W- what a daunting task, how to ethically influence people.

Tristan Harris

Hmm.

Joe Rogan

And what a weird thing that this industry that didn't exist 20 years ago has such a... I mean, think about life on Earth.

Tristan Harris

Mm-hmm.

Joe Rogan

And then 20 years ago, all of a sudden, this social media thing sort of e- evolves, and now you have to wonder how much of an effect it has on our just day-to-day lives and how to ethically influence people.

Tristan Harris

Yeah.

Joe Rogan

What does that... what the fuck does that even mean?

Tristan Harris

(laughs) Well, first of all, I should say-

Joe Rogan

How does that... how do those thoughts even get, you know... how, how does that get worked out?

Tristan Harris

Actually, I should first say that there wasn't, at Google, a department that said, "How do we ethically influence people?" I actually sort of, um, as was shown in that... in the film, The Social Dilemma, wrote this presentation worried about how technology was influencing-

Joe Rogan

Right.

Tristan Harris

... people's thoughts, concerns, behaviors, et cetera. And I studied persuasive technology at Stanford, which is a whole discipline and field, the idea that technology can influence people. And it was out of my own personal concern that when that presentation went viral at Google, um, I, uh, kind of worked my way into this position that never existed before, which was, h- how could we create a framework for what it means to ethically influence other people? And a lot of that has to do with asymmetries of power. I mean, when I was a kid, I was a magician. We talked about this before. Um, magic is about an asymmetric relationship. The magician knows something about your mind that you don't know about your own mind. That's what makes the trick work. And actually, across some of these things we're going to talk about today are ways that there is an asymmetric relationship between what technology knows about us and what we don't know about ourselves. That's-

Install uListen to search the full transcript and get AI-powered insights

Get Full Transcript

Get more from every podcast

AI summaries, searchable transcripts, and fact-checking. Free forever.

Add to Chrome