
Joe Rogan Experience #1558 - Tristan Harris
Joe Rogan (host), Tristan Harris (guest), Narrator
In this episode of The Joe Rogan Experience, featuring Joe Rogan and Tristan Harris, Joe Rogan Experience #1558 - Tristan Harris explores ex-Google ethicist warns: social media is rewiring democracy’s mind Joe Rogan and Tristan Harris (of *The Social Dilemma* and Center for Humane Technology) unpack how ad-driven platforms like Facebook, YouTube, Twitter, and TikTok compete for attention by exploiting psychological vulnerabilities rather than serving users’ interests.
Ex-Google ethicist warns: social media is rewiring democracy’s mind
Joe Rogan and Tristan Harris (of *The Social Dilemma* and Center for Humane Technology) unpack how ad-driven platforms like Facebook, YouTube, Twitter, and TikTok compete for attention by exploiting psychological vulnerabilities rather than serving users’ interests.
Harris argues that recommendation algorithms and engagement metrics have effectively taken control of what billions of people see and believe, amplifying outrage, conspiracy theories, polarization, teen mental health issues, and even offline violence worldwide.
They discuss the asymmetry of power between ancient human brains and supercomputer-optimized feeds, the global security risks of information warfare via social media, and parallels to environmental regulation and the abolition of lead and slavery.
While Harris sees the problem as systemic and on the scale of climate change, he believes a global cultural and regulatory shift—plus alternative, humane tech models—can still redirect the trajectory if enough people recognize the manipulation and demand change.
Key Takeaways
Recognize that your feed is engineered, not neutral.
Platforms don’t simply show posts in time order; they rank and recommend content predicted to maximize your engagement, which systematically favors outrage, sensationalism, and rabbit holes over accuracy or nuance.
Get the full analysis with uListen AI
Assume an asymmetry of power every time you open an app.
You bring an ancient prefrontal cortex and limited willpower, while a supercomputer model of ‘you’ is testing millions of content options to keep you hooked; treating this as a fair contest of self-control is a mistake.
Get the full analysis with uListen AI
Treat recommendation systems—not just individual posts—as the core harm.
Whether it’s YouTube’s ‘Up Next’ or Facebook’s group suggestions, the engine that auto-selects what you see tends to steer people toward more extreme, engaging material (e. ...
Get the full analysis with uListen AI
Limit kids’ exposure and move them off ad-driven social platforms in groups.
Because teen social life is locked into Instagram, TikTok, etc. ...
Get the full analysis with uListen AI
Push for structural changes to the business model, not just more fact-checking.
Even perfect privacy and better moderation won’t fix a system where revenue scales with addiction, polarization, and junk attention; laws and platform rules must change incentives so companies profit more from helping than from hijacking users.
Get the full analysis with uListen AI
Understand social media as a national and global security risk.
Foreign actors don’t need to invent fake stories; they can cheaply amplify divisive real content and extremist fringes, turning open engagement algorithms into tools for destabilizing elections, fueling violence, and undermining trust.
Get the full analysis with uListen AI
Practice ‘digital hygiene’ and periodic disconnection.
Simple steps—moving addictive apps off your home screen, using phone time limits, or taking a weekly ‘digital sabbath’—won’t solve the systemic problem, but they reclaim slices of attention and make you less reactive to engineered outrage.
Get the full analysis with uListen AI
Notable Quotes
“They’re not competing for your data or your money; they’re competing to keep you using the product.”
— Tristan Harris
“Social media is more like this manipulative environment that is tapping into our weaknesses.”
— Tristan Harris
“If you’re not paying for the product, you are the product—but the product is your predictable behavior.”
— Tristan Harris
“We have landed in a world where the things that we are paying attention to are not necessarily the agenda of topics that we would say, in a reflective world, are the most important.”
— Tristan Harris
“In short, Orwell feared that what we fear will ruin us. Huxley feared that what we desire will ruin us.”
— Neil Postman (quoted by Tristan Harris)
Questions Answered in This Episode
If recommendation algorithms are the real driver of harm, what concrete regulatory or design constraints on recommendations would meaningfully reduce polarization without turning into heavy-handed censorship?
Joe Rogan and Tristan Harris (of *The Social Dilemma* and Center for Humane Technology) unpack how ad-driven platforms like Facebook, YouTube, Twitter, and TikTok compete for attention by exploiting psychological vulnerabilities rather than serving users’ interests.
Get the full analysis with uListen AI
How can we realistically shift billions of people off ‘free’ ad-supported platforms when alternatives either cost money or lack the network effects that make current platforms so sticky?
Harris argues that recommendation algorithms and engagement metrics have effectively taken control of what billions of people see and believe, amplifying outrage, conspiracy theories, polarization, teen mental health issues, and even offline violence worldwide.
Get the full analysis with uListen AI
What specific metrics should platforms and regulators adopt to measure ‘societal IQ’ or problem-solving capacity, similar to how lead’s impact was quantified, so that harms can be priced into policy?
They discuss the asymmetry of power between ancient human brains and supercomputer-optimized feeds, the global security risks of information warfare via social media, and parallels to environmental regulation and the abolition of lead and slavery.
Get the full analysis with uListen AI
Given that foreign actors can simply amplify existing domestic voices, is there any principled way to distinguish between ‘legitimate’ virality and weaponized amplification in an open society?
While Harris sees the problem as systemic and on the scale of climate change, he believes a global cultural and regulatory shift—plus alternative, humane tech models—can still redirect the trajectory if enough people recognize the manipulation and demand change.
Get the full analysis with uListen AI
What might a truly ‘humane’ social platform look like day-to-day for a teenager or an average adult, and how different would their information diet and mental state be compared to today?
Get the full analysis with uListen AI
Transcript Preview
(dramatic music plays) Joe Rogan podcast, check it out. The Joe Rogan Experience. Train by day, Joe Rogan podcast by night. All day. (rock music plays) Tristan, how are you?
Good. Good to be here.
Good to have you here, man. Um, you were just telling me before we went on air the numbers of The Social Dilemma.
Yeah.
And they're bonkers. So what, say that real quick.
Yeah. Uh, The Social Dilemma was seen by, uh, 38 million households in the first 28 days on Netflix, which I think has broken records. And if you assume, you know, a lot of people are seeing it with their family because parents seeing it with their kids, uh, the issues that are around teen mental health. Uh, so y- if you assume one out of ten families saw it with a few family members, we're in the 40 to 50 million people range, which has just broken records, I think, for Netflix. I think it was the second most popular documentary throughout the month of September.
I think-
Or, or course, or filmed throughout the month of September.
... it was a really well done documentary. But I think it's one of those documentaries that affirmed a lot of people's worst suspicions about the dangers of social media, and then on top of that, it sort of a-alerted them to what they were already experiencing in their own personal life and, like, highlighted it.
Yeah, I think that's right. I mean, most people were aware. I think it's a thing everyone's been feeling that the feeling you have when you use social media isn't that this thing is just a tool or it's on my side, it is an environment based on manipulation, as we say in the film. And that's really what's changed, that, you know, w- (sighs) I, I remember, you know, I was gonna be working on these issues for something like eight or, eight years or something now.
Could you please tell people who didn't see the documentary-
What it is, yeah.
... what, what your background is-
Yeah.
... and what you, how you got into it?
Yeah, so I, uh, the, you know, the, the film goes back as a, a set of technology insiders. My, my background was as a design ethicist at Google. So I first had a startup company, uh, that we sold to Google, and I landed there through a talent acquisition. And then, um, uh, started, uh, work, about a year into being at Google, uh, made a presentation that was about how essentially technology was holding the human collective psyche in its hands, that we were really controlling the world's psychology. Uh, because every single time people look at their phone, they are basically experiencing thoughts and scrolling through feeds and believing things about the world, this has become the primary meaning-making machine for the world. And that we as Google had a moral responsibility to, uh, you know, hold the collective psyche in a thoughtful, ethical way and not create this sort of race to the bottom of the brain stem attention economy that we now have. Uh, so my background was as a, as a kid I was a magician. We can get into that. Uh, I, uh, studied at a lab at Stanford, uh, called, or studied in a class called the Stanford Persuasive Technology, uh, Class that taught a lot of the engineers at, in Silicon Valley kind of how the mind works, and the co-founders of Instagram were there. And, uh, then later studied behavioral economics and how the mind is sort of influenced. I went into cults and started studying how cults work, and then arrived at Google through this lens of, you know, technology isn't really just this thing that's in our hands. It's more like this manipulative environment that is tapping into our weaknesses, everything from the slot machine rewards to, you know, the way you get tagged in a photo and it sort of manipulates your social validation and approval, these kinds of things.
Install uListen to search the full transcript and get AI-powered insights
Get Full TranscriptGet more from every podcast
AI summaries, searchable transcripts, and fact-checking. Free forever.
Add to Chrome