Skip to content
The Joe Rogan ExperienceThe Joe Rogan Experience

Joe Rogan Experience #1558 - Tristan Harris

Called the “closest thing Silicon Valley has to a conscience,” by The Atlantic magazine, Tristan Harris spent three years as a Google Design Ethicist developing a framework for how technology should “ethically” steer the thoughts and actions of billions of people from screens. He is now co-founder & president of the Center for Humane Technology, whose mission is to reverse ‘human downgrading’ and re-align technology with humanity. Additionally, he is co-host of the Center for Humane Technology’s Your Undivided Attention podcast with co-founder Aza Raskin.

Joe RoganhostTristan Harrisguest
Oct 29, 20202h 21mWatch on YouTube ↗

At a glance

WHAT IT’S REALLY ABOUT

Ex-Google ethicist warns: social media is rewiring democracy’s mind

  1. Joe Rogan and Tristan Harris (of *The Social Dilemma* and Center for Humane Technology) unpack how ad-driven platforms like Facebook, YouTube, Twitter, and TikTok compete for attention by exploiting psychological vulnerabilities rather than serving users’ interests.
  2. Harris argues that recommendation algorithms and engagement metrics have effectively taken control of what billions of people see and believe, amplifying outrage, conspiracy theories, polarization, teen mental health issues, and even offline violence worldwide.
  3. They discuss the asymmetry of power between ancient human brains and supercomputer-optimized feeds, the global security risks of information warfare via social media, and parallels to environmental regulation and the abolition of lead and slavery.
  4. While Harris sees the problem as systemic and on the scale of climate change, he believes a global cultural and regulatory shift—plus alternative, humane tech models—can still redirect the trajectory if enough people recognize the manipulation and demand change.

IDEAS WORTH REMEMBERING

5 ideas

Recognize that your feed is engineered, not neutral.

Platforms don’t simply show posts in time order; they rank and recommend content predicted to maximize your engagement, which systematically favors outrage, sensationalism, and rabbit holes over accuracy or nuance.

Assume an asymmetry of power every time you open an app.

You bring an ancient prefrontal cortex and limited willpower, while a supercomputer model of ‘you’ is testing millions of content options to keep you hooked; treating this as a fair contest of self-control is a mistake.

Treat recommendation systems—not just individual posts—as the core harm.

Whether it’s YouTube’s ‘Up Next’ or Facebook’s group suggestions, the engine that auto-selects what you see tends to steer people toward more extreme, engaging material (e.g., diet to anorexia, news to conspiracies), regardless of truth or wellbeing.

Limit kids’ exposure and move them off ad-driven social platforms in groups.

Because teen social life is locked into Instagram, TikTok, etc., individual deletions are socially costly; coordinated ‘group migrations’ (classes, teams, families) to tools like Signal, iMessage, or non-addictive apps are far more effective.

Push for structural changes to the business model, not just more fact-checking.

Even perfect privacy and better moderation won’t fix a system where revenue scales with addiction, polarization, and junk attention; laws and platform rules must change incentives so companies profit more from helping than from hijacking users.

WORDS WORTH SAVING

5 quotes

They’re not competing for your data or your money; they’re competing to keep you using the product.

Tristan Harris

Social media is more like this manipulative environment that is tapping into our weaknesses.

Tristan Harris

If you’re not paying for the product, you are the product—but the product is your predictable behavior.

Tristan Harris

We have landed in a world where the things that we are paying attention to are not necessarily the agenda of topics that we would say, in a reflective world, are the most important.

Tristan Harris

In short, Orwell feared that what we fear will ruin us. Huxley feared that what we desire will ruin us.

Neil Postman (quoted by Tristan Harris)

The Social Dilemma’s impact and Tristan Harris’s background in persuasive tech and ethicsAttention economy: how social platforms compete for and monetize human attentionAlgorithmic recommendations, engagement feeds, and the amplification of extremism and conspiraciesPsychological manipulation: voodoo-doll user models, negativity bias, social validation, addictionGlobal consequences: polarization, teen mental health, genocide, and foreign information warfareRegulation, incentives, and analogies to environmental protection and lead abolitionPossible solutions: humane technology, Apple’s role, alternative platforms, and digital norms

High quality AI-generated summary created from speaker-labeled transcript.

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome