The Joe Rogan ExperienceJoe Rogan Experience #1792 - Daryl Davis & Bill Ottman
At a glance
WHAT IT’S REALLY ABOUT
Can Open Social Media and Dialogue Really De-Radicalize Extremists Online?
- Joe Rogan speaks with Daryl Davis and Minds.com CEO Bill Ottman about free speech, online censorship, and how to reduce extremism without banning people.
- Ottman argues that open-source, privacy-focused, and decentralized platforms like Minds can be healthier alternatives to big tech’s opaque algorithms and aggressive moderation.
- Daryl Davis explains his method of patiently engaging with Ku Klux Klan members and other extremists, showing how respectful dialogue and alternative perspectives can lead hundreds to abandon hate groups.
- They also explore the unintended radicalizing effects of deplatforming, the role of big tech ideology, digital privacy concerns, and how future tools could crowdsource moderation and credibility.
IDEAS WORTH REMEMBERING
5 ideasDeplatforming often hardens beliefs and can increase radicalization.
Ottman and Davis reference empirical research, including their paper “The Censorship Effect,” showing that banning users tends to increase their certainty in extreme beliefs and drive them into more insular, radical echo chambers.
Open-source code and transparent algorithms are essential for trustworthy platforms.
Ottman argues that any ‘alternative’ social network that doesn’t publish its source code and algorithms can’t be trusted on free speech claims because users can’t verify whether shadow banning, spyware, or hidden manipulation is occurring.
Free speech plus more *good* information works better than suppression.
Daryl Davis maintains that bad ideas should be countered with better ideas and accurate information, not silence; suppressing people only sends them to more extreme spaces where they’re never challenged productively.
Respectful, long-term dialogue can genuinely de-radicalize extremists.
Davis explains that he’s helped over 200 KKK members and neo-Nazis leave extremism by listening first, not attacking their ‘reality,’ and then offering better alternative perceptions that allow them to change their own minds.
Big tech’s moderation is shaped by internal ideology and virtue signaling.
They discuss how top executives feel pressure to appear ‘responsible’ and ‘woke,’ leading to overbroad definitions of harm that can include dissenting views on politics, pharma, or COVID, while ignoring data showing censorship backfires.
WORDS WORTH SAVING
5 quotesA missed opportunity for dialogue is a missed opportunity for conflict resolution.
— Daryl Davis
You cannot change someone’s mind if you do not platform them.
— Joe Rogan
Any app, if they’re claiming to be an alternative and they’re not open source, they should not be taken seriously.
— Bill Ottman
What can be learned can be unlearned.
— Daryl Davis
They think their lawyers are better at drafting healthy conversation than the First Amendment, and that’s just not true.
— Bill Ottman
High quality AI-generated summary created from speaker-labeled transcript.
Get more out of YouTube videos.
High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.
Add to Chrome