Joe Rogan Experience #1792 - Daryl Davis & Bill Ottman

Joe Rogan Experience #1792 - Daryl Davis & Bill Ottman

The Joe Rogan ExperienceJun 27, 20241h 54m

Narrator, Joe Rogan (host), Daryl Davis (guest), Bill Ottman (guest), Guest (brief interjection) (guest)

Big tech censorship, moderation policies, and ideological biasMinds.com’s open-source, decentralized, and free-speech-focused modelDaryl Davis’s approach to de-radicalizing KKK members and extremistsThe “censorship effect” and research on deplatforming and radicalizationDigital privacy, surveillance capitalism, and open-source phones/OSAlgorithm design, filter bubbles, and tools for exposure to opposing viewsCritical race theory, historical education, and the dangers of suppressing history

In this episode of The Joe Rogan Experience, featuring Narrator and Joe Rogan, Joe Rogan Experience #1792 - Daryl Davis & Bill Ottman explores can Open Social Media and Dialogue Really De-Radicalize Extremists Online? Joe Rogan speaks with Daryl Davis and Minds.com CEO Bill Ottman about free speech, online censorship, and how to reduce extremism without banning people.

Can Open Social Media and Dialogue Really De-Radicalize Extremists Online?

Joe Rogan speaks with Daryl Davis and Minds.com CEO Bill Ottman about free speech, online censorship, and how to reduce extremism without banning people.

Ottman argues that open-source, privacy-focused, and decentralized platforms like Minds can be healthier alternatives to big tech’s opaque algorithms and aggressive moderation.

Daryl Davis explains his method of patiently engaging with Ku Klux Klan members and other extremists, showing how respectful dialogue and alternative perspectives can lead hundreds to abandon hate groups.

They also explore the unintended radicalizing effects of deplatforming, the role of big tech ideology, digital privacy concerns, and how future tools could crowdsource moderation and credibility.

Key Takeaways

Deplatforming often hardens beliefs and can increase radicalization.

Ottman and Davis reference empirical research, including their paper “The Censorship Effect,” showing that banning users tends to increase their certainty in extreme beliefs and drive them into more insular, radical echo chambers.

Open-source code and transparent algorithms are essential for trustworthy platforms.

Ottman argues that any ‘alternative’ social network that doesn’t publish its source code and algorithms can’t be trusted on free speech claims because users can’t verify whether shadow banning, spyware, or hidden manipulation is occurring.

Free speech plus more *good* information works better than suppression.

Daryl Davis maintains that bad ideas should be countered with better ideas and accurate information, not silence; suppressing people only sends them to more extreme spaces where they’re never challenged productively.

Respectful, long-term dialogue can genuinely de-radicalize extremists.

Davis explains that he’s helped over 200 KKK members and neo-Nazis leave extremism by listening first, not attacking their ‘reality,’ and then offering better alternative perceptions that allow them to change their own minds.

Big tech’s moderation is shaped by internal ideology and virtue signaling.

They discuss how top executives feel pressure to appear ‘responsible’ and ‘woke,’ leading to overbroad definitions of harm that can include dissenting views on politics, pharma, or COVID, while ignoring data showing censorship backfires.

Users need tools to control what they see, not blanket paternalism.

Minds lets users filter content (e. ...

Digital rights require both speech freedom and tech privacy reform.

Beyond speech, they highlight how mainstream apps aggressively harvest contacts, location, and behavior; open-source phones, de-Googled Android, and decentralized messengers (e. ...

Notable Quotes

A missed opportunity for dialogue is a missed opportunity for conflict resolution.

Daryl Davis

You cannot change someone’s mind if you do not platform them.

Joe Rogan

Any app, if they’re claiming to be an alternative and they’re not open source, they should not be taken seriously.

Bill Ottman

What can be learned can be unlearned.

Daryl Davis

They think their lawyers are better at drafting healthy conversation than the First Amendment, and that’s just not true.

Bill Ottman

Questions Answered in This Episode

If deplatforming tends to increase radicalization, what would an evidence-based, large-scale alternative moderation model actually look like on Twitter or Facebook?

Joe Rogan speaks with Daryl Davis and Minds. ...

How can platforms practically distinguish between harmful disinformation campaigns and merely unpopular or dissenting opinions without sliding into ideological censorship?

Ottman argues that open-source, privacy-focused, and decentralized platforms like Minds can be healthier alternatives to big tech’s opaque algorithms and aggressive moderation.

What are the real-world limits of Daryl Davis’s dialogue approach—are there groups or individuals for whom it simply doesn’t work, and why?

Daryl Davis explains his method of patiently engaging with Ku Klux Klan members and other extremists, showing how respectful dialogue and alternative perspectives can lead hundreds to abandon hate groups.

How much personal convenience are we collectively willing to sacrifice for genuine digital privacy and open, non-manipulative platforms?

They also explore the unintended radicalizing effects of deplatforming, the role of big tech ideology, digital privacy concerns, and how future tools could crowdsource moderation and credibility.

Could a decentralized, reputation-based identity system truly help reduce trolling and bad-faith behavior without creating new risks of surveillance and social credit scoring?

EVERY SPOKEN WORD

Install uListen for AI-powered chat & search across the full episode — Get Full Transcript

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome