The Joe Rogan ExperienceThe Joe Rogan Experience

Joe Rogan Experience #1970 - Bill Ottman

Joe Rogan and Bill Ottman on decentralized social media, surveillance, AI, and UFO secrecy collide here.

Joe RoganhostBill OttmanguestJoe RoganhostBill Ottmanguest
Jun 27, 20242h 52mWatch on YouTube ↗
Decentralized social media, protocols (Nostr), and user-owned identityGovernment surveillance, encryption, Pegasus, and backdoors into devicesFree speech, censorship laws (RESTRICT Act, AB 587), and platform moderationBig Tech algorithms, Twitter under Elon Musk, and content discoverabilityAI, data scraping, OpenAI vs open-source approaches, and creator compensationAlternative platforms (Minds, Rumble, Substack) and business modelsUFOs/UAPs, secrecy, whistleblowers, and the societal impact of disclosure

In this episode of The Joe Rogan Experience, featuring Narrator and Narrator, Joe Rogan Experience #1970 - Bill Ottman explores decentralized social media, surveillance, AI, and UFO secrecy collide here Joe Rogan and Bill Ottman, founder of Minds.com, discuss decentralized social media, free speech, and how government and corporate power intersect with online platforms and surveillance. They explore encryption, Pegasus spyware, and the risks of mandated backdoors, as well as proposed laws like the RESTRICT Act and California’s AB 587 that Ottman argues are effectively censorship frameworks. The conversation widens into AI, data ownership, OpenAI vs open‑source models, and how social networks monetize attention while shaping public discourse. They close by speculating on UFO disclosure, classified information, and how much hidden knowledge—governmental and corporate—could fundamentally reshape society if revealed.

At a glance

WHAT IT’S REALLY ABOUT

Decentralized social media, surveillance, AI, and UFO secrecy collide here

  1. Joe Rogan and Bill Ottman, founder of Minds.com, discuss decentralized social media, free speech, and how government and corporate power intersect with online platforms and surveillance. They explore encryption, Pegasus spyware, and the risks of mandated backdoors, as well as proposed laws like the RESTRICT Act and California’s AB 587 that Ottman argues are effectively censorship frameworks. The conversation widens into AI, data ownership, OpenAI vs open‑source models, and how social networks monetize attention while shaping public discourse. They close by speculating on UFO disclosure, classified information, and how much hidden knowledge—governmental and corporate—could fundamentally reshape society if revealed.

IDEAS WORTH REMEMBERING

7 ideas

Decentralized identity can give users leverage over platforms.

By using protocols like Nostr and cryptographic key pairs, users can own their social graph (followers, posts, identity) and port it between apps, which limits a platform’s power to deplatform or lock them in.

Backdoors into encryption make everyone less safe, including governments.

Ottman argues that while agencies want access to private communications, any systemic backdoor weakens security for officials, citizens, and infrastructure alike, because adversaries can exploit the same vulnerabilities.

Censorship framed as fighting ‘hate’ or ‘misinformation’ is easily politicized.

Laws like California’s AB 587 and the RESTRICT Act use vague terms (hate, extremism, radicalization) that can be selectively enforced, pressuring platforms to police speech in ways that align with prevailing political narratives.

Algorithmic opacity lets platforms quietly shape economic and political outcomes.

From suppressing external links to deranking certain topics, non‑transparent recommendation systems can throttle competition, punish dissenting views, and make or break creators’ livelihoods without clear accountability.

AI systems are built on everyone’s data, raising questions of ownership and reward.

Large models like ChatGPT scrape the public internet—including creators’ work—without direct permission or compensation, suggesting future models may need revenue‑sharing or data‑rights frameworks if they’re monetized at scale.

Open-source and transparent infrastructure can counter Big Tech’s surveillance incentives.

Ottman advocates open-source code, non‑surveillance analytics, and user revenue‑sharing as ways to build trust and redistribute value, contrasting this with closed, data‑extractive systems from major platforms and cloud providers.

If major UFO claims are true, current secrecy could destabilize trust more than disclosure.

They note that leaks about Nord Stream, Twitter Files, and UAP programs already erode public confidence; a controlled, law‑backed path to greater transparency on classified programs might be less disruptive than ongoing leaks and denials.

WORDS WORTH SAVING

5 quotes

We’re decentralizing as fast as possible, getting it out of our hands so that we need to protect ourselves from ourselves.

Bill Ottman

Banning hate does not stop hate.

Bill Ottman

If it wasn’t for social media, that act would have slipped right through, like the Patriot Act did.

Joe Rogan

The US government should be stockpiling Bitcoin right now. It is a national security risk to not do that.

Bill Ottman

I don’t want to die without knowing… and having these people that I don’t know who they are, and why do they get to know?

Bill Ottman

QUESTIONS ANSWERED IN THIS EPISODE

5 questions

How realistic is it for decentralized identity protocols like Nostr to gain mainstream adoption without sacrificing usability?

Joe Rogan and Bill Ottman, founder of Minds.com, discuss decentralized social media, free speech, and how government and corporate power intersect with online platforms and surveillance. They explore encryption, Pegasus spyware, and the risks of mandated backdoors, as well as proposed laws like the RESTRICT Act and California’s AB 587 that Ottman argues are effectively censorship frameworks. The conversation widens into AI, data ownership, OpenAI vs open‑source models, and how social networks monetize attention while shaping public discourse. They close by speculating on UFO disclosure, classified information, and how much hidden knowledge—governmental and corporate—could fundamentally reshape society if revealed.

Where should the legal line be drawn between legitimate national security concerns and censorship of online speech?

Should companies developing large AI models be obligated to compensate the creators whose data they trained on, and if so, how?

Can open-source, privacy‑respecting platforms ever economically compete with highly optimized surveillance‑capitalism giants?

If definitive evidence of non‑human technology were made public, what concrete changes would you expect in geopolitics, religion, and the global economy?

EVERY SPOKEN WORD

Install uListen for AI-powered chat & search across the full episode — Get Full Transcript

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome