Joe Rogan Experience #1263 - Renée DiResta

Joe Rogan Experience #1263 - Renée DiResta

The Joe Rogan ExperienceMar 13, 20192h 7m

Joe Rogan (host), Renée DiResta (guest), Narrator

Origins and evolution of Renée DiResta’s research into online manipulationStructure and tactics of Russia’s Internet Research Agency (IRA)Tribalization strategy: building identity‑based communities and then politicizing themUse of bots, sock‑puppets, memes, and cross‑platform amplificationReal‑world activation: organizing protests, funding activists, and staging confrontationsImpact questions: polarization, tone of discourse, and unknown election effectsPlatform responsibility, moderation vs. free speech, and future threats (deepfakes, AI personas)

In this episode of The Joe Rogan Experience, featuring Joe Rogan and Renée DiResta, Joe Rogan Experience #1263 - Renée DiResta explores inside Russia’s Social Media War: Troll Farms, Tribes, And Tension Joe Rogan interviews disinformation researcher Renée DiResta about how Russia’s Internet Research Agency (IRA) systematically weaponized U.S. social media from roughly 2014–2017.

Inside Russia’s Social Media War: Troll Farms, Tribes, And Tension

Joe Rogan interviews disinformation researcher Renée DiResta about how Russia’s Internet Research Agency (IRA) systematically weaponized U.S. social media from roughly 2014–2017.

DiResta explains how Russian operators built seemingly authentic communities (LGBT, Black activism, right‑wing patriot groups, etc.), nurtured them with non‑political content, then gradually injected divisive, highly targeted political narratives.

They combined meme warfare, sock‑puppet accounts, ad buys, and real‑world event organizing to polarize Americans, depress turnout among key Democratic constituencies, and boost Donald Trump while undermining establishment Republicans and Hillary Clinton.

The conversation also explores platform responsibility, free‑speech vs. moderation dilemmas, and how emerging technologies like deepfakes and AI‑generated personas will make future information manipulation even harder to detect.

Key Takeaways

Russian operators built trust first, then injected politics into identity communities.

Pages targeting Black Americans, LGBT people, Texas secessionists, and others began with relatable cultural content and in‑group affirmation, then slowly introduced messages like “as Black people we don’t vote” or “as LGBT people we hate Mike Pence,” leveraging tribal identity to steer political behavior.

Get the full analysis with uListen AI

Most “trolls” were human sock‑puppets, not simple bots, often using semi‑automation.

The IRA employed young, internet‑savvy staff who ran multiple personas (“cyborg” accounts) that mixed automated posting with real‑time engagement, harassment, and relationship‑building, making them far harder to detect than purely automated bots.

Get the full analysis with uListen AI

They repurposed and A/B‑tested pages like a professional marketing firm.

When themes flopped, the IRA would rename and rebrand pages—e. ...

Get the full analysis with uListen AI

Russian campaigns amplified real American content and grievances, not just fabrications.

To avoid linguistic and cultural giveaways, operators often copied local news, memes from U. ...

Get the full analysis with uListen AI

Online operations frequently spilled into offline action, including protests and trainings.

The IRA organized Facebook events for dueling rallies (e. ...

Get the full analysis with uListen AI

The measurable footprint was huge, but actual persuasion impact remains largely unknowable.

Datasets show millions of posts, hundreds of pages, and hundreds of millions of engagements, yet researchers lack comment threads and downstream behavior data, so they cannot reliably say whether this “swayed the election,” only that it likely shifted tone and norms in targeted communities.

Get the full analysis with uListen AI

Future disinformation will be supercharged by AI‑generated faces, voices, and videos.

Technologies like GAN‑generated profile photos (“thispersondoesnotexist. ...

Get the full analysis with uListen AI

Notable Quotes

You can believe two things simultaneously: that Trump did not collude and that this operation still happened.

Renée DiResta

They were building tribes. ‘As Black people we don’t vote,’ ‘As LGBT people we hate Mike Pence’—that’s how they pushed people politically.

Renée DiResta

It seemed like we had moved into another level of hostility that I’d never experienced before.

Joe Rogan

People assume they’re too smart to fall for it. It’s just those liberals or those conservatives. No, it targets everybody.

Renée DiResta

I’m in one sense horrified and in another sense deeply impressed.

Joe Rogan

Questions Answered in This Episode

How can ordinary users realistically distinguish between authentic grassroots communities and long‑game influence operations that look and feel ‘real’?

Joe Rogan interviews disinformation researcher Renée DiResta about how Russia’s Internet Research Agency (IRA) systematically weaponized U. ...

Get the full analysis with uListen AI

What types of regulatory or oversight frameworks could pressure platforms to address algorithmic amplification without becoming de facto censors of political speech?

DiResta explains how Russian operators built seemingly authentic communities (LGBT, Black activism, right‑wing patriot groups, etc. ...

Get the full analysis with uListen AI

Given that Russian campaigns mainly amplified existing divisions, what responsibility do domestic media and political elites bear for the underlying polarization?

They combined meme warfare, sock‑puppet accounts, ad buys, and real‑world event organizing to polarize Americans, depress turnout among key Democratic constituencies, and boost Donald Trump while undermining establishment Republicans and Hillary Clinton.

Get the full analysis with uListen AI

How should democracies prepare their citizens—through education or media literacy—for a world where AI‑generated humans and deepfake events are commonplace?

The conversation also explores platform responsibility, free‑speech vs. ...

Get the full analysis with uListen AI

At what point does social‑media manipulation cross the line from propaganda into an act of aggression that should trigger a formal state‑level response?

Get the full analysis with uListen AI

Transcript Preview

Joe Rogan

... people though, they really are. It's just, it's fucking hard business, especially when you didn't see it coming. Two, one. (hands clap) Hello, Renee.

Renée DiResta

Hello.

Joe Rogan

Thanks for doing this. I really appreciate it.

Renée DiResta

Thanks for having me.

Joe Rogan

Uh, I listened to you on Sam Harris's podcast and I was utterly stunned. I had to listen to it twice 'cause I just couldn't bel- let's get into, let's get into this from the beginning. How did this start out? How did you start researching these, uh, online Russian trolls and bots and, and all this jazz?

Renée DiResta

Yeah, so a couple years back, in around 2015, um, I, I had had my, my first baby in 2013 and I was getting on these preschool lists, and what I decided to do was I started looking at, um, anti-vaccine activity in California because I had a kid and I wanted to, uh, you know, put him on a preschool list where I was gonna fit with the parents basically, um, as someone who vaccinates. And I started looking at the way that small groups were able to kind of disproportionately amplify messages on social channels. And some of this was through very legitimate activity and then some of it was through really kind of coordinated deliberate attempts to kind of game, um, ways that algorithms were amplifying content, amplifying particular types of, uh, narratives. And I thought it was interesting and I started writing about it, and I, um, I wound up writing about ways in which, um, hashtag gaming, um, ways in which people were kind of using automation to just be in a hashtag all the time, so it was kind of a way to really gain control of share of voice and what that meant when very small groups of people could achieve this kind of phenomenal amplification and what the pros and cons of that were. And then this was, um, 2015, so the way that, that this sort of, um, awareness of social media challenges came, came about was actually when I was working on this, other people were looking at it from the same, um, looking at the same tactics but how they were being used by ISIS, by the terrorist organization. And there also you had this very small group of people that managed to use bots and amplification to really kind of own a narrative, really push this, this brand, this, this digital caliphate to kind of build it on all social platforms almost simultaneously, and the ways in which information was hopping from one platform to another, um, through kind of deliberate coordination and then also just ways in which, uh, information flows kind of contagion style. Um, and I wound up working on, uh, thinking about how the government was going to respond to the challenge of terrorist organizations using American social platforms to spread propaganda. Uh, so what we came to realize was that there was just this information ecosystem and it had evolved in a certain way over a period of about eight years or so, and the kind of unintended consequences of that. And the way that Russia kind of, uh, came into the conversation was around October 2015 when we were thinking about what, what to do about ISIS, what to do about terrorism, uh, and, and terrorist, uh, you know, kind of proliferation on social platforms. This was right around when Adrian Chen had written the article The Agency for The New York Times, and that was one of the first big exposés of the Internet Research Agency, the first time an American journalist had gone over there and actually met the trolls, been in St. Petersburg, and began to write about what was happening over there, and the ways that they had pages that were targeting certain facets of American culture. So while we were in DC talking about what to do about terrorists using these platforms to spread propaganda, there were beginning to be rumblings that Russian intelligence and, you know, Russian entities were doing the same thing. And so the question became, can we think about ways in which the internet is vulnerable to this type of manipulation by anyone and then, um, a- and then come up with ways to stop it? So that was how the, the Russia investigation began, was actually around 2015, a handful of people started looking for evidence of Russian bots and trolls, uh, on social platforms.

Install uListen to search the full transcript and get AI-powered insights

Get Full Transcript

Get more from every podcast

AI summaries, searchable transcripts, and fact-checking. Free forever.

Add to Chrome