Skip to content
The Joe Rogan ExperienceThe Joe Rogan Experience

Joe Rogan Experience #1263 - Renée DiResta

Renée DiResta is the Director of Research at New Knowledge and a Mozilla Fellow in Media, Misinformation, and Trust.

Joe RoganhostRenée DiRestaguest
Mar 13, 20192h 7mWatch on YouTube ↗

EVERY SPOKEN WORD

  1. 0:0015:00

    ... people though, they…

    1. JR

      ... people though, they really are. It's just, it's fucking hard business, especially when you didn't see it coming. Two, one. (hands clap) Hello, Renee.

    2. RD

      Hello.

    3. JR

      Thanks for doing this. I really appreciate it.

    4. RD

      Thanks for having me.

    5. JR

      Uh, I listened to you on Sam Harris's podcast and I was utterly stunned. I had to listen to it twice 'cause I just couldn't bel- let's get into, let's get into this from the beginning. How did this start out? How did you start researching these, uh, online Russian trolls and bots and, and all this jazz?

    6. RD

      Yeah, so a couple years back, in around 2015, um, I, I had had my, my first baby in 2013 and I was getting on these preschool lists, and what I decided to do was I started looking at, um, anti-vaccine activity in California because I had a kid and I wanted to, uh, you know, put him on a preschool list where I was gonna fit with the parents basically, um, as someone who vaccinates. And I started looking at the way that small groups were able to kind of disproportionately amplify messages on social channels. And some of this was through very legitimate activity and then some of it was through really kind of coordinated deliberate attempts to kind of game, um, ways that algorithms were amplifying content, amplifying particular types of, uh, narratives. And I thought it was interesting and I started writing about it, and I, um, I wound up writing about ways in which, um, hashtag gaming, um, ways in which people were kind of using automation to just be in a hashtag all the time, so it was kind of a way to really gain control of share of voice and what that meant when very small groups of people could achieve this kind of phenomenal amplification and what the pros and cons of that were. And then this was, um, 2015, so the way that, that this sort of, um, awareness of social media challenges came, came about was actually when I was working on this, other people were looking at it from the same, um, looking at the same tactics but how they were being used by ISIS, by the terrorist organization. And there also you had this very small group of people that managed to use bots and amplification to really kind of own a narrative, really push this, this brand, this, this digital caliphate to kind of build it on all social platforms almost simultaneously, and the ways in which information was hopping from one platform to another, um, through kind of deliberate coordination and then also just ways in which, uh, information flows kind of contagion style. Um, and I wound up working on, uh, thinking about how the government was going to respond to the challenge of terrorist organizations using American social platforms to spread propaganda. Uh, so what we came to realize was that there was just this information ecosystem and it had evolved in a certain way over a period of about eight years or so, and the kind of unintended consequences of that. And the way that Russia kind of, uh, came into the conversation was around October 2015 when we were thinking about what, what to do about ISIS, what to do about terrorism, uh, and, and terrorist, uh, you know, kind of proliferation on social platforms. This was right around when Adrian Chen had written the article The Agency for The New York Times, and that was one of the first big exposés of the Internet Research Agency, the first time an American journalist had gone over there and actually met the trolls, been in St. Petersburg, and began to write about what was happening over there, and the ways that they had pages that were targeting certain facets of American culture. So while we were in DC talking about what to do about terrorists using these platforms to spread propaganda, there were beginning to be rumblings that Russian intelligence and, you know, Russian entities were doing the same thing. And so the question became, can we think about ways in which the internet is vulnerable to this type of manipulation by anyone and then, um, a- and then come up with ways to stop it? So that was how the, the Russia investigation began, was actually around 2015, a handful of people started looking for evidence of Russian bots and trolls, uh, on social platforms.

    7. JR

      So 2015, if we think about social media and the birth of social media, essentially it had only been alive for, I mean, what was Twitter 2007 I believe?

    8. RD

      I think so, yeah.

    9. JR

      Something like that. So eight years, like eight years of social media, and then all of a sudden they figured out how to game this system and then they figured out how to use this to make people argue against each other.

    10. RD

      Yeah, so I think, so there was this, uh, if you go back to like, um, remember like Geocities and they-

    11. JR

      Yes, sure.

    12. RD

      (laughs) Okay, AOL used that.

    13. JR

      Yeah, of course.

    14. RD

      Um, so we're probably about the same age. Um, so there have always been, you know, kind of, uh, the, the thing, the thing that was great about the internet, like internet 1.0 we can call it, right-

    15. JR

      Mm-hmm.

    16. RD

      ... was this idea that, uh, everybody was given a platform and you could use your platform, you could put up your blog, you could say whatever you wanted. Um, you didn't necessarily get attention, but you could say whatever you wanted. And so there was this kind of consolidation as, as social platforms kind of came into existence, content creators were really excited about the fact that now they not only had this, um, this access to write their own stuff, but they also had access to this audience because as the network effects got more and more pronounced, more and more people came to be on social platforms. And it originally wasn't even Facebook, if you remember, it was like, you know, there was like Friendster and MySpace, and social networks kind of evolved. When I was in college, Facebook was still limited to like, you know, a handful of like Ivy League schools, and so I wasn't even eligible. (laughs) And, um, as you watch this consolidation happen, you, you start to have this information ecosystem really dominated by a handful of companies that grow very large because they're providing a service that people really want, um, but there's a kind of mass consolidation of audiences onto this handful of platforms. So this becomes really interesting for regular people who just want to find their friends, reach people, spread their message, grow an audience. It also becomes really interesting for propagandists and trolls, and in this case, terrorist organizations and state intelligence services, 'cause instead of reaching the entire internet, they really just kind of have to concentrate their efforts on a handful of platforms. So that consolidation is one of the things that kind of kicks off some of the, um, one o- one of the reasons that we have these problems today.

    17. JR

      Right, so the, the fact that there's only a Facebook, a Twitter, an Instagram, and a couple other...... minor platforms other than YouTube. I mean, anything that, uh, uh, y- you can tell it's an actual person. Like YouTube is a problem, right, because you could see it's an actual person. If you're, if you're narrating something y- you, you know, if you're in front of the camera and explaining things, people are gonna know that you're an actual human being.

    18. RD

      Mm-hmm.

    19. JR

      Whereas there's so many of these accounts that I'll go to, like I'll, I'll watch people get involved in these little online beefs with each other, and then I'll go to some of these accounts, I'm like, "This doesn't seem like a real person." And I'll go and it's like #MAGA, there's a, uh, American eagle in front of a flag, and then you, you read their stuff and you're like, "Wow, this is, this is probably a Russian troll account." And it's strange, like you feel like you're not supposed to be seeing this, like you're seeing the wiring under the board or something, and then you'll go through the timeline and all they're doing is engaging people and arguing, you know, for Trump and against, you know, whatever the fuck they're angry about, whatever, whatever it is that's being discussed, and they're, they're basically just like some weird little argument mechanism.

    20. RD

      Yeah, so in 2016, um, there was a lot of that during the presidential campaign, right? And there were, um, there was so much that was written. You know, we can go back to the free speech thing we were kinda chatting about before. There was so much that was written about, um, harassment and trolling and negativity and these kind of hoards of accounts that would brigade people and harass them.

    21. JR

      Yeah.

    22. RD

      Of course, a lot of that is just real Americans, right? There are plenty of people who are just assholes on the internet.

    23. JR

      Sure.

    24. RD

      Um, but there were actually a fair number of these as we began to do the investigation into the Russian operation in, uh, it, it started on Twitter in about, um, 2014 actually. So 2013, 2014, the Internet Research Agency is targeting Russian people, so they're tweeting in Russian, @Russian and Ukrainian, uh, folks, people in their sphere of influence, so they're already on there, they're already trying this out. And what they're doing is they're creating these, uh, these, these accounts, it's kinda wrong to call them bots 'cause they are real people, they're just not what they appear to be. So I think the unfortunate term for it has become like cyborg, like semi-automated, you know, sometimes-

    25. JR

      Hmm.

    26. RD

      ... it's automated, sometimes it's a real person, but a sock puppet is the other way that we can-

    27. JR

      Hmm.

    28. RD

      ... refer to it, a person pretending to be somebody else. So you have these sock puppets and they're out there and they're tweeting in 2014 about the Russian annexation of, uh, Crimea, or about MH17, that plane that went down, which Russia, you know, of course had no idea what happened and it wasn't their fault at all. And gradually, as they, um, begin to experience what I imagine they thought of was success, that's when you see some of these accounts pivot to targeting Americans. And so in 20- late 2014, early 2015, you start to see the, um, the strategy that for a long time had been very inwardly focused, making their own people think a certain way or feel a certain way or have a certain experience on the internet, uh, it begins to, to spread out, it begins to, uh, to look outwards. And so you start to see these accounts communicating with Americans. And as we were going through the datasets, which the Twitter dataset is public, anyone can go and look at it at this point, um, you do see some of the accounts that are kind of, um, you know, that were, that were somewhat notorious for being really virulent, nasty trolls, um, antisemitic trolls going after journalists, you know, some of these accounts, um, being revealed as actually being, uh, Russian trolls. Now it doesn't kind of, um, exculpate the actual American trolls that were very much real and active and part of this and expressing their opinion, uh, but you do see that they're mimicking this, they're using that same style of tactic, that harassment to, to get at real people.

    29. JR

      And if they do get banned, if their account gets banned, they just simply make another account. They use, uh, some sort of a, you know, um, what is it? A virtual, virtual server. What is that called?

    30. RD

      You mean VPNs or...

  2. 15:0030:00

    And they created it.…

    1. RD

      it wasn't like Renee deciding this was IRA, it was, uh, the platforms giving it to, uh, to our government. And the information in there, um, what it showed was that across all platforms, across Twitter, across Facebook, Instagram, YouTube, they were building up tribes. So they were really working to create distinct communities of distinct types of Americans and that would be, for example, there's an LGBT page that is very much about LGBT pride, there's, um...

    2. JR

      And they created it.

    3. RD

      And they created it.

    4. JR

      And they f- they curate it and they...

    5. RD

      Create it, curate it, um, it has a, you know, their, it's like a persona, a lot of the posts on the LGBT page were written by what sounded kind of like a millennial lesbian was the voice, um, so it was a lot of, um, you know, memes of LGBT actresses and they would brand it with a specific brand mark, it was a rainbow heart. Um, uh, LGBT United was the name of the page, it had a matching Instagram account, which you would also expect to see from a media property, right? You would expect them to see in both places, and this, um, you know...

    6. JR

      What were they pushing?

    7. RD

      It read like a, it read like a, a young woman talking about, um, crushes on actresses and things, actually. You know, it was, it was, it was really, besides the sometimes wonky English, virtually indistinguishable from what you would read on any kind of like young millennial focused, um, social page. It, it wasn't, uh, none of it was radical or divisive, it wasn't like, um ... the way that they got the division across was they built these tribes where they're reinforcing in-group dynamics. So you have the LGBT page, you have, uh, numerous pages targeting the Black community, that was where they spent most of their energy, a lot of pages targeting, um, far right. So, uh, both old far right, meaning, um, people who are very concerned about what does the future of America look like, and then young far right, which was much more angry, much more like trolling culture. So they recognize that there's a divide there, that the kinds of memes you're gonna use to target younger right-wing audiences are not the same kinds of memes you're gonna use to target older right-wing audiences.

    8. JR

      Mm.

    9. RD

      So there's a tribe for older right wing, younger right wing, in the Black community there's a Baptist tribe, there's a Black liberation tribe, there's a, uh, Black women tribe, there's one for people who have incarcerated spouses, there's a, uh-

    10. JR

      Wow.

    11. RD

      ... Brown, um, Brown Power I believe was the name of it, page, that was very much about, um, Mexican and, uh, Chicano culture. There was Native Americans United. Uh-

    12. JR

      And all of these are fake.

    13. RD

      All these are fake.

    14. JR

      All these are fake. And what are they trying to do with all these?

    15. RD

      So you build up this in-group dynamic and over, and they did this over years, so this was not a, a short-term thing. They started these pages in 2014, 2015 timeframe, most of them. Um, they started some other ones that were much more political later, and we can talk about the election if you want to, but with this tribal thing, um, you're building up tribes so you're saying like, "As Black women in America, this is, um, here's posts about things that we care about. Here's posts about Black hair, here's posts about child rearing, here's posts about, um, fashion and culture." And then every now and then, there would be a post that would reinforce, like, "As Black people, we don't do this and so..." or "As LGBT people, we don't like this." And so you're building this rapport, so like me and you, we're having a conversation, we're developing a relationship on this page over time-And then I say like, "As this kind of person, we don't believe this." So it's a way to subtly influence by appealing to an in-group dynamic or appealing to like, as members of this tribe, as LGBT people, of course we hate Mike Pence.

    16. JR

      Hmm.

    17. RD

      As Black people, of course we're not going to vote because, you know, we hate Hillary Clinton because we hate her husband. As, um, as people who are concerned about the future of America, as Texas secessionists, you know. So, so everything is presented as members of this tribe, we think this. As members of this tribe, we don't think this.

    18. JR

      But a lot of the posts-

    19. RD

      And that's where you see the ... Go ahead.

    20. JR

      Sorry, but a lot of the posts were not even political, they were just sort of affirming the standards of the tribe.

    21. RD

      Yes.

    22. JR

      So they were kind of setting up this whole long game.

    23. RD

      Yep.

    24. JR

      And then once they got everybody on board, how many followers are these, do these pages have?

    25. RD

      So the, there was kind of a long tail. There were, um, I think 88 pages on Facebook and 133 Instagram accounts, and I would say maybe 30 of the Facebook pages had over 1,000 followers, which is not very many, and then, um, maybe the top ten had upwards of 500,000 followers.

    26. JR

      Hmm.

    27. RD

      So there's, you know, same way you're on any social campaign, sometimes you have hits, sometimes you have flops.

    28. JR

      Right.

    29. RD

      And what was interesting with the flops is you would see them repurpose them.

    30. JR

      Hmm.

  3. 30:0045:00

    Yes, it does. …

    1. RD

      narrative of Jill Stein. So you have the left-leaning pages, totally anti-Clinton, and then you have the right-leaning pages, staunchly pro-Trump and also strongly anti-Cruz, anti-Rubio, uh, anti-Lindsey Graham, basically anti every now what's called "establishment" Republican. Um, and there's this, uh, kind of pushing of people to opposite, opposite ends of the political spectrum. This is where you get at the conversation around facilitating polarization. So not just... Um, it, it wasn't enough to just support Donald Trump, it was also necessary to strongly disparage, um, the kind of traditional conservative, moderate center right in the course of amplifying the Trump candidacy.Does that make sense?

    2. JR

      Yes, it does.

    3. RD

      I know it's a lot of stuff.

    4. JR

      Yeah, it is a lot of stuff but, i- it does make sense and one of the things that was really bizarre to me watching the election, a- and I was trying to figure out, is this because Trump is so bombastic and he's so outrageous and he's just a different person, that the- the way I was describing it on stage was that like finally the assholes have a king.

    5. RD

      (laughs)

    6. JR

      Because they never had a king before. Like everyone who was running for president was at least mostly dignified. I mean, b- basically, it's really difficult to go back in time and find someone who isn't.

    7. RD

      Yeah.

    8. JR

      Find someone who d- there's no one who insults people like he does. I mean, he insults people's appearances, he calls them losers, he called Stormy Daniels horse face. I mean, he- he says some outrageous shit. So part of it was me thinking like, "Wow, maybe he's just ignited an emboldened..." I actually had this conversation with my wife today. She was like, uh, "I've- it feels like racism is more prevalent." Like it's more, it's more accepted. People feel more emboldened, because they're- in their mind, they think he is a racist, "I can get away with more things, Trump is president." Like there's actually videos of people saying racist shit and saying, "Hey, Trump's president now. We can do this." So, I was thinking that, well maybe that's what it was, it's just sort of like some rare flower that only blooms under the right conditions, poof, it's back, right? But when you think about the influence that these pages have had in establishing communities and this long game that they're playing, like the LGBT pages, even though they're shitting on Trump, they really wanna support Jill Stein because they know that'll actually help Trump because it'll take votes away from Hillary Clinton. That- they m- it seems different. Like political discourse, discussions online and social media, the way social media reacted, I mean there was a lot of people that were anti-Obama, uh, before, you know, m- either- e- either of his elections that he won. But it seemed different. It seemed different to me than this one. This one seemed like, l- like we had moved into another level of hostility that I'd never experienced before, and another level of division between the right and the left that I'd never experienced before. And, uh, a- uh, b- like a willingness to engage with really harsh, nasty comments, and just to- to dive into it. You would see it all day. I mean there's- there were certain Twitter followers that I think they're pretty much human beings, but I would follow them and they would just be engaged with people all day long. Just shitting on people and criticizing this and insulting that, and i- it seemed like, it seemed dangerous. It seemed like things had moved into a much more aggressive, uh, much more hostile and confrontational sort of chapter in American history. If this was all done at the same time as it's happening, w- how much of an influence do you think this IRA agency had on all this stuff?

    9. RD

      That's the, uh, that's the question that we would all like the answer to and I unfortunately can't give it.

    10. JR

      Yeah.

    11. RD

      And so, let me-

    12. JR

      In your mind though.

    13. RD

      Yeah, let me, let me, let me kinda caveat that. Um, the thing that we don't have, that nobody who looks at this on the outside has, is we can't see what people said in response to this stuff.

    14. JR

      Mm-hmm.

    15. RD

      So I've looked at now almost t- 200,000 of these posts. Um, that's what I spent most of last year doing, was- was this- was this research. And we can see that they have thousands of engagements, thousands of comments, thousands of shares. We have no idea what happened afterwards, and that's the problem. So when- once the stuff comes down, it's really hard to go back and piece it together. So I can see that there are some of, per your point, the- the really, really just fucking horrible troll accounts that they ran. They didn't necessarily have a lot of followers but you see them in there, like, @ing people. So they're, you know, @ and then the name of a reporter-

    16. JR

      Mm-hmm.

    17. RD

      ... @ the name of a prominent person and so they're in their kinda like draft on the popularity of, you know, famous people basically.

    18. JR

      Mm-hmm.

    19. RD

      Um, and they're just saying, like, horrible shit. (laughs) And- and it's- the tone is so spot on. And one thing that was interesting with a couple of them is like if you go and you look at their profile information, which was also made public, they would have like, um, they would have a b- uh, like a Gab account in their- in their profile that was like, so they would, um... So it was a remarkable piece of- of- of kind of the culture in which you see that like, they're actually sitting on Gab too, right? And- and so they can also go and they can draw on, they're in Reddit, there's you know, 900 or something, uh, troll accounts were found on Reddit, they're on Tumbler. And so they're just picking the most divisive content and they're pushing it out, uh, into communities and at the same time we can see that they're doing it, but we can't see what people do in return. We can't see did they just block? Did they have to fight back? Did-

    20. JR

      Right.

    21. RD

      Was there a huge, you know, when this happens on a Facebook page, um, and they're doing something like telling black people not to vote, um, "As black people we shouldn't vote," um, what do people say in response? And that's the piece that we don't have. So when we talk about impact, a lot of the impact conversation is really, uh, focused on did this swing the election? We don't have, nothing that I've seen has the answer to that question. Um, the other thing is, yeah but- but the second question, the thing when I think about impact, I think from- from, I think you and I agree on this, um, it also matters how does this change how people relate to each other? And we have no real evidence of, that we have no information on that either. This is the kind of thing that lives in some, you know, Facebook has it, the rest of us haven't seen it.

    22. JR

      Now are most of these people, is this mostly Facebook? Is it mostly Twitter? Uh where- what does- how does it break down?

    23. RD

      Yeah, so there were, um... Here are my like little stats here 'cause I didn't want to give you the wrong data. There were 10.5 million tweets of which about six million were original content created by about 3,800 accounts. Um-There were about 133, let me just read it, 133, um, Instagram accounts with about 116,000 posts, and then 81 Facebook pages and 17 YouTube channels with about 1,100, um, videos. And so they got about 200 million engagements on Instagram and about another 75 million or so on Facebook. Um, engagements are like, likes, shares, comments, reactions, you know. Um, so it's hard to contextualize, but what we think happened, you know, you can go and you can try to look at, um, how well did this content perform relative to other real authentic media targeting these communities? And what you see with the Black community in particular is their Instagram game was really good. Um, their... so their... on their Instagram accounts, the, you know, the top five, three of them targeted the Black community and got, you know, um, 10s to 100 millions of engagements.

    24. JR

      Wow.

    25. RD

      So I would have to pull up the exact number-

    26. JR

      Is it mostly memes?

    27. RD

      I don't know it off the top of my head. Yeah, it's um, it's... on Instagram it's all memes. And then, you know, so we have the memes and then we have the text. On Instagram you can't really share, so it's amazing that they got the kind of engagement that they did even without the sharing function.

    28. JR

      Mm-hmm.

    29. RD

      But one of the things you can do is if you know the names of the accounts and they're... a lot of them are out there publicly now, um, you can actually see them in regram apps. So, uh, people were regramming the content. So Facebook says about 20 million people, excuse me, engaged with the Instagram content. But what isn't included in that is all of the regrams of the content that were shared by other accounts. So the spread and the dispersion of this, it's an interesting, uh, an interesting thing to try to quantify because we have engagement data, but we don't know did it change hearts and minds? We don't know if it-

    30. JR

      Right.

  4. 45:001:00:00

    There was one that…

    1. JR

      underneath the memes. And I go to it, I'm like, "What, what exactly are they doing here?" Like, "What exactly are they trying to do with these?" Because they just, they're, they're very weird.

    2. RD

      There was one that I came across. Um, I was looking at the, uh, the, the conversation around GMOs, and because we have seen ... One of the things that Russia does besides the social bots and, you know, the Amer- you know, screwing with like Americans directly, um, is the House ... So this was a Republican House committee, the House, uh, Science and Technology Committee, about a year ago said that they were seeing evidence of, um, both kind of overt propaganda and then ways of, um, disseminating the propaganda. So there's always the dissemination and then the accounts and then the content. So it's like you look at three different things to try to get a handle on whether or not this is real or fake. Um, so when we talk about the accounts, we're looking at are they real people, are they, you know, automated, are they not automated? When you're looking at the content you're usually looking at the domains. Um, and that's kind of the last piece, because you don't want to have any kind of bias get in there, but you're just trying to see is it being pushed through like overt, uh, Russian propaganda domains, like their think tanks and things? And then the third is the dissemination pattern. Is it being pushed out through automated accounts? Is it spreading in ways that, um, look anomalous versus how normal information would spread? So one of the things that the House committee looked at was using that kind of rubric, um, Russian, uh, you know, these dubious, um, pieces of content and narratives around American strategic industries. So the energy industry, oil and fracking for example. Or the, um, uh ... You see a lot of stuff with, um, GMOs and agriculture, you know, this very, this narrative of, um, you know, Putin and Russia being the land of organic plenty and the United States, uh, serving its people toxic, poisoned vegetables, this, this sort of stuff. And meanwhile at the same time there's competition for who's going to get the, you know, large contract to provide rice to some part of the world. So, so there's like an economic motivation underlying this, this kind of narrative. And I was looking at one of these accounts and it was tweeting an article about Hillary Clinton, a vote for Hillary is a vote for Monsanto. But it was tweeting this in (laughs) you know, just like three months ago or something. It was like mid-2018 or late 2018 when I was looking at this. I'm like, "Well, that ship sailed a long time ago, guys." (laughs)

    3. JR

      (laughs) Yeah.

    4. RD

      Why are we, why are we tweeting about, you know, Hillary's votes? It's because they just, um, they're just there to ... It was written by a Russian think tank, um, and so they're, they're just of these automated accounts retweeting, repurposing this content from forever ago. And it, it doesn't even make sense. It's just out there to amplify a particular point of view or bump up, uh, mentions of a site.

    5. JR

      What were they trying to do with the anti-vaccine posts?

    6. RD

      Yeah. That was, uh, that was an interesting thing. So I would say not much, to be honest. Um, the, there were 900, maybe 800 I think, um, tweets about vaccines in the content. And so Facebook and Twitter, you have this ... Oh, sorry, Facebook and Instagram, you have this building up of tribes. Twitter, you have instead they're just talking about whatever is popular, right? They're ta- they're shitposting, they're talking about whatever is current and new, whatever scandal has just broken anywhere in, you know. Um, so Twitter is less about establishing relationships and more about joining the conversation and nudging it. And so most of the vaccine-related posts, it was not a big theme for them. It wasn't something that was on like Facebook and Instagram where that's where they're really leaning in, like, "This is what we want Americans to think about." So no mention of vaccines on those platforms, none on YouTube. On Twitter you see it in 2015, funny enough during the Disneyland measles outbreak. Um, much like there's a whole lot of conversation around vaccines right now because of the outbreaks in Washington and New York, uh, back in 2015 you saw the same thing. Lots of conversations about measles because of the Disneyland thing that happened down here. And so they're in there and they're saying, um...... vaccinate your kids, don't vaccinate your kids. They had a couple of conspiracy theorist accounts. Um, I am trying to remember the name. It was a, uh, it looked like a blonde woman. I think its name was Amy. Um, and, uh, Amy was a, was a conspiracy theorist, so (laughs) -

    7. JR

      And Amy was a fake person.

    8. RD

      Amy was a fake person, yeah. And I, I-

    9. JR

      Wow.

    10. RD

      ... wish I could remember.

    11. JR

      Was it a Twitter page?

    12. RD

      Yeah, it was a Twitter account. Um, God, what the hell was her name? She w- Amy Black? Um, there, there are certain av- certain of their personas actually got a lot of lift. There was one called Woke Louisa that was a Black woman. There was, uh-

    13. JR

      (laughs)

    14. RD

      Yeah, I mean, they nail it, right? The (laughs) -

    15. JR

      (laughs)

    16. RD

      ... they're, they're not dumb. Um, there was TN GOP, the fake Tennessee GOP page and Twitter account-

    17. JR

      How much autonomy do you think these people have that are creating these things? I mean, are they-

    18. RD

      I think they-

    19. JR

      ... are they creative? Are they ... It sounds like some of them are actually pretty funny.

    20. RD

      Yeah, they are funny. They are funny. That's why they work. That's the-

    21. JR

      Yeah.

    22. RD

      Everybody thinks it's just, like, you know, incompetent shit; it's not. It's actually really good. That's where ... That, that's, I think, the thing that ... You know, even with, um, whatever, you know, political proclivities you may have, I think you can at least recognize humor, even if it's laughing at your side.

    23. JR

      Mm-hmm.

    24. RD

      And I will say that some of the stuff, especially targeting the, the right wing, you know, the right wing, like, youth kind of pages were ... they were funny. They really were.

    25. JR

      Yeah.

    26. RD

      And it was ... I think that is ... People assume that, like, they're too smart to fall for it. Um, it's just those liberals or it's just those conservatives or, you know. Uh, it's, it's really, it, it targets everybody. And they understand the psychology, the motivation, the narratives, and the culture, and they produce the content accordingly. And, you know, I imagine that they had a grand old time doing it 'cause there was some stuff in, um, two narratives that came out in 2017. Um, the first was when Facebook started to moderate their pages, they started to scream about how Facebook was censoring them. So the exact same narrative (laughs) that you see today about how any moderation is censorship, it's, you know, a picture of, like, um, Zuckerberg and it's like, "Nice page you've got. Be a shame if anything happened to it." You know, and that's the meme that they're putting out there when they're complaining that their fake page got taken down.

    27. JR

      Mm-hmm.

    28. RD

      Um, there was, uh, tons and tons of, uh, these memes also about, um, the Russians did it, mocking the idea that the Russians did it. Uh, so this is ... As the story is beginning to come out, before we've had the tech hearings, before we've had the Mueller indictments, before we've had the investigation, you see these memes, um, where it's like, "Oh, my speedometer was broken. It must have been the Russians," or picture of Hillary Clinton and it's, like, uh, in a, like, Little Golden Book kind of thing, and it's like the whiny child's guide to blaming Russia for your failures. You know, and it's, again, it's funny. Like, the stuff is funny. And they're, like, meta-trolling. You imagine them sitting there, like, you know, they have a picture of, like, some, like, you know, buff guy carrying a gun and they're like, "I'm not a Russian troll, man. I'm an American." (laughs) And I'm like-

    29. JR

      Wow.

    30. RD

      ... "Okay." So you're looking at this and you're like, "It's just so spot on." And again, I can't see what the people commented under it, if they were like, "Right on," or if they were like, "Ah, this is bullshit." Um, but so that's where you get at the ... They ... It, it ... You know, people think, like, "Oh, I'm too smart to fall for it," or, "Oh, this is targeting those other people." No, it isn't. That's the problem. It's just, um, it's gonna target you with the thing that you're most likely to be receptive to just because of psychological bias and, and tribal affiliation. And you're not sitting there thinking, "How is this person who is purportedly just like me screwing with me?" And that's why, uh, that's why it does manage to attract a following and get retweeted, get reshared. Um, it's good.

  5. 1:00:001:15:00

    (laughs) …

    1. RD

      we are."

    2. JR

      (laughs)

    3. RD

      "Look at over there. It's the motherfucking enemy."

    4. JR

      (laughs)

    5. RD

      (laughs) I think a couple times there were comments, um, on some of the like archived pages and things where you could see the screenshots of people being like, "Dude, you hel- us all come out there and like nobody showed up."

    6. JR

      Right. (laughs)

    7. RD

      (laughs)

    8. JR

      Well ...

    9. RD

      Who is in charge? You know? (coughs)

    10. JR

      They're s- they're probably throwing a lot of things against the wall hoping that they stick.

    11. RD

      Hoping something sticks. Well, you see this on the, um ... There was a page called Black Matters, and Black Matters was interesting 'cause they went and made a whole website. So they made a website, blackmattersus.com, which I think is still active. It's dormant, they're not updating it, but I believe you can go and, and read it. Um, and it was designed to call attention to police brutality type things. And so they have this Black Matters US page, and then there's the Black Matters Facebook page, the Twitter account, the Instagram page, the YouTube channel, the SoundCloud podcast. (laughs)

    12. JR

      Wow.

    13. RD

      The Tumblr, the s- Facebook stickers. They had Facebook stickers that looked like little black panthers, like little, um-

    14. JR

      Really?

    15. RD

      ... little cats. Yeah, little black cats. They were actually really cute, very well done. Um, so you have this entire fake media ecosystem that they've just created out of whole cloth, all theirs. And then what they start to do is they start to put up job ads-And so it's, "Come be a designer for us. Come be a, uh, come write for us. Come photograph our protests. Come ..." Um, you know, they, they had like a kind of like, um, Black guy dressed in a cool outfit, like hipster, you know, holding a sign like, "Join Black Matters." Um, you see them go through a couple different logos the same way you would if you were starting a, you know, media brand. Um, they start posting ads for, "Do you want to be a calendar girl? Send us your photos. Uh, do you want to be on a Black reality TV show? Send us video clips of you." Um, do you ... Like, they begin to do real work to ingratiate themselves with the community. They had a physical fitness thing, it was called Black Fist and the idea was that it was, um, kind of vaguely militant-esque, um, in that it was supposed to teach, uh, Black people how to handle themselves at protests should there be police violence, how to fight back. And they actually went and found a guy, a physical fitness, you know, a, a martial arts, um, guy and they were paying him via PayPal. So (laughs) so, so he was running classes for the Black community under this Black Fist, um, brand, and they would like text him or call him. He, he played some of the voicemails on TV, actually I heard them. Um, (coughs) after my report came out, I think they tracked him down and, um, and he just talks about how they, yeah, they just PayPaled him, you know, a couple hundred bucks every time he ran a fitness class.

    16. JR

      What were the voicemails like?

    17. RD

      It was, um-

    18. JR

      "Hello. We are fellow Black men-"

    19. RD

      (laughs)

    20. JR

      "... concerned about police."

    21. RD

      They, you'd, you'd be (laughs) you'd be surprised. They actually had a YouTube channel with, um, two Black men named Williams and Calvin and there was this channel, Williams and Calvin, and-

    22. JR

      They were actual Black men?

    23. RD

      ... it's actual, yeah, yeah, actual Black men.

    24. JR

      So they hired these gentlemen?

    25. RD

      They hired these guys, um, to be a fake YouTube channel, um, and it was a, it was called A Word of Truth, I think was the name of it. And so Williams and Calvin, these two guys, would, um, would give their word of truth. And their word of truth was usually about, you know, how fucked up America is, which, I mean, there, there are very real grievances underlying all of this and that's the problem, right?

    26. JR

      Right.

    27. RD

      They have things to exploit. Um, and-

    28. JR

      Were they writing these things for these gentlemen?

    29. RD

      I imagine. I mean, I imagine they were.

    30. JR

      But they were definitely paying them and they organized the channel?

  6. 1:15:001:18:28

    Section 6

    1. RD

      wrote about it. Um, a guy by the name of Jonathan Albright, uh, found a CrowdTangle data cache and with that we got the names of a bunch more pages, bunch of more posts, and we had some really interesting stuff to work with. Originally, the platforms were very resistant to the idea that this had happened. (coughs) And so as a result of that, they were, um, in, you know ... There was a, the first thing that Zuck said in 2016 when, you know, Trump gets elected, Twitter goes crazy that night with people who work at Twitter saying, "Oh my God. Were we responsible for this?" Which is (laughs) a very Silicon Valley thing to say. Um, but what I think they meant by that was, their platform had been implicated as hosting Russian bots and fake news and harassment mobs and a number of other things, and there was always the sense that it didn't have an impact and it didn't matter. And so this was the first time that they started to ask the question, "Did it matter?" (coughs) And then Zuck made that statement, um, fake news is a very small percentage of, you know, whatever, on Facebook, the amount of information on Facebook, and the idea that it could have swung an election was ludicrous. So you have the platforms kind of, uh, the leaders of the platforms digging in and saying it's inconceivable that, that, that this, you know, could have happened. And as the research and the discovery begins to take place over the next nine months or so, you st- you, you get to when the tech hearings happen. So I worked with, um, a guy by the name of Tristan Harris. He's the one who introduced me to Sam, um, and he, he and I started, uh, going to DC with a third fellow, Roger McNamee, and saying, "Hey, there's so much ... There's this body of evidence that's coming out here and we need to have a hearing. We need to have Congress ask the tech companies to account for what happened, to tell the American people what happened." Because what we're seeing here as outside researchers, when investigative journalists are writing, the things that we're finding just don't line up with the statements that, that nothing happened and th- this was all no big deal. And so we start asking (coughs) for these hearings and actually, um, myself and a couple of others then begin asking them in the course of these hearings, "Can you get them to give you the data?" Because the platforms hadn't given the data. So it was that lobbying by concerned citizens and journalists and researchers saying, "We have to have some accountability here. We have to have the platforms account for what happened. They have to tell people." Um, because this had become such a politically divisive issue. Did it even happen? And we felt like having them actually sit there in front of Congress and account for it would be the first step towards, um, towards moving forward in a way, but, but also towards, um, changing the minds of the public and making them realize that what happened on social platforms matters. And it was, it was really interesting to, to, to be part of that as it, as it played out, um, because one of the things that Senator Blumenthal, one of the, one of the senators did was actually said, um, "Facebook and Twitter have to notify people who engage with this content." And so the- there was this idea that, um, if you are engaging with propagandist content, you should have the right to know.

Episode duration: 2:07:54

Install uListen for AI-powered chat & search across the full episode — Get Full Transcript

Transcript of episode UAGZcGi1OP8

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome