Skip to content
The Joe Rogan ExperienceThe Joe Rogan Experience

Joe Rogan Experience #1558 - Tristan Harris

Called the “closest thing Silicon Valley has to a conscience,” by The Atlantic magazine, Tristan Harris spent three years as a Google Design Ethicist developing a framework for how technology should “ethically” steer the thoughts and actions of billions of people from screens. He is now co-founder & president of the Center for Humane Technology, whose mission is to reverse ‘human downgrading’ and re-align technology with humanity. Additionally, he is co-host of the Center for Humane Technology’s Your Undivided Attention podcast with co-founder Aza Raskin.

Joe RoganhostTristan Harrisguest
Oct 30, 20202h 21mWatch on YouTube ↗

EVERY SPOKEN WORD

  1. 0:0015:00

    (dramatic music plays) Joe Rogan podcast,…

    1. JR

      (dramatic music plays) Joe Rogan podcast, check it out. The Joe Rogan Experience. Train by day, Joe Rogan podcast by night. All day. (rock music plays) Tristan, how are you?

    2. TH

      Good. Good to be here.

    3. JR

      Good to have you here, man. Um, you were just telling me before we went on air the numbers of The Social Dilemma.

    4. TH

      Yeah.

    5. JR

      And they're bonkers. So what, say that real quick.

    6. TH

      Yeah. Uh, The Social Dilemma was seen by, uh, 38 million households in the first 28 days on Netflix, which I think has broken records. And if you assume, you know, a lot of people are seeing it with their family because parents seeing it with their kids, uh, the issues that are around teen mental health. Uh, so y- if you assume one out of ten families saw it with a few family members, we're in the 40 to 50 million people range, which has just broken records, I think, for Netflix. I think it was the second most popular documentary throughout the month of September.

    7. JR

      I think-

    8. TH

      Or, or course, or filmed throughout the month of September.

    9. JR

      ... it was a really well done documentary. But I think it's one of those documentaries that affirmed a lot of people's worst suspicions about the dangers of social media, and then on top of that, it sort of a-alerted them to what they were already experiencing in their own personal life and, like, highlighted it.

    10. TH

      Yeah, I think that's right. I mean, most people were aware. I think it's a thing everyone's been feeling that the feeling you have when you use social media isn't that this thing is just a tool or it's on my side, it is an environment based on manipulation, as we say in the film. And that's really what's changed, that, you know, w- (sighs) I, I remember, you know, I was gonna be working on these issues for something like eight or, eight years or something now.

    11. JR

      Could you please tell people who didn't see the documentary-

    12. TH

      What it is, yeah.

    13. JR

      ... what, what your background is-

    14. TH

      Yeah.

    15. JR

      ... and what you, how you got into it?

    16. TH

      Yeah, so I, uh, the, you know, the, the film goes back as a, a set of technology insiders. My, my background was as a design ethicist at Google. So I first had a startup company, uh, that we sold to Google, and I landed there through a talent acquisition. And then, um, uh, started, uh, work, about a year into being at Google, uh, made a presentation that was about how essentially technology was holding the human collective psyche in its hands, that we were really controlling the world's psychology. Uh, because every single time people look at their phone, they are basically experiencing thoughts and scrolling through feeds and believing things about the world, this has become the primary meaning-making machine for the world. And that we as Google had a moral responsibility to, uh, you know, hold the collective psyche in a thoughtful, ethical way and not create this sort of race to the bottom of the brain stem attention economy that we now have. Uh, so my background was as a, as a kid I was a magician. We can get into that. Uh, I, uh, studied at a lab at Stanford, uh, called, or studied in a class called the Stanford Persuasive Technology, uh, Class that taught a lot of the engineers at, in Silicon Valley kind of how the mind works, and the co-founders of Instagram were there. And, uh, then later studied behavioral economics and how the mind is sort of influenced. I went into cults and started studying how cults work, and then arrived at Google through this lens of, you know, technology isn't really just this thing that's in our hands. It's more like this manipulative environment that is tapping into our weaknesses, everything from the slot machine rewards to, you know, the way you get tagged in a photo and it sort of manipulates your social validation and approval, these kinds of things.

    17. JR

      When you were at Google, did they still have the "Don't be evil" sign up?

    18. TH

      Uh, I don't know if there was actually a physical sign. Was there one?

    19. JR

      There was never a physical sign? I thought there was something that they actually had.

    20. TH

      I think it was, there was this guy, was it Paul T- not Paul. What was his last name? He was the inventor, one of the inventors of Gmail, and they had a meeting and they came up with this mantra, 'cause they realized the power that they had and they realized that there was gonna be a conflict of interest between advertising on the search results and regular search results. And so we know that, they knew that they could abuse that power and they came up with this mantra, I think, in that meeting in the early days to "Don't be, don't be evil."

    21. JR

      There was a time where they took that mantra down, and I remember reading about it online and, and I-

    22. TH

      They took it off their page, I think.

    23. JR

      That's what it was.

    24. TH

      Yeah.

    25. JR

      And, uh, when I read that, I was like, "That should be big news." Like, there's no reason to take that down. Why would you take that down? (laughs)

    26. TH

      Yeah.

    27. JR

      Why, why would you, why would you say, "Well, maybe it can be a little evil. Let's not get crazy."

    28. TH

      I- it's a good question. I mean, I wonder what logic would have you remove a statement like that. (laughs)

    29. JR

      That seems like a standard state- Like, it's a great statement. Okay, here it is. "Google removes 'Don't be evil'" clause from its code of conduct."

    30. TH

      In 2018?

  2. 15:0030:00

    Mm-hmm. …

    1. TH

      of something where you could see all the things your friends were listening to. So just like making a news feed like we do with Facebook and Twitter, right? Um, and then he said was, "Well, why would we do that? If something is important enough, your friend will actually just send you a link and say, 'You should listen to this.'"

    2. JR

      Mm-hmm.

    3. TH

      Like, why would we automatically just promote random things that your friends are listening to? And again, this is kind of how you get back to social media. How is social media so successful? Because it's so... It's much more addictive-

    4. JR

      Right.

    5. TH

      ... to see what your friends are doing in a feed, but it doesn't reward what's true or what's meaningful. And this, uh, and this is the thing that people need to get about social media is it's, it's really just rewarding the things that tend to keep people back addictively. The business model is addiction in this race to the bottom of the brain stem for attention.

    6. JR

      Well, it seems like if we, in hindsight, if hindsight is 20/20, what, what should have been done, or what could have been done had we known where this would pile out, is that they, they could have said, "You can't do that. You can't manipulate these algorithms to make sure that people pay more attention and manipulate them to ensure that people become deeply addicted- "

    7. TH

      Mm-hmm.

    8. JR

      "... to these platforms." What you can do is just let them openly communicate-

    9. TH

      Right.

    10. JR

      ... but it has to be organic.

    11. TH

      And then the problem is... So if you... This is the thing I was gonna say about Twitter is, um, when one company does the, call it the engagement feed, meaning showing you the things that the most people are clicking on and-

    12. JR

      Trending, yeah.

    13. TH

      ... retweeting, trending, things like that. Ver-... Let's imagine there's two feeds. So there's the feed that's called the reverse chronological feed, meaning showing in order in time, you know, "Joe Rogan posted this two hours ago," but that's, you know, after that you have the thing that people posted an hour and a half ago all the way up to 10 seconds ago. That's the reverse chronological. Um, they have a mode like that on Twitter. If you click the sparkle icon, I don't know if you know this, it'll show you just in time, here's what people said, you know, sorted by recency.

    14. JR

      Mm-hmm.

    15. TH

      But then they have this other feed called what people click on, retweet, et cetera the most, the people you follow. And it sorts it by what it thinks you'll click on and want the most. Which one of those is more successful at getting your attention, the sort of recency, what they posted recently, versus what they know people are clicking on retweeting on the most?

    16. JR

      Certainly what they know people are clicking on and retweeting the most.

    17. TH

      Correct. And so once Twitter does that, let's say Facebook was sitting there with the recency feed, like, just showing you here's the people who posted in this time order sequence.

    18. JR

      Mm-hmm.

    19. TH

      They have to also switch to who is... Like, the most relevant stuff, right? The most, uh, clicked, retweeted the most. So this is part of this race for attention that once one actor does something like that and they algorithmically, you know, figure out what peop- what's most popular, the other companies have to follow, because otherwise they won't get the attention. So it's the same thing if, you know, Netflix adds the autoplay five, four, three, two, one countdown to get people to watch the next episode. That, if that works at, say, increasing Netflix's watch time by 5%, YouTube sits there, says, "We just shrunk (laughs) how much time people were watching YouTube 'cause now they're watching more Netflix, so we're gonna add five, four, three, two, one autoplay countdown." And it becomes, again, this game theoretic race of who's gonna do more. Now if you open up TikTok...TikTok doesn't even wait... I don't know if you know or if your kids use TikTok, but when you open up the app, it doesn't even, um, wait for you to click on something. It just actually plays the first video the second you open it, which none of the other apps do, right? And the, the point of that is that causes you to enter into this engagement stream even faster. So this is this... again, this race for attention produces things that are not good for society. And even if you took the Whac-A-Mole stick or you took the antitrust case and you whacked Facebook and you got rid of Facebook, or you whack Google or you whack YouTube, you're just gonna have more actors flooding in doing the same thing. And one other example of this is, um, uh, the time it takes to reach, let's say, 10 million followers. So if you remember back in the Ashton... Wasn't it Ashton Kutcher-

    20. JR

      Mm-hmm.

    21. TH

      ... who raced for the first million followers?

    22. JR

      Raced with CNN.

    23. TH

      Raced with CNN, right?

    24. JR

      Yeah.

    25. TH

      So now, if you think of it, the companies are competing for our attention, if they find out that, um, each of us becoming a celebrity and having a million people we get to reach, if that's the currency of the thing that gets us to come back to get more attention, then they're competing at who can give us that bigger fame lottery hit faster. So let's say 2009 or 2010 when Ashton Kutcher did that, it took him... I don't know how long it took. Months to, for him to get a million?

    26. JR

      I don't, I don't remember.

    27. TH

      It was a, it was a little bit though, right? Um, and then TikTok comes along and says, "Hey, we want to give kids the ability to hit the fame lottery and make it big, hit the jackpot even faster." We want you to be able to go from zero to a million followers in 10 days, right? And so they're competing to make that shorter and shorter and shorter. And I, I know about this 'cause, you know, speaking from a Silicon Valley perspective, uh, venture capitalists fund these new social platforms based on how fast they can get to, like, 100 million users. There was this famous line that, like, I forgot what it was, but I think Facebook took, like, 10 years to get to 100 million users. Instagram took, you know, I don't know, four years, three years or something like that. TikTok can get there even faster. And so it's shortening, shortening, shortening. And that's what people are, are... that's what we're competing for. It's like who can win the fame lottery faster, but is a world where everyone broadcasts to millions of people without the responsibilities of publishers, journalists, et cetera. Does that produce an information environment that's help- that, that's, that's healthy? And obviously the, the film, The Social Dilemma, is really about how it makes the worst of us rise to the top, right? So our, our hate, our outrage, our polarization, um, what we disagree about, black and white thinking, more conspiracy-oriented views of the world, QAnon, you know, Facebook groups, things like that. And I, I can, we can definitely go into... there's a lot of legitimate conspiracy theories, so I want to make sure I'm not categorically dismissing stuff. Um, but that's really the point, is that we have landed in a world where the things that we are paying attention to are not necessarily the agenda of topics that we would say in a reflective world, what we would say is the most, most important.

    28. JR

      So there's a lot of... there's a lot of conversation about free will and about letting people choose whatever they ch- wha- wha- whatever they enjoy viewing and watching and paying attention to. But when you're talking about these incredibly potent algorithms and the incredibly potent, uh, addictions that people... that, that people develop to these, these things, and we're, we're pretending that people should have the ability to just ignore it and put it away.

    29. TH

      Right.

    30. JR

      And-

  3. 30:0045:00

    Crazy Town. …

    1. TH

      Cronkite or you could start in crazy town, but if I'm YouTube and I want you to watch more, am I gonna steer you towards the calm stuff or am I gonna steer you more towards crazy town?

    2. JR

      Crazy Town.

    3. TH

      ... always more towards Crazy Town. So then you imagine just tilting the floor of humanity, just by like three degrees, right? And then you just w- step back and you let society run its course. As Jaron Lanier says in the film, "If you just tilt society by one degree, two degrees, that's the whole world." That's the, that's what everyone is thinking and believing. And so if you look at the, at the degree to which people are deep into rabbit hole conspiracy thinking right now ... And again, I wanna acknowledge COINTELPRO, Operation Mockingbird. We ... Like, there's a, a lot of real stuff, right?

    4. JR

      Right.

    5. TH

      So I'm not categorically dismissing it, but we're asking, what is the basis upon which we're believing the things we are about the world?

    6. JR

      (sighs)

    7. TH

      And increasingly, that's, that's based on technology. And we can get into, you know, what's going on in Portland. Well, the only way I know that is I'm looking at my social media feed, and according to that, it looks like the entire city's on fire and it's a war zone. But if you ... I called a friend there the other day, and he said, "It's a beautiful day. There's, there's actually no violence anywhere near where I am. It's just like these two blocks or something like that."

    8. JR

      Right.

    9. TH

      And, and this is the thing, it's warping our view of reality. And, and I think that's what really, for me, The Social Dilemma's, was really trying to accomplish as a film, and, uh, you know, the director, Jeff Orlowski, was trying to accomplish, is, is how did this society get, go crazy everywhere all at once, you know, seemingly? You know, they was ... This didn't happen by accident. It happened by design of this business model.

    10. JR

      When did the business model get implemented? Like when did they start using these algorithms to recommend things? 'Cause initially YouTube was just a series of videos and it didn't have that recommended-

    11. TH

      Correct.

    12. JR

      ... section. When was that?

    13. TH

      You know, it's a good question. I mean, I, um ... You know, they ... Originally YouTube was just post a video and you can get people to, you know-

    14. JR

      Right.

    15. TH

      ... uh, go to that URL and send it around. Uh, they needed to figure out ... O- once the competition for attention got more intense, they needed to, to figure out, how am I gonna keep you there? And so recommending those videos on the right-hand side, I think that was there pretty early if I remember, actually. Um, because that's, that was some of the innovation is like keeping people within this YouTube wormhole. And once people were in the YouTube wormhole constantly seeing videos, that was what the, they could v-, they could g-, offer the promise to a new video uploader, "Hey, if you post it here, you're gonna get way more views than if you post it on Vimeo." Right? And that's, that's the thing. If I open up TikTok right now on my phone-

    16. JR

      Do you have TikTok on your phone?

    17. TH

      Um, (laughs) well, I'm not supposed to, obviously, but more for research purposes. Um-

    18. JR

      Ah, research.

    19. TH

      (laughs)

    20. JR

      Do you-

    21. TH

      Um-

    22. JR

      ... know how to TikTok at all?

    23. TH

      No, I don't. I-

    24. JR

      Okay. My 12-year-old's obsessed.

    25. TH

      Oh, really?

    26. JR

      Oh, yeah. She can't even sit around. If she's standing still for f- five minutes, she just starts like ...

    27. TH

      (laughs)

    28. JR

      She starts TikTok-ing.

    29. TH

      And that's the thing. I mean they have-

    30. JR

      2012.

  4. 45:001:00:00

    Oh, wow. (laughs) …

    1. TH

      We've got melting glaciers. We've got, uh, ocean acidification. We've got the coral reefs, you know, uh, getting, dying." These can feel like disconnected things until you have a unified model of how emissions change all those different phenomena. Right? In the social fabric, we have shortening of attention spans. We have more outrage-driven news media. We have more polarization. Um, we have more breakdown of truth. We have more conspiracy-minded thinking. These seem like separate events, uh, and separate phenomena, but they're actually all part of this attention extraction paradigm that the company's growth, as you said, depends on extracting more of our attention, which means more polarization, more extreme material, more conspiracy thinking, and shortening attention spans. 'Cause it, we, we also say, like, you know, "If we want to double the size of the attention economy, I want your attention, Joe, to be split into two separate streams. Like, I want you watching the TV, uh, the tablet, and the phone at the same time, because now I've tripled the size of the amount of extractable attention that I can get for advertisers, which means that by fracking for attention and splitting you into more junk," (laughs) you know, "attention that's, like, thinner, we can sell that as if it's real attention." It's like the financial crisis where you're selling thinner and thinner financial assets as if it's real, but it's really just a junk asset.

    2. JR

      Oh, wow. (laughs)

    3. TH

      And that's kind of where we are now, where it's sort of the junk attention economy-

    4. JR

      Right.

    5. TH

      ... because we, we're, we can shorten attention spans and we're debasing the substrate of, that, that makes up our society. 'Cause everything in a democracy depends on individual sense-making and meaningful choice, meaningful free will, meaningful independent views. But if that's all basically sold to the highest bidder, that debases the soil from which independent views grow, because all of us are jacked into this sort of matrix of social media manipulation. That's, that's ruining and degrading our democracy. And that's really, uh, there's many other things that are ruining and degrading our democracy, but that's, that's a sort of invisible force that's upstream that affects every other thing downstream, because if we can't agree on what's true, for example, you can't solve any problem. I think that's what you talked about in your ten-minute thing on The Social Dilemma I think I saw on YouTube.

    6. JR

      Yeah. Um, do, your organization highlights all these issues in, you know, in an amazing way, and it's very important. But do you have any solutions?

    7. TH

      (sighs) It, it's hard. Right?

    8. JR

      Yeah.

    9. TH

      So I just wanna say that this is as, is a complex a problem as climate change, um, in the sense that you need to change the business mod-... I think of it like we're on the fossil fuel economy and we have to switch to some kind of beyond that thing. Right? Because so long as the business models of these companies depend on extracting attention, can you expect them to do something different? Like-

    10. JR

      You can't, but how could you, is it, I mean, there's so much money involved.

    11. TH

      Correct.

    12. JR

      And now, they've accumulated so much wealth that they have an amazing amount of influence.

    13. TH

      Yeah.

    14. JR

      You know? And-

    15. TH

      And the asymmetric influence can buy lobbyists, can influence-

    16. JR

      Yes. Yeah.

    17. TH

      ... Congress and prevent things from happening. So this is why it's kind of the last moment-

    18. JR

      And they're protecting themselves.

    19. TH

      That's right. But, you know, I think we're seeing signs of real change. We had the antitrust case that was just filed against Google, um, in Congress. We're seeing more hearings that are gonna happen than ever-

    20. JR

      What was the basis of that case?

    21. TH

      You know, to be honest, I, uh, was actually (laughs) in the middle of, uh, The Social Dilemma launch when I think that happened and our f- our, my home burned down in the recent fires in Santa Rosa-

    22. JR

      Oh, no.

    23. TH

      So I actually missed, uh, that happening.

    24. JR

      Sorry to hear that.

    25. TH

      Yeah, sorry, that was a big thing to drop. But yeah, no-

    26. JR

      Yeah.

    27. TH

      ... it's, it's awful. Uh, there's so much that's been happening in the last six weeks that I've-

    28. JR

      I've been, uh, I was evacuated three times where I lived in California.

    29. TH

      Oh, really?

    30. JR

      Yeah. So w- we... It cut real close to our house. "Justice Department sues monopolist Google for violating antitrust laws. Department files complain against Google to restore competition in search and search advertising markets." Okay, so it's all about search.

  5. 1:00:001:10:00

    2004. …

    1. JR

      2002 or '03? Like what is-

    2. TH

      2004.

    3. JR

      2004. S- wh- this is such a short timeline in having these massive worldwide implications from the use of these things. W- when you look at the future, do you look at this like a runaway train that's headed towards a cliff?

    4. TH

      Yeah, I mean, I think right now this thing is a Frankenstein that... It, it's not like... Even if Facebook was aware of all these problems, they don't have the staff unless they hired like hundreds of, you know, tens- hundreds of thousands of people definitely, minimum, to try to address all these problems. But the paradox we're in is that the very premise of these services is to rely on automation.Like, it used to be, we had editors and journalists, or at least editors or, you know, people who edited even what went on television, saying, "What is credible? What is true?" Like, you know, you sat here with, you know, with Alex Jones even yesterday and you're trying to check him on everything he's saying, right? You're researching and trying to look that stuff up, you're trying to be doing some more responsible, uh, communication. The, the premise of these systems is that you don't do that. Like the reason venture capitalists find social media so, um, uh, profitable and such a good investment is because we generate the content for free. We are the useful idiots, right?

    5. JR

      Hmm.

    6. TH

      Instead of paying a journalist $70,000 a year to write something credible, we can each be convinced to share our political views and we'll do it knowingly for free. Actually, we don't really know that we're the useful idiots, that's the, kind of the point. And then instead of paying an editor $100,000 a year to figure out which of those things is true that we want to promote and give exponential reach to, you have an algorithm that says, "Hey, what do people click on the most? What do people like the most?" And then you realize the quality of the signals that are going into the information environment that we're all sharing is a totally different process. We went from a high quality, gated process that cost a lot of money, to this, um, really crappy process that costs no money, which makes the companies so profitable. And then we fight back for territory, for, for values, when we raise our hands and say, "Hey, there's a thinspiration video problem for teenagers and anorexia. Hey, there's a mass conspiracy sort of echo chamber problem over here. Hey, there's, um, you know, flat earth sort of issues." And again, the- these get into tricky topics because we want to... You know, I, I know we both believe in free speech, and we have this feeling that, um, the solution to bad speech is better, you know, more speech that counters the things that are said. But in a finite attention economy, we don't have the capacity for everyone who gets bad speech to just have a counter response.

    7. JR

      Right.

    8. TH

      In fact, what happens right now is that that bad speech rabbit holes into wor- not only I call worse and worse speech, but more extreme versions of that view that confirms it. Because once Facebook knows that that flat earth rabbit hole is good for you at getting your attention back, it wants to give you just more and more of that.

    9. JR

      Mm-hmm.

    10. TH

      It doesn't want to say, "Here's 20 people who disagree with that thing."

    11. JR

      Right.

    12. TH

      Right? So I think if you were to imagine a, a different system, we would ask, "Who are the thinkers that are most open-minded and synthesis-oriented, where they can actually steel man the other side?" Actually they can do, you know, for this speech, here is the opposite counter argument. They can show that they understand that. And imagine those people get lifted up. But notice that none of those people that you and I know, I mean we're both friends with Eric Weinstein, and, you know, I think he's one of these guys who's really good at sort of offering the steel manning, "Here's the other side of this."

    13. JR

      Yeah.

    14. TH

      "Here's the other side of that." But the people who generally do that aren't the ones who get the tens of millions of followers on these surfaces.

    15. JR

      Right.

    16. TH

      It's the black and white extreme outrage-oriented thinkers and, and speakers that get rewarded in these attention economy. And so if you look at how, if I zoom way out and say, "How is the entire system behaving?" Just like if I zoom out and say climate, you know, the climate system, like, how is the entire overall system behaving? It's not producing the kind of information environment on which democracy can survive.

    17. JR

      Jesus. The, the thing that troubles me the most is that I c- clearly see your thinking and I agree with you. Like I don't see any holes in what you're saying.

    18. TH

      Mm-hmm.

    19. JR

      Like I, I don't know how this plays out, but it doesn't look good. And I don't see a solution. It's like if there are 1000 bison running full steam towards a cliff and they don't realize the cliff is there, I don't see how you pull them back.

    20. TH

      So, I think of it like we're trapped in a body and, um, that's eating itself. So like it's kind of a cannibalism economy because our, our economic growth right now with these tech companies is based on eating our own organs. So we're eating our own mental health organs, we're eating the health of our children, we're eating the... Sorry for being so gnarly about it, but it's, it's a cannibalistic system. In a system that's hurting itself or eating itself or punching itself, if one of the neurons wakes up in the body, it's not enough to change that. It's going to keep punching itself. But if enough of the neurons wake up and say, "This is stupid. Why would we build our system this way?" And the reason I'm so excited about the film is that if you have 40 to 50 million people who now recognize that we're living in this sort of cannibalist system in which the, an economic incentive is to debase the life support systems of your democracy, we can all wake up and say, "That's stupid. Let's do something differently. Let's actually change the system. Let's use different platforms, let's fund different platforms, let's regulate and tame the existing Frankensteins." And I don't mean regulating speech, I mean really thoughtfully, how do we change the incentives?

    21. JR

      Yeah.

    22. TH

      So it doesn't go to the same race to the bottom. And we have to all recognize that we're now 10 years into this hypnosis experiment of warping of the mind. And like, you know, as friends with this movement is, it's like how do we snap our fingers and get people to say, "That, that artifi-" There's an inflated level of polarization and hatred right now, that especially going into this election, I think we all need to be much more cautious about what's running in our brains right now.

    23. JR

      Yeah, I don't think most people are generally aware of what's causing this polarization. I think they think it's the climate of society because the president and because of, uh, Black Lives Matter and the, the George Floyd protests and all this jazz. But I don't think they understand that that's exacerbated f- in a fantastic way by social media and the last 10 years of our addictions to social media and these echo chambers that we all exist in.

    24. TH

      Yeah. So I want to make sure that we're both clear, and I know you agree with this, that, um, these things were already in society to some degree, right?

    25. JR

      Yes.

    26. TH

      So we want to make sure we're not saying social media is blamed for all of it. Absolutely not.

    27. JR

      No. No.

    28. TH

      Not, no, no, no. In fact-

    29. JR

      It's gasoline.

    30. TH

      It's gasoline, right? Exactly. It's, it's lighter fluid for sparks of polarization.

Episode duration: 2:21:32

Install uListen for AI-powered chat & search across the full episode — Get Full Transcript

Transcript of episode OaTKaHKCAFg

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome