Skip to content
The Joe Rogan ExperienceThe Joe Rogan Experience

Joe Rogan Experience #1736 - Tristan Harris & Daniel Schmachtenberger

Tristan Harris is a former Google design ethicist, co-founder and president of the Center for Humane Technology, and co-host of the Center for Humane Technology’s "Your Undivided Attention" podcast with Aza Raskin. Daniel Schmachtenberger is a founding member of The Consilience Project, aimed at improving public sensemaking and dialogue.

Joe RoganhostTristan HarrisguestDaniel Schmachtenbergerguest
Jun 27, 20243h 1mWatch on YouTube ↗

EVERY SPOKEN WORD

  1. 0:001:12

    Intro

    1. NA

      (drumbeats) Joe Rogan podcast, check it out.

    2. The Joe Rogan Experience.

    3. JR

      Train by day, Joe Rogan podcast by night, all day. (instrumental music plays) Gentlemen, thank you for being here. Why don't you let's in- let's, I, I keep doing these podcasts where I just talk to people, so please introduce yourself and tell people what you do.

    4. TH

      (smacks lips) Uh, I am Tristan Harris and, uh, came on this show about a year ago after The Social Dilemma came out. That's probably where most people know me. Um, and, uh, used to be a design ethicist at Google studying how do you ethically influence people's attention and thoughts and behaviors. Uh, and, uh, really enjoyed the conversation last year. The reason that today I'm here with, um, Daniel Schmactenberger who's really, uh, a person I've learned so much from the last few years and why I thought it'd be a good through line, um, is that the issues of social media, which I know we're gonna talk about today, are connected to a number of other issues that are going wrong in society that are all kind of interconnected. And I've learned a tremendous amount from Daniel, and I thought I'd help really clarify some of these, these issues for everyone.

    5. JR

      Well, thank you, Daniel. Thanks for coming aboard.

    6. NA

      Thanks for having me here.

    7. JR

      W- what a daunting task, how to ethically influence people.

    8. TH

      Hmm.

    9. JR

      And what a

  2. 1:122:44

    How to ethically influence people

    1. JR

      weird thing that this industry that didn't exist 20 years ago has such a... I mean, think about life on Earth.

    2. TH

      Mm-hmm.

    3. JR

      And then 20 years ago, all of a sudden, this social media thing sort of e- evolves, and now you have to wonder how much of an effect it has on our just day-to-day lives and how to ethically influence people.

    4. TH

      Yeah.

    5. JR

      What does that... what the fuck does that even mean?

    6. TH

      (laughs) Well, first of all, I should say-

    7. JR

      How does that... how do those thoughts even get, you know... how, how does that get worked out?

    8. TH

      Actually, I should first say that there wasn't, at Google, a department that said, "How do we ethically influence people?" I actually sort of, um, as was shown in that... in the film, The Social Dilemma, wrote this presentation worried about how technology was influencing-

    9. JR

      Right.

    10. TH

      ... people's thoughts, concerns, behaviors, et cetera. And I studied persuasive technology at Stanford, which is a whole discipline and field, the idea that technology can influence people. And it was out of my own personal concern that when that presentation went viral at Google, um, I, uh, kind of worked my way into this position that never existed before, which was, h- how could we create a framework for what it means to ethically influence other people? And a lot of that has to do with asymmetries of power. I mean, when I was a kid, I was a magician. We talked about this before. Um, magic is about an asymmetric relationship. The magician knows something about your mind that you don't know about your own mind. That's what makes the trick work. And actually, across some of these things we're going to talk about today are ways that there is an asymmetric relationship between what technology knows about us and what we don't know about ourselves.

  3. 2:443:54

    The arms race

    1. TH

      That's-

    2. JR

      When, when you were studying at Stanford-

    3. TH

      Yeah.

    4. JR

      ... what year was this?

    5. TH

      This was, um, 2002 to 2006. I was an undergrad. And then 2006, I got involved with, uh, Professor B.J. Fogden, uh, who, again, actually studied ways that persuasive technology could be used for positive purpose. Like, how do you help people be healthier? How do you help people floss? How do you help people work out more often? Things like that. It could be used in a positive way.

    6. JR

      Right.

    7. TH

      But I got concerned, uh, because it was all of this increasing arms race to use persuasive tools to harvest and capture people's attention, now known as the race to the bottom of the brainstem, to go down the brainstem into more social validation, more social narcissism, all of that. And that's one of the arms races we see everywhere, which is like in every single thing. If one oil company doesn't drill for that oil well, the other one will. If one, uh, attention company doesn't add the beautification filter, the other one will.

    8. JR

      Hmm.

    9. TH

      If one company doesn't do narcissism, social validation hacking, and li- and likes and variable rewards, the other one will. And it's true across so many of the other issues that we're facing, whether it's like, if I don't build the drone for everyone, then someone else is gonna build the drone for everyone. If I don't... So that's how I think

  4. 3:545:12

    Social media before the iPhone

    1. TH

      these things are connected.

    2. JR

      Did you realize it back then? I mean, in 2002 to 2006, we're talking about a completely different world in terms of social media influence.

    3. TH

      Totally. It's before the iPhone, actually.

    4. JR

      Yeah.

    5. TH

      Yeah.

    6. JR

      2007, right?

    7. TH

      iPhone came out in 2007. We were studying persuasive technology, and I was, as I've said in the past, uh, partners with the co-founder of Instagram in the persuasive technology class. So we were actually studying how would you apply persuasive technology to people before the iPhone even existed. And, um, you know, for... You know, what bothered me is that I think when people think about how do you ethically persuade people, you just get into a whole bunch of ethical copouts. Like, "Well, we're just giving people what they want." You know? Or, "If they don't want this, they'll, they'll use something else." There's these very simple ways that the minds of people in technology, the tech industry, I think defend what they're doing. And what concerned me was that the ethical framework wasn't really there. Not that I had one at the time, by the way. I studied at Google-

    8. JR

      Right.

    9. TH

      ... for three years to try to develop, like, what does it mean to ethically influence three billion people who are jacked into the system? And this is before Cambridge Analytica, before the Facebook files and Frances Haugen talking about the, you know... We now have the receipts for all these things, so we talked about all these things in The Social Dilemma, but now there's the evidence with Frances Haugen's whistleblowing that, um, you know, Instagram makes, uh, body image issues worse for one in three teenage girls. Uh, I know I'm going fast, but that's the, the broad

  5. 5:126:53

    The tinfoil hat theory

    1. TH

      strokes.

    2. JR

      Do you know the conspiracy theory about her?

    3. TH

      (smacks lips) Tell me.

    4. JR

      The, the conspiracy theory amongst, uh, the tinfoil hat folk is, uh, first of all, she was, uh... she started a, uh, Twitter account, like, right before she went there and was immediately verified.

    5. TH

      Right.

    6. JR

      And then instantaneously was on all these major, uh, media outlets, major network television shows, and being interviewed. And p- and she was saying something that a lot of people felt like was a call to authoritarian intervention into social media, that it was government censorship, uh, was the solution and regulation was the solution to dealing with this problem, and that it seemed like she was a sanctioned whistleblower.

    7. TH

      Mm-hmm.

    8. JR

      Like, they... she was saying all the things that they wanted to hear, and that's why they put her in the position to make, uh, a big loud noise.

    9. TH

      What, what did you think about that when it came up? Just curious.

    10. JR

      I always have-... you know, when something like that happens, like, "Hmm... Maybe. Maybe." 'Cause you know the government would do that.

    11. TH

      Mm-hmm.

    12. JR

      Like, most certainly, they would love to have control over social media. They would love to be able to censor things like the Hunter Biden laptop story. They would love to be able to, you know, hide Joe Biden's medical records or, you know, Kamala Harris's, uh, uh, time as a prosecuting attorney. Like, there's a lot of stuff they would like to do.

    13. TH

      Yeah.

    14. JR

      Or district attorney, rather. There, there's a lot of stuff they would like to do, uh, with, uh, access to information. I mean, you're seeing it right now in terms of... One of the things that's been fascinating about COVID is, uh, during this pandemic and during this, uh, terrible time of, uh, you know, paranoia

  6. 6:538:01

    Social media censorship

    1. JR

      and dealing with this disease, and, and fear and anxiety, you're seeing, uh, this narrative from social media networks that absolutely walk step in step with the government, where if the government wants certain information censored, it's being censored across major social media platforms.

    2. TH

      Yep.

    3. JR

      That has to be coordinated. There's no way it's not. And there's no way they're incentivized to not have people discuss certain things, because we've said before, uh, you know, it's one of the major, uh, points of, um, the social dilemma is that things that are controversial, whether they're true or not, are the things that are the most clicked on, the most shared, the most... And that's where-

    4. TH

      Go viral, yeah.

    5. JR

      That's where the money is.

    6. TH

      Yep.

    7. JR

      So there's got to be some sort of incentive for them to not do what they do with every other subject, whether it's immigration or gun control or abortion or anything. Yet the algorithms favor-

    8. TH

      Do they censor on imm- immigration? Or you're saying that as an example if something goes viral?

    9. JR

      As an example if something goes viral, yes.

    10. TH

      Yeah, yeah, yeah. Not the censorship. Yeah, yeah, yeah.

    11. JR

      Right, they don't censor on

  7. 8:019:09

    The border crisis

    1. JR

      immigration.

    2. TH

      No.

    3. JR

      I mean, the, the, I mean, the border crisis is a great example of that.

    4. TH

      Right.

    5. JR

      Like the government probably likes us to not see all those Haitian immigrants storming across the border, but my God, those were shared like crazy.

    6. TH

      Totally.

    7. JR

      You know? So why was COVID information shared? Well, because there was a narrative they could say, "Well, we're... This is dangerous misinformation and we could protect people," even though some of it turned out to actually be accurate-

    8. TH

      Right.

    9. JR

      ... like the lab leak hypothesis. It's-

    10. TH

      Or at least that, at least that it's a hypothesis

    11. NA

      (laughs)

    12. JR

      ... that should be considered, yeah.

    13. TH

      Yeah, yeah. Totally.

    14. JR

      It's a hypothesis, it's... Yeah, that, that at least is being considered by virologists.

    15. TH

      Right.

    16. JR

      But the point is that who the fuck are they to decide what can and can't be discussed? And when they're doing something step in step with the government, I get concerned.

    17. TH

      Totally.

    18. JR

      So when someone comes along and this person who's a whistleblower says, "Something needs to be done, you know, we're endangering young girls' lives. We're doing this, we're doing that, we need some sort of government infr- intervention." I mean, this is, uh, essentially calling for censorship and calling for government control of social media, which freaks people out.

    19. TH

      So,

  8. 9:0911:05

    Is it going to go viral

    1. TH

      so she's pretty clear that she's not calling for censorship. But it's... So the reason I asked you was I was curious what... How it came across your radar, 'cause I, I happen to, to know and hear a little bit about this from her. We interviewed her on our, on our podcast. Um, and, um, the story that goes viral about her saying that she's a PSYOP-

    2. JR

      Yeah.

    3. TH

      ... or that she's a plant, that's an incendiary, inflammatory, controversial story. So when that gets suggested, is it gonna go... Is it just gonna fizzle out or is it gonna go viral?

    4. JR

      How ironic.

    5. TH

      It's gonna, it's gonna go viral, exactly. And in fact, the... When you kind of realize like everything that... I mean, there's, there's some things that are real conspiracy theories and there's some things that are real PSYOPs and there's, there's... That's a real thing. But there... Notice how many things we think of as PSYOPs, conspiracies, et cetera now, and it's because anything that has that incendiary quality go- goes, goes viral.

    6. JR

      Right.

    7. TH

      And I happen to know, for example, I think one of the things, the claims in there is that she's funded by this billionaire, Pierre Omidyar. But I happen to know from talking to her that that happened at the very, very end of what she was doing, and it was a tiny grant of like $150,000. For us in the nonprofit world, that's like a tiny amount of money.

    8. JR

      Right.

    9. TH

      Basically just to support her flight costs. And I happened to also sort of hear from her how like, uh, how much of the media was constructed at the very last minute. Like, she was working this one, um, newspaper, the Wall Street Journal, to do this sort of procedural rollout of, of specific stuff that she thought was concerning. I guess what I'll just say is like what if she's just a good faith person-

    10. JR

      Right.

    11. TH

      ... who saw that virality was driving people crazy and that it was, it was harmful to teenage girls? And it's true that the, um, the government would see some of that and say, "Hey, we could use that for, for something else. We could use that. She could be a, a tool for us to do something else." But I, I guess what... You know, and in the aim of complexity and nuance and not jumping to conclusions in this sort of thing, um, m- my perception from, from talking to her now extensively, she's a very good faith actor who was concerned that this was gonna drive the world apart.

  9. 11:0512:14

    Algorithms

    1. TH

    2. JR

      I should be really clear-

    3. TH

      Yeah.

    4. JR

      ... that this is not my position-

    5. TH

      Yeah, yeah.

    6. JR

      ... that she's... This is just the conspiracy theory-

    7. TH

      Totally.

    8. JR

      I, I literally don't have an opinion on her.

    9. TH

      Yeah.

    10. JR

      Uh, I do have an opinion on algorithms and I do have an opinion on what it does do to young girls' self-esteem, and I, I, but-

    11. TH

      Totally. You have teenage daughters, right? Yeah.

    12. JR

      Yes. I, I just think... It... And, and young girls are a point of focus b- uh, for... Why they're a point of focus more than young boys, I'm not entirely sure. I guess it has to do with their emotional makeup and, and, uh, there's higher risk of self-harm due to social media. And Jonathan Haidt talked about that in his book, The Coddling of the American Mind. Um, it's, it's very clear that it's very damaging.

    13. TH

      Yeah.

    14. JR

      And I... My kids, uh, you know, uh, my 13-year-old does have like interactions with her friends and sh- I do see how they bully each other and talk shit about each other and it's, it's-

    15. TH

      Yeah.

    16. JR

      They get so angry and mad at each other. Um, it is a factor, but it's an al- it's an algorithm issue, right?

    17. TH

      It... It... There's multiple things here. So the first thing is, um, just to kind of set the stage a little bit, uh, I always use E.O. Wilson, the sociobiologist to... Who, who, uh-... sort of defined what, what the problem statement for

  10. 12:1414:37

    The Fundamental Problem

    1. TH

      humanity is. He said, "The fundamental problem with humanity is we have Paleolithic emotions and brains," like easy brains that are hackable for magicians. "We have medieval institutions," you know, government that's not really good at seeing the latest tech, whether it was railroads or now social media, or AI, or deep fakes, or whatever's coming next. And then we have godlike technology. So we have Paleolithic emotions, medieval institutions, godlike technology.

    2. JR

      Mm.

    3. TH

      You combine that fact, that's the fundamental problem statement. How do we wield the power of gods without the love, prudence, and wisdom of gods? Which is actually something that Daniel taught me. And, um, then you add to that the race to the bottom of the brainstem for attention. What is their business model? Just to review the basics, everybody knows this now, but it's, it's engagement, it's like, "How do I get that attention at all costs?"

    4. JR

      Yeah.

    5. TH

      So, algorithms is one piece of that, meaning, um, when you're on a news feed, like, I don't want to just show you any news. I want to show you the most viral, engaging, like, longest argumentative comment threads news, right? So that's like pointing a trillion dollar market cap AI at your brain, saying, "I'm going to show you the next perfect boogeyman for your nervous system." The thing that's going to make you upset, angry, whether it's masks, vaccines, Francis Haugen, (laughs) whatever the thing is, it will just drive that over and over again, and then repeat that thing. And that's one of the, the tools in the arsenal to get attention, is that the algorithms. Another one is technology making design decisions, like how do we inflate people's sense of, uh, beautification filters? In fact, just recently, since we talked last time, um, I think it's a MIT tech review article showing that, um, they're all com- they're all competing, first of all, to, like, inflate your sense of beauty, so they're doing the, the, I think you did this thing.

    6. JR

      The filters.

    7. TH

      The filters, right?

    8. JR

      Yeah.

    9. TH

      People know this stuff, it's very obvious, but they're competing for who can give you a nicer filter, right? And then now, instead of waiting for you to actually add one, TikTok was actually found to actually do like a 2%, like, just bare beautification filter on the no filter mode. Because the thing is once they do that, the other guys have to do it too.

    10. JR

      Oh.

    11. TH

      So, I just want to name that all of this is taking place in this race to capture human attention, because if I don't do it, the other guy will. And then, it's happening with design decisions, like the beautification filters, and, like, the follow you, and if you follow me, I'll follow you back, and the like button, and check pull to refresh, the dopamine stuff.

    12. JR

      Mm-hmm.

    13. TH

      That's all design. Then there's the algorithms, which is I'm pointing a, a thing at your brain to figure out w- what... How can I show you an infinite feed that just maximally enrages you? And we should talk about that, because that thing drives polarization, which breaks democracy, but that's a, that's a... We can get into

  11. 14:3717:23

    How Did You Meet

    1. TH

      that.

    2. JR

      Oh, Br- Daniel, let's bring you in here. So, how did you guys, uh, meet and how did this, uh, sort of dynamic duo come about? (laughs)

    3. DS

      Yeah, I was working on studying kind of catastrophic risks writ large. You've had people on the show talking about risks associated with AI and with CRISPR, and genetic engineering, and with climate change, and environmental issues and-

    4. JR

      Pull up to the microphone there.

    5. DS

      And-

    6. JR

      There, there you go.

    7. DS

      ... escalation pathways to war, and all these kinds of things, and so I was-

    8. TH

      Basically, how can shit hit the fan?

    9. JR

      Right.

    10. DS

      And I think it's a pretty common question of, like, "How long do we have on which of these, and are we doing a good job of tending to them, so that we get to solve the rest of them?" And then for me, it was there were so many of them, what was in common driving them? Are there any kind of, like, societal generator functions of all of the catastrophic risks that we can address with, to make a more, uh, resilient civilization writ large? Tristan was working on the social media issues, and, um, when you had Eric on, he talked about the twin nuclei problem of atomic energy and kind of genetic engineering, and basically saying these are extremely powerful technologies that we don't have the wisdom to steward that power well. Well, in addition to that is all things computation does, right? There's a few other major categories. And computation has the ability to, as, as you mentioned with Facebook, get to billions of people in a very, very short period of time compared to how quickly the railroads expanded or any other type of tech-

    11. TH

      Like, how fast can TikTok get to a billion people, a billion users-

    12. JR

      Yeah.

    13. TH

      ... which they did in, like, a few years versus before that it took software companies like Microsoft even longer than that, and before that it took railroads even longer than that. So the power of this tech is you can compress the timeline, so you're getting, you know, a scale of a billion people's... Uh, you're impacting a billion people in deeper ways much faster, which means that if you're blind to something, if you don't know what you might be doing, the consequences show up faster than you can actually remediate them. It's-

    14. DS

      When we say exponential tech, we mean a number of things. We mean tech that makes more powerful versions of itself, so I can use computer chips to model how to make better computer chips, and then those better computer chips can recursively do that. We also mean exponential speed of impact, exponential scale of impact, exponentially more capital returns, exponentially, uh, smaller numbers of people capable of achieving a scale of impact. And so when he's mentioning godlike powers and kind of medieval institutions, the speed at which our tech is having influences in the world, and not just first order influences, the obvious stuff, but the second and third order ones. Facebook isn't trying to polarize the population. It's a externality. It's a side effect of the thing they're trying to do, which is to optimize ad revenue. But the speed at which new technologies are having effects on the world and the total amount of consequence is way faster than regulation can keep up with, and regulation-

    15. TH

      So, and just by that, and just by that alone, we should be skeptical of any government's ability to regulate something that's moving faster than it, faster than it can appraise of what the hell is even happening in the first place, so we are-

    16. JR

      Well, not only

  12. 17:2321:18

    Government Regulation

    1. JR

      that, you, you need someone who really understands the technology, and you're not going to get that from elected officials. You're going to need someone who's working on it and has a comprehensive understanding of how the stuff works, how it's engineered, w- where it goes. You're... I mean, I'm, I'm skeptical of the government being able to regulate almost everything.

    2. TH

      Right. Well, and so there's maybe a few things to say about that. So one is the complexity of all issues. Like climate change is really complex.

    3. JR

      Yes.

    4. TH

      Like, the way or the nuclear pathways of escalation or the way a satellite or GPS could get knocked out, triggers a nuke somewhere-

    5. JR

      Yeah.

    6. TH

      That's also really complex. Social media is really complex. CRISPR, you know, bio stuff is complex.

    7. JR

      Right.

    8. TH

      So in general, like one of the ways to summarize the kind of problem from our friend Zach Stein's kind of, uh, work is that the complexity of humanity's problems is going up like this, but the capacity to meet them is, like, not really meeting it. And then you add in social media and you polarize people and divide them into, like, they don't even know what's true because everyone's got their own personalized version of reality.

    9. JR

      Right.

    10. TH

      And instead of even trying to try to meet that-... it goes down. And in fact, social media also rewards the most cynical take on anything. So anytime a government institution has ever said something dumb, like when the guy asked Zuckerberg, "Uh, how do you make money?" And he says, "Uh, Senator, we sell ads." That thing (laughs) goes viral.

    11. JR

      Right.

    12. TH

      And when that goes viral, everybody saw that, and they didn't see the, you know, the five senators who I talked to who actually do really get these things pretty decently. Now, I'm not gonna say like, "Let's just, like, regulate it," but just to notice, right? So the cynical take about every time an institution makes a mistake, that thing goes viral, which means we lose trust in so many things, because, no matter what, what the issue is. Do you wanna-

    13. DS

      You notice that you were bringing up the conspiracy theory of might the government have a incentive to make a plant like Frances? And so it's plausible, but plausible doesn't automatically mean is. One of the challenges is when someone has a confirmation bias, they hear something that's plausible, and they just assume that it is without doing the due diligence of saying-

    14. JR

      Right.

    15. DS

      ... "What would I need to know?" And you do a good job of checking that. We could also say would Facebook have an incentive to say that she's a plant and try to hire a bunch of PRs to do that?

    16. JR

      Sure. Sure.

    17. TH

      And they were, and they were helping to spread that story, by the way, by the way.

    18. JR

      Oh, that makes sense.

    19. TH

      I'm not saying they're responsible for it, actually.

    20. JR

      I understand. I understand.

    21. TH

      I actually think that what happens is organically, again, the cynical take goes viral, and then if you're Russia or China, or you're Facebook in this case-

    22. JR

      Yeah.

    23. TH

      ... you can be like, "Hmm, that's a really helpful cynical take from my perspective."

    24. JR

      Mm-hmm.

    25. TH

      In fact, one of the things that Facebook does try to do is turn the social media debate into a censorship or free speech debate because they know (laughs) that divides the, the, the political class-

    26. JR

      Hmm.

    27. TH

      ... because they know that the right doesn't want censorship, obviously, and so they say... The more they can spin whatever Frances is doing as she's claiming censorship, the more they can divide, um, uh, any, any possibility for actual action. In fact, I'll tell you just a quick story, really quick, is during the three-hour testimony that Frances, uh, gave, if you watch the full three hours, she had both people on the left and the right... And I've been working on this for eight years. I have never seen someone create a bipartisan consensus the way that she did. She actually did, if you watch the, the video. And there was a senator there on the right who typically had been very skeptical of these issues, and the next day, I talked to her, she was gonna meet with that senator. Um, and he later said, "I can't meet with you." Why? Because the story went viral saying that she was a Democratic operative, and he said, "My base will hate me if I meet with you." So the very thing-

    28. JR

      Wow.

    29. TH

      ... we're talking about, which is the ability to regulate anything-

    30. JR

      Wow.

  13. 21:1825:27

    Censorship

    1. TH

    2. JR

      Y- it's so funny. You say the right doesn't want censorship. Isn't that a crazy statement? Like, uh, we, like, shifted the polar, you know, the polar-

    3. TH

      What do you mean? Say it, say more.

    4. JR

      The... It used to be the left didn't want censorship. The ACLU used to defend Nazis.

    5. TH

      Right. Right.

    6. JR

      I mean, what the fuck has happened? Like, our poles have shifted. Like, north is south and south is north. It's, it's... It just shows you that so much of what ideology is, is tribal. It's like you-

    7. TH

      Right.

    8. JR

      ... you find a group that agrees to a certain, um, uh, gr- uh, s- a certain pattern of behavior and thought, and you subscribe to that. And now I am a right-wing conservative, I am a left-wing progressive, and then you just follow the playbook, and it makes-

    9. TH

      Totally.

    10. JR

      ... it so much easier than having your own individual nuanced thoughts on complex and difficult issues like this.

    11. TH

      Right.

    12. JR

      But the fact that he couldn't talk to her because his base would somehow or another think that she actually is a Democratic operative, and she does work for the government, and is trying... uh, is some sort of an attempt at censorship. And I'm sure not only is Facebook amplifying that, but all of the different Russian troll pages on Facebook are amplifying that-

    13. TH

      Totally.

    14. JR

      ... which confuses the water.

    15. TH

      Totally. Well, also if I'm Russia or China, Facebook is, like, the best weapon I've ever had against the United States.

    16. JR

      Yes.

    17. TH

      Oh my god. You've got an F-35? I don't need F-35.

    18. JR

      Right.

    19. TH

      I've got Facebook. I can destroy your entire coherence as a society.

    20. JR

      And they have.

    21. TH

      And you won't get anything done, and all of your energy will be spent on waste, infighting, and heat.

    22. JR

      We talked about this recently, but there's... I'm sure you saw the story. There was 20, top 20 Christian sites on Facebook. 19 of them were-

    23. TH

      Yes.

    24. JR

      ... run by a Russian troll farm. (laughs)

    25. TH

      I'm, I'm glad you actually mentioned that 'cause-

    26. JR

      Or excuse me, it was a Eastern European troll farm.

    27. TH

      Yeah, the-

    28. JR

      Macedonia, I think it was.

    29. TH

      To- totally. Uh, 140... This is an important stat actually. I'm glad you brought it up. Um, a hundred and f-... This is as recent as October 2019. A hundred and 40 million Americans per month were reached by essentially troll farms, actively. Um, there's three categories of pages in which they're... So for Christian pages, the top 15 out of 15 Christian pages were all run by troll farms.

    30. JR

      (laughs)

  14. 25:2728:19

    Regulation

    1. TH

    2. JR

      It seems absolutely insane that they could, through one page, inviting people-

    3. TH

      Yeah.

    4. JR

      ... instantaneously start to distribute all of their information on the, on those people that they invited. That seems-

    5. TH

      So you're saying, so why-

    6. JR

      That should kind of be illegal.

    7. TH

      So why would Facebook even allow that? Like, you, you would think... So if I'm designing Facebook, you would probably say-

    8. JR

      Wait, wait, you just said-

    9. TH

      Go ahead.

    10. JR

      ... the government should regulate social media. (laughs)

    11. TH

      (laughs)

    12. JR

      It should be illegal is what I said.

    13. TH

      It should be illegal. Yeah.

    14. JR

      Yeah. Yeah. I, well, this is... I, I don't think the government should regulate, but I do think there should be rules in terms of, like, if you're a regular person that, say, uh, has a specific group of interests, like say you only like, um, motor cars.

    15. TH

      Yeah.

    16. JR

      You, you like vehicles.

    17. TH

      Yeah.

    18. JR

      You like, you like, uh, hot rods or whatever, and that's what you're interested in. You, you know, you use Facebook when you're off duty at work and you just wanna just check some stuff out, and all of a sudden you get QAnon shit.

    19. TH

      Mm-hmm.

    20. JR

      Because they invited you into this QAnon group, and you started getting all this information. You start getting radicalized. It, it seems like (sighs) ... And again, I don't know what we should do in terms of regulation. I, I, but I don't think that social media groups should be able to just distribute information to people based on this concept of universal growth.

    21. TH

      Yeah. Well, I mean, think about it. If, if we were just designing-

    22. JR

      Or unlimited growth.

    23. TH

      Yeah. E- exactly. I mean, if, if we were designing Facebook with a feature called Groups, and Groups had a feature called Invitations, and you could invite people-

    24. JR

      Yeah.

    25. TH

      ... wouldn't you design it so that people have to accept-

    26. JR

      Yes.

    27. TH

      ... the invitation for the group before it shows up in your feed? Why would Facebook not do it that way?

    28. JR

      Right.

    29. TH

      Because what happened is, starting in, I think it was like 2018, people stopped posting as much on Facebook. So you and I, and maybe we used to post a lot more in 2016, 2017. If we stop posting as much, "Oh, shit, we can't harvest all that attention from people."

    30. JR

      What was the cause?

  15. 28:1933:48

    Text

    1. JR

      "What in the fuck?" I could only look at it for a couple moments before I started freaking out. But the idea that, uh, you know, that's f- that's, it's not far off. Like, this, uh, this ability that, uh, deepfake AI has to recreate, es- espe- especially in text.

    2. TH

      Yes, exactly. That, that's specifically what GPT-3 is. It's a text model that trains on trillions of parameters and, and basically the entire corpus of the internet. So you're basically ingesting everything everyone has ever said online ever, including stuff in your voice or in my voice. And then you could say, "GPT-3, write me an argument about why social media is great, written by Tristan Harris, (laughs) using his words and phrases," and it'll do that. It'll actually be able to take my style of, of speech, and it'll generate text there. You could also say... You wanna do the vaccine one?

    3. DS

      Hm. M- The ability to say, uh, make arguments for vaccines or against vaccines, and say, "Only use real data," and then be able to show the financial vested interest of anyone arguing on the other side, and just have it be able to create m- more data than people can parse in any reasonable amount of time.

    4. TH

      And create like an academic-looking paper that's 10 pages long saying why the vaccine is not safe, with citing real charts, real graphs, real statistics, and the real vested interests of people who are, say, positively pointing out that the vaccine is safe, who maybe they have some connection to Pfizer or something like that.

    5. JR

      Right.

    6. TH

      And it'll generate that full 10-page or 20-page document, and it'll take a team of statisticians, you know, a while to decode that thing, and you can flood the internet with that kind of text.

    7. DS

      And it's already, we already have, through OpenAI and the GPT-3 algorithm, the ability to pass the Turing test in many areas, meaning that you-

    8. TH

      We should explain what the Turing test is.

    9. DS

      Meaning that if you're reading the text, you can't tell that it wasn't produced by a human.

    10. JR

      Right. Turing test is the idea that if you... That's how you find out if someone... It's a very good robot. (laughs)

    11. DS

      So you've already got an AI-

    12. JR

      Ex Machina, right?

    13. DS

      Right.

    14. JR

      So this is, this is the, um, Reddit thread. So this is-

    15. TH

      So basically-

    16. JR

      These are all "Why do human babies cry?" These are all robots. This is all bots arguing with each other.

    17. TH

      "This is what happens when you give birth to a human baby."

    18. JR

      "Oh, my bad. I thought you were just trying to answer the question. No worries." "No, I'm trying to answer the question of how babies cry."... "YTA," I don't know what that means, a- and you are disgusting. "I can't even fathom the level of toxicity in this post." These are all bots.

    19. TH

      Yeah.

    20. JR

      "I am disgusted that you are making fun of others. Don't you know that people in this sub are supposed to be empathetic of others' feelings?" I'm sorry, but you're being a cunt. These are all robots.

    21. TH

      Yeah. Talking to each other.

    22. JR

      This is wild. Because if you just read this and you didn't know, "I don't really care if you disagree with my opinion as long as you don't call me a pedophile. If you were a real man, you would be with a young girl and take care of her, and you would be a sex offender." Like, this is wild shit.

    23. TH

      Yeah. It, one of the things people don't know, it actually was just developed over the summer. They announced it at OpenAI, um, just to track since we came and talked about some of these things last time. In August 2020, OpenAI released a video of using this same technology of machines generating stuff to actually write programming code. So, you tell the GPT-3, "I want a asteroid video game." And it's like (computer sound effect) , and it writes all the code, and then it puts a little graphic of a starship thing in the middle, and then there's rocks that are flying.

    24. JR

      Whoa.

    25. TH

      And you say, "I want the rocks to move faster," and then the rocks move faster through the asteroid game. Only requiring natural language input, but no programming.

    26. JR

      Yes.

    27. TH

      So, you're just saying it, you're typing in natural text, "I want an asteroid video game that when I move left, it moves left. I want this. I want the asteroids to move faster. Actually make the, the, the starship bigger." And then it just changes, and it does it all for you. Now, it's not perfect, but this is AGI.

    28. JR

      Well, this is typing it in.

    29. TH

      You're typing.

    30. JR

      You're just typing it in text.

  16. 33:4838:31

    Drones CRISPR

    1. TH

    2. JR

      (sighs) So, AI can create fake stories, and the, the fake stories can be boosted up by these troll farms and-

    3. TH

      Which themselves could be run by fake, you know, accounts and fake logic of, say...

    4. JR

      Oh, my God. And-

    5. TH

      But wait, it goes one step further, is, so that's just distributed AI, right? But we also have drones making... Continuously better drones with continuously better ability to swarm and weaponize them that also becomes easily accessible. We also have CRISPR-making biotech capability, something that you don't have to be a state actor to have, small actors can have. So, there's this question of how do we make it through having decentralized exponential tech, which means decentralized catastrophic capability?

    6. JR

      Godlike powers. Decentralized Godlike powers.

    7. TH

      Decentralized Godlike powers... Nukes are really...

    8. JR

      ... in terms of biology as well as in terms of technology.

    9. TH

      That's right. So, social media's an instance of one case.

    10. JR

      So, we don't think we should just, like, gloss over the CRISPR thing. For people who don't understand what CRISPR is, CRISPR is a gene-editing tool. I think it's on the second iteration now, or is it on the third?

    11. TH

      Something like that.

    12. JR

      They're, they're getting better and better at it.

    13. TH

      Yeah.

    14. JR

      The, the, the idea is eventually it's going to get to the point where it's like a home computer, like where you are going to be able to edit genes.

    15. TH

      Yeah.

    16. JR

      So, how, I mean, how do you stop that? Or what do you do about that? And who has, who, who, if you wanted to have any kind of regulation about something like that, what is the regulation? Is the regulation that you have to have some specific level of clearance before you have access to it? And, but if that's the case, then you put it in ch- in control of the government, and then also...

    17. TH

      People don't trust the government, yeah.

    18. JR

      Bad actors and other governments are going to just distribute it wildly.

    19. TH

      And how do you control that someone would, uh, have to have some kind of access to get it if one of the "its" is something that you just need internet access for, like OpenAI or the ability for cyber weapons, right?

    20. JR

      Yeah.

    21. TH

      Cyber weapons hitting infrastructure targets, it's like now the only way to regulate that is universal surveillance on everyone's use of their home computer.

    22. JR

      And we don't want that future.

    23. TH

      So, so in general, like...

    24. JR

      Yeah.

    25. TH

      Um, because this might sound like just a disaster porn, which I want to be really clear that, I mean, I think our goal in coming on... There is a way through this. Yeah, our goal in coming on was to be able to talk about the framing the problem so we know what we're trying to solve.

    26. JR

      Right.

    27. TH

      We're not trying to say, "Hey, we've just got this social media problem."

    28. JR

      Let's frame it really clearly.

    29. TH

      Yeah.

    30. JR

      Let's frame it really clearly.

  17. 38:3141:49

    Stochastic Terrorism

    1. JR

      know where this comes from? Are these things w- from troll farms?

    2. TH

      I don't know.

    3. JR

      But they could be.

    4. TH

      There's a concept-

    5. JR

      Because some of them probably are, right?

    6. TH

      Yeah.

    7. JR

      There's a concept called stochastic terrorism. There's a good article on it on Edge, which basically is the idea, let's say there was a foreign state actor that wanted to mess things up in the US population. Trying to control a specific person to do a specific thing is hard, but trying to get an already kind of disenfranchised group more radicalized that makes it more likely that some of them do some harmful stuff is easy.

    8. TH

      Think about, you know, you last texted me, Joe, on January 6th. I think we had a quick text exchange-

    9. JR

      Mm-hmm.

    10. TH

      ... 'cause-

    11. JR

      Mm-hmm.

    12. TH

      ... like, I think that's an example of ... I, I don't, I mean, and I'm not gonna claim that everyone ... It's just that, that's an example I think I would say of I can basically go into a group of the Boogaloo Boys or, you know, Stop the Steal groups or something like that, and I can just seed stuff that's like, "Hey, let's get our guns out. Let's do this." And I just, just, just hinting-

    13. JR

      Right.

    14. TH

      ... at that idea.

    15. JR

      Right.

    16. TH

      I, I'm not telling one person to go do something. I'm not controlling anyone. I'm just hinting, and there's a wide enough group there that people can take action. So, that's one of the, the other decentralized power tools. But I just wanted to close the thought of on the, on the bowling alley. We've got, um, the bowling alley, one gutter is like let's lock it down with surveillance. Let's lock it down with Mark Zuckerberg controls everything. Let's lock it down with the government tells us what we can and can't do on computers. And the other gutter, which is the decentralized power for everyone, which without people having the wisdom to wield that godlike power.

    17. JR

      Right.

    18. TH

      Or like not, at least not evidenced as in, in people's own usage of it right now.

    19. JR

      Also, we've incentivized people to do destructive things-

    20. TH

      Also, we've incentivized.

    21. JR

      ... just for likes.

    22. TH

      Right.

    23. JR

      Yeah.

    24. TH

      So, in, in certain places there is an incentive to, for those things to happen. It's not just by accident. It's like by design and incentivized.

    25. JR

      Yeah.

    26. TH

      But the-

    27. JR

      Wait, what you just said is super important. It's a population that is getting continuously more radicalized on all sides, that simultaneously has continuously more powerful tools available to them, in a world that's increasingly fragile.

    28. TH

      Yeah.

    29. JR

      And so if you have an increasingly fragile world, meaning more interconnected global supply chains that have, where a, a collapse somewhere leads to collapse everywhere, more sensitive infrastructure, you know, things like that. If you have an increasingly fragile world, you have more and more radicalized people, and you have those radicalized people having access to more and more powerful tech. That's a just fragility across lots of different dynamics, and this is why the social media thing is so central is, it's a major part of the radicalization process.

    30. TH

      It's both a major part of the radicalization process, and is itself an example of the centralized control censorship, which we don't want, and the decentralized viral memes for everyone, which radicalize and enrage people and polarize democracies into not working. The thing is, in those two gutters, the gutters are getting bigger every day, like on each side. You've got more potential for centralized control. You've got China basically doing full control over its internet, you know, doing a bunch of stuff to, to k- top down control, and the other side you have more and more decentralized power in more hands, and that, that gutter's growing. So, the, the question is how do you basically ... We have to bowl a strike down the center of that alley, but it's getting thinner and thinner every day. And the goal is how do we actually sort of, um ... It's almost like a test, right? We are, we are given these godlike powers, but we have to have the wisdom, love and prudence of gods to match that-

  18. 41:4943:19

    Social Credit Scores

    1. TH

      Yeah. Have you been following this?

    2. JR

      Yeah. That's what terrifies me is that we have to become like China in order to deal with what they're doing. Um, I, I, I, I just, I feel like one step moving in that general direction is a social credit score system.

    3. TH

      Mm-hmm.

    4. JR

      And I'm terrified of that.

    5. TH

      Right.

    6. JR

      And I think that that is where vaccine passports lead to.

    7. TH

      Mm-hmm.

    8. JR

      I really do, and I, I think this idea that they're slowly working their way into our everyday lives and in this sort of inexorable way, where you have to have some sort of paperwork or some sort of a Q-code or something on your phone, or QR code.

    9. That s- scares the shit out of me.

    10. TH

      Right.

    11. JR

      ’Cause that, that’s a se- you’re never gonna get that back.

    12. TH

      Right.

    13. JR

      Once the government has that kind of power and control, they’re gonna be able to exercise it whenever they want with all sorts of reasons to, to institute it.

    14. TH

      I, I’m worried about that too, but I will say also, just to also notice, that, um, everywhere there’s a way in which a, a small move in a direction can be shown to lead to another big boogeyman, and that boogeyman makes us angry, social media is up-regulating the meaning of everything to be its worst possible conclusion. So, like, a small move by the government to do X might be seen as, "This is the first step in this total thing." I’m not saying that they’re not gonna go do that, I’m worried about that too, but to also just notice the way that social media amplifies that d- the degree to which we all get kind of- Right, the thing-

    15. JR

      ... reactive and triggered by that.

    16. TH

      The thing I think is worth mentioning is what China is doing regarding its internet, because it’s seeing real problems. And we might not like their solution, we might want to implement a solution that has more civil

  19. 43:1947:49

    Social Media Reform

    1. TH

      liberties than we should.

    2. JR

      Well, let’s explain what they’re doing.

    3. TH

      Yeah, yeah. So, I’ll, I’ll do it quickly. So, um, it’s, it’s almost, it’s quite literally as if Xi, Xi Jinping saw The Social Dilemma, because they’ve, they’ve, in the last two months, rolled out a bunch of sweeping reforms that include things like if you’re under the age of 14 and you use Douyin, which is their version of TikTok, when you swipe the videos, instead of getting, like, the influencer dancing videos and soft pornography, you get science experiments you can do at home, museum exhibits, and patriotism videos.

    4. JR

      Wow.

    5. TH

      So you’re scrolling and you’re getting stuff that’s educating, ’cause they want their, their kids to grow up and want to be astronauts and scientists.

    6. JR

      Yeah.

    7. TH

      They don’t want them to grow up and be influencers. Uh, and I’m not, when I say this by the way, I’m not, just to be clear, I’m not praising that model, just noticing-

    8. JR

      Right.

    9. TH

      ... all the things that they’re doing.

    10. JR

      Well, I’ll praise it. (laughs)

    11. TH

      (laughs) And I know that-

    12. JR

      If you’re gonna influence people, that’s a great way to do it.

    13. TH

      They also limit it to, um, three hours, uh, sorry, 40 minutes a day on TikTok.

    14. JR

      Mm.

    15. TH

      For gaming, um, let me actually do the TikTok example. So they, they do 40 minutes a day for TikTok. They also, when you scroll a few times, they actually do a mandatory five-second delay saying, "Hey, do you want to get up and do something else?" Like, because when people sit there infinitely scroll, even Tim Cook recently said mindless scrolling, which was actually invented by my co-founder of the Center for Humane Technology, Aza Raskin, he invented, he was in The Social Dilemma, he’s the one who invented that infinite scroll thing.

    16. JR

      Mm.

    17. TH

      Um, China said, "Hey, we don’t want people mindlessly scrolling," so after you scroll a few videos, it does a f- a mandatory five-second, like, interlude. They also have opening hours and closing hours. So from 10:00 PM until 6:00 in the morning, if you’re under 14, it’s like, it’s closed, meaning, um, one of the problems with social media for teenagers is if I’m not on at 1:00 in the morning, but all my friends are on and they’re still commenting-

    18. JR

      Mm.

    19. TH

      ... on my stuff, I feel the social pressure. I’m gonna be ostracized if I don’t participate.

    20. JR

      Right, and if your notifications are on-

    21. TH

      Yeah.

    22. JR

      ... your phone keeps buzzing.

    23. TH

      Totally. And even if they’re not on, it’s like, "Oh, but I want to see if they-

    24. JR

      Yeah.

    25. TH

      ... said something about my thing." And so, um, it’s a, it’s a, it’s a, we call a multipolar trap. If I don’t participate, but the other guys are, I’m gonna lose out. And Facebook and these companies, they know that, by the way. Even Netflix said, "Our biggest competitor is sleep." So-

    26. JR

      True.

    27. TH

      ... one of the, um, because they’re all competing for attention. So when you do this mandatory thing where you say, "We’re gonna close from 10:00 PM to 6:00 in the morning," suddenly everyone, if y- you’re in the same time zone, it’s another important side effect, uh, can’t use it at the same time. So these are some examples. For their military, by the way, when you, if you’re a member of the Chinese, uh, uh, PLA, uh, Army, um, you, you get a locked-down smartphone. It’s like a light phone. It’s like hyper-locked down, you can’t do anything.

    28. JR

      Mm.

    29. TH

      By contrast, we know that Russia and China go into our, um, uh, veterans' groups on Facebook, and they actually try to sew disinformation, try to radicalize veterans. "Hey, Afghanistan happened, aren't you really pissed? Let me show you 10 videos." Right?

    30. JR

      Yeah.

  20. 47:4951:19

    Democracy and Social Media

    1. TH

      was just at that we actually need to double-click on, which is that democracies are more affected by what’s happening with social media than authoritarian-

    2. JR

      Of course.

    3. TH

      ... nations are. And for a number of reasons, but do you wanna? Well, and we sort of hinted at it earlier, but when social media’s business model is showing each tribe their boogeyman, their extreme reality, it forces a more polarized political base, which means to get elected, you have to say something that’s going to appeal to a base that’s more divided. And in the Facebook files that Frances Haugen put out, um, they showed that when Facebook changed the way its ranking system worked, uh, in 2018, to something called meaningful social interactions, I won’t go into the details, they talked to political parties in Europe. So here we are, it’s 2018, they do an interview with political parties in Poland, in Hungary, in Taiwan, in India, and f- and these political parties say, "Facebook, we know you changed your ranking system."And Facebook, like, smugly responds, "Yeah, everyone has a conspiracy theory about how we change (laughs) our ranking system," because those stories go viral. And they're like, "No, no, no. We know that you changed how your ranking system works because we used to be able to publish, here's a white paper on our agriculture policy to deal with, like, soil degradation. And now when we publish the white paper, we get crickets. We don't get any response. And we tested it, and the only thing that we get a traffic and attention on is when we say negative things about the other political parties." So and they say, "We know that's bad. We don't want to do that. We don't want to, like, run our campaign that's about saying negative things about the other party. But when you change the algorithm, that's the only thing we can do to get attention." It shows how central the algorithm is to everything else. If I'm Tucker Carlson or Rachel Maddow or anybody who's a political personality, are they really, um, saying things just for their TV audience? Or are they also appealing to the algorithm because most, more and more of their attention is going to happen downstream in these little clips that get filtered around? So they, they also need to appeal to how the algorithm is rewarding saying negative things about the other party. So what that does is it means you elect a more political, like, representative class that's based on disagreeing with the other side and being divided about the other side, which means that it throws a gear into the rent- a wrench into the gears of democracy and means that democracy stops delivering results in a time where we have more crises. We have more supply chain stuff and inflation and all these other things to respond to, and instead of responding effectively, it's just division all the way down.

    4. JR

      But it's been division from the jump, even long before there was social media. So all social media is doing-

    5. TH

      Is putting gasoline on it.

    6. JR

      Yeah, it's they're taking advantage of a trend that already existed. It's not like, they, but my opponent is reasonable-

    7. TH

      Right.

    8. JR

      ... but I feel like I'm just a better choice.

    9. TH

      Tot-

    10. JR

      And you could disagree because he's a great guy, but this is how I feel. No one's doing that.

    11. TH

      Totally, totally.

    12. JR

      Right?

    13. TH

      But again, but notice though in this 2018 example how specific the change was. Those political parties, before 2018, they could get elected in those countries because they hadn't gone that, as partisan maybe as we were yet.

    14. JR

      Yes.

    15. TH

      They could have gotten elected and getting attention by saying, "Here's a white paper about our agriculture policy." But after 2018, the algorithm has the master say. Everyone has to appeal to the algorithm. If I'm a small business, I have to appeal to the algorithm. If I'm a newspaper, do I just, like, write the articles I want to write or the investigative stories, the fourth estate that we need for democracy to work? No, I have to write the clickbait title that's going to get attention. So I have to exaggerate and say, "Joe, Joe Rogan just takes horse dewormer," because that's going to get more attention than saying he took ivermectin. So-

    16. JR

      Particularly in this world where no one's buying paper anymore.

    17. TH

      Correct, which-

    18. JR

      Everyone's buying everything clicking online. So you really... And very few people are even subscribing, so you have to give them these articles and then have these ads in the articles.

    19. TH

      And those publishers... And that's also driven by the business models of these, these central

  21. 51:191:10:11

    runaway feedback loops

    1. TH

      tech companies-

    2. JR

      Yeah.

    3. TH

      ... especially Facebook, Twitter and Google.

    4. DS

      There's two feedback loops that he just mentioned. Politically, if you have Facebook and other platforms like this polarizing the population, then the population supports a more polarized representative class, but the representatives to be elected are doing political ads and so the political ads then further polarize the population. And so now you have this feedback loop and then the same is also true with, with media. The media has to, meaning newspapers, television, still has to do well on the Facebook algorithm because more and more there's a monopoly of attention happening there and it's someone seeing a clip there that has them decide to subscribe to that paper or keep subscribing to it or whatever it is. So you end up having the algorithm radicalizing what people want to pay attention to where then the, uh, sources of broadcast have to appeal to that which then in turn further radicalizes the population. So these are runaway feedback loops.

    5. JR

      And what's the solution?

    6. TH

      Um, well, actually, you, you asked me this last time, and I dodged the question.

    7. JR

      (laughs)

    8. TH

      And part of it is because, um, it's, it's connected to a set of broader issues that, that I think is actually really deep in Daniel's language is actually the reason I wanted us, uh, to do this together this time. Um, there's obviously many steps to this, right? So once you've kind of let this cancer sort of spread, if you take out the, the thing that was causing the cancer, we, we've now already pre-polarized everyone's beliefs. Like when you say, "What's the solution to all this?" Like all of our minds are running malware, like we're all running bad code, we're all running confirmation bias.

    9. DS

      Except no one thinks that they are-

    10. JR

      Right.

    11. TH

      Everyone thinks the other ones are, but-

    12. JR

      Right.

    13. TH

      ... not, not me. But the point is that all of us, like it doesn't matter, like people on all sides of the political aisles and all tribes, we've all been shown our version of the boogeyman, our version of the inflated thing that got our attention and then made us focus on that and then make us double down and, and go into those habits of those topics being the most important. And so we have to realize that. I almost think we need a shared moment for that. I wish The Social Dilemma was a little bit more of a... It was a shared moment, but I think there's almost like a truth and reconciliation, like, uh, moment that we need to unwind the, you know, our minds from the cult factory, because it's a cult factory that found each of the little tribes and then just sucked them in together and made them in a self-reinforcing chamber.

    14. DS

      So let's, let's say we take any issue that some people care about and think is central, whether we take social justice or climate change or US China relations. If half of the population thinks that whatever half- half the population has a solution they want to implement, carbon taxes or whatever, other half of the population is polarized to think that that is bad and terrible and going to mess everything up. So that other half are still political actors and they're going to escalate how they counter that. How do you get enough cooperation to get anything done, especially where there are real issues and not just have all the energy become waste heat? In an autocracy, let's take China as an example where you don't have to, where you don't have so much internal dissent, you don't have that issue so you can actually do long term planning. Um, so one of the things that we see is we have decreasing ability to make shared sense of the world. And in any kind of democratic society if you can't make shared sense of the world, you can't act effectively on issues. But the tech, the types of tech that are decreasing our ability to make shared sense of the world are also increasing the speed at which tech is changing the world-... and the total consequentiality of it. And it- that's one way to start to think about, like, this bowling alley example is, we're having faster and faster, more and more profound consequential effects, and less and less ability to make sense of it or do anything about it. So underneath the AI issue, the CRISPR issue, the US China issue, the how do we regulate markets issue, the how do we fix the financial crisis issue, is-

    15. JR

      Quality.

    16. DS

      ... can we make sense of anything, collectively, adequately, to be able to make choices effectively in the, in the environment we're in? And that's underlying it. Tristan was laying out that you got these two gutters, right? You've got decentralized catastrophe weapons for everyone if we don't try to regulate the tech in some ways, and that world breaks. Or to say, if we don't want decentralized catastrophe weapons for everyone, maybe we do something like the China model, but where you have ubiquitous surveillance, and that's a dystopia of some kind. And so either you centralize the power and you get dystopias, or it's decentralized and you get catastrophes. And right now, the future looks like one of those two attractor states most likely, catastrophes or dystopias. We want a third attractor. How do you have a world that has exponential tech that doesn't go catastrophic, where the control mechanisms to keep it from going catastrophic aren't dystopic?

    17. TH

      And by the way, we're not here saying like, "Go buy our thing," or, "We've got a new platform." This is not about... This is just about describing what is that, that center of that bowling alley that's not the gutters-

    18. JR

      Got it.

    19. TH

      ... that we can skate down? The closest manifesting example of this so far, um, although we need to do one more construction, I think, which is this, um, but is, is Taiwan, because Taiwan, actually I think I talked about it last time we were, we were here, is a, um, uh... They've got this digital minister, Audrey Tang, who has been, um, saying, "How do you take a democracy and then use technology to make a stronger democracy?" So you can look right now at the, the landscape. You say, "Okay, we can notice that China and countries like China, autocratic countries, are employing the full suite of tech to make a stronger authoritarian autocratic society." They're adding surveillance. They're doing, you know, cameras everywhere. They're doing Sesame credit scores. They're, uh, using TikTok to educate their people instead of turn them into influencers. They're using the full suite of tech to create their kind of autocracy that they want to see. By contrast, open societies, democracies, western democracies, are not consciously saying, "Hey, how do we take all of this tech and make a stronger democracy?"

    20. JR

      Right.

    21. TH

      "How do we have tech plus democracy equals stronger democracy?" One of the other reasons I wanted to talk to you is, so far, I think the tech reform conversation is like, how do we make social media like 20% less toxic and then call it a day? Or like take a mallet and break it up and then call it a day. That's not enough when you understand the full situation assessment that we're kind of laying out here of the skating down the middle of the bowling alley. The, the thing that we need that competes with that thing, because we can't just also allow... That thing's going to outperform to the, the China autocratic model is going to out-compete a, you know, democracy plus social media (laughs) that like is 20% less toxic, isn't going to out-compete that thing.

    22. JR

      Well, ultimately, in the long run, it's going to, but the, the, that's fascinating is they're willing to forego any sort of profits that they would have from these children from 10:00 PM to 6:00 AM-

    23. TH

      Right.

    24. JR

      ... in order to make a more potent society of more influentia- not influencer, but influential, more educated, more positive people that are going to contribute to society.

    25. TH

      Yeah.

    26. JR

      This is something that I think you can only do if you have this inexorable connection between the government and business. And that's something that they have with corporations and with the CCP over there.

    27. TH

      Right.

    28. JR

      They have this ability because they're, they're completely connected, like the-

    29. TH

      Well, what did the senator tell us about China's response? Oh, yeah. This is, uh, a great point. We, we were talking with a, a sitting senator who was saying... or at some national security conference, um, talking to a foreign minister of a major EU country and said, "Who do you think, um, the CCP, the Chinese Communist Party, considers to be the greatest rival to its power?" You would say United States, right?

    30. JR

      Right.

Episode duration: 3:01:15

Install uListen for AI-powered chat & search across the full episode — Get Full Transcript

Transcript of episode sLNEojqZlUE

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome