Skip to content
The Joe Rogan ExperienceThe Joe Rogan Experience

Joe Rogan Experience #1572 - Moxie Marlinspike

Computer security researcher Moxie Marlinspike is the creator of the encrypted messenger service Signal, and co-founder of the Signal Foundation: a nonprofit dedicated to global freedom of speech through the development of open-source privacy technology.

Moxie MarlinspikeguestJoe RoganhostGuest 2guest
Jun 27, 20243h 2mWatch on YouTube ↗

EVERY SPOKEN WORD

  1. 0:0015:00

    (drumbeats) Joe Rogan podcast,…

    1. MM

      (drumbeats) Joe Rogan podcast, check it out. The Joe Rogan Experience.

    2. JR

      Train by day, Joe Rogan podcast by night. All day. (instrumental music plays)

    3. MM

      Uh, so like, we're gonna just sit here and talk for a long time, huh?

    4. JR

      Yeah. We're started right now. We're started.

    5. MM

      Th- It has begun.

    6. JR

      Yes (laughs) .

    7. MM

      (laughs) .

    8. JR

      W- What was your question, though?

    9. MM

      I was gonna ask, you know, like, what if something comes up, you know, like-

    10. JR

      Like what?

    11. MM

      ... like, uh, you know, you need to, like, pee or something.

    12. JR

      Oh, you can totally do that. Yeah. We'll just pause-

    13. MM

      Okay.

    14. JR

      ... and just run out and pee. That happens. D- Don't sweat it.

    15. MM

      All right, all right.

    16. JR

      I want you to be comfortable.

    17. MM

      All right.

    18. JR

      Have you ever done a podcast before?

    19. MM

      First time.

    20. JR

      Really?

    21. MM

      First time.

    22. JR

      Um, so tell me how, where Signal came from. What, what was the impetus? What was, how did it get started?

    23. MM

      It's a long story.

    24. JR

      It's okay, we got time. We got plenty of time.

    25. MM

      Yeah, yeah, yeah. We got time.

    26. JR

      (laughs) .

    27. MM

      Uh, okay, well, you know, I think ultimately what we're trying to do with Signal is, um, stop mass surveillance, to bring some normality to the internet, and to, uh, explore a different way of developing technology that might ultimately serve all of us better.

    28. JR

      We should tell people, maybes people just tuning in, Signal is an app that is, uh... Explain how it works and what, what it does. I use it. It's a, it's a messaging app, go ahead.

    29. MM

      It's a messaging app, yeah. Yeah.

    30. JR

      But-

  2. 15:0030:00

    Yeah, for sure. There,…

    1. MM

      like, you know, what Snowden revealed was PRISM, which was, um, the, the cooperation between the government and these places where data was naturally accumulating, like Facebook, Google, et cetera, you know, and the phone company. And, uh, Cambridge Analytica I think was the moment that people were like, "Oh, there's, like, also sort of like a private version of PRISM," you know? That's, like, not just governments, but, like, the data is out there, and other people who are motivated are using that against us, you know? Um, and I, so I think, you know, in the beginning, it was sort of like, "Oh, this could be scary." And then it was like, um, "Oh, but, you know, we're just using these services." And then people were like, "Oh, wait, the government is, you know, using the data that we're, uh, you know, sending to these services." And then people were like, "Oh, wait, like, anybody can use the data against us." And then we were like, "Oh, shit." You know, it's like, I think things went from like, um, "Ah, I don't really have anything to hide," to like, "Wait a second. These people can predict and influence how I'm going to vote based on what kind of jeans I buy?" You know. Uh, and, you know, and then, you know, sort of where we are today, where it's like, I think people are also beginning to realize that the companies themselves that are doing this kind of data collection are, are also not necessarily acting in our best interests.

    2. JR

      Yeah, for sure. There, there's just two... There's, there's also this weird thing that's happening with these companies that are gathering the data, whether it's Facebook or Google or... I don't think they ever set out to be what they are. They started out, like Facebook, for example, we were talking about it before. It was really just sort of like a social networking thing, and this was in the early days. It was a business. I don't think anybody ever thought it was going to be something that influences world elections in a staggering way. Like, especially in other parts of the world, you know, where Facebook becomes the sort of de facto messaging app on your phone when you get it.

    3. MM

      Yeah.

    4. JR

      I mean, it has had massive impact on, on politics, on shaping culture, on, on, I mean, even genocide has been connected to Facebook in, in certain, certain countries. You know, it's, it's weird that this thing that is in, I don't know how many different language Facebook ex- how many different languages does Facebook, uh, operate under?

    5. MM

      All of them. Whatever they-

    6. JR

      All of them?

    7. MM

      Yeah, yeah.

    8. JR

      I mean, th- this was just a s- a social app. It was from Harvard, right? They were just connecting students together? Wasn't that initially what the, the first iteration of it was?

    9. MM

      Yeah. Oh, okay, I mean, I think you can say, like, no one anticipated that these things would be this significant, um, but I also think that there's... You know, I think ultimately, like, what we end up seeing again and again is that, like, bad business models produce bad technology, you know? That, like, the point, you know, what we were talking about before, like, the point, you know... Mark Zuckerberg did not create Facebook because of his deep love of, like, social interactions. Like, he did not have some, like, deep sense of, like, wanting to connect people and connect the world. That's not his passion, you know? Uh, Jeff Bezos did not start Amazon because of his deep love of books. Uh, you know, uh, these companies are oriented around, um, profit, you know? They're, they're trying to make money, and, and they, uh, you know, they're, they're subject to external demands as a result. They have to, like, grow infinitely, you know? Um, which is insane, but that's the expectation. And, you know, so what we end up seeing is that, uh, the technology is not necessarily, uh, in our best interest, because that's not what it was designed for to begin with.

    10. JR

      That is insane that companies are expected to grow infinitely.

    11. G2

      Yeah.

    12. JR

      I mean, they're literally exp- what, what is your expectation? To take over everything, to have all the money-

    13. G2

      And then-

    14. JR

      ... one day.

    15. G2

      ... and then more, you know.

    16. JR

      If, yeah, if we extrapolate, we, we anticipate we will have all the money. There will be no other money.

    17. G2

      (laughs) Yeah.

    18. JR

      I mean, if you keep going, that's what has to happen. How can you just grow infinitely? That's bizarre.

    19. G2

      Yeah, and that's why, I mean, you know, the, I think, you know, the Silicon Valley obsession with China is-

    20. JR

      Yeah.

    21. G2

      ... uh, you know, a big part of it, where people, they're just like, "Well, that's a lot of people there."

    22. JR

      Yes.

    23. G2

      You know? That's, that's another-

    24. JR

      That's, that's a lot of people there.

    25. G2

      ... thing people could just keep growing.

    26. JR

      Yeah, there was a, a fantastic, uh, um, thing that I was reading this morning. God, I wish I could remember what the source of it was. But it was, uh, they were, they were essentially talking about how strange it is that there are so many people that are so anti-human trafficking, they're so pro-human rights, they're so anti-slavery, they're so... All the, the, the powerful values that we ascribe, uh, that we, that we, we think of when we think of Western civilization, we, we think of all these beautiful values, but that almost all of them rely on some form of slavery to get their electronics.

    27. G2

      Oh, yeah.

    28. JR

      Yeah.

    29. G2

      And it was just-

    30. JR

      You have the, uh, eight grams of cobalt in your pocket over there-

  3. 30:0045:00

    Hmm. …

    1. MM

      speeches, um, you know, if you look at the fascist leaders, um, you know, they would give a speech and when there was a- a moment of applause, um, they would just sort of stand there and accept the applause, because in their ideology, uh, they were responsible for the thing that people were applauding, you know? And if you watch the old communist leaders, you know, like when Stalin would give a speech, um, and he would say something and there would be a moment of applause, he would also applaud. Uh, because in their ideology of historical materialism, they were just agents of history. They were just the tools of the inevitable. Uh, it wasn't them, you know, they had just sort of been chosen as the agents of this thing that was an inevitable process, and so they were applauding history, you know? And sometimes when I see, like, the CEOs of tech companies give speeches, uh, and people applaud, I kind of feel like they should also be applauding.

    2. JR

      Hmm.

    3. MM

      You know, that it's not...... them, you know, that-

    4. JR

      Right.

    5. MM

      ... uh, technology has its own agency, its own force that they're the tools of, in a way.

    6. JR

      That's a very interesting way of looking at it. The, uh, yeah, they are the tools of it and at this point, if we look at where we are in 2020, t- in this- it seems inevitable. It seems like there's just this unstoppable amount of momentum behind innovation and behind the, just the process of creating newer, better technology and constantly putting it out, and then dealing with the demand for that newer, better technology, and then competing with all the other people that are also putting out newer, better technology. And this- and then (claps) -

    7. MM

      (laughs) Yeah, yeah. Hey.

    8. JR

      Look what we're doing.

    9. MM

      Yeah.

    10. JR

      We're- we are helping the demise of human beings-

    11. MM

      (laughs)

    12. JR

      ... because I feel, and I've- I've said this multiple times and I'm gonna say it again, I think that we are the electronic caterpillar that will give way to the butterfly. We are- we don't know what we're doing. We are putting together something that's going to take over. We are putting together some ultimate being, some symbiotic connection between humans and- and technology, or literally an artificial version of life. Not even artificial, a version of life constructed with silicone and- and wires and- and things that we're making. We're- we're- if we keep going the way we're going, we're gonna come up with a technology that is gonna be ex machina. It's gonna- it's gonna pass the Turing test and it's going to literally be something that's better than what we are, a better version of a human being.

    13. MM

      I think we're a ways away. Uh-

    14. JR

      Yeah, we're a ways away, but what- how many ways? 50 years?

    15. MM

      Uh, the moment that I can put my hand under the, like, automatic sink thing-

    16. JR

      Mm-hmm.

    17. MM

      ... and have the soap come out without, like-

    18. JR

      (laughs)

    19. MM

      ... waving arou- you know, like, then I'll be worried, you know? But like, you know-

    20. JR

      Oh, that's simplistic, sir.

    21. MM

      (laughs)

    22. JR

      How dare you. Well, here's a good example. The- you know the Turing test, is- the Turing test is, uh, if, uh, someone sat down with, like, an ex machina. Remember that- that- the je-

    23. MM

      Sure.

    24. JR

      It's one of my all-time favorite movies, where the coder is brought in to talk to the-

    25. MM

      Mm-hmm.

    26. JR

      ... the woman. He f- falls in love with the robot lady and h- she passes the Turing test 'cause he's- he's in love with her. I mean, he really- he really can't differentiate. In his mind, that is a woman, that's not a robot. Who- was it Alan Turing? What was the- the gentleman's name?

    27. MM

      Alan Turing.

    28. JR

      Alan Turing, that came up with the Turing test. You know, he- he was a gay man in England in the 1950s when it was illegal to be gay, and they-

    29. MM

      Killed himself.

    30. JR

      ... chemically castrated him-

  4. 45:001:00:00

    You think? All right.…

    1. JR

      keeps a lot of fucking people on Apple.

    2. MM

      You think? All right.

    3. JR

      Oh, it's the best. You can make a video, like a long video, like a couple minutes long, and you can just AirDrop it to me. Whereas if you text it to me, especially if I have an Android phone-

    4. MM

      Yeah.

    5. JR

      ... ugh, it becomes this disgusting version.

    6. MM

      Yeah, it'll down-sample it and it's-

    7. JR

      Ugh.

    8. MM

      Yeah, yeah, yeah.

    9. JR

      It looks terrible.

    10. MM

      Yeah. No, that's true. That's true.

    11. JR

      Yeah. Photographs are not too bad. I think that it, it does a down-sample of photographs as well?

    12. MM

      Sure. Yeah, yeah.

    13. JR

      But not too bad.

    14. MM

      Yeah.

    15. JR

      It's like, you could look at it. It looks like a good photograph.

    16. MM

      Yeah, yeah, yeah.

    17. JR

      But video's just god-awful. Y- It's embarrassing when someone sends you a video and you have it on an Android phone and you're like, "What the fuck did you just send me?"

    18. MM

      (laughs)

    19. JR

      "This is terrible." (laughs) Like, "What did you take this with? A flip phone from the '90s?"

    20. MM

      Yeah. Yeah, yeah.

    21. JR

      "It's so bad."

    22. MM

      But, I mean, a lot of that is, like, uh... I, I think the reason that, that... the, the reason why it is that way is, is kind of interesting to me, which is, you know, it's like these are, um, protocol. You know, it's like when you're just using a normal SMS message on Android, you know? Uh, that was, like, this agreement that phone carriers made with each other in, like, you know, whenever-

    23. JR

      2002?

    24. MM

      No, before that. Way before.

    25. JR

      Really?

    26. MM

      You know?

    27. JR

      '96?

    28. MM

      Yeah, yeah. Exactly, you know? And then have, they've been unable to change the way that it works since then because, um, you have to get everyone to agree.

    29. JR

      Right. And is Apple holding back some sort of a universal standard? Because if they did have a universal standard, then everyone would have this option to use. You could use a Samsung phone or a Google phone. You could use anything and everybody would be able to message you clearly without a problem. Like, one of the things that holds people back is if you switch from an iPhone to an Android phone-

    30. MM

      Yeah.

  5. 1:00:001:12:15

    Missed it. (laughs) …

    1. MM

      It's not like it used to be."

    2. JR

      Missed it. (laughs)

    3. MM

      What do you think? It's like... And now people are like, "Have you been?" I was like, "I went once in 2000." They're like, "Wow, wow, that's when it was like the real deal." You know, like, eh, I don't think so. Uh, it's one of those things where it's like, you know, there's like day one and then on day two they're like, "Ah, it's now like day one."

    4. JR

      Right.

    5. MM

      You know?

    6. JR

      Of course, of course.

    7. MM

      Like, and it just gets worse and worse. Uh, but, uh, yeah, I don't know. Those things, those spaces were important to me and, like, an important part of my life. And as more of our life started to be taken over by technology, um, you know, me and my friends felt like those spaces were missing online, you know. And so we wanted to demonstrate that it was possible to create spaces like that. And, um, there had been a history of people thinking about, um, cryptography in particular and, uh, (laughs) and which is kind of funny in hindsight, right? Uh, so in the, like, '80s... So the history of cryptography is actually not long, like, uh, at least in, outside of the military, you know. Uh, and, you know, it really in- starts in the '70s, um, and, uh, there were some really important things that happened then. And in the '80s there was this person who was just sort of this lone maniac who was, like, writing a bunch of papers about cryptography during a time when it wasn't, wasn't actually that relevant because there was no internet, the-... you know, the, the, the applications for these things were harder to imagine. Um, and then in the late '80s, there was, um, this, uh, guy who wrote a... who was a, a retired engineer who discovered the papers that this maniac, David Chaum, had been writing and was really fascinated-

    8. JR

      Was he doing this in isolation or was he a part of a project or anything?

    9. MM

      No, I think David Chaum was, um... I think he's an academic. I'm, I'm actually, I'm actually... I'm embarrassed that I don't know. But, uh, he, um, he did a lot of the, the notable work, um, on, uh, using the, the primitives that had, had already been developed. And, um, he had a lot of interesting ideas and there's this guy, uh, who was a retired engineer, his name was Tim May, uh, who was, uh, kind of a weird character, and he found these paper, papers by David Chaum, was really enchanted by, um, what they could represent for a future. And he wanted to write, like, a sci-fi novel about... that was sort of predicated on a world where cryptography existed and there was a, a future where the internet was developed. And so he wrote some notes about this novel, uh, and he, he titled the notes The Cryptoanarchy Manifesto. And, uh, he published the notes online and people got really into the notes. Um, and then, uh, he started a mailing list in the early '90s called The Cypherpunks Mailing List and all these people started, you know, joined the mailing list and they started communicating about, you know, what the future was gonna be like and how, you know, they needed to develop tr- cryptography to live their, you know, cryptoanarchy future. Uh, and, um, at the time, uh... it's strange to think about now, but cryptography was, uh, somewhat illegal. It was, it was regulated as a munition.

    10. JR

      Really?

    11. MM

      Yeah. So if you wrote a b- a little bit of crypto code and you sent it to your friend in Canada, that was the same as, like, shipping Stinger missiles across the border to Canada.

    12. JR

      Wow.

    13. MM

      So yeah.

    14. JR

      So did, did people actually go to jail for cryptography?

    15. MM

      Uh, there were, like, uh, some high profile legal cases. Um, nobody... I don't know of, uh, any situations where people were, like, tracked down as, like, munitions dealers or whatever, but it really, um, hampered what people were capable of doing. Uh, so people got really creative. There's some people who wrote some crypto, uh, software called Pretty Good Privacy, PGP, and they, uh, they printed it in a book, like a MIT press book, in a machine readable font. Uh, and then they're like, "This is speech." You know, "This is a book." You know, it's like, "I have my First Amendment right to, like, print this book and to distribute it." And then they, like, shipped the books to, like, Canada and other countries and stuff, and then people in those places scanned it back in, uh, to computers and they were able to make the case that they were, uh, legally allowed to do this because of, you know, their First Amendment rights. Uh, and-

    16. JR

      Hmm.

    17. MM

      ... um, people, uh, other people moved to Anguilla and started, like, writing code in Anguilla and, like, shipping it around the world. Uh, there were a lot of people who were fervently interested-

    18. JR

      Why Anguilla?

    19. MM

      Uh, 'cause it's close to the United States and, uh, there were no laws there about producing-

    20. JR

      Hmm.

    21. MM

      ... cryptography, so, uh, I think... I don't know, it's something people (inaudible)

    22. JR

      They have, like, three cases of COVID there, ever.

    23. MM

      Oh, really?

    24. JR

      Yeah. It's a really interesting place.

    25. MM

      Yeah. I, I used to work down there. It's, it's-

    26. JR

      Really? Uni- Okay. International Traffic and Arms Regulation. "It's a United States regulatory regime to restrict and control the export of defense and military-related technologies to safeguard US national security and further US foreign policy objectives."

    27. MM

      ITAR.

    28. JR

      Yeah, they were closed. Anguilla was closed till, like, uh, November. They wouldn't let anybody in.

    29. MM

      Oh, really?

    30. JR

      And, uh, yeah, if you wanna go there, they have, like... I was reading they had all these crazy restrictions. You have to get COVID tested and you have to apply and, and, and then when you get there, they test you when you get there.

Episode duration: 3:02:24

Install uListen for AI-powered chat & search across the full episode — Get Full Transcript

Transcript of episode WDe-mUxR7P0

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome