Skip to content
The Joe Rogan ExperienceThe Joe Rogan Experience

Joe Rogan Experience #1768 - Dr. Robert Epstein

Dr. Robert Epstein is an author, professor, and Senior Research Psychologist at American Institute for Behavioral Research and Technology: a non-profit, non-partisan organization that offers data regarding the power of Google and other Big Tech companies to censor dissenting opinions online and sway the outcome of elections.

Joe RoganhostDr. Robert Epsteinguest
Jun 27, 20242h 41mWatch on YouTube ↗

EVERY SPOKEN WORD

  1. 0:0015:00

    (drumming) Joe Rogan podcast,…

    1. JR

      (drumming) Joe Rogan podcast, check it out.

    2. RE

      The Joe Rogan Experience.

    3. JR

      Train by day, Joe Rogan podcast by night, all day. (rock music) Uh, first of all, thank you for coming. I really appreciate it. This is, uh, uh, a very interesting subject, because I think, um, search engine results, uh, have, uh, always been a thing that people kind of take for granted, that the search engine results is gonna give you the most significant results at the top. And they don't really think about the fact that this is kind of curated. And, uh, you know, we found that many times, because we use two different search engines. We'll use Google, and then we'll say, "Well, if we can't find it on Google, use DuckDuckGo." And oftentimes, when you're looking for something very specific, you'll find that you can't find it on Google. Like, they're, uh, if it's in there, it's deep, deep, deep, you know, many pages in. Whereas, DuckDuckGo will give you the relati- you know, the relevant search results very quickly. So something's going on with search engines, and from your research, what you have found is that it can significantly affect the results of elections.

    4. RE

      Well, not, not just that. It can affect (clears throat) how people think. It can affect, um, your opinions, attitudes, the purchases that you make, uh, pretty much... It's, it's, it's a mind control machine. It's, it's the most powerful mind control machine that's ever been invented. Uh, and by the way, you should never use the Google search engine. Never.

    5. JR

      Never?

    6. RE

      Never.

    7. JR

      Why is that?

    8. RE

      Because it, this is what I call, um... And this isn't, this is an S&M platform, and I'm, I'm not sure what S&M means to you. I don't wanna pry into your personal life, but, uh, point is that, uh, but I... Well, what I mean by S&M is it, this is a surveillance and manipulation, um, platform. Uh, on the surface, there are always two, two levels to everything with Google. On the surface, it's a, it's like a free public library kind of thing, right?

    9. JR

      Yes.

    10. RE

      That's always on the surface. Beneath the surface, it's something different. From a business perspective, it's an S&M platform. It exists for two purposes only, and that is to trick people into giving up lots and lots of personal information. Notice your public librarian doesn't do that. Did you notice that? They don't actually do that (laughs) , you know?

    11. JR

      Right.

    12. RE

      And then it's also used for manipulation, because they discovered quite a few years ago that if they control the ranking of the search results, they can control people's opinions, purchases, votes. Now, they can't control everyone's opinions, because a lot of people already have strong opinions, so the people they're going after are the people who are undecided, the people who are vulnerable, and they know exactly who those people are. And they literally... Your, your mind is putty in their hands. Uh, so you should never, ever use Google or any, any other S&M product. Like, Amazon Alexa is an S&M product. Um, or the Google Home device or Android phones. Uh-

    13. JR

      Android phones are bad?

    14. RE

      An Android phone is an S&M device. It's, it's always listening. It's always recording.

    15. JR

      Android phones are always recording you?

    16. RE

      (laughs) Are you serious?

    17. JR

      Yeah, I mean, uh, just, I'm, I'm questioning this. I mean, I believe you, but uh-

    18. RE

      Mm-hmm.

    19. JR

      ... I just want you to elaborate.

    20. RE

      Oh, yeah. There have been court cases in which the recordings have been subpoenaed, uh, from, uh, whoever's controlling that per- you know, that so-called personal assistant or that, that device, and courts have recovered recordings and transcripts, uh, when people are not even aware that they're, that they're being monitored.

    21. JR

      I know that's the case with Alexa, right?

    22. RE

      Yes.

    23. JR

      But that's the case with Android phones as well?

    24. RE

      Yes. In fact, And- Android phones, uh, uh, uh, the equipment to, to, to prove this, which I didn't bring but... Um, it's so cheap now that literally anyone can confirm this. Android phones, even if they're disconnected from your, uh, your mobile service provider, even if you pull out the SIM card, okay, as long as the power is on, it is recording, tracking every single thing that you do. So if you use it to read things, if you use it to listen to music, you use it to shop, whatever it is, it's, it... And of course, your location is always tracked. Then when you go back online, the moment you're, you're reconnected, it, it uploads all that information. So, some people wonder why their batteries run down sometimes even when you're not really doing anything with your phone. That's because, uh, with Android phones, it's, I think it's 50 to 60 times per hour it's uploading. Uh, it's uploading about 12 megabytes of data per hour, so that's a lot of, a lot of energy. That, that, that requires energy. Uh, so it... I mean, the kind of phone I have is completely different. It doesn't do anything like that.

    25. JR

      What do you have, like a no agenda type phone? Do you know that show No Agenda?

    26. RE

      No.

    27. JR

      It's, uh, uh, my friend Adam Curry who's the original Podfather, he's the guy who invented podcasting, and he, uh, his company develops these de-Googled phones where they take out all the tracking stuff, all, everything, and it's, uh, it's, it's basically running on a different operating system.

    28. RE

      Right. So I have a phone that runs on a differ- different operating system. It's completely de-Googled, and, um, and-

    29. JR

      What do you got? Can you show it to me?

    30. RE

      Yeah, I can show it to you (laughs) .

  2. 15:0030:00

    So they, they did…

    1. JR

    2. RE

      So they, they did it to show that they could do it and have their fun, but they didn't wanna get attention. And if they'd, if they'd interfered with, with financial transactions, they would have gotten a lot of attention.

    3. JR

      So no one was ever caught?

    4. RE

      No one was ever caught, but they never denied that this happened either.

    5. JR

      So this was done through Google, for sure. They know this how?

    6. RE

      They, they... It's, it's... It's reported in the news reports and the, and Q- Google was queried and Google said, "Yeah, yeah, that did happen. Yeah, we fixed it."

    7. JR

      So how does Google have the ability to even do something like that? How can that even be done?

    8. RE

      Well, that, that's what I explain in that article, is it's, uh, they, they have blacklists. Uh, let me, let me jump ahead and then I'll...

    9. JR

      Okay.

    10. RE

      Okay. But let me just jump ahead for a second 'cause I, you, you gotta, you gotta see really how sinister this whole thing is. It's just... Seriously, if you knew, (laughs) if you knew a half of what I know about all this dark tech stuff, you would just say, "The hell with it," and just give up. You'd say, "I don't wanna, I don't wanna bring up kids in this kinda world. This is too crazy." But anyway, blacklist. So, I'm telling-

    11. JR

      I feel, I feel like we need to stop you there and make you elaborate. Like, what are you saying?

    12. RE

      Well, there's... Well, what, what I ended up doing, which I think we should get to later in some detail if you're-

    13. JR

      Okay.

    14. RE

      ... if you're still interested-

    15. JR

      Yes.

    16. RE

      ... what I ended up doing was I started doing, uh, randomized controlled experiments to see what kind of shenanigans are out there, to see what kind of power these companies have, especially Google. And I am still, almost month by month, making more discoveries, running more experiments, getting very disturbing data. I mean, so disturbing. We, we just figured out, uh, I think within the last month, that a single question and answer interaction on Alexa... So you ask Alexa a question, and let's say it's about, uh, I don't know, some political issue or political candidate, something you're undecided about. So you ask Alexa and Alexa gives you back an answer. And the answer, let's say, has a bias. In other words, it favors, you know, one candidate, favors one party, favors one cause, right? A single question and answer interaction on Alexa in a group of, let's say, 100 undecided people, uh, can shift opinions by 40% or more. One interaction. If there are multiple questions asked on that topic over time, you can get shifts of 65% or more with no one having the slightest idea that they have been manipulated.

    17. JR

      But are they doing it to manipulate you or is it just the fact that they distribute this information based on their algorithm? It's manipulating you just by default because the higher or more likely you find this information from the search engine...... like, that's what you're gonna... that's what's gonna influence your opinion. But are they doing it to influence your opinion or is that just the best answer? Like, if you have a question-

    18. RE

      (laughs)

    19. JR

      ... who, who is Dr. Robert Epstein?

    20. RE

      Yes. Who, who is he?

    21. JR

      Who is he?

    22. RE

      Yes, exactly.

    23. JR

      That's you. So if I ask that to Alexa, and then it pulls up these results, it's gonna pull up supposedly the most relevant result. Now, are they... they have to have some... if you're g- if you've got something like Alexa, where you're asking a question and it's just reading it back to you, there has to be, like, some sort of curation of that information, right?

    24. RE

      Confession.

    25. JR

      Okay.

    26. RE

      Okay. I, uh, have, have not followed Joe Rogan over the years. Okay? I have five kids. My two eldest sons are like the biggest... your biggest fans in the universe. My eldest son is technically a bigger son, a bigger fan than the other son because he's recently gained 60 pounds because of COVID, so he's definitely the bigger of the two fans. This is Julia-

    27. JR

      I get it.

    28. RE

      ... Julian and Justin. Yeah, you get it. So, anyway, uh, but I, I'm not... I don't follow Joe Rogan, right?

    29. JR

      Okay.

    30. RE

      So now I've had to bone up and actually had to, had to listen. I, I was forced. I had to listen to some of your shows and, you know, I'm thinking, "Wow, this is interesting. This guy is genuinely curious about things." You, you really are genuinely curious. It's crazy.

  3. 30:0045:00

    Yeah. …

    1. RE

      ... uh, uh, uh, g- Google's, Google's influence on the internet is ... it's, it, it's beyond monopoly. They're, they're, they're really in charge. Outside of China and North Korea, they're in charge of pretty much everything that happens on the internet. Um-... uh, Yahoo. Here, let's take Yahoo. Yahoo used to be one of the big search engines.

    2. JR

      Yeah.

    3. RE

      And some people still use it, except Yahoo stopped crawling the internet, uh, about five years ago or more. W- they don't crawl the internet anymore. They get their, their content from Google.

    4. JR

      Really?

    5. RE

      Yeah.

    6. JR

      So Yahoo isn't really a search engine, it just searches Google?

    7. RE

      And your second-favorite, uh, DuckDuckGo is also not a search engine.

    8. JR

      God dammit. What is it?

    9. RE

      It, it, they, they have a crawler, they do have a crawler, so, so, but they, and they do a little crawling, but actually what DuckDuckGo does is it, it's a database aggregator. They're checking databases.

    10. JR

      And what is the difference there?

    11. RE

      Oh, night and day. In other words, Google is literally looking at, you know, billions of websites every day, and it's looking for updates and changes in new websites and this and that. So it's crawling and it's extracting information, especially looking for links, 'cause that's how it gets you good information. It looks for what's linking to what. Okay? Uh, but DuckDuckGo doesn't do that. DuckDuckGo is looking at databases and information and it's trying to answer your question based on information that is in databases, lots of different databases. That's not what Google does. Google's really looking at the whole internet.

    12. JR

      And the Brave search engine, what does it do?

    13. RE

      Um, the Brave search engine is crawling, so it is crawling. Uh, it can't do it at the same level that Google can, but obviously, b- but this guy, you know, Brendan Eich is very ambitious, so he's, you know, he wants to do it at that level. So no, no, they're, they're doing ... Brave is trying to do what, what Google does, except, uh, preserving privacy and suppressing ads.

    14. JR

      And it seems like what happened with Google before anyone even understood that the data is so valuable, before anyone even under- it was too late. It was already, uh, an inexorable part of day-to-day life that people were using that and that people were using Gmail and using all these services and just giving up their data.

    15. RE

      Yeah. (laughs)

    16. JR

      Yeah.

    17. RE

      Well, that's-

    18. JR

      So there's no, there's no considerate ... Like, there's no regulation?

    19. RE

      No, there's no regulation. There are no laws. Uh, and in fact, uh, the courts have ruled over and over again when someone has gone after Google, uh, that Google can do whatever they want. So I'll give you an example, a case I was following very closely, and I was kind of working with these people to some extent, Florida company called EVentures. S- again, someone at Google, it might've been that same guy that I mentioned earlier, I think his name was Matt Cutts or something like that, they, they all of a sudden shut down hundreds of URLs that were, that w- that were just kind of ... that EVentures was using for its business, saying they were not good quality.

    20. JR

      Mm.

    21. RE

      Okay? That, that ... I mean, for you to get that much information out of Google is like pulling teeth, because normally they, they just don't tell you anything. But anyway, so they shut them down, nearly shut down the company. The company decided to sue. So Google, of course, kept them hung up in court for like a couple years 'cause they wouldn't, uh, they wouldn't provide any, any information through discovery. (laughs) And that's ... Google always does that. They just, they won't ... They just stonewall you just even on discovery, which is like preliminary stuff before a lawsuit. Anyway, so EVentures keeps pushing, pushing, pushing, pushing, finally goes to court, and EVentures loses, and they're slaughtered. Literally, the decision of the judge in the case was Google is a private company. It can do what it wants. It can demote you in search eng- in, uh, search rankings. It can, it can delete you. It can block access to your websites. It can do anything it wants. That, that ... Literally, that was the decision.

    22. JR

      So let's say if Donald Trump runs again in 2024 ...

    23. RE

      (laughs)

    24. JR

      ... and they have a Trump campaign website. Google can decide that that website is of poor quality and deny people access to it so that when people go to google "Donald Trump," they will never see his website.

    25. RE

      Correct.

    26. JR

      That's wild.

    27. RE

      Well, they, they block access every day to several million websites, so it's not ... It's, this is not a rare thing that they do.

    28. JR

      And they block access based on their own decisions. Like, they, internal, they don't have to justify them.

    29. RE

      Correct.

    30. JR

      They don't have to have a criteria that they can establish that they're doing the right thing. They t- just do it.

  4. 45:001:00:00

    Mm-hmm. …

    1. RE

      power, ever.

    2. JR

      Mm-hmm.

    3. RE

      So value is a second, and they real- and this is serious. One of the leaks from Google was an eight-minute video, which y- you should definitely watch. It's so creepy, and it's called The Selfish Ledger, and it's eight minutes, and it was, it was put together by their advanced, their super secret advanced products division. It was never meant to leak out of that company. And I, I have a transcript of it too, which I've published, so I can get you all that stuff. But point is, what is this about? This is about the ability that Google has to re-engineer humanity, uh, according to company values.

    4. JR

      Re-engineer humanity according to company values?

    5. RE

      Yes.

    6. JR

      And, and this is a directive, like this is something they're doing purposely or aware of?

    7. RE

      Well, in the videos, uh, th- in the video, they're, they're presenting this as, as an ability that we have.

    8. JR

      Jesus Christ.

    9. RE

      This is an ability that we have. (laughs) Uh, so that's the second area. You, you nailed it. Third one you didn't mention. The third one is, uh, intelligence, because they, uh, they had some support, um, Page and Brin, right in the very beginning at Stanford. They had some support and, and had to be in regular touch with, uh, representatives from the NSA, the CIA, and another intelligence agency. The intelligence agencies, uh, were doing their job. Okay? They, they realized that the internet was growing. This is l- 1990s. So they realized that the internet is growing, and they were thinking, "Hey, the, these are people building indexes, indices to the content. So sooner rather than later, we're gonna be able to find threats to national security by looking at what people are looking up. If someone is going, uh, online, they're using a search engine to, to find out instructions for building bombs, for example, okay, that's a potential threat to national security. We wanna know who those people are." So right from the outset, okay, and this is totally unlike Brave, okay? Brave doesn't do this, but right from the very, very beginning, the, the Google search engine was set up to track and preserve search history. So in other words, to, to, to keep track of who's doing the search and where did they search? That is very, very important to this day for intelligence agencies. So Google, to this day, works very closely with intelligence agencies, not just in the US, but other agencies around the world. So those are the three areas. Money, values, intelligence, and the intelligence stuff, uh, is legit. I mean, it's legit, you know. I- it is an obvious place. If you're, if you're in law enforcement, uh, that's an obvious place to go to find bad guys and girls.

    10. JR

      Yeah. So Google has this ability that they've proclaimed that they can cor- sort of shift culture and direct the, the opinion of things and direct p- public consciousness.... what percent, like, how much of a percentage do you, do you think they have in shifting? Do they have, like, a 30% swing?

    11. RE

      (laughs)

    12. JR

      Like, what?

    13. RE

      Well, see, this is what I do. Now you're getting, now you're, now you're getting close to what I actually do, what I've been doing for, for now for over nine years. I quantify, this is exactly what I do every single day, that's what I do. My, that's my, my team, my staff, that's what we do, and it's, and it, and it's cool.

    14. JR

      Mm-hmm.

    15. RE

      And talk about cool. We're, we're w- we're doing the cool stuff now, okay? Google is not. We're doing the cool stuff, because we are l- we have discovered a number of different tools that Google, and to a lesser extent other companies, use to shift thinking and behavior. And what we do in randomized controlled experiments, which are also counterbalanced and double-blind and all that stuff, we measure the ability that these tools have to shift thinking and behavior, and we, we pin it down to numbers, percentages, proportions. Uh, uh, we, we can make predictions in a, in an election about how many votes can be shifted if they're using this technique or these three techniques or ... And, uh, and so we, yeah, that's what we do. So we've, we started with the search engine, and, uh, and we, it, it took years and years of work, but we, we, we really, I think at this point, uh, have a good understanding of what the search engine can do. Uh, but then along the way, we discovered other tools that they have and which they are definitely using, and how do we know they're using these tools? Well, we can get to that, but-

    16. JR

      What are the tools?

    17. RE

      Well, the first one we called SEEM, search engine manipulation effect, and that means they're either allowing, uh, you know, one candidate or one party to rise to the top, you know, in search rankings, or they're making it happen, and you don't know for sure whether, you know, which is, which is occurring unless there's a whistleblower or there's a leak. Okay, but the fact that it's occurring at all, that's important.

    18. JR

      Right.

    19. RE

      I mean, we don't, in a way, we don't care, because if it's just the algorithm that's doing it, well, that's horrible. That means that, that means literally a computer program is deciding who's gonna be the next president, who's gonna be the next senator. Do, do we want that (laughs) decision made by an algorithm? So anyway, we, we, we spent a lot of time on that. We're still studying SEEM. Uh, then we went, we, we learned about SSE, which is search suggestion effect. When you start to type ... Oh, in fact, if you have your phone handy, this will be fun. If you, if you start to type, uh, a search term into the box-

    20. JR

      Mm-hmm.

    21. RE

      ... the search box, you're, you're, there are, uh, suggestions flashed at you.

    22. JR

      Mm-hmm.

    23. RE

      As fast as you're typing, that's how fast those suggestions come.

    24. JR

      Right.

    25. RE

      Well, guess what? We learned in controlled experiments that (clears throat) by manipulating the suggestions that are being flashed at people, we could turn a 50/50 split in a group of undecided voters into nearly a 90/10 split.

    26. JR

      Wow.

    27. RE

      Without anyone having the slightest idea that they're being manipulated.

    28. JR

      Whoo.

    29. RE

      Just, just by manipulating search suggestions.

    30. JR

      Just by suggesting.

  5. 1:00:001:15:00

    Really? …

    1. RE

      Google. Ever.

    2. JR

      Really?

    3. RE

      Well, wa- watch.

    4. JR

      Okay.

    5. RE

      Okay. So, you got goo- Google up there, right?

    6. JR

      Yes.

    7. RE

      In s- you're in the search box.

    8. JR

      Yes.

    9. RE

      Type A. What's it suggesting?

    10. JR

      Amazon.

    11. RE

      (laughs) Yeah. Well, it's doing more than one suggestion. What are the suggestions?

    12. JR

      Amazon, uh, Academy Sports and Outdoors, Amazon Prime, Houston Astros, uh, then a bunch of other people.

    13. RE

      Mm-hmm.

    14. JR

      Alamo Drafthouse, American Airlines.

    15. RE

      So, your first and third suggestions, (clears throat) notably the first position is the most important-

    16. JR

      Mm.

    17. RE

      ... are Amazon.

    18. JR

      Yes.

    19. RE

      Well, it turns out everywhere in the world where Amazon does business, if you try to search for anything beginning with a letter A and you type A-... Google suggests Amazon. Why is that? Well, it turns out Amazon is Google's largest advertiser, and Google is Amazon's largest single source of traffic. It's a business relationship. Get it?

    20. JR

      Hmm.

    21. RE

      If you type T, you're gonna get Target, and so on. But, what's interesting is when you type G.

    22. JR

      Okay.

    23. RE

      Just type G.

    24. JR

      All right. (clicks keyboard) What do you think I'm gonna get?

    25. RE

      Well, tell us, tell us what you got.

    26. JR

      Grand Seiko.

    27. RE

      Nothing interesting on there at all?

    28. JR

      No. Gastronomical, and then number four is Google Translate. Number five is Gmail. Number six is Google.

    29. RE

      (laughs) Okay. Oh, it's starting to see a pattern here.

    30. JR

      Yeah, but I mean, like, the first ones are all, like, something that I would look up.

  6. 1:15:001:30:00

    What? …

    1. RE

      we have control over what the Up Next algorithm suggests. And guess what we can do with our Up Next algorithm?

    2. JR

      What?

    3. RE

      Well, it should be obvious. (laughs)

    4. JR

      You can manipulate people.

    5. RE

      Yeah, we manipulate people.

    6. JR

      Yeah.

    7. RE

      We randomly assign them to this group or that group, and we just push people any old way we wanna push them.

    8. JR

      And when you're doing these tests and studies, like, how are you doing this? Like, are you doing ... How many people are involved in this? Are they students? Like, what ... How are you, how are you doing this?

    9. RE

      Okay, we- we- we never do the, you know, subject pool at the university where you get, you know, 50 students from your- your college to take, you know, to be your research.

    10. JR

      Mm-hmm.

    11. RE

      We never do that. So we're- we're always reaching out, uh, to the community or we're doing things online. So we do big studies online, and we are getting very deg- diverse groups of people. We're getting, uh ... Literally, we're getting people from, uh, lists of registered voters, so we're getting people, you know, who- who- who look like the American population, and we are ... We- we can mess with them. Can I say "we can fuck with them"?

    12. JR

      You just did.

    13. RE

      Oh. I guess I just did.

    14. JR

      (laughs)

    15. RE

      Oh, this is definitely not Fox. This is not Fox.

    16. JR

      No.

    17. RE

      No. No.

    18. JR

      This is ... We're on the internet.

    19. RE

      This is not Fox News. Uh, yeah, but the internet ... See, the internet, though, because there are no regulations and rules, it does allow for some pretty evil things to take place. And the fact is, in our experiments, we- we- we do these ... Usually our experiments have hundreds of people in them. Sometimes they have thousands of people, and we can fuck with people and they have absolutely no idea. Let me j- ... I'll tell you about something new, okay?

    20. JR

      Okay.

    21. RE

      Something new, brand new. Okay, and this is f- ... Uh, uh, thank God I'm not talking about Google this time. Just talking about something else that's happening. There are websites that will help you make up your mind about something. So for example, there's a whole bunch of them right now that'll help you decide whether you're really a Democrat or you're really a Republican. And the way they do that is they give you a quiz, and based on your answers to how you feel about abortion and immigration and this and that, at the end of the quiz they say, "Oh, you are definitely a Republican. Sign up here if you wanna join the Republican Party." And this is called opinion matching, and the research we do on this is called OME, the Opinion Matching Effect. And there are hundreds of websites like this, and when you get near an election, a lot more of them turn up because the Washington Post will give you a quiz and help you decide who to vote for, and Tinder ... Tinder-

    22. JR

      Hmm.

    23. RE

      ... okay, which is used for sexual hookups, uh-

    24. JR

      How about romantic, sir?

    25. RE

      Oh.

    26. JR

      Not just sex. Sorry.

    27. RE

      Uh, my mistake.

    28. JR

      (laughs)

    29. RE

      Uh, so Tinder actually set up a swipe the vote option on Tinder during the 2016 election. You swipe left if you think this, you swipe right if you think that, and then at the end of it, they say, "Oh, you should be voting for Hillary Clinton."

    30. JR

      Mm.

  7. 1:30:001:33:19

    Really? …

    1. JR

      Really?

    2. RE

      Yeah.

    3. JR

      So, when this person said that to you, what, what does this person do? What-

    4. RE

      He's an attorney general of a state.

    5. JR

      And why did he say that to you?

    6. RE

      Because he was concerned. He thought I was pissing people off who had a lot of power and that, um, they wouldn't like that.

    7. JR

      And how did your wife die in that accident? What were the circumstances?

    8. RE

      Um, she (clears throat) lost control of her little pickup truck that I had bought her, and, uh, got broadsided by a, uh, a massive, um, truck that was towing two loads of cement. Uh, but her pickup truck was never examined forensically, and, um, it disappeared. Uh, I was told that it, that it had been sold to someone in Mexico, and it just disappeared from-

    9. JR

      Sold to someone in Mexico. O- Obviously, it was totaled?

    10. RE

      It was totaled, and the, the wreck, uh, which I suppose was technically my property, uh, disappeared. Was never examined and disappeared and went to Mexico.

    11. JR

      Now, was this a older truck? Was it a newer truck?

    12. RE

      Uh, it was an older truck, but you know-

    13. JR

      Oh, older as in like how, how old?

    14. RE

      Uh...... like, 2002 but we kept in very good shape, had low mileage, uh, new tires.

    15. JR

      The reason why I ask is, like, what kind of, uh, computer systems were involved in cars-

    16. RE

      Oh.

    17. JR

      ... from 2002 as opposed to... Do you remember the, um, (smacks lips) the story of the journalist who, uh- Michael Hastings, who, uh, wrote a story about, uh, a general in, um, uh... during th- during the time of, um, (smacks lips) Obama's administration, there was, uh, a volcano that erupted in Iceland and, uh, he was stuck overseas. I believe it was Af- Afghanistan or Iraq? I think it was Afghanistan. So, he was over there writing a story for Rolling Stone and because he was over there for so long, because he was trapped, because no flights were going, because of the, uh, air cover was so bad because of this volcano, they got real comfortable with him. And these soldiers started saying things, not even thinking this guy is like... you know, he's not one of them. He is a journalist and he's gonna write all these things about... So, he wrote this very damning article. Uh, the general in question got fired, and then this guy, Michael Hast- Hastings, started talking about how he was in- fearing for his own life. And, uh, cut to sometime in the future, he sped up, there's actually a video of it, sped up on Sunset Boulevard, uh, towards the west side and slammed into a tree going, like, 120 miles an hour. There was an explosion. The- the car's engine was, you know, m- many yards from the- the car itself and there was a lot of speculation that not only did the government have the ability to manipulate, that intelligence agencies had the ability to manipulate people's cars, but it's something they've actively done.

Episode duration: 2:41:56

Install uListen for AI-powered chat & search across the full episode — Get Full Transcript

Transcript of episode TIRtBfUBMmk

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome