Skip to content
Modern WisdomModern Wisdom

How Pornhub Became The Internet’s Biggest Crime Scene - Laila Mickelwait

Laila Mickelwait is an anti-sex-trafficking activist, founder, and an author. One of the most visited websites on the planet is more than just a site; it’s a crime scene. As Pornhub rose to global dominance, a lack of regulation allowed thousands to be exploited against their will. Now, Leila Mickelwait is leading the charge to expose the truth, demand justice, and bring real accountability to an industry built on harm. Expect to learn why Pornhub isn’t just a porn site but a crime scene, the story of Pornhub across the years and where it all went wrong, the major significance of the #traffickinghub hashtag, what the most uncomfortable truth people ignore about online sexual abuse is, why regulators decided not to act despite obvious red flags from the site, the fallout of trying to get Pornhub shut down, what changes need to occur in tech regulation to stop abuse from occurring are, and much more… - 00:00 Pornhub Is Not A Porn Site, It’s A Crime Scene 08:01 The Movement That Held PornHub Accountable 22:15 The Investigation of Pornhub 32:13 What Does A Healthy Porn Moderation Process Look Like? 49:28 How Pornhub Tried To Discredit Traffickinghub 56:22 The Dangers Of Underage Exposure To Pornsites 1:06:03 Keeping Children Safe Online By Using Aura 1:12:51 Learn More About Laila - Get access to every episode 10 hours before YouTube by subscribing for free on Spotify - https://spoti.fi/2LSimPn or Apple Podcasts - https://apple.co/2MNqIgw Get my free Reading List of 100 life-changing books here - https://chriswillx.com/books/ Try my productivity energy drink Neutonic here - https://neutonic.com/modernwisdom - Get in touch in the comments below or head to... Instagram: https://www.instagram.com/chriswillx Twitter: https://www.twitter.com/chriswillx Email: https://chriswillx.com/contact/

Chris WilliamsonhostLaila Mickelwaitguest
Jun 19, 20251h 14mWatch on YouTube ↗

EVERY SPOKEN WORD

  1. 0:008:01

    Pornhub Is Not A Porn Site, It’s A Crime Scene

    1. CW

      Pornhub is not a porn site, it's a crime scene. What's that mean?

    2. LM

      It means exactly what you just said. So, what I discovered, uh, about five years ago was what millions of people already knew, in that all it took to upload to the world's YouTube of porn, so this is user-generated porn, uh, the biggest porn site in the world at the time. Actually, it was the fifth most trafficked website in the world at the time. I made this discovery that all it took to upload to Pornhub was an email address that anybody, in under 10 minutes, could upload to the site, and they were not verifying ID to make sure that these were not children, and they were not verifying consent to make sure that these are not rape or trafficking victims. And because of that, the site had actually become infested with videos of real sexual crime. So, we're talking about child abuse, child sexual abuse material, we call it, this is child rape. It's also self-generated child sexual abuse material where children would be, you know, filming themselves and sharing it, and then that would get uploaded to Pornhub, which is completely illegal to be viewing and distributing that content. Um, to adult rape, you know, unconscious women, completely drunk, non-consenting, to, all the way to what we used to call revenge porn. So this would be image-based sexual abuse. So all kinds of non-consensual content, and even copyright, uh, you know, violations, where this is illegal content because it was stolen material that was being uploaded to the site. So, that was the state of Pornhub, and, you know, they had, at the time, they had 6.9 million videos that were uploaded, uh, in 2019, and this was, you know ... My fight to hold them accountable for these crimes started in 2020. And at that time, they had 56 million pieces of content uploaded to the site, and they had actually, uh, 170 million visitors per day, 62 billion visitors per year, and enough content being uploaded that it would take 169 years to watch if you put those videos back to back. So that's how much content was being uploaded. And mind you, this is now anybody with an iPhone. So anybody, anywhere in the world that had a camera could film a sex act and with no checks whatsoever, using a VPN even, to be even more anonymous. They could upload this content to Pornhub and it was infested with videos of crime.

    3. CW

      Why was this your job to find? This seems like ... 62 billion visits per year, one of those 62 billion visits could have sprung somebody else into action. W- why was this you?

    4. LM

      I mean, this is one of the things that really kind of amazes me even now, is that this was something that was h- hiding in plain sight. So, this was under everybody's noses. Like, really anybody could have sounded the alarm on this, and it's, it's amazing that it took from 2007 to 2020 for it to get any attention, that this was actually going on. And, you know, I am just, I am actually honored to have the opportunity to shine a light on this and to, you know, be helping the many, many, many countless victims now who've had their trauma immortalized on this site. Uh, and so I don't know why it took this long t- for it to come to light. But there's this saying that I love, and it's, i- it's, "It's an idea whose time had come." And I think that that's actually true. It was an idea. Trafficking Hub, which is the movement that I started to hold Pornhub accountable, it started with a hashtag on social media and grew and went viral. But I think Trafficking Hub was an idea whose time had finally come. And enough was enough, and it was time to, you know, expose what was going on.

    5. CW

      What's the story of Pornhub across the years? Huge company owned by an even bigger parent company. What's the arc of how they ended up where they are?

    6. LM

      Sure. Yeah. It's so interesting because you think about Pornhub as a, a solo site often, but really what people don't know is that Pornhub is owned by a parent company. You know, most people in the world have probably heard the name Pornhub. Uh, you know, they had spent millions and millions of dollars to become a household brand name for porn. They had done things like massive PR campaigns to save the bees and save the whales and, uh, you know, cl- clean the oceans, and even donate to breast cancer awareness, and they had this whole arm of Pornhub called Pornhub Cares, which was like this phil- philanthropic arm. They're walking New York Fashion Week. They, you know, have faux commercials on Saturday Night Live. So, everybody pretty much understood the name Pornhub. Now, if you talk about the company that owned Pornhub, well, that's a different story. So most people had never heard of the parent company of Pornhub, which was called MindGeek. Now, MindGeek had essentially rolled up the global porn industry under one huge international multi-billion dollar corporation, because with a $362 million loan, they had, uh, actually bought up most of the world's most popular porn sites and brands. So there was a hedge fund called Colbeck Capital that had 125 secret investors that included JPMorgan Chase, that including Cornell University, and they had loaned these hundreds of millions of dollars to, um, which was Manwin at the time, so I'll, I'll tell you the, the history of that, but, um, to the company to buy up the world's most popular porn sites and brands. So they actually owned everything from Playboy Digital to Pornhub and all of its sister tube sites. So Pornhub and its sister sites, so that would include RedTube, YouPorn, GayTube, Extreme Tube, XTube, PornMD. I mean, I could go on and on, massive amounts of tube sites that operated all the same way. But MindGeek used to be a company called Manwin, and before Manwin, it was a company called Mancef.And there was, you know, these m- men in Montreal that started Manseff in 2007. They had purchased the website Pornhub.com for about $2,500 at a Playboy Mansion party, and they launched Pornhub. But it was a man named Fabian Thylmann, so he's a German entrepreneur, that actually put Pornhub on the map. So he, uh, you know, was a, a very shrewd businessman, and he had this idea that he wanted the world to be able to access free porn. And so he kind of took Pornhub from, you know, a somewhat popular site to the brand name that it is today, and then he actually got in trouble for tax evasion. So the company was sold because originally, uh, the owners at Manseff were in trouble for mon- money laundering, so they had to suddenly sell the site, and they sold it to Fabian. And then Fabian got in trouble for tax evasion, and then he suddenly had to sell the site, and so he sold it to, uh, VPs at his company, and now they're in trouble again, so criminally charged by the US government. And now they're trying to sell the site again, uh, to a hastily concocted private equity firm. And so that's kind of the story of Pornhub, where you have this history where they get in trouble with the law, they rebrand. Each time they name the company something else and sell it and sell it again, and this time, they were criminally charged, uh, for intentionally profiting from the sex trafficking of over 100 women in California. So now they sold MindGeek, and now they call it Alo, so it's a new... It's the same company. It's the same website. And many of the same owners and executives have been involved from the very beginning that are still there in Montreal today running the site.

    7. CW

      It's kind of like a dirty penny that keeps on getting passed around.

    8. LM

      Yeah. (laughs) I- that's a good follow-up on it, yes.

    9. CW

      Or like a, a cursed, a cursed penny or something that, you know, the, the holder ends up ins- getting in hot water in some

  2. 8:0122:15

    The Movement That Held PornHub Accountable

    1. CW

      way. And, uh, what's the, what's the story of how you found yourself embroiled in this? I- I- I don't understand how somebody just sort of stumbles upon creating a movement that causes the biggest porn site in the world to end up being basically shut down.

    2. LM

      Yeah, so I had been in the fight for many, many years before the fight to hold Pornhub accountable began, the fight against sex trafficking. So I've been involved, now it's almost 20 years that I've been involved in the fight against sex trafficking and child sexual exploitation. So it was in the context of that work that I ha- you know, happened to test the upload system for Pornhub I wa- as I was investigating the site, because I was really concerned by some video... sorry, some s- news stories that I had heard at the end of 2019. So I was paying attention to the headlines because this is my work, right? And so there was a very concerning story that I credit to the launch of the Trafficking Hub movement, and it was about a 15-year-old girl from Broward County, Florida who was missing for an entire year, and she was finally found when her distraught mother was tipped off by a Pornhub user that he recognized her daughter on the site. And she was found in 58 videos being raped and trafficked under an account named Daddy Slut, and he had impregnated the teen girl, and she was finally rescued from his apartment when, uh, surveillance footage from a 7-Eleven was matched up with the perpetrator's face in the Pornhub videos, and they actually found the girl and rescued her. And then at the same time, the London Sunday Times had done an investigation into Pornhub and they found dozens of illegal videos on the site within minutes, even children as young as three years old. And at the time, they had called out Heinz and Unilever for advertising on Pornhub, shamed them for doing that. They actually apologized. They took their ads off the site. At that time, PayPal cut ties with Pornhub. And it was just this really, uh, you know, kind of shocking moment for anybody in the anti-trafficking and anti-child exploitation space to finally say, "Hey, what's actually going on here?" And we're hearing stories of children being abused on Pornhub, very young children, in fact. And that is when, you know, I had- just couldn't get those stories out of my mind. I kept thinking about them and thinking about them. And, you know, at the time, you see Pornhub in the headlines all the time, you know, just getting so much press. And I said, "How i- just, how in the world is this happening?" And that's when, you know, late one night I was putting my own, you know, fussy baby back to sleep in the middle of the night on February 1st of 2020, and I was thinking about the story of that 15-year-old girl, and that's when I said, "Look, I'm just gonna see what it takes to upload to Pornhub." And I just took a video of the rug and the dark room and the laptop keyboard and tried it and- for myself and realized it only took 10 minutes. It took a few clicks, no ID, no consent form. And that's- and then I started really paying attention. So that's when I launched the Trafficking Hub hashtag on social media. I mean, I only had, you know, a few thousand followers at the time from all of my advocacy work that I was doing, and I shared the Trafficking Hub hashtag. And the reason why Trafficking Hub is what came to my mind is because anytime you monetize, uh, an act, a sex act that involves a minor, so anyone under the age of 18 who's involved in a commercial sex act or anybody that has been induced into a sex act by force, fraud, or coercion, so this is non-consensual, uh, is a victim of trafficking when it's monetized. Now on Pornhub, it's free porn, but it's not free. I mean, these are heavily monetized porn videos, and they're monetized mostly with ads. So they were selling 4.6 billion ad impressions on Pornhub every single day, and that's how they were monetizing these millions and millions of videos, including...... child sexual abuse, rape, and all forms of non-consensual content. So that's why I said, "Hey, this is a trafficking hub. We have to hold this company accountable." And the hashtag just started to catch on, and then I said, "Okay, this has to go bigger than my tiny social media following. I'm gonna write an op-ed about it." So I wrote an op-ed. I sent it to a few different outlets and The Washington Examiner is who actually decided to publish it, and then that kind of started to get a bit v- a bit of virality. And so people were reading it, they were horrified as to what was going on, and then one of my followers just said, "Hey, you need to start a petition, and if you don't start one, I'll start it." So I said, "Okay, I'll do it." And so I started the Trafficking Hub Petition to shut down Pornhub and hold its executives accountable for enabling trafficking, and that started to go viral, and today we have 2.3 million signatures on the petition from every country in the world. Uh, we've had 600 organizations involved. Hundreds of survivors have come forward. At one point, survivors were coming forward to me on a daily basis saying, "I was exploited on Pornhub. My videos are still on Pornhub. I was a child. I can't get the videos down. Please help me." "I'm unconscious in this video." You know, all of these things, um, and we're able to connect them with lawyers and then, you know, since then, thousands of media articles have been written on this exposing the criminality of Pornhub, and one of the most important things that we did throughout this campaign to hold them accountable was go after the credit card companies. So we knew that the Achilles heel of Pornhub was the credit card companies because, you know, without credit card companies, you don't have a very profitable online business, uh, and actually, the former owner of Pornhub reached out to me in the midst of, you know, this viral, uh, campaign, and he said, "Listen, if you really wanna get Pornhub, you have to go after the credit card companies." And that's what we did. And eventually, you know, enough pressure through litigation, through lawsuits, through public pressure, through the articles that were being written about this, especially The Children of Pornhub by The New York Times, the credit card companies finally cut off Pornhub and they were forced to delete 91% of the entire website.

    3. CW

      Why did credit card companies cutting them off result in them deleting content? Did that mean that credit card companies would reactivate payments?

    4. LM

      So they were hoping that would be the case. So they understood that their site was completely infested with crime. They didn't know what was consensual and what was not consensual. They didn't know who was 16 and who was 18. They were just guessing. So, you know, one of the things that we understood was that their moderation system was a joke. So we had moderators who came forward and they exposed the inner workings of how they were ta- you know, trying to vet the content and basically they were just-

    5. CW

      What- what's the... What was the... What was the process?

    6. LM

      So it was 10 people in Cyprus, so imagine this. So 10 people who were in charge of millions of videos... Now, it wasn't just Pornhub. Remember I described MindGeek, right? They were in charge of vetting all of the porn tube sites. So again, YouPorn, RedTube, Tube8, xTube, all of them. 10 people per shift and they were just clicking through. They were expected... Like, they were actually reprimanded if they didn't click through at least 700 videos per eight-hour shift, but some of the more experienced moderators were clicking through, you know, 2,000 videos per eight-hour shift with the sound off and, you know, they were told, like... Essentially, the moderators said, "Our job was not to make sure that illegal content wasn't getting on the site." It was, "Our job was to just make sure that as much content could go through as possible." So think about that. And so, so that's how the site was just... They- they actually had no idea how many of these videos were rough sex and how many of them were rape. So the only-

    7. CW

      Yeah. So it's- it's- it-

    8. LM

      ... thing they could do-

    9. CW

      Am I- am I right in saying that the two main, uh, issues here, one is consent and the other is age? Are those sort of the two big buckets?

    10. LM

      It's... Yes, it's both. Yeah. It's- it's because age... Again, like, a pediatrician can't even guess on a consistent basis who's 16 and who's 18, right? I mean, they had very young children on the site. One of the stories that just... It's probably one of the worst that I've heard is of a 12-year-old boy from Alabama who was drugged and overpowered and raped by a man named Rocky Shay Franklin, and Franklin filmed 23 of the assault videos and he uploaded those videos to Pornhub and the police went after, uh, you know, the- the site to take those videos down when they found out, and they were ignored multiple times. For seven months, those videos stayed up even though police were demanding that they come down, getting hundreds of thousands of views. Monetized views, mind you, making money for the owners of Pornhub. But- So there's 12-year-olds on this site, three to six-year-olds on this site, but most of the victims that came forward were underaged teens. So they were young teens and teens who were under 18, and again, that's because, you know, like, they were just obviously not vetting the videos at all, but even if they were looking at a video, there's no way they could tell who was 15 and who was 18. So yeah. So you're right in saying it's underage issues and then it's consent issues, right? 'Cause also, they... How in the world could they tell if a video was non-consensually uploaded but consensually recorded? There's literally no way they could tell. So the only choice they had at that point when the credit card companies cut them off was in an- in an- an attempt to woo back the credit card companies, they said, "We have to delete all of the unverified content off our site." And so today, they have actually taken down 91% of Pornhub. So they took down over 50 million images and videos in what Financial Times has called probably the biggest take down of content in internet history, and they still have to delete more. So they're gonna delete more-...next month. So by June 30, they're gonna have to take even more content off the site, and that's because the content that they left on the site, they left verified uploaders. Okay? So listen, the verified uploader doesn't take, take care of the problem. So they had some videos on the site that they had actually verified who the uploader was. And Rocky Shay Franklin, who I just told you about, he was a verified uploader, but that didn't mean he wasn't uploading victims in his videos. So they're going to have to take down a lot of the remaining content in the next month, um, so we'll see how much that ends up being.

    11. CW

      How much do you think will be left?

    12. LM

      I mean, it's, it's hard to say. Uh, you know, as of September of 2024, they've been forced to start verifying the age and consent of people who are in the video, so the individuals in the videos, for the new content being uploaded, and that's because they've been sued. So they've been sued now by nearly 300 victims in 27 lawsuits, and that includes class actions on behalf of tens of thousands of child victims. These are certified class action lawsuits. They have one in Alabama, one in California, and they're just getting... I mean, they could have potentially billions of dollars in damages for what's happened to these victims. And, you know, to the damages, sometimes people think of it and they kind of minimize it as, "Oh, this is just online. These are just..." You kind of think of it as pixels on the screen, and the, the actual victim is not humanized in the way that they really should be. But-

    13. CW

      Hmm.

    14. LM

      ...I think one of the things that we have to think about is the trauma that they face when these videos are uploaded online, because it's one thing to be raped or abused as a child, but then when that's recorded and then it's distributed to the world and it's distributed with a download button. So they had a download button on every single video on that site, so anybody could then download onto their device the worst moment of that victim's life and then re-upload it again and again and again forever, that they just have to then engage in this sadistic game of whack-a-mole-

    15. CW

      Hmm.

    16. LM

      ...where they're constantly in fear of, "Who's gonna upload their video to the internet now?" And they call it the immortalization of their trauma. They say... You know, one victim said, "My abuser put me in a mental prison, but Pornhub gave me a life sentence." And so the severity of this, when you think about the lawsuits and this going to trial and the facts being put before a jury, I mean, this could be massive, massive damages.

    17. CW

      It's, I mean, horrifying, but we... There's sort of two big buckets, again, of crimes that are happening, one being the actual incident, presuming that somebody isn't of age, didn't consent during the act, didn't consent to the recording, d- didn't consent to the distribution, uh, and then you've got the actual distribution on Pornhub's side. I get the sense that a lot of the ire and sort of hatred and vitriol and stuff that's directed at Pornhub is also... It's like Pornhub are a, a conduit for who did the, uh, the crime too, and a lot of the times we can't... "Who is this person? How do we find them? Where are they? Uh, investigation," so on and so forth. Uh, very difficult to do. I know sometimes people wear masks or purposefully blur faces or, you know, do things that mean that you can't see who the potential perpetrator is. Uh, so yeah, Pornhub, uh, uh, definitely gonna feel an awful lot of wrath from everyone.

    18. LM

      Yeah, and, I mean, to your point, th- there is, there's multiple levels of perpetration in this issue and what's happening. And for sure, you know, the, the person who actually did that abuse, who filmed it, they have to be held accountable, 100%. I mean, we wanna see accountability across the

  3. 22:1532:13

    The Investigation of Pornhub

    1. LM

      board. When it comes to Pornhub, I mean, the facts that have been uncovered in legal discovery, I mean, Nick Kristof of The New York Times, I mean, he wrote a scathing expose in 2020 called The Children of Pornhub that featured the story of one particular victim. Her name was Serena. I'll just share the, her story because it's an important story. Um, she was a young teen, so she was 13 years old, and she was from Bakersfield, California, an innocent teen. I mean, she'd never even kissed a boy, boy before. She was a straight A student. She had a crush on a boy older than her, and he coerced her and convinced her to send him some nude images and videos of herself, which she did, and she shared those, um, with him, and then he shared them with classmates, and they got uploaded to Pornhub where they got millions and millions of views. And she would beg for those videos to come down and she would be ignored because they only had one person. So they... We uncovered through the legal discovery process that out of, uh, out of, uh, employees, that they had 1,800 working, uh, for Pornhub and MindGeek, and they employed one person to be reviewing videos flagged by users as containing, uh, rape, uh, child abuse, or other f- terms of service violations. So they had one person. They had a backlog of 706,000 flagged videos. So, uh, they also had a policy where they wouldn't even put a video in line for review unless it had over 15 flags. So a victim could actually flag their video 15 times and it would never even have been put in line for review. So Serena would beg for them to come down. If she would get a hold of anybody, they would hassle her and say, "Prove that you're a victim. Prove that you were underage in this video." And if she eventually got it down, again, it would just get uploaded again. So this sent her on a spiral of despair. She ended up dropping out of school because she was being bullied. She got addicted to drugs to try to numb the pain. She ended up trying to kill herself multiple times. Uh, this is very common among e- victims of image-based sexual abuse. So the suicide ideation rate for these victims is about 50%.... uh, and then she ended up homeless, living out of a car.

    2. CW

      50%.

    3. LM

      Yes. 50% have suicidal ideation. So they think about it, so they think that ending their life might be better than enduring the pain of constantly having their trauma, um, on the internet. And so that was, you know, the trajectory of Serena. But if you think about the intentionality, so going back to, like, who's responsible in- in this- s- this situation, right? We have the- the individual perpetrator, but then the executives making the decisions, the intentional policy decisions, and we know they're intentional because we've uncovered email exchanges and messages and, you know, all of the communications and policies that they had put in place, uh, to enable this abuse to happen. So I mean, even all the way from having a VPN, where they offer a VPN to people, so they weren't just checking, they weren't not checking ID and consent. They were allowing you to anonymously upload, but then you could also access the- the site with a VPN. So law enforcement need an IP address in order to locate a perpetrator. Like, that's how you actually locate a device. So if you use a VPN, well then you're masking your location. But not only that, like, they were not reporting child sexual abuse that they were aware of to authorities for 13 years, until we finally held their feet to the fire and exposed them. So it's actually mandatory in Canada, where they have headquarters, to report. When you know about child sexual abuse, you have to report it to authorities. And they were not reporting. They were not reporting for 13 years, even though they were aware of children who were being abused on the site. And so then you think about that, it's like, how many perpetrators could have been apprehended and how many children could have been saved from years of abuse if they were actually reporting the videos to authorities like they should have been, but they were hiding it from the public?

    4. CW

      How damning are the internal documents? How did you get a ho- how does anyone get a hold of the emails of a company? What's the- th- the story of getting behind the scenes?

    5. LM

      Yeah. So one of the amazing tools through civil litigation is being able to get behind your opponent's, uh, communications, exchanges, emails, text messages, uh, internal- all kinds of internal policy documents. So as a litigation, as a civil litigation progresses, they have this period of what they call discovery, and basically they can compel the company, and they have compelled the company to release documents. And tr- it's hard for lawyers, like they- they do not give this stuff easily. They put up a fight. But these are amazing attorneys that are representing these victims, and they've been able to get this information, messages. Now, an amazing thing happened a few weeks ago, and this was the basis of Nick Kristof's recent article. So I told you about The Children of Pornhub, but actually he just released a follow-up, and it was because the court in Alabama, um, for the- the child trafficking class action lawsuit accidentally, so the court accidentally released thousands of pages of internal documents and communications and messages and emails that were supposed to be sealed. So they had actually accidentally unsealed all of this information. So now we have, I mean, it's an amazing amount of information, depositions, where they actually depose under oath, it's a crime if they actually tell a lie in these depositions, the managers, the employees, the executives, the owners, and we have all of, like, you know, and one deposition is, like, a 500-page deposition. And all of this put together, the question becomes, how in the world are Pornhub's executives not in prison? And I honestly think after this release of this evidence, they will go to prison. I- I feel confident that we will see this company properly criminally prosecuted.

    6. CW

      Why aren't they in prison? Sounds to me like relatively open-and-shut case. There's already been investigations. This weird lily-paddding thing where something goes wrong and then it's, we- we're rebranded over here and then we rebrand a little bit more and there's a tiny exec change, but most of the people th- behind the scenes all stay the same and it kind of doesn't really matter who's been switched in and switched out. Ooh, is it just taking a long time? I guess it's only been five years to do this. It's a big investigation. Is it, is it just the kind of slow, lumbering behemoth that is legislation happening? What's going on?

    7. LM

      I mean, there is a saying that s- you know, the wheels of justice turn slowly, right? And I think that's true. I think the wheels of justice turn slowly, but they turn, and I think especially if we keep the pressure on, they turn. I think that when we focus on something, when we give it attention, and when there's a public outcry about something, then, like, the squeaky, what is it? The squeaky wheel gets the oil or whatever those, the saying is.

    8. CW

      Mm-hmm.

    9. LM

      But yeah, if you, if we c- can continue to put pressure on those in power to do their job, uh, then I think that we will see it happen. And, uh, and so I think that's a matter of time. There was a company called Backpage and the fight to hold them accountable for child trafficking on that website, I think that was a 10-year fight. So we're at five years now from really starting to shine a light on this, and I really believe that if we can keep it up, that's public pressure coupled with civil litigation to continue to hit these companies where it hurts, in their bank accounts, that we will see the outcome of justice really being served. And why is that important? I think, you know, why is it important to hold Pornhub and its parent company accountable? You know, s- people might say, "Well, this is just one of so many different sites. This is just one website." That's true. But there is something that is real and it's called deterrence. And one of the most important things that we can do to prevent abuse...... is to deter future abusers because at the end of the day, this is a risk-benefit calculation for what I call corporate traffickers, and this is about money for them, and they're just saying, you know, "Is what's going to happen to me the cost of doing business? Or is it worse than that? Like, will I face real and serious consequences?"

    10. CW

      Hmm.

    11. LM

      And when they understand that they will face real and serious consequences, they'll make different decisions. They don't have to distribute illegal content on that site. They can, although it's expensive and it's not easy, they can make the decisions to prevent that, to, to put in those safety policies, and they have to be forced to do it. And so what we're seeing right now is the power of deterrence. Like right now, it- Pornhub's biggest competitors are proactively seeing what's happening on Pornhub, and they're actually taking down illegal content from their sites. They're changing the way that the upload process works.

    12. CW

      For f- fear of being hit with the same kind of litigation.

    13. LM

      Exactly.

    14. CW

      Right? Okay. So-

    15. LM

      Exactly.

    16. CW

      ... I, I think, yeah, the obvious question is Pornhub and even ELO, Xmindgeek aren't the only adult website in the world, so you shut this thing down, and it goes elsewhere, and I know that you're pushing for Pornhub itself to be shut down entirely as opposed to just meeting the standards of moderation that you would be happy with. Is this because taking down Pornhub would be a very loud shot across the bow for everybody else? And then presumably moving forward, you want what kind of moderation? Uh, why shut down, not moderation exclusively on Pornhub? And then what does a

  4. 32:1349:28

    What Does A Healthy Porn Moderation Process Look Like?

    1. CW

      (laughs) what does a healthy porn moderation process look like?

    2. LM

      Yeah. Those are great questions. From the beginning, the call to action has been to shut down Pornhub, and I absolutely mean that. I didn't say it lightly when we started, and that is because the level of harm that has been done by this company to so many victims since 2007 with impunity, with intentionality, on purpose, for profit, is absolutely unacceptable, and the only just outcome is for the site to be shut down, for reparations to be paid to all victims, significant reparations, and for there to be criminal prosecution, and that's what justice served looks like. And justice is important because that's how victims can heal, when they see that what happened to them was recognized and it was paid for, and so that's, that's important. It's also important, like I said, to be a deterrent to future abusers so they understand that there will be consequences if they act in the same way, and in that way, we're going to help other websites not act in the same way that Pornhub has. But it... Again, it's not enough to hold one company accountable, and don't forget, holding Pornhub accountable is also holding probably most of the world's most popular tube sites accountable because they're all owned by the same company. Um, but going forward, we need policy to make sure that this doesn't happen in the future, and that's why I am a strong advocate for age and consent verification policies, because the crux of the problem here was unfettered un- uh, unmonitored uploading on user-generated sites, right? And so the solution is pretty simple. It's verifying the age, ID, and consent, document a consent, of every person in every video on every website that per terms of service allows user-generated porn. And this can be done at scale, so we have the technology to be able to do this at scale. It's very doable.

    3. CW

      How do you, how do you do it? What's, what's the technology do?

    4. LM

      Yeah. So there's, there's n- numerous companies that do this. One of them that Pornhub is currently being forced to use is called Yoti, and what they do is they do a biometric scan coupled with verification of government issued ID, uh, in order to verify that the person in the video i- includes a liveness scan. Um, and so-

    5. CW

      A what?

    6. LM

      ... there's... A liveness scan, so it's like-

    7. CW

      A, a liveness.

    8. LM

      Yeah. So you, you move when you're doing the, the-

    9. CW

      Oh.

    10. LM

      ... scan of your face.

    11. CW

      Okay. Yep, yep.

    12. LM

      So you can make sure that you're not just putting somebody's picture up there. So I mean, there's different ways that, that this can be done, but the technology is there, and they can do this quickly, efficiently. Now it costs money, right? So the one who's gonna pay for this is the porn companies who have to implement these third-party checks, and I think third party is so essential because I would never ever want anybody to give their ID to Pornhub. I mean, they're actually facing a class action lawsuit for the exploitation of user data. So-

    13. CW

      What have they done? What's the story behind the user data?

    14. LM

      Yeah. So apparently what was happening was that they were, uh, without consent, uh, obtaining and selling the user data of millions and millions of people who are visiting their sites to third parties without consent, and so they're facing a class action lawsuit for that.

    15. CW

      They are fully fucked, aren't they?

    16. LM

      (laughs)

    17. CW

      Like they are so fucked, dude. Holy shit. Like how many different ways... I don't know. Maybe, maybe it's the case that we will look back on Pornhub and think that they were kind of the f- first through the door, Wild West, frontier-style porn company that just made all of the errors, right? It was... Look, th- this was before we had the, the, the, uh, Lila Micklethwait Act of fucking 2028 or whatever. You know what I mean? Um, it was before we had the correct barriers in place. Technology had enabled this kind of user-generated porn uploading, and it had done it at such a pace, and no one had any idea what was going on, and everyone was making money, and lots of people were enjoying free access to porn on the internet from mobile devices and their laptops, and then, and then...... uh, we realize just how sort of rotten the core of this was, uh, and maybe we'll look back and go, "Wow, Pornhub and, uh, Alo are a shining example of all of the different ways that you can get this stuff wrong online." But it is kind of impressive, it, it, it actually genuinely is impressive, to have one company that has accumulated... They're like the neutron star of making errors with this stuff. Like, how many... They're the, they're the LeBron James of getting... (laughs) You know? They're like the GOAT of fucking up. Um-

    18. LM

      But, I mean, it's funny that you say that though, because if they had just been left alone, I mean, they were so popular. I mean, people were wearing their apparel proudly in public.

    19. CW

      Culturally. I mean, it's, look, that's the power of brand, right? That's, that just shows... Obviously you need a product that backs it up. But if you are... They're the Apple of porn, right? They're the first mover advantage. You think, you think mobile phone, you think Apple, you think porn, you think Pornhub.

    20. LM

      Exactly. That's absolutely true. And h- the thing of it is this, is that I think a lot of people today still have no idea that this ever happened, that they're facing all of these consequences for the horrific actions that they have deliberately done. And, you know, we were talking about, like, the, you know, the Wild West of the internet and this and that, and, you know? The, the thing that really, I think is important for people to understand about Pornhub, based on all of the evidence that we've uncovered, like I said, is the knowing intentionality. It's that, you know, that these were decisions that were made where i- it's not like they were completely oblivious to the children that were being exploited on the site or the rape victims. I mean, you know, there's pages and pages of, in these recently released documents, uh, a- accidentally released, where there's just years and years where they had people filling out their contact form and saying, "Please take these videos down. I was unconscious in this video. I was raped in this video. I was a child in this video." Or, like, "I, this is my friend. She's 15 in that video. She doesn't know that this was uploaded." You know, "Take it down." And these were, this was for years that they knew about that. So-

    21. CW

      Is there a, uh, y- you know, if you knowingly distribute w- whatever, underage sexual material... Look at me trying to sound like I know in terms of l- legislation and stuff.

    22. LM

      No, you're doing-

    23. CW

      I've heard this, I've heard this sentence before, right? Not at me. (laughs) Fucking hell. I, I've seen other people have this sentence loged at them before. Um, if you knowingly share underage something, you get in trouble, right? Like, it's, you're really-

    24. LM

      Yeah.

    25. CW

      ... really fucked.

    26. LM

      Cr- it's actually a crime, yeah.

    27. CW

      Is there ... is there a particular different type of carve-out, or was there a particular different type of carve-out in the same way as whatever that article was that said, "We are not a, a curation site. We are a pipeline utility." That was a thing that all of the social media-

    28. LM

      Right. You're talking about Section 230. Yeah. So, in the US-

    29. CW

      Ah-ha. Yes.

    30. LM

      Yes. So-

  5. 49:2856:22

    How Pornhub Tried To Discredit Traffickinghub

    1. LM

    2. CW

      Hmm. Have you ever got to sit down with the people behind PornHub?

    3. LM

      No.

    4. CW

      Have you ever been in a room face-to-face with them?

    5. LM

      I have not, no. No, and when this-

    6. CW

      How do you think that would... How do you, how do you imagine that would go?

    7. LM

      Well, I imagine it would never happen because one of the things that they did when I started this campaign to hold PornHub accountable, one of the things they did was engage in attacks, smears, like, they just tried to discredit the work that we were doing. They tried to discredit the Trafficking Hub Movement, um, whatever ways that they could, they were doing that. I mean, they've done... They've, they've... what we call, uh, dirty tricks. Like, they've engaged in dirty tricks to try to silence. Instead of address it, they wanted to silence it because they knew it would be expensive if they actually made the changes that were necessary to stop the illegal content from uplo- being uploaded. So, no, they didn't wanna engage. They wanted to silence, and they've done some pretty horrible things, not only to me, but, you know, victims has, have faced some real hardships as well for speaking out.

    8. CW

      What has... Has there ever been direct PornHub response to your work? Have you... Ha- has there ever been... Have they in- interacted with that stuff directly?

    9. LM

      I mean, one of the things that they... Are you talking about, like, their responses in the media or-

    10. CW

      Everything. I mean, have they-

    11. LM

      Yeah.

    12. CW

      ... reached... Have you... Have they tailed you with private investigators? Have they tried to counter-sue for you accessing stuff?

    13. LM

      Well, they've never been... They've never, uh, tried to counter-sue because here's the thing, if they were subject to legal discovery, I mean, they, they know that they're going to be in, in just hot water. And the problem with, you know, if they were to engage in a defamation lawsuit, right? If, from the very beginning, you know, they could have done that, but the problem is when you're telling the truth, I mean, that's the ultimate defense. And so, I mean, and absolutely 100% everything that I've been saying, and it's not just me, I mean, really, this has been a movement of so many people, hundreds of organizations, hundreds of survivors, attorneys and lawmakers and, uh, law enforcement and lawyers and businessmen and so many people coming together, um, that, no, they have not done that, but, uh, yes, I've had faced, you know. And some of this is not... Cannot be tied directly to the company. Some of it can and some of it has been. But yeah, there's been a lot of backlash from, you know, doxing, hacking, you know, online smear campaigns, media smear campaigns, uh, letters being sent to my house with my children's names, middle names saying, "We're watching you. You're gonna get somebody killed." Um, you know, even things like getting reported for child sexual abuse material distribution myself. So they, you know, people who we know are directly tied to PornHub, um, put in fake police reports about me to actually get me investigated, but it didn't go well for them because obviously, you know, when they looked, they didn't find anything. But besides that, they did, they heard a lot about PornHub and so the police that were investigating me ended up becoming allies and saying, "Hey, we're on the same page, we're on the same team and how can we help you?" Um, so that didn't go well for them.

    14. CW

      I have to imagine that the valuation arc of PornHub looks like the saddest investing could have sold at the top opportunity of all time.

    15. LM

      Well, we known some numbers now. So, it was a multi-billion dollar corporation. And just a few weeks ago, some information was released from court documents, uh, where we understand that the site was sold, and I put sold in quotation marks 'cause it wasn't actually sold, it was just sold on paper, um, but it wasn't paid for, but the sale price was $400 million. So, it has lost a significant amount of value as a company, for sure.

    16. CW

      Okay. The... I guess the question... PornHub is the biggest, largest mold that needs to be whacked. It's a shot across the bow of people who are maybe going to do something similar. It hopefully will be a massive deterrent. Presumably, there needs to be some changes in tech and/or regulation to m- make this more scalable, so they're like-

    17. LM

      Yeah.

    18. CW

      ... scalable protection. I'm aware that what, you know, the purest approach would be, this is on the tube sites, they just need to be very strict with their moderation and so on and so forth, but we need to be realistic and kind of enable that, uh, enable moderation to be made as easy as possible from a tech side and then-

    19. LM

      Yeah.

    20. CW

      ... increase the level of deterrence from a regulation side.

    21. LM

      Yeah.

    22. CW

      It seems like kind of those are, are two important routes to go down. So what's the... What does the future look like with that?

    23. LM

      Yeah. Yeah. So that is such an important, uh, question and we have to think that way because we have to, at the end of the day, we need to make the internet a safer place. And yes, we need to hold these porn sites accountable, and like you said, the justice and the deterrence, but how do we add scale?... help prevent this across the many different user-generated porn websites and other sites that, you know, may not be porn websites, but per terms of service, they allow and they distribute user-generated porn. Um, and that is mandatory third-party age and consent verification for every person in every video. But the scale solution, the at-scale solution isn't just that governments implement this policy, because th- these are international corporations, right? Every website is pretty much operating in every place, every country in the world. So if we have that policy in the US, well, we have to implement it in Canada and then we'll have to implement it in everywhere, uh, which we should do. But the, I think that the at-scale solution is to have the financial institutions, so Visa, Mastercard, Discover, PayPal, say, "We don't do business with user-generated porn sites that don't verify the age and consent of every individual in every video." And just like they have anti-money laundering policy, they can have anti-online exploitation policy. And we know that these websites are highly motivated by credit card company demands. I mean, that's exactly why Pornhub took down 91% of the entire website, was because of the credit card companies. So we know the power that they have and when they enact that policy, it's instant and it's global, and I think that's going to be the most effective way to get all of these websites into compliance to start verifying age and consent.

  6. 56:221:06:03

    The Dangers Of Underage Exposure To Pornsites

    1. LM

    2. CW

      Mm. Yeah, in Texas, where I am at the moment, uh, there's all manner of age verification stuff being debated right now. It seems like that's even in the- the news sort of at the moment. There's stuff bouncing back and forth. What is, uh, you know- I have to assume restricting access as well. Like, we haven't even talked about that. Like, we literally haven't talked yet about- And what about exposing porn to people who are underage? That's like an entire other world, too.

    3. LM

      That's a- the other world, and that's the debate right now in Texas that's going on. And there's this movement across the United States and in other countries, so we are seeing it in Europe, we're seeing, you know, in the UK and Canada, where countries are understanding, legislators and the population is understanding the harm that is being done to children through unfettered free access to these tube sites, to these porn sites, and just to porn online. And so yeah, there's age verification laws that have been enacted in, right now, in Texas. So they enacted mandatory age verification for users. So that's, you know, people who go to that site, they're going to have to verify they're an adult to get onto the site. And Pornhub obviously does not want this to happen, so they're shutting themselves down in c- in states across the US that are implementing age verification for users in protest of this policy.

    4. CW

      Mm.

    5. LM

      And why? It's because it's expensive for them. They have to pay to get every user verified. But that's the cost of running a porn site and making sure that children aren't all over your site, um, both in back of the screen and in front of the screen, because this is a form of secondhand sexual abuse for a child to have to access and witness what's happening on these sites. And like I said, so much of it is legal, so much of it's illegal, some of it's illegal on the homepages of these sites, and some of it's illegal, but it's pretend-illegal. So like, there was a study done by, uh, the UK, um, Journal of Criminology in 2021, and they looked at 150,000, I think was the number, of videos on the homepages of the most popular porn tube sites, Xvideos, xHamster, Pornhub, these free porn sites, and they were analyzing what's showing up to just anybody who may accidentally or imp- intentionally land on the homepage. And they were finding that one in eight of the videos was depicting sexual violence. So, you know, some of this may be pretend, like, you know-

    6. CW

      Mm-hmm. Mm-hmm.

    7. LM

      ... what I said, there's no way to tell what's rough sex and what's rape. There's no way to tell who's 15 and who's 18. But so much of this was also the teen content that children are witnessing as their sex education from as young as eight, 10 years old. I mean, I get messages all the time from especially men who say, "I was addicted to the free porn tube sites when I was..." Or even, you know, even six years old, eight, 10, and they've been addicted ever since. And it's shaping their se- sexual template, right? This is where they're saying, what's normal? What is sex supposed to look like? What is it supposed to be like? And they're seeing so much of, you know, things that I wouldn't wo- wish on my worst enemy to have to witness, that I've seen on these sites.

    8. CW

      I have to imagine that this is going to get even more complex as AI, uh, renderings of- of porn, whether it's, um, stuff that's so lifelike that you can't tell. Like, is it, is it ethical to put out non-consensual AI porn because there hasn't been anybody's consent that has been crossed, but there is a, like an ethical essence of, well, this potentially increases real-world harm by, uh, changing people's expectations. Uh, this, uh, uh, this something kind of just the virtue of a person being represented in this way is also something that kind of should be protected. Uh, the ability to take photos of people and then recreate videos that aren't them, but are like them. Do you own your own likeness when it comes to this sort of protection? So I mean, it is like the real front lines of- of this at the moment.

    9. LM

      Absolutely. And there was just a law that was passed in the US called the Take It Down Act, where now it, it's federally a crime to upload even AI-generated non-consensual content, so this would be like deepfakes, where people could have their face on, um, superimposed onto porn and it looks realistic and is being distributed. So, you know, that's, uh, illegal. Um, but also, you know, what, what's important too that parents might not even realize when it comes to AI-generated content like this is that now there's the ability for predators, abusers, anybody to take an image even of a child. So if you have an open social media account and you're posting pictures of your children, they could take that image of the child's face and s- and put that, you know, into an AI-generated child sexual abuse material video and make it look like it's actual abuse of that child. Um, and, and that's, that's actually happening. Um, now in certain countries, even the depiction of child sexual abuse is illegal. So in Australia, in the UK, in Canada, even if it is somebody over the age of 18 that is being used in a video and they look like they're a child, that's illegal. Um, in the US, you know, that actual... and we're not talking about AI right now, but the depiction of a child by a person that is over the age of 18 is not illegal. That was made legal in a, in a case called, um, Ash- Ashcroft versus the Free Speech Coalition, uh, unfortunately. Uh, so but in other countries that's illegal. But yeah, there's this whole frontier of what's going to happen with all of this AI-generated content, and I actually think that at least on websites that distribute user-generated content, that by having age and consent verification policies in place, you can actually prevent the, even the AI-generated content from being distributed.

    10. CW

      Because that would have to cross the same...

    11. LM

      Yeah. So if you have somebody's face-

    12. CW

      filter.

    13. LM

      ... superimposed on a deepfake, you're, you're- how are you gonna get the ID and actually-

    14. CW

      Mm-hmm. Mm-hmm.

    15. LM

      ... have a verification of that person to, uh, verify their government issued ID and consent? It would stop that from being uploaded, so, uh...

    16. CW

      That's an interesting, uh, single solution to multiple problems.

    17. LM

      Yeah.

    18. CW

      W- how do you think that, uh, sort of user monetized platforms, OnlyFans, AdmireMe.VIP, stuff like that, how do you think of that as contributing to the sort of ecosystem at the moment? There's a lot of moral panic around the normalization of regular people becoming sex workers online and, and, you know, all of the objectification that comes along with that. Uh, do you think about c- k- have you got concerns around that? How, how do you think about that sort of working into this world?

    19. LM

      Yeah. Well, I know that we have seen some really concerning reports from the BBC, multiple, uh, reports from the BBC and from other news outlets that have been investigating the subscription sites, and so some of those are, like, the OnlyFans model where they've actually, you know, had children and victims who have been abused in even that content under, you know, behind the paywall i- on the subscription, uh, sites. And even victims that I have had come forward to me that have been abused on PornHub or the Tube sites have, some of them ha- have also been abused on OnlyFans. Uh, I think to me they've tightened up a lot of their regulation now. Again, the power of deterrence, right? Um, starting in 2020 when they saw what was happening with PornHub, I definitely know that there was a change in policies in the way that they were checking who's, uh, in those videos. But it's, it's not by any means... it doesn't seem like it's perfect, and I think, uh, mo- minors are being abused on the subscription sites, for sure. I know that that's true. Um, and so, uh, I mean, it's just a real concern that it a- a lot of this, again, is self-generated where it's not that a child is out there getting raped and having an abuser post their content. Children are now... it's very... become very normalized for children to be sharing nude images, not realizing the harm that that could do to them, the way that the internet is forever. And there was a study done by Thorn. So Thorn is a, a big child protection organization in the United States. They focus on CSAM online, child sexual abuse material, and they surveyed over 1,000 children and they found that one in seven nine to 12-year-olds said that they had shared a nude image or video of themselves-

    20. CW

      One i-

    21. LM

      ... with somebody else.

    22. CW

      One in seven?

    23. LM

      One in seven nine to 12-year-olds had shared a nude image or video of themselves with somebody else.

    24. CW

      Holy fuck. I am so glad that I had a Nokia 3410 when I was 14 years old.

    25. LM

      (laughs) Yeah.

    26. CW

      Like j- you know what I mean? I've, I've just...

    27. LM

      Yeah. I think the same thing. I do. I mean, it's so hard to be a child these days. I mean...

    28. CW

      Perilous minefield of bullshit. That's

  7. 1:06:031:12:51

    Keeping Children Safe Online By Using Aura

    1. CW

      a question. I had... uh, do you know who Jeffrey Katzenberg is? He's the guy-

    2. LM

      No.

    3. CW

      ... that founded Dreamworks with Steven Spielberg. He did Aladdin, he did The Lion King. He is now pushing this thing called Aura, A-U-R-A, and it is... I mean, it's... uh, to be honest, it's pretty mind-blowing what it can do. It's a, a security app, I guess, uh, but it allows parents... you just install it on your child's phone and it uses sentiment analysis to work out whether kids are, uh, accessing or messaging stuff that-... isn't good. If they receive adult images or send adult images or take photos of adult images, it sort of pings the parent immediately. So it doesn't restrict the use of the phone all that much-

    4. LM

      Mm-hmm.

    5. CW

      ... and it's not a overbearing level of supervision, but it keeps, it allows the parent to sort of be notified about what's going on. It can do stuff. It can even work out the mood of the kids-

    6. LM

      Wow.

    7. CW

      ... based on the geolocation of where they've been. So it'll say, "When you go to football for an hour, you type less hard, you hit the screen less aggressively." And less aggressive screen hitting has been associated with lower cortisol, which means that you're typically in a better mood. On the nights when your child doesn't use their phone for half an hour before they go to bed, they stay in bed, i.e. they don't use their phone for a bigger window, which we can, uh, correlate with better quality sleep. It is the most... It turns a phone-

    8. LM

      How?

    9. CW

      ... into kind of like a wearable, like a biometrically-

    10. LM

      Yeah.

    11. CW

      ... informed wearable device. The whole thing, I think pretty much everything's done locally, so it's not like they're sending this up to the cloud, security. So I just think, you know, these kinds of... I know there's, there's always this arms race of tech versus tech. Um, but-

    12. LM

      Mm-hmm. Yeah. Yeah.

    13. CW

      ... that was the first time that I sat down and, uh, uh, he had, uh, Hari, his, his, uh, co-founder at this company, and they just kept on telling me more and more of this, "Oh, yeah, we can work out, you know, how hard they hit the screen is a, an indication of their level of, uh, autonomic arousal and, and whether they're stressed." N- so there are some cool-

    14. LM

      Amazing. Yeah.

    15. CW

      Yeah. It's, it's sick.

    16. LM

      I mean, technology is, is... The capability of technology to help solve the problems that technology creates is amazing. Um, and I mean, there are even some, uh, apps and different programs now that for, for children's devices where they can prevent even the filming. So the camera itself could detect whether it's filming a nude image or video and actually stop it from ever being filmed in the first place if it's a child's phone.

    17. CW

      Hmm.

    18. LM

      That, that kind of thing. I mean, that level of, of prevention at that level. Um, but yeah, I mean the, the ways that we can implement technology to help is, is amazing. But the harm that children face, the danger that children face online, and young people and, you know, even adults, right, uh, is, is, is inc- incredible levels. And I think, you know, the more that we can talk about it and at least have young people understand the consequences of distributing content online. And that, yeah, it's nude images and videos, but it's also things that we say online that we take them, we take it so lightly. Uh, sometimes even things we say or things we distribute online, not realizing the internet can be forever. And that maybe you'll regret saying that 10 years from now when, you know, your employer goes and there's a screenshot of you saying whatever. Um, but just to have more of a sense of, I don't know, uh, a- I guess a, a gravity about the, th- the way that things do get immortalized online and the consequence of that, especially when it's a nude image and video. 'Cause there's sometimes there's so many people that think that, "You know, it's fun right now, and I'm going to do this." But they don't realize that maybe they won't want those nude images and videos online in 10 years from now. Uh, but then, you know, it's forever. So informed consent I think is a really important thing as well-

    19. CW

      Mm-hmm.

    20. LM

      ... for people to not just consent, but understand exactly what they're consenting to when they're sharing those kinds of images online.

    21. CW

      There is a weird type of... It, it, I think in the history of ideas, it's called conceptual inertia. Uh, so you can imagine there is a time when it's proposed that maybe the entire solar system and universe doesn't orbit around the Earth. Perhaps the Earth orbits around the Sun, and this is a total heresy and we can't believe that this is the truth. And then, you know, evidence continues to come forward, and you could say somebody proposes a thing that most people aren't sure whether they agree with. And then slowly, maybe, uh, evidence or data or science catches up with this, and they go, "Okay, this person wasn't talking bullshit. This is actually legit. This is the way it is." But there is still this huge lag. And it even happened with that revolution, this huge lag for just most people to use the right language. And, you know, given that we're talking about the internet being around for two decades, uh, sort of widespread porn being around for one, one and a half, something like that, uh, and you think, "Okay, is it any surprise that cultural norms and expectations and understandings of behavior and the way that parents communicate with their kids, that these things are taking time to catch up?" And, you know, it's people like you that are, uh, applying a nitro boost, like turbocharge thing to, "Hey, these are all of the areas. These are all of the different bits of weakness and, and vectors where shit can go awry. And don't fall down that fissure over there, and we need to be worried about this thing." And, um, yeah, I mean, you're a, a trooper. You're a real hero for putting this stuff together, I think. You know, like god knows what would have happened if it hadn't been for you. And it certainly seems like the, the hashtag and the movement that you put behind this has definitely expedited this process. So, uh, yeah, I- I-

    22. LM

      Well, thank you. I, I always, you know, wanna pass that on because I know that... Shout out to the, the survivors who have s- spoken up, and without their voices, none of this would've been possible, and their bravery to speak up and to share their stories, their powerful, powerful stories.

    23. CW

      Hmm.

    24. LM

      And so many of them have done that at risk. It's hard to talk about your own exploitation, um, but they have done it because they don't want this to happen to others. And so I just, yeah, thank you for that, and I would love to pass on.

    25. CW

      Well, they've got a, they've got a, a powerful ally in you. And, uh, I, I-My intention is to spend the rest of my life not being the subject of an in- investigation that you do. Uh, I do not want to be on the (laughs) on the other side.

    26. LM

      I'm sure you will not.

    27. CW

      No, that's-

    28. LM

      (laughs) I guarantee that.

    29. CW

      Yeah, yeah,

  8. 1:12:511:14:40

    Learn More About Laila

    1. CW

      yeah, yeah. Uh, look, tell people where they can check out your stuff online-

    2. LM

      Yeah.

    3. CW

      ... support you, do all of the things.

    4. LM

      Of course, yeah. So many people are still signing the Trafficking Hub Petition, and it is still a powerful awareness tool, and it, you know, it is a way that so many people are getting this message. So you can go to traffickinghubpetition.com and sign that, uh, and join others. You can also ... So I wrote a book about this story that was released last summer called Takedown: Inside the Fight to Shut Down Pornhub. And, um, you can buy that book and all proceeds, 100% of author proceeds from the sale of the book go to the cause, go to the Justice Defense Fund, uh, uh, organization that I founded. And in the book, like, you will go through this journey with me. It's written ... A lot of people are calling it a true crime thriller where, you know, it's first person present tense, and from that moment in February 1st at night when I tested the upload system, you go on the journey with me all the way through, and you will understand this issue not only in your head, but you'll understand it in your heart. You will experience it with me. And so hopefully you'll be inspired by that. So you can do that at takedownbook.com, and you can join what we created called Team Takedown. So this is a team of dedicated activists that are saying, "Look, we are ... Yes, we're gonna take down Pornhub, but we're gonna work to take down illegal content across the internet to make the internet an actually a safer place for our children for generations to come." So you can do that, and, uh, my organization is called the Justice Defense Fund, and you can go to justicedefensefund.org. So ...

    5. CW

      You are doing God's work. I appreciate you very much.

    6. LM

      Thank you.

    7. CW

      Thank you for today.

    8. LM

      I appreciate you too. Thank you. Thank you.

    9. CW

      (music) Congratulations. You made it to the end of the episode, and if you want more, well, why don't you press right here. Come on.

Episode duration: 1:14:40

Install uListen for AI-powered chat & search across the full episode — Get Full Transcript

Transcript of episode f-yM5ejNlz0

Get more out of YouTube videos.

High quality summaries for YouTube videos. Accurate transcripts to search & find moments. Powered by ChatGPT & Claude AI.

Add to Chrome