Speaker | Time | Text |
---|---|---|
So just to give you a sense for what's happened, this all began in 2020. In the last five years, hundreds of victims have come forward. | ||
Now there's lawsuits on behalf of nearly 300 victims and 25 lawsuits suing Pornhub, including class actions on behalf of tens of thousands of child victims. | ||
They've been criminally charged by the U.S. government. | ||
They've lost all credit card processing. | ||
Visa, MasterCard, and Discover cut them off. | ||
They actually had to take down 91% of the entire website had to be taken down in what Financial Times called probably the biggest takedown of content in internet history. | ||
unidentified
|
All right. | |
So we are here at the ARC conference and I've been talking to all sorts of political leaders and thinkers and philosophers and all of those types about the civilizational struggle that You come at it from a slightly different place from really focusing on human trafficking and pornography, but that fits within this civilizational struggle that we're going through. | ||
So I guess first, how did you get involved in all that? | ||
In the context of now 18 years in the fight against sex trafficking, about five years ago I was investigating the intersection between what I call the big porn industry and sex trafficking and child sexual abuse and was paying attention to the headlines and heard some really horrific stories of child abuse. | ||
That was taking place on Pornhub. | ||
It was being monetized and globally distributed and specifically the case of a 15-year-old girl from Florida who was missing for an entire year and she was finally found when her distraught mother was tipped off by a Pornhub user that he recognized her daughter on the site. | ||
And she was found in 58 videos being raped for profit on Pornhub. | ||
And so in the context of my work, I said I asked a question. | ||
How in the world was this happening? | ||
And it wasn't just that victim. | ||
It was numerous victims that we were hearing about. | ||
And it was a question of how did this happen? | ||
And I began to look at the big porn tube sites. | ||
These are the YouTubes of porn. | ||
I tested the upload system by recording the rug in my room and the keyboard and uploaded and found out that they were not verifying age or consent. | ||
So anybody anywhere in the world could upload a sex video to Pornhub in under 10 minutes. | ||
Only an email address was required. | ||
So literally no controls. | ||
So a 15-year-old kid themselves could be uploading this. | ||
unidentified
|
Absolutely. | |
And they were. | ||
Now, I assume that's against whatever their terms of service might be, but that doesn't stop anyone from doing it. | ||
It was just a show, right? | ||
It was the fine print, terms of service, check a box here. | ||
But it was a free-for-all, and that's how the site actually became infested with videos of real sexual crime. | ||
And I felt like we need to sound the alarm on this because it was Pornhub, but it was... | ||
Also, it's sister sites that were owned by the same parent company. | ||
Some of the most popular porn tube sites in the world are owned by this company, RedTube, GayTube, YouPorn, XTube, ExtremeTube, on and on and on. | ||
And these sites are infested with real sexual crime. | ||
And so we had to... | ||
Really sound the alarm on what was going on. | ||
So as you guys uncover some of this stuff, what is the response from PornTube and The Parrot Company as it relates to, putting aside what someone might feel about porn in general, which I would like to discuss with you, as it relates to child porn, I mean, I don't know any sane person that thinks that is okay. | ||
Right. | ||
Well, they were kind of deflecting. | ||
So as all this was coming to light... | ||
There was a lot of traction around calling them out for what was happening. | ||
I was able to start a petition that started to go viral. | ||
We got 2.3 million signatures from every country in the world. | ||
And as this was happening, victims were coming forward on a regular basis. | ||
And whistleblowers from inside the company were coming forward to reveal the inner workings of how this all happened. | ||
The media started to pay attention. | ||
Hundreds of articles began to be written about this. | ||
Our response, instead of to recognize that there was a serious problem, was to try to deflect, was try to do crazy PR stunts, to try to take the attention away from what was actually going on. | ||
But we kept at it and happy to say that we've made a lot of progress in holding them accountable for the mass distribution of sexual crime. | ||
So, you know, just to give you a sense for what's happened, this all began in 2020. In the last five years, hundreds of victims have come forward. | ||
Now there's lawsuits on behalf of nearly 300 victims and 25 lawsuits suing Pornhub, including class actions on behalf of tens of thousands of child victims. | ||
They've been criminally charged by the U.S. government. | ||
They've lost all credit card processing. | ||
Visa, MasterCard, and Discover cut them off. | ||
They actually had to take down 91% of the entire website. | ||
Had to be taken down in what Financial Times called probably the biggest takedown of content in internet history. | ||
unidentified
|
Wow. | |
So 91% of the content was underage? | ||
It was unverified. | ||
So they had no idea whether this was a 16-year-old or an 18-year-old, whether it was rough sex or rape, whether it was consensually recorded and non-consensually uploaded. | ||
And what we understood was the site was actually infested. | ||
Right. | ||
And so that is why they had to take down that much content. | ||
They went from 56 million pieces of content in 2020 to 5.2 million today. | ||
unidentified
|
Wow. | |
And we're not done yet because there's still unverified content on the site. | ||
So that's going after the distributor, in effect. | ||
But what about the people? | ||
Now, I assume some of it is just, in essence, kids uploading it themselves. | ||
And then there's obviously some legal issues there. | ||
I don't know if you go after parents or something like that. | ||
And then what about the people that are actually involved in trafficking? | ||
Yes. | ||
So we believe in accountability. | ||
We need justice to be served in order to deter future abusers. | ||
And that includes those who are actually uploading the content. | ||
In this case, this is a free porn tube site. | ||
The ones that are actually profiting from and doing the most damage in this case with regard to distribution is the site itself. | ||
Because, you know, at the time in 2020, they had 170 million visits per day. | ||
They had 56 billion visits that year. | ||
They had enough content uploaded every year. | ||
It would take 169 years to watch if you put those videos back to back. | ||
And so what victims say, it's one thing to be raped, right? | ||
But then that's filmed and then it's uploaded to the mainstream site on the surface. | ||
Like, Pornhub, where millions of people had an opportunity to download that content, so they had a download button on every single video, and to download that content, then to re-upload it again and again and again, where they understood the worst moment of their life would live in perpetuity online forever. | ||
So, this is going to sound naive or something, but who are they being raped by? | ||
Like, these are traffickers that are getting basically young kids, in essence, and don't they think they're going to be caught? | ||
I mean... | ||
Well, they could anonymously upload. | ||
So they could even use a VPN to upload. | ||
So at the time, it could be impossible to actually locate them. | ||
There were some who were uploading, and they were even verifying themselves as uploaders to monetize the content on CornHub. | ||
So I'll give you an example. | ||
There was a man named Rocky Shea Franklin in Alabama. | ||
And he had drugged and overpowered a 12-year-old boy. | ||
And he filmed the assaults of this little boy. | ||
And he uploaded 23 of those videos to Pornhub with titles that actually indicated that this was abuse. | ||
Like, you know, Uncle Secret and Young Ass is Best. | ||
I mean, titles that would clearly show that this was abuse. | ||
And he verified himself, so he actually showed his ID, and they actually apprehended him, and he's in prison for 40 years for what he did. | ||
But Pornhub, that globally distributed, police actually went after Pornhub and demanded that they take those videos down, and they were ignored for seven months while that boy's rape videos were downloaded and uploaded and got hundreds of thousands of views. | ||
So how is it that Pornhub is still up at this point if they've done so much clearly illegal stuff? | ||
I mean, it doesn't sound like they're hiding it to that. | ||
Well, the wheels of justice turn slowly. | ||
I mean, but they are turning. | ||
Thankfully, 91% of the site has been taken down and they've faced serious repercussions. | ||
I mean, the CEO and the COO were forced to resign. | ||
The secret majority shareholder was exposed. | ||
They're personally being sued. | ||
That little boy that I just mentioned, he's now 18 and he's suing the company along with many other victims. | ||
And the company was actually sold as a distressed asset because of the repercussions of what's happened. | ||
But we're not done yet because we want to see justice. | ||
fully served. | ||
And what that means is, you know, justice to the full extent of the law, both criminally and civilly. | ||
And that's so important to be a deterrent to future abusers. | ||
unidentified
|
Right. | |
Because at the end of the day, for these corporate traffickers is what I call them, it's a risk benefit calculation for them. | ||
And we have to increase risk and then eliminate profitability. | ||
And when we do that, we're going to see a transformation of this industry to make the internet a safer place. | ||
Do you see any of the controls that they're putting in as working in... | ||
I'm in Florida where now they... | ||
I think you have to have an ID to go on some of these sites. | ||
That at least provides some protection, I suppose, on the user end. | ||
That's not doing anything on the end that you're talking about. | ||
But I assume you find some value in some of that kind of stuff. | ||
Absolutely. | ||
I mean, children are being exploited in front of the screen and they're being exploited behind the screen. | ||
And victims are being exploited on both sides for profit because the business model of free user-generated porn... | ||
Is to sell ad impressions. | ||
So it's all about traffic. | ||
That's what they care about the most is content and traffic. | ||
Have you guys gone for the advertisers too? | ||
They have no mainstream legitimate advertisers anymore. | ||
Nobody will. | ||
Even K.Y. Jelly will not advertise. | ||
I mean, they even had, at one point, they had Kraft Heinz Unilever advertising. | ||
They were called out. | ||
They stopped. | ||
So they don't have legitimate. | ||
Mainstream advertisers anymore, thank goodness, because that's another repercussion of what's happened that lost mainstream business partners. | ||
But the business model relies on selling 4.6 billion ad impressions every single day on Pornhub, and that's why they need that traffic. | ||
But I think these controls are important because children have free access to the site too, right? | ||
Where their sex education as young as 8, 9, 10, 11 years old is not just violent, hardcore. | ||
Legal pornography, but it's actually crime scenes that they're witnessing from such a young age that's shaping their sexual template for life, which is really harmful. | ||
So the solution at scale for this issue is age and consent verification. | ||
So we need age verification for those who are accessing the site, just like you said Florida has done, Texas has done. | ||
Currently, the Supreme Court is hearing. | ||
And going to decide whether it's okay for states to implement age verification to protect children. | ||
But we also need age and consent verification for those who are in those user-generated porn videos as well. | ||
And when that happens, then you're going to see this prevented on these sites. | ||
What do you think of the argument when the Florida thing happened? | ||
I read an article that I thought was an interesting philosophical point that the verification should be done at the user end, meaning it should be done by the parents on the device rather than putting it on the site because the site can only control that site. | ||
But if the parents were fully involved... | ||
And kids are smart and they can always get around things. | ||
I fully accept that. | ||
But if the parent were fully involved in paying attention to what's on the kid's phone or what access was there, that that probably is really the only way to deal with it. | ||
I don't agree with that's the only way. | ||
I say both and. | ||
When we think about safety, like when we're driving a car, right? | ||
We have a seatbelt. | ||
We have a roll bar. | ||
We have airbags. | ||
We have many different ways that we try to protect ourselves in that situation. | ||
I think the same thing could apply to protecting kids from porn exposure. | ||
I mean, that's one line of defense, but it doesn't work all the time. | ||
And I mean, there's a lot of children that don't have attentive parents that are able to watch them 24-7 that even could take the time to understand the technology for device. | ||
So I think we need that, but we also need... | ||
Age verification at the site level, third-party age verification, because I would never want anybody to hand over an ID to a porn site like Pornhub. | ||
Right. | ||
So the article that I read was basically arguing that, that then, okay, congratulations, you've just handed them your ID. Well, we have third parties. | ||
So I think that fear is based on a misconception about how this could actually happen and what technology we have available to safely, with respect to privacy, be able to verify age on a site like that. | ||
So we have third-party companies. | ||
And one of them is called Yoti. | ||
And in one second, they can do a biometric facial scan that doesn't even store the actual photo of a person, but uses pixels and numbers. | ||
And that technology is trained on thousands, if not millions, of other faces. | ||
And they can determine in one second, with over 99% accuracy, the age of a person that would be using the site. | ||
They can store that as a token, and that could be used on multiple adult sites. | ||
Ability like that to be able to age verify. | ||
And it's not a huge inconvenience for an adult to go through that in order to protect countless children from exposure to this kind of content. | ||
So in your vision of a future that would, let's say, make sense through the lens that you're talking about this, you'd have the processes like you just described on both ends, meaning the user end and the corporate end or whatever you want to call that. | ||
And then basically, you're okay with pornography existing. | ||
Yeah, I mean, look, when it's legal and it's consensual, really, truly consensual, for me, that's none of my business. | ||
My work is to protect children from access and to protect victims and children from being exploited in that content. | ||
In the United States, pornography, if it's done that way, is legal, right? | ||
And it's constitutionally protected. | ||
So, for me, that's none of my business. | ||
To each his own in that regard. | ||
But we need to for sure draw the red line, ensuring the best safeguards that we possibly can to ensure that children are not being exposed to this content. | ||
Is the biggest hurdle you have to face that nobody wants to talk about this because it's kind of embarrassing and weird and sex is uncomfortable to talk about and everything else? | ||
It's one of them. | ||
I mean, people are... | ||
I think that that does allow this to continue kind of hidden in the shadows. | ||
I mean, that's why Pornhub... | ||
It was purchased for $2,750 in 2007 is when it launched online. | ||
By 2020, it was the fifth most visited website in the world. | ||
And this entire time, victims were being exploited for profit. | ||
It was hidden in plain sight. | ||
And it wasn't until 2020 that this began to get attention and traction so that we can actually hold. | ||
Those responsible to account and help stop this in the future. | ||
And I think it's partly because of what you just said, that people are scared to talk about it. | ||
I mean, the interesting thing is that it may feel like it's this quiet corner of the internet, but it's such a huge swath of it, right? | ||
And most people have been exposed to it or consumers of it, but afraid to talk about it. | ||
A lot of people even struggle with compulsive. | ||
Porn use that they don't want to talk about because they're embarrassed or there's shame involved with that as well. | ||
And that's a huge problem, especially for young people who are growing up on exposure to pornography. | ||
Yeah, I think it's a huge issue in society, but it's definitely not talked about enough. | ||
And we need to kind of break those barriers and just be able to discuss this because it's really affecting so many lives. | ||
Let me ask you one other thing, and then we'll finish this up on the main stage. | ||
So I don't want to ask you everything right now. | ||
But do you see a difference in attitudes as it relates to some of these controls when it comes to America versus Europe? | ||
Interesting. | ||
I mean, there's a movement. | ||
I feel like there's a movement now that it's not just in the United States, but we've seen it in Canada and the UK and I've seen it across Europe for protecting, specifically protecting kids from accessing this content. | ||
Unfortunately, I haven't also seen the same... | ||
Passion for implementing the controls we need for those who are in the videos. | ||
I don't think people quite understand the dangers of user-generated pornographic content where these sites are getting infested with illegal content. | ||
But I think we'll get there. | ||
But at least it's a start. | ||
I see a huge note. | ||
17 states in the United States have implemented age verification for users. | ||
We're seeing that push in the UK. We're seeing it in Germany, in France, in Canada. | ||
So I think that there's this kind of global movement. | ||
Understand the harms and try to protect against it. |