Laila Mickelwait details her 2020 investigation exposing Pornhub's unverified uploads of child sexual abuse material, citing a Florida case where a 15-year-old appeared in 58 videos. Her findings triggered hundreds of victim testimonies, 25 lawsuits, U.S. criminal charges, and the withdrawal of Visa, MasterCard, and Discover processing. Consequently, the site removed 91% of its content, dropping from 56 million to 5.2 million pieces, while the CEO and COO resigned amid the scandal. Mickelwait argues that eliminating free user-generated content models reliant on ad revenue is essential to protect children, noting that major advertisers like Kraft Heinz have already ceased operations there. [Automatically generated summary]
So just to give you a sense for what's happened, this all began in 2020. In the last five years, hundreds of victims have come forward.
Now there's lawsuits on behalf of nearly 300 victims and 25 lawsuits suing Pornhub, including class actions on behalf of tens of thousands of child victims.
They've been criminally charged by the U.S. government.
They've lost all credit card processing.
Visa, MasterCard, and Discover cut them off.
They actually had to take down 91% of the entire website had to be taken down in what Financial Times called probably the biggest takedown of content in internet history.
So we are here at the ARC conference and I've been talking to all sorts of political leaders and thinkers and philosophers and all of those types about the civilizational struggle that You come at it from a slightly different place from really focusing on human trafficking and pornography, but that fits within this civilizational struggle that we're going through.
So I guess first, how did you get involved in all that?
In the context of now 18 years in the fight against sex trafficking, about five years ago I was investigating the intersection between what I call the big porn industry and sex trafficking and child sexual abuse and was paying attention to the headlines and heard some really horrific stories of child abuse.
That was taking place on Pornhub.
It was being monetized and globally distributed and specifically the case of a 15-year-old girl from Florida who was missing for an entire year and she was finally found when her distraught mother was tipped off by a Pornhub user that he recognized her daughter on the site.
And she was found in 58 videos being raped for profit on Pornhub.
And so in the context of my work, I said I asked a question.
How in the world was this happening?
And it wasn't just that victim.
It was numerous victims that we were hearing about.
And it was a question of how did this happen?
And I began to look at the big porn tube sites.
These are the YouTubes of porn.
I tested the upload system by recording the rug in my room and the keyboard and uploaded and found out that they were not verifying age or consent.
So anybody anywhere in the world could upload a sex video to Pornhub in under 10 minutes.
So as you guys uncover some of this stuff, what is the response from PornTube and The Parrot Company as it relates to, putting aside what someone might feel about porn in general, which I would like to discuss with you, as it relates to child porn, I mean, I don't know any sane person that thinks that is okay.
There was a lot of traction around calling them out for what was happening.
I was able to start a petition that started to go viral.
We got 2.3 million signatures from every country in the world.
And as this was happening, victims were coming forward on a regular basis.
And whistleblowers from inside the company were coming forward to reveal the inner workings of how this all happened.
The media started to pay attention.
Hundreds of articles began to be written about this.
Our response, instead of to recognize that there was a serious problem, was to try to deflect, was try to do crazy PR stunts, to try to take the attention away from what was actually going on.
But we kept at it and happy to say that we've made a lot of progress in holding them accountable for the mass distribution of sexual crime.
So, you know, just to give you a sense for what's happened, this all began in 2020. In the last five years, hundreds of victims have come forward.
Now there's lawsuits on behalf of nearly 300 victims and 25 lawsuits suing Pornhub, including class actions on behalf of tens of thousands of child victims.
They've been criminally charged by the U.S. government.
They've lost all credit card processing.
Visa, MasterCard, and Discover cut them off.
They actually had to take down 91% of the entire website.
Had to be taken down in what Financial Times called probably the biggest takedown of content in internet history.
So they had no idea whether this was a 16-year-old or an 18-year-old, whether it was rough sex or rape, whether it was consensually recorded and non-consensually uploaded.
And what we understood was the site was actually infested.
Right.
And so that is why they had to take down that much content.
They went from 56 million pieces of content in 2020 to 5.2 million today.
We need justice to be served in order to deter future abusers.
And that includes those who are actually uploading the content.
In this case, this is a free porn tube site.
The ones that are actually profiting from and doing the most damage in this case with regard to distribution is the site itself.
Because, you know, at the time in 2020, they had 170 million visits per day.
They had 56 billion visits that year.
They had enough content uploaded every year.
It would take 169 years to watch if you put those videos back to back.
And so what victims say, it's one thing to be raped, right?
But then that's filmed and then it's uploaded to the mainstream site on the surface.
Like, Pornhub, where millions of people had an opportunity to download that content, so they had a download button on every single video, and to download that content, then to re-upload it again and again and again, where they understood the worst moment of their life would live in perpetuity online forever.
So at the time, it could be impossible to actually locate them.
There were some who were uploading, and they were even verifying themselves as uploaders to monetize the content on CornHub.
So I'll give you an example.
There was a man named Rocky Shea Franklin in Alabama.
And he had drugged and overpowered a 12-year-old boy.
And he filmed the assaults of this little boy.
And he uploaded 23 of those videos to Pornhub with titles that actually indicated that this was abuse.
Like, you know, Uncle Secret and Young Ass is Best.
I mean, titles that would clearly show that this was abuse.
And he verified himself, so he actually showed his ID, and they actually apprehended him, and he's in prison for 40 years for what he did.
But Pornhub, that globally distributed, police actually went after Pornhub and demanded that they take those videos down, and they were ignored for seven months while that boy's rape videos were downloaded and uploaded and got hundreds of thousands of views.
They have no mainstream legitimate advertisers anymore.
Nobody will.
Even K.Y. Jelly will not advertise.
I mean, they even had, at one point, they had Kraft Heinz Unilever advertising.
They were called out.
They stopped.
So they don't have legitimate.
Mainstream advertisers anymore, thank goodness, because that's another repercussion of what's happened that lost mainstream business partners.
But the business model relies on selling 4.6 billion ad impressions every single day on Pornhub, and that's why they need that traffic.
But I think these controls are important because children have free access to the site too, right?
Where their sex education as young as 8, 9, 10, 11 years old is not just violent, hardcore.
Legal pornography, but it's actually crime scenes that they're witnessing from such a young age that's shaping their sexual template for life, which is really harmful.
So the solution at scale for this issue is age and consent verification.
So we need age verification for those who are accessing the site, just like you said Florida has done, Texas has done.
Currently, the Supreme Court is hearing.
And going to decide whether it's okay for states to implement age verification to protect children.
But we also need age and consent verification for those who are in those user-generated porn videos as well.
And when that happens, then you're going to see this prevented on these sites.
What do you think of the argument when the Florida thing happened?
I read an article that I thought was an interesting philosophical point that the verification should be done at the user end, meaning it should be done by the parents on the device rather than putting it on the site because the site can only control that site.
But if the parents were fully involved...
And kids are smart and they can always get around things.
I fully accept that.
But if the parent were fully involved in paying attention to what's on the kid's phone or what access was there, that that probably is really the only way to deal with it.
When we think about safety, like when we're driving a car, right?
We have a seatbelt.
We have a roll bar.
We have airbags.
We have many different ways that we try to protect ourselves in that situation.
I think the same thing could apply to protecting kids from porn exposure.
I mean, that's one line of defense, but it doesn't work all the time.
And I mean, there's a lot of children that don't have attentive parents that are able to watch them 24-7 that even could take the time to understand the technology for device.
So I think we need that, but we also need...
Age verification at the site level, third-party age verification, because I would never want anybody to hand over an ID to a porn site like Pornhub.
So I think that fear is based on a misconception about how this could actually happen and what technology we have available to safely, with respect to privacy, be able to verify age on a site like that.
So we have third-party companies.
And one of them is called Yoti.
And in one second, they can do a biometric facial scan that doesn't even store the actual photo of a person, but uses pixels and numbers.
And that technology is trained on thousands, if not millions, of other faces.
And they can determine in one second, with over 99% accuracy, the age of a person that would be using the site.
They can store that as a token, and that could be used on multiple adult sites.
Ability like that to be able to age verify.
And it's not a huge inconvenience for an adult to go through that in order to protect countless children from exposure to this kind of content.
So in your vision of a future that would, let's say, make sense through the lens that you're talking about this, you'd have the processes like you just described on both ends, meaning the user end and the corporate end or whatever you want to call that.
And then basically, you're okay with pornography existing.
Yeah, I mean, look, when it's legal and it's consensual, really, truly consensual, for me, that's none of my business.
My work is to protect children from access and to protect victims and children from being exploited in that content.
In the United States, pornography, if it's done that way, is legal, right?
And it's constitutionally protected.
So, for me, that's none of my business.
To each his own in that regard.
But we need to for sure draw the red line, ensuring the best safeguards that we possibly can to ensure that children are not being exposed to this content.
Is the biggest hurdle you have to face that nobody wants to talk about this because it's kind of embarrassing and weird and sex is uncomfortable to talk about and everything else?
I feel like there's a movement now that it's not just in the United States, but we've seen it in Canada and the UK and I've seen it across Europe for protecting, specifically protecting kids from accessing this content.
Unfortunately, I haven't also seen the same...
Passion for implementing the controls we need for those who are in the videos.
I don't think people quite understand the dangers of user-generated pornographic content where these sites are getting infested with illegal content.
But I think we'll get there.
But at least it's a start.
I see a huge note.
17 states in the United States have implemented age verification for users.
We're seeing that push in the UK. We're seeing it in Germany, in France, in Canada.
So I think that there's this kind of global movement.
Understand the harms and try to protect against it.