All Episodes
Nov. 29, 2017 - Freedomain Radio - Stefan Molyneux
11:53
3913 YouTube Child Sex Scandal: Aftermath

After months and months of working extensively to demonetize videos controversial to advertisers, YouTube finds itself in the middle of an absolutely revolting child sex scandal. Since the scandal broke, YouTube has “terminated more than 270 accounts and removed over 150,000 videos ” and the company has “turned off comments on over 625,000 videos targeted by child predators."Archive: https://web.archive.org/save/https://news.vice.com/story/youtube-kills-ads-on-50000-channels-as-advertisers-flee-over-disturbing-child-contentYour support is essential to Freedomain Radio, which is 100% funded by viewers like you. Please support the show by making a one time donation or signing up for a monthly recurring donation at: http://www.freedomainradio.com/donate

| Copy link to current segment

Time Text
So, a quick update here on a video I did a couple of days ago.
I pointed out, using screenshots of people's typing on Twitter, that if you went to the YouTube search bar, not logged in, no history, and you typed in how to ha, then it auto-filled at the top, or near the top, with how to have sex with your children, and there was a bunch of other creepy stuff, and I've been sent a bunch of other search algorithms that produce hideous, creepy stuff, and this was symptomatic.
of the large amount of quasi-pedophiliac content or pedophiliac-orbited channels that were available on YouTube.
And I also got many communications from people who had said that years and years and years they had spent trying to get YouTube to pay attention to this, to deal with this.
And I did talk about in the video that I had some sympathy for the Difficulty of automating this stuff anyway.
So it turns out that within a day or two of this happening, and they fixed it overnight, like they got rid of that search result, and I guess they've taken some action behind the scenes.
So the title here is, YouTube kills ads on 50,000 channels as advertisers flee over disturbing child content.
Oh man. So that's not 50,000 videos, that's 50,000 channels, and this is just from November 27th of 2017.
The article says, for the second time in less than a year, major advertisers are fleeing YouTube after finding their ads were paired with offensive content, this time directed at children.
And the number of disturbing videos targeted at child audiences is much larger than previously known, YouTube's response reveals.
Over the past week, YouTube says it has, quote, terminated more than 270 accounts and removed over 150,000 videos from our platform in the last week.
The company also, quote, turned off comments on over 625,000 videos targeted by child predators.
To continue, YouTube says, Finally, over the past week, we removed ads from merely 2 million videos and over 50,000 channels masquerading as family-friendly content, the Google-owned company said in a statement of Vice News.
Quote, Content that endangers children is abhorrent and unacceptable to us.
Now, that's a little baffling to me.
A little baffling to me.
So, YouTube has shown that it is more than willing to demonetize political content.
Mostly on the right, some of it on the left as well.
So, they're fine with policing stuff.
They're fine with stuff that is legally protected free speech.
They're not just age-restricting it.
They're demonetizing it for fear of advertiser blowback.
But through this mechanism, my sympathy for how difficult it might be to find and police these kinds of videos has kind of evaporated in that within a couple of days, they were able to terminate comments on millions of videos, monetization on millions of videos, 50,000 channels. Boom!
Within, what, a day or two?
So that means that this was something that they were capable of doing.
I guess I first got on YouTube.
I was like guy 7 on YouTube back in 2006.
Oh, smooth old 240p.
I'm getting older and the resolution is better, not a great combo.
But if they have automated tools that are able to identify, flag, and remove all of this stuff within a day or two, it's pretty hard for me to avoid looking back through the tunnel of time and say, why now?
Why now? The article goes on to say Adidas, Morris, Hewlett, Packard and a host of other big brands have all paused advertising on YouTube in the wake of reports revealing their ads were showing up alongside sexually explicit comments under videos of children.
The tools used to screen such comments, volunteer moderators told the BBC, haven't been working properly for over a year, allowing between 50,000 and 100,000, quote, predatory, end quote, accounts to remain on YouTube.
So what do they mean by that? So they say that there's a glitch in YouTube's tool for tracking obscene comments.
This is from the BBC. Part of YouTube's system for reporting sexualized comments left on children's videos has not been functioning correctly for more than a year, say volunteer moderators.
So users can use this online form to report potentially predatory accounts.
They are then asked to include links to relevant videos and comments.
The reports then go to moderators, YouTube employees who review the material and have the power to delete it.
However, sources told Trending that after members of the public submitted information on the form, the associated links might be missing from the reports.
I am not running YouTube, but I can tell that seems pretty important.
The article says YouTube employees could see that a particular account had been reported but had no way of knowing which specific comments were being flagged.
With the help of a small group of trusted flaggers trending, I identified 28 comments directed at children that were clearly against the site guidelines.
The comments? Shocking.
Some of them are extremely sexually explicit.
Others include the phone numbers of adults and requests for videos to fulfill sexual fetishes of children.
They were left on YouTube videos posted by young children and they are exactly the kind of material that should be immediately removed under YouTube's own rules and in many cases reported to the authorities.
No kidding. The children in the videos appeared to be younger than 13 years old, the minimum age for registering an account on YouTube.
The videos themselves did not have sexual themes but showed children emulating their Favourite YouTube stars by, for instance, reviewing toys or showing their outfit of the day.
The explicit comments on these videos were passed on to the company using its form to report child endangerment, the same form that is available to general users.
Over a period of several weeks, five of the comments were deleted, but no action was taken against the remaining 23 until Trending contacted the company and provided a full list.
All of the predatory accounts were then deleted within 24 hours.
So here we have a situation where horrible, horrifying, nasty, vicious, and I would argue possibly illegal content is posted under children's videos.
The tool is used to report this to YouTube and the majority of them stay up until there is a threatening of exposure.
So to return to the original article, the reports on the explicit comments come amid a slew of related stories concerning kids and offensive YouTube content.
Those revelations include how highly trafficked YouTube channels have been tricking kids into watching disturbing videos and how YouTube search results have been auto-populating with pedophiliac queries.
Again, such as how to have sex with your kids.
If all this sounds familiar, that's because it was only this past March when blue-chip advertisers suspended their ad buys on YouTube after journalists found their ads were running next to hate, speech, and other offensive material.
Offensive material. See, now this is material aimed at adults with challenging arguments or ugly epithets or whatever.
So challenging material. Apparently, it's much more challenging for adults to be exposed to arguments they don't like than for children to be exposed to this stuff.
See, you're a kid, you're posting videos and you probably get auto-notified of the comments and you go and see the comments and it is like a raw cheese grater of human sexual sickness right there going up and down on your face.
So within a few months, the article goes on, within a few months and after some major promises from YouTube, a number of big brands returned to the platform.
In August, YouTube unveiled a revamped policy to better suppress or in some cases censor hate speech content.
Hate speech still. And, of course, in America, hate speech is protected under the First Amendment.
So, articles goes on.
On November 22nd, after the first stories about disturbing content aimed at children started dropping, the web video network announced a new set of policies to deal with content on YouTube that attempts to pass as family-friendly, but is clearly not.
So, this is videos that masquerade as children's characters and so on, but end up with really disturbing, horrible situations.
And this is parents who are cheap and lazy.
Sorry to say it. It's just the reality.
Number one, you're cheap because if you buy a video service, you're not going to like, I don't know, Netflix or Amazon Prime or a bunch of other stuff.
You're not going to get that same freaky, disturbing content.
You're going to get children's shows that are pre-vetted, that come from major studios.
They may have some leftist programming, but they won't have your child's favorite cartoon character having their teeth ripped out with pliers while screaming and struggling.
So just pay for something.
Get something official and you'll be able to bypass this.
And just do not...
Have YouTube as your babysitter.
Do not have YouTube as your babysitter.
Do not have a tablet as your babysitter.
Sit down and do stuff with your kids.
So the article goes on to say Toy Freaks, for example, was until this month one of the 100 most viewed YouTube channels with more than 8.5 million subscribers to its videos, some of which featured strange content that included children in pain and throwing up.
YouTube terminated the Toy Freaks channel in mid-November after a Medium essay highlighting its content went viral.
Come on. Top 100?
I can understand there's some obscure video with four views a guy uploaded at once.
A little tough to find, but come on.
Come on. Top 100 most viewed channels?
8.5 million subscribers?
That is not something that goes underneath And so it's horrifying to see these kinds of priorities, that they went after political speech before they went after pedophiles.
They went after political speech challenging arguments, ugly sentiments, directed at adults, made by adults.
They went after that. Before they went after pedophiliac content.
That is a horrifying set of priorities.
And the fact that they were able to scrub this stuff so quickly, and the fact that they were willing to scrub stuff earlier that was largely political in nature, the fact that they were able to scrub this stuff so quickly tells me a lot about the supposed technological challenges of identifying and getting rid of this stuff.
I think that the company needs to be open about what went wrong.
It needs to be open about the decision matrix, the decision flowchart that ended up with this kind of horrifying situation.
Who knows? The number of children out there who were traumatized, who may have been lured in by comments, who may have called pedophiles, who may have ended up exposed directly.
To child predators?
Who knows? Based upon the fact that the kind of scrubbing that happened over the last day or two did not seem to have happened over the last 10 years or so.
So, I mean, if you're a shareholder, if you are, like, for heaven's sakes, this information needs to come out.
Because we've got a situation where...
God forbid, on your YouTube video, you use a 10-second clip from a news station.
You might get dinged for copyright.
Boy, what if there's a song playing in the background while you're doing a video?
Boom! They're all over you.
So... It's just astonishing and horrifying.
And the information needs to be known.
It needs to come out. How on earth did this happen?
And what steps are in place to prevent it from happening again?
But until then, keep your kids off YouTube, please.
Export Selection