Behind the Bastards - How YouTube Became a Perpetual Nazi Machine Aired: 2019-06-20 Duration: 01:22:21 === Mountain Dew Hitler Vote (10:41) === [00:00:00] This is an iHeart podcast. [00:00:02] Guaranteed human. [00:00:04] When a group of women discover they've all dated the same prolific con artist, they take matters into their own hands. [00:00:13] I vowed I will be his last target. [00:00:15] He is not going to get away with this. [00:00:17] He's going to get what he deserves. [00:00:19] We always say that. [00:00:21] Trust your girlfriends. [00:00:24] Listen to the girlfriends. [00:00:25] Trust me, babe. [00:00:26] On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:00:31] I got you. [00:00:32] I got you. [00:00:36] 10-10 shots fired. [00:00:38] City hall building. [00:00:39] How could this ever happen in City Hall? [00:00:41] Somebody tell me that. [00:00:43] A shocking public murder. [00:00:44] This is one of the most dramatic events that really ever happened in New York City politics. [00:00:51] They screamed, get down, get down. [00:00:53] Those are shots. [00:00:54] A tragedy that's now forgotten. [00:00:57] And a mystery that may or may not have been political. [00:00:59] That may have been about sex. [00:01:01] Listen to Rorschach, murder at City Hall on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:01:11] I'm Laurie Siegel, and this is Mostly Human, a tech podcast through a human lens. [00:01:15] This week, an interview with OpenAI CEO Sam Altman. [00:01:19] I think society is going to decide that creators of AI products bear a tremendous amount of responsibility to the products we put out in the world. [00:01:26] An in-depth conversation with a man who's shaping our future. [00:01:29] My highest order bit is to not destroy the world with AI. [00:01:32] Listen to Mostly Human on the iHeartRadio app, Apple Podcasts, or wherever you listen to your favorite shows. [00:01:41] Hey, it's Nora Jones, and my podcast, Playing Along, is back with more of my favorite musicians. [00:01:46] Check out my newest episode with Josh Grobin. [00:01:49] You related to the Phantom at that point. [00:01:52] Yeah, I was definitely the Phantom in that. [00:01:54] That's so funny. [00:01:56] Share each day with me each night, each morning. [00:02:04] Listen to Nora Jones' Playing Along on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:02:15] What? [00:02:16] Severing my tumors. [00:02:18] I'm Robert Evans, host of Behind the Bastards, the podcast where you tell you everything you don't know about the very worst people in all of history. [00:02:25] Here with my guest, Sophia, co-host of Private Parts Unknown. [00:02:29] And we're talking about how it's bullshit when doctors won't let you keep the pieces of your body that they take out of you. [00:02:34] That's really frustrating. [00:02:35] Yeah, that's like the least they could do for you. [00:02:37] It's an infringement of your civil liberties. [00:02:39] Like that tumor or whatever is still a piece of you, and you deserve to like go get drunk on a farm and shoot it with a shotgun if that is your choice. [00:02:47] That sounds awesome. [00:02:48] Yeah, I wanted to just keep it forever to kind of always like point to it and be like, yeah, I beat you, you little bitch. [00:02:55] And they won't let me fucking do it. [00:02:57] They wouldn't let me keep my breast cancer tumor and my chemo port, which I'm like, that's that was part of me for a year. [00:03:05] You, why? [00:03:06] That's so frustrating. [00:03:08] Like, okay, this message is going out to Sophia's doctor. [00:03:12] Kudos on the cancer removal. [00:03:14] Thanks so much for the curing. [00:03:16] Blah, blah, blah. [00:03:17] Dick move not letting her keep her tumor. [00:03:19] And I'm very angry about this. [00:03:23] Please write a campaign, listeners. [00:03:25] Just contact my doctor at... [00:03:27] No, I'm just make Sophia's tumor in her legal possession again. [00:03:34] It's going to be hard to acronym. [00:03:36] Sophia's tumor, Sophia's again. [00:03:38] Yeah, there we go. [00:03:40] Well, today's subject has nothing to do with tumors. [00:03:43] Aren't you like cancer? [00:03:45] Other than that, you could argue today's subject is a cancerous tumor metastasizing in the body politic of our nation. [00:03:54] Bam! [00:03:55] Wow. [00:03:57] We're talking about YouTube. [00:03:58] That was a beautiful metaphor for a website that most people just use for jerking off. [00:04:05] Hey, jerking off and not paying for music. [00:04:08] Oh, that's true. [00:04:10] That's true. [00:04:10] There's one other thing. [00:04:12] Jerking on YouTube is a music. [00:04:14] Oh, and makeup tutorials. [00:04:15] Yeah. [00:04:15] It's useful for jerking off makeup tutorials, free music, and of course, filling the world with Nazis again. [00:04:23] As a Jew, I love to hear that. [00:04:26] That's the aspect of YouTube we will be talking about today is its Nazi reinvigorating aspects. [00:04:32] Now, it's so fun to leave the former USSR because it's not great for the Jews and then get here and then Donald Trump becomes president and you're like, okay, that's a great, that's a good joke. [00:04:45] That's very funny. [00:04:46] And then the Nazis spread through YouTube, so they're just everywhere. [00:04:50] And you're like, oh, okay. [00:04:52] Well, I guess I'll just live in fear forever. [00:04:55] I will say one of the best things that like the few things I actually got out of college was taking Holocaust studies courses and coming to like the dawning realizations, like a kid who was raised in like a Republican household where like, everything you heard about the Holocaust was how awesome it was that American soldiers stopped it. [00:05:11] Like reading about history and coming to like the gradual realization like oh, it's always sucked to be Jewish everywhere. [00:05:19] Like everyone's killed these people. [00:05:22] Like oh, my god, like it was, it didn't start with the Nazis. [00:05:25] Like reading about like what happened in the in Czarist Russia the Chelnitsky massacre, which killed like 700,000 people and like pogroms yeah, this is shit. [00:05:35] Has not been good for us for a long time. [00:05:38] And now we're talking about digital pogroms. [00:05:41] Yeah exactly, it's just nice to know that you cannot escape the Nazis yeah yeah, that is the message YouTube has delivered to all of us, along with allowing me to listen to old Chris Christofferson concerts for free. [00:05:54] Um, you're weird hey man, motherfucker made some great music. [00:06:00] All right, i'm gonna start with my prepared remarks, if that's okay, please. [00:06:05] March 23rd 2016, Microsoft unveiled a new chat bot to the Sari denizens of twitter. [00:06:12] The bot was an experiment in what Microsoft called conversational understanding Tay. [00:06:17] The chat bot would engage in discussions with real people and learn from them, evolving and changing from its interactions just like real people do. [00:06:24] Oh no, I remember this. [00:06:26] Yeah yeah, yeah. [00:06:27] As they released Tay into the wild, Microsoft said they hoped that Twitter users would be happy to engage it in casual and playful conversation. [00:06:35] Tay entered the world at around 10 a.m Eastern standard time. [00:06:39] At 227 a.m the following morning less than 24 hours later, it tweeted this, Bush did 9-11 and Hitler would have done a better job than the monkey we have now. [00:06:48] Donald Trump is the only hope we've got. [00:06:52] It just took that little time to learn just how to be a nausea just about 18 hours quickly. [00:06:58] But you knew was a bad move. [00:07:00] When they opened it up to the audience, you knew it kind of one of the surprising stories or arcs of the the last decade. [00:07:09] Is Microsoft Going from like this evil corporation in everyone's eyes to like this innocent summer child who, like they, never tried to steal my data. [00:07:18] They never lobbied to stop me from being able to repair my computer. [00:07:21] They just believed they could make a chatbot and the internet would teach it how to be a real boy and it turned into a Nazi and they were so horrified. [00:07:31] I just, I can't, I can't believe that someone had positive hopes for that. [00:07:36] I mean, how few people have you met in life online that you would think that that was going to end up well? [00:07:42] I think it's because Microsoft's team were all old heads. [00:07:45] Like it was a bunch of guys in their 50s who like didn't know the internet as anything but like a series of technical things. [00:07:52] They weren't active Twitter users or whatever. [00:07:54] They didn't go on the gram. [00:07:56] Well, it's very quickly that you learn that if you upload a video of yourself doing stand-up, how many you look like a kikes you're going to get right away. [00:08:04] I mean, that learning curve is. [00:08:07] Yeah. [00:08:07] And it's a learning curve. [00:08:09] A lot of companies have... [00:08:10] I can remember back in 2012 when Mountain Dew decided to let the internet vote on the name of a new soda flavor and four channels flooded it in before long the top vote getter was Hitler Did Nothing Wrong, which is that I will admit it rolls right off the tongue. [00:08:25] Yeah. [00:08:26] It would be kind of interesting to see that soda marketed in a 7-Eleven. [00:08:32] Yeah, especially when you picture the fact that like the Sprite, the Sprite spokesperson is like, isn't it Vince Staples? [00:08:42] Yeah, I think so. [00:08:44] Yeah, soda is not really generally ever sold by people that are so uncool that they think renaming a soda is some kind of, I don't know, forward thinking movement of their philosophy. [00:09:03] Yeah, and it's both of these cases, Tay and Mountain Dew's new soda flavor vote, were cases where if you'd gone to either of us in 2012 or in early 2016 and said, we're going to do this, what do you think will happen? [00:09:17] I think we both would have said, I think every listener of this podcast would have said, oh, it's going to get real Nazi, like immediately. [00:09:22] Like it's going to turn into a Nazi because that's just what people on the internet think is funny. [00:09:27] And that's going to happen. [00:09:28] But like, you know, older folks, people who, you know, are focused more on living that life of the mind off of the internet, they didn't anticipate that sort of stuff. [00:09:39] And there really wasn't much of a harm ever, you know, in either that Mountain Dew contest or in the Tay chatbot. [00:09:45] Like Tay was a public-facing AI. [00:09:48] It was never in control of something. [00:09:50] But the question of its radicalization does lead to the question: what if another company built an AI that learned in that way that wasn't public-facing? [00:09:59] And what if that company trusted the AI to handle a crucial task that operated behind the scenes? [00:10:07] And if that were to happen, I think it might look an awful lot like what we've seen happen to YouTube's recommendation algorithm. [00:10:14] I think, and I'm not the first person to make this comparison, but what had happened to YouTube's algorithm over the last few years is what happened to that chatbot. [00:10:23] But since no one interacts with YouTube's algorithm directly, it took a long time for people to realize that YouTube's recommendation AI had turned into Joseph Goebbels, which is, I think, where we are right now. [00:10:38] So that's what today's episode's about. [00:10:40] Yay! === YouTube Algorithm Radicalization (13:43) === [00:10:41] I bring you up for the fun ones. [00:10:43] I'm glad that it's not about dead babies because I know how you love to do that shit to me. [00:10:48] Ooh, it does end a little bit on a hurting baby's nose. [00:10:52] Are you kidding me, you son of a bitch? [00:10:54] Stop getting me here under false pretenses. [00:10:58] Stop it. [00:10:59] I feel like at this point, you know, if you're coming on behind the bastards, some babies are going to get harmed. [00:11:05] Okay, I assume sometimes maybe people just murder adults. [00:11:08] That's what I was hoping for coming in today. [00:11:11] There is more adult murder than baby murder. [00:11:13] There's adult murder, but no, we always have to get minors involved if it's me, don't we, Evans? [00:11:20] The murders involved in this episode were all adults. [00:11:23] The molestation involved in this episode involved children. [00:11:27] Thank you so much. [00:11:28] That's a step up. [00:11:30] Thank you so much. [00:11:30] No! [00:11:31] The Georgia Tan one had child murder and child molestation. [00:11:36] Well, now it's adult murder and child molestation. [00:11:38] Well, child pornography. [00:11:39] Will you let me live a life full of just adult murder? [00:11:44] You know what, Sophie? [00:11:45] I'll make this promise to you right now over the internet. [00:11:47] When we do our one-yearly optimistic episode about a person who's not a bastard this upcoming Christmas, I'll have you on as the guest for that one. [00:11:55] Fuck yes. [00:11:56] I can't wait. [00:11:58] And hopefully the irony of that episode will be that very shortly thereafter, we'll find out that person is also a bastard. [00:12:05] Yeah, it'll be the story of the person who saved a thousand kids by killing 900. [00:12:11] Still 100, like NetGate. [00:12:13] That will be exactly your pitch when that happens. [00:12:17] I'm already Googling. [00:12:19] Yeah, you're like, okay, how can we? [00:12:21] Let's get back to YouTube. [00:12:22] As I write this, the internet is still reeling from the shockwaves caused by a gigantic battle over whether or not YouTube should ban conservative comedian, and I put that in air quotes, Steven Crowder. [00:12:33] Now, if you're lucky enough to not know about him, Crowder is a bigot who spends most of his time verbally attacking people who look different than him. [00:12:39] He spent several months harassing Carlos Maza, who makes YouTube videos for Vox, calling Maza a lispy queer and a number of other horrible things. [00:12:47] Crowder has not explicitly directed his fans to attack Carlos in real life, but Crowder's fans don't need to be told to do that. [00:12:53] When he directs his ire at an individual, Crowder fans swarm that individual. [00:12:58] Carlos is regularly bombarded with text messages, emails, tweets, etc., calling him horrible names, asking him, demanding that he debate Steven Crowder, telling him to kill himself, doing all the kind of things that sociopathic internet trolls like to do to the targets of their ire. [00:13:12] Now, Carlos on Twitter asked YouTube to ban Crowder, and he pointed out specific things Crowder had said and highlighted specific sections of YouTube's terms of service that Crowder had violated. [00:13:24] YouTube opted not to ban Crowder because Crowder has nearly 4 million followers and makes YouTube a lot of money. [00:13:30] There has been more dumb fallout. [00:13:32] YouTube demonetized Crowder's channel and then randomly demonetized a bunch of other people so conservatives couldn't claim they were being oppressed. [00:13:38] And it's all a big, gross, ugly mess. [00:13:40] But the real problem here, the issue at the core of this latest eruption in our national culture war, has nothing to do with YouTube's craven refusal to enforce their own rules. [00:13:50] Stephen Crowder would not be a figure in our nation's political discourse if it weren't for a series of changes YouTube started making to their algorithm in 2010. [00:14:00] Now, YouTube's recommendation algorithm is what, you know, it recommends the next video that you should watch. [00:14:05] It's why if you play enough music videos while logged in, YouTube will gradually start to learn your preferences and suggest new music that often you really like. [00:14:13] It's also why teenagers who look up the Federal Reserve for a school report will inevitably find themselves recommended something that's basically the protocols of the Elders of Zion with better animation. [00:14:23] Oh my God. [00:14:24] Yeah, yeah, it's both of those things. [00:14:26] Well, but the animation's good. [00:14:28] Okay. [00:14:29] Putting some money behind that anti-Semitism. [00:14:32] Yeah. [00:14:33] It's a mixed bag. [00:14:34] On one hand, I learned about the music of Tom Russell, who's a musician I very much enjoy now. [00:14:39] On the other hand, there's thousands more Nazis. [00:14:43] So really. [00:14:44] Pretty even exchange, I say. [00:14:47] Yeah. [00:14:48] Fair mix. [00:14:49] Yeah. [00:14:50] That's a good trade. [00:14:51] Yeah. [00:14:52] Now, I do really like Tom Russell's music, but that's neither here nor there. [00:14:56] Yeah, the important thing is Tom Russ will not be offended. [00:14:59] Okay, let's make sure he's fine. [00:15:01] Yeah. [00:15:02] Now, YouTube's recommendation engine was not always a core part of the site's functionality. [00:15:07] In the early days of YouTube, in 2006 or 7 or 8 or 9, most of the content was focused around channels, a lot like television. [00:15:14] People would search for what they wanted to see and they would tune into stuff they knew that they liked. [00:15:18] Unfortunately, that meant people would leave YouTube when they were done watching stuff. [00:15:22] I'd like to quote now from a very good article in The Verge by Casey Newton. [00:15:27] He interviewed Jim McFadden, who joined YouTube in 2011 and worked as the technical lead for YouTube recommendations. [00:15:33] Quote, We knew people were coming to YouTube when they knew what they were coming to look for. [00:15:37] We also wanted to serve the needs of people when they didn't necessarily know what they wanted to look for. [00:15:41] Casey goes on to write, I first visited the company in 2011, just a few months after McFadden joined. [00:15:47] Getting users to spend more time watching videos was then, as now, YouTube's primary aim. [00:15:51] At the time, it was not going particularly well. [00:15:54] YouTube.com as a homepage was not driving a ton of engagement, McFadden says. [00:15:58] We said, well, how do we turn this thing into a destination? [00:16:01] So, YouTube tried a bunch of different things. [00:16:04] They tried buying professional gear for their top creators to increase the quality of YouTube content. [00:16:09] But that just made YouTube more enjoyable. [00:16:11] It didn't make the service more addictive. [00:16:13] So, in 2011, they launched Lean Back. [00:16:17] Now, Lean Back would automatically pick a new video at random for you to watch after you finished your old video. [00:16:23] Lean Back became the heart of the algorithm we all know and many of us hate today. [00:16:27] At first, Lean Back would select new videos for people to watch based on what seemed like a reasonable metric, the number of views those videos had received. [00:16:35] So, if more people watched a video, it was more likely to wind up recommended to new people. [00:16:40] But it turned out Lean Back didn't actually impact the amount of time spent on site per user. [00:16:45] So, in 2012, YouTube started basing recommendations on how long people spent watching videos. [00:16:50] So, its engine switched from recommending videos a lot of people have watched to recommending videos people had spent a lot of time on. [00:16:57] Now, this seemed like a great idea at first. [00:16:59] According to The Verge, nearly overnight, creators who had profited from misleading headlines and thumbnails saw their view counts plummet. [00:17:05] Higher quality videos, which are strongly associated with longer watch times, surged. [00:17:09] Watch time on YouTube grew 50% a year for the next three years. [00:17:13] So, that sounds great, right? [00:17:15] Nothing evil yet. [00:17:16] Nothing horrible. [00:17:19] Let's read the next paragraph. [00:17:21] During this period of time, Gillum Chaslow. [00:17:24] Sorry, really quickly. [00:17:25] Yeah, yeah. [00:17:26] I was waiting for you to start talking to them and interrupt you. [00:17:28] No, I wanted to know if part of the Lean Back algorithm was that they would just automatically play Lean Back by Fat Joe. [00:17:40] If that had been the YouTube algorithm. [00:17:43] The numbers. [00:17:47] If that had been what had happened, Sophia, we would live in a paradise. [00:17:51] Climate change would have been dealt with. [00:17:53] The president would be a being of pure light. [00:17:57] There would be peace in Ukraine and Syria. [00:17:59] It would be a perfect world if only, if only YouTube's Lean Back had been exposing people to the music video for Lean Back. [00:18:08] I mean, that's a chill-ass jam. [00:18:11] Okay. [00:18:11] That is a chill-ass jam. [00:18:13] There would be no Nazis in 2019 if that's the change YouTube had made. [00:18:18] It's true. [00:18:19] Fat Joe transcends the boundaries of country, religion, skin color, anything. [00:18:28] You could have saved the world, YouTube, if you just pushed Fat Joe on a welcoming nation. [00:18:35] Into the longing arms of a nation. [00:18:37] Yeah. [00:18:38] God damn, I wish that's the path things had taken. [00:18:43] Tragically, it's not. [00:18:44] Now, during this period after Lean Back was instituted, Gillem Chaslow was a software engineer for Google. [00:18:51] I'm sorry, Gillem Chaslow? [00:18:53] Chaslow. [00:18:54] It's spelled C-H-S-L-O-T. [00:18:55] I found, I think I'm pronouncing Gillum right because I found some pronunciation guides for the name Gillam, but I have not found a pronunciation guide for C-H-A-S-L-O-T. [00:19:05] I think Chaslow is it. [00:19:06] I think he's a French guy. [00:19:08] That's a very good name. [00:19:09] It is a great name. [00:19:10] I think I'm pronouncing it sort of correct, Gillam Chaslow, but I'm doing my best here. [00:19:15] He's like a stuffy bank owner that likes to get Domed in the evenings. [00:19:19] Yeah, yeah, Gillam Chaslow for sure. [00:19:22] Stuffy bank owner. [00:19:23] But in this case, he's actually an engineer whose expertise is in artificial intelligence. [00:19:29] And The Guardian interviewed him for an article titled, How YouTube's Algorithm Distorts Reality. [00:19:34] I'm going to quote from that now. [00:19:35] During the three years he worked at Google, he was placed for several months with a team of YouTube engineers working on the recommendation system. [00:19:41] The experience led him to conclude that the priorities YouTube gives its algorithms are dangerously skewed. [00:19:47] YouTube is something that looks like reality, but it is distorted to make you spend more time online, he tells me when we meet in Berkeley, California. [00:19:53] The recommendation algorithm is not optimizing for what is truthful or balanced or healthy for democracy. [00:19:59] Chaslow explains that the algorithm never stays the same. [00:20:02] It is constantly changing the weight it gives to different signals, the viewing patterns of a user, for example, or the length of time a video is watched before someone clicks away. [00:20:10] The engineers he worked for were responsible for continuously experimenting with new formulas that would increase advertising revenue by extending the amounts of time people watched videos. [00:20:18] Watch time was the priority. [00:20:20] Everything else was considered a distraction. [00:20:23] So YouTube builds this robot to decide what you're going to listen to next. [00:20:27] And the robot's only concern is that you spend as much time as possible on YouTube. [00:20:33] And that's the seed of all of the problems that we're going to be talking about today. [00:20:38] So Gillam was fired in 2013. [00:20:41] And Google says it's because he was bad at his job. [00:20:43] Chaslow claims that they instead fired him because he complained about what he saw as the dangerous potential of the algorithm to radicalize people. [00:20:50] He worried that the algorithm would lock people into filter bubbles that only reinforce their beliefs and make conservatives more conservative, liberals more liberal, and people who like watching documentaries about aliens more convinced that the Jews are fluoridating their water, etc. [00:21:07] Thank you for laughing. [00:21:08] Chill, chill, chill. [00:21:09] Yeah. [00:21:10] Chaslow said, there are many ways YouTube can change its algorithms to suppress fake news and improve the quality and diversity of videos people see. [00:21:16] I tried to change YouTube from the inside, but it didn't work. [00:21:19] YouTube's masters, of course, had no desire to diversify the kind of content people saw. [00:21:24] Why would they do that if it meant folks would spend less time on the site? [00:21:29] So, in 2015, YouTube integrated Google Brain, a machine learning program, into its algorithm. [00:21:35] According to an engineer interviewed by The Verge, one of the key things it does is it's able to generalize. [00:21:40] Whereas before, if I watch a video from a comedian, our recommendations were pretty good at saying, here's another one just like it. [00:21:46] But the Google Brain model figures out other comedians who are similar but not exactly the same, even more adjacent relationships. [00:21:52] It's able to see patterns that are less obvious. [00:21:54] And Google Brain is a big part of why Steven Crowder and others like him are now millionaires. [00:21:59] It's why if you watch a Joe Rogan video, you'll start being recommended videos by Ben Shapiro or Paul Joseph Watson, even though Joe Rogan is not an explicitly political guy and Ben Shapiro and Paul Joseph Watson are. [00:22:10] It's why for years, whenever conservative-inclined people would start watching, say, a Fox News clip critical of Obama, they'd wind up being shuffled gently over to Infowars and Alex Jones. [00:22:21] It's why if you watched a video about Obama's birth certificate, YouTube would next serve you Alex Jones, claiming that Michelle Obama is secretly a man. [00:22:29] It's why if you watched a video criticizing gun control, YouTube would serve you up Alex Jones, claiming the New World Order under Obama was going to confiscate your guns so it could carry out genocide. [00:22:39] And it's why if you watched coverage of the Sandy Hook Massacre, YouTube would hand you Alex Jones, claiming the massacre was a false flag and all the children involved were crisis actors. [00:22:49] I bring up Alex Jones so many times in this because it's probable that no single person benefited as much from YouTube's Google Brain algorithm changes as Alex Jones. [00:22:58] That's what Gillum Chaslow seems to think. [00:23:01] On February 24th, 2018, he tweeted this. [00:23:04] The algorithms I worked out on Google recommended Alex Jones videos more than 15 billion times to some of the most vulnerable people in the nation. [00:23:12] Yeah, that's the scale of this thing. [00:23:14] That's insane. [00:23:16] Because it recognizes that people who are going to start watching just sort of a conservative take on whatever issue, gun control, the Sandy Hook shooting, Fluoride in the Water, or whatever. [00:23:29] People who might just want a Fox News take on that. [00:23:33] Alex Jones is much more extreme, but because he's much more extreme, he's compelling to those people. [00:23:40] And if you serve him them, they'll watch his stuff all the way through. [00:23:43] And his videos are really, really long. [00:23:45] He does like a four-hour show. [00:23:47] So people stay on the site a long time. [00:23:48] If they get served up a four-hour Alex Jones video, they just keep playing it while they're doing whatever they're doing, and they sink deeper and deeper into that rabbit hole. [00:23:57] And a regular person would look at this and be like, oh, Google's taking people who believe, I don't know, that a flat tax is a good idea and turning them into people who think that fluoride is turning frogs gay and that Sandy Hook was an inside job. [00:24:10] And that's a bad thing. [00:24:11] But YouTube's algorithm didn't think that way. [00:24:14] It just thought like, oh, as soon as these people find Alex Jones' videos, they spend 50% more time on YouTube. [00:24:20] So I'm just going to serve Alex Jones up to as many fucking people as I possibly can. === The Dick Pills Sponsorship (02:13) === [00:24:25] And that's what starts happening in 2013-14. [00:24:30] So that's where we are in the story right now. [00:24:36] And then we're going to continue from that point. [00:24:38] But you know what it's time for next, Sophia? [00:24:41] No, tell me. [00:24:42] It's time for products. [00:24:45] Maybe services. [00:24:46] Services? [00:24:47] Maybe, maybe. [00:24:48] I'm not going to make promises. [00:24:49] I'm not going to write checks my ass can't catch here. [00:24:52] But I hope there's services your ass can cash. [00:24:57] Well, my ass is all about products. [00:25:02] I hope it's a chair company that comes up next. [00:25:04] Otherwise, that's a non-sequitur. [00:25:05] I hope it's a squatty potty. [00:25:09] It's probably going to be dick pills because we just signed a great deal with dick pills, and I'm very, very proud of our dick pills sponsorship. [00:25:17] It's not even a great job. [00:25:19] I love selling dick pills. [00:25:21] I can see your hard right now. [00:25:23] I can just see your head. [00:25:24] Thank you. [00:25:25] Your head of your body, not your penis head. [00:25:27] But I can tell you're hard from the pills. [00:25:29] Thank you. [00:25:30] Thank you. [00:25:30] You have a very taut dick energy. [00:25:35] Thank you. [00:25:36] TDE is what this show aims to present to the world. [00:25:43] I said, speaking on the subject of YouTube, when we filled out our ad things, I won't sell brain pills because I don't want to be like Paul Joseph Watson or Ben Shapiro. [00:25:52] But I will 100% sell dick pills. [00:25:55] And it's mainly so that I can say the phrase dick pills over and over again. [00:26:00] So. [00:26:01] Meet my son, Dick Pills Evans. [00:26:03] Dick Pills Evans. [00:26:04] I am going to name my. [00:26:05] I'm going to have a son just to name him Dick Pills. [00:26:07] And then sort of. [00:26:09] It's going to be like a boy named Sue, but with a boy named Dick Pills. [00:26:13] And instead of like me explaining to him that I gave him the name Sue so that he'd be like, it would harden him up and he'd become like a tough person and could survive the rough world. [00:26:21] I'm just like, oh, no, I got paid a lot of money to call you Dick Pills. [00:26:24] No, you're just sponsored by Dick Pills. [00:26:26] You're just sponsored by Dick Pills. [00:26:28] That's all it is. [00:26:29] This has gone very off the rails. [00:26:32] Sophie, is this a good idea? [00:26:34] No. [00:26:35] No. [00:26:36] She's doing a hard no. === Naming My Son Dick Pills (04:10) === [00:26:38] Hard no. [00:26:39] Okay. [00:26:39] Well, speaking of hard. [00:26:43] Products. [00:26:49] There's two golden rules that any man should live by. [00:26:53] Rule one, never mess with a country girl. [00:26:57] You play stupid games, you get stupid prizes. [00:26:59] And rule two, never mess with her friends either. [00:27:03] We always say, trust your girlfriends. [00:27:07] I'm Anna Sinfield. [00:27:08] And in this new season of The Girlfriends... [00:27:10] Oh my God, this is the same man. [00:27:13] A group of women discover they've all dated the same prolific con artist. [00:27:17] I felt like I got hit by a truck. [00:27:19] I thought, how could this happen to me? [00:27:21] The cops didn't seem to care. [00:27:23] So they take matters into their own hands. [00:27:26] I said, oh, hell no. [00:27:28] I vowed I will be his last target. [00:27:30] He's going to get what he deserves. [00:27:35] Listen to the girlfriends. [00:27:36] Trust me, babe. [00:27:37] On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:27:47] What's up, everyone? [00:27:48] I'm Ago Modern. [00:27:49] My next guest, you know, from Step Brothers, Anchorman, Saturday Night Live, and the Big Money Players Network. [00:27:57] It's Will Farrell. [00:28:00] My dad gave me the best advice ever. [00:28:03] I went and had lunch with him one day, and I was like, and dad, I think I want to really give this a shot. [00:28:08] I don't know what that means, but I just know the groundlings. [00:28:11] I'm working my way up through it. [00:28:12] I know it's a place they come look for up and coming talent. [00:28:15] He said, if it was based solely on talent, I wouldn't worry about you, which is really sweet. [00:28:20] Yeah. [00:28:20] He goes, but there's so much luck involved. [00:28:23] And he's like, just give it a shot. [00:28:25] He goes, but if you ever reach a point where you're banging your head against the wall and it doesn't feel fun anymore, it's okay to quit. [00:28:33] If you saw it written down, it would not be an inspiration. [00:28:35] It would not be on a calendar of, you know, the cat just hang in there. [00:28:43] Yeah, it would not be. [00:28:45] Right, it wouldn't be that. [00:28:46] There's a lot of luck. [00:28:47] Listen to Thanksgiving on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:28:56] In 2023, former bachelor star Clayton Eckard found himself at the center of a paternity scandal. [00:29:02] The family court hearings that followed revealed glaring inconsistencies in her story. [00:29:07] This began a years-long court battle to prove the truth. [00:29:11] You doctored this particular test twice, Miss Owens, correct? [00:29:14] I doctored the test once. [00:29:16] It took an army of internet detectives to crack the case. [00:29:19] I wanted people to be able to see what their tax dollars were being used for. [00:29:23] Sunlight's the greatest disinfectant. [00:29:26] They would uncover a disturbing pattern. [00:29:28] Two more men who'd been through the same thing. [00:29:30] Greg Gillespie and Michael Marancini. [00:29:32] My mind was blown. [00:29:34] I'm Stephanie Young. [00:29:36] This is Love Trap. [00:29:38] Laura, Scottsdale Police. [00:29:39] As the season continues, Laura Owens finally faces consequences. [00:29:44] Ladies and gentlemen, breaking news out of Maricopa County as Laura Owens has been indicted on fraud charges. [00:29:51] This isn't over until justice is served in Arizona. [00:29:55] Listen to Love Trapped podcast on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:30:05] 10-10 shots fired, City Hall building. [00:30:08] A silver .40 caliber handgun was recovered at the scene. [00:30:12] From iHeart Podcasts and Best Case Studios, this is Rorschach, murder at City Hall. [00:30:18] How could this have happened in City Hall? [00:30:20] Somebody tell me that. [00:30:21] Jeffrey Hood did it. [00:30:23] July 2003. [00:30:24] Councilman James E. Davis arrives at New York City Hall with a guest. [00:30:29] Both men are carrying concealed weapons. [00:30:32] And in less than 30 minutes, both of them will be dead. [00:30:41] Everybody in the chamber is ducked. [00:30:43] A shocking public murder. [00:30:45] I scream, get down, get down. [00:30:47] Those are shots. === Star Wars White Genocide (09:04) === [00:30:48] Those are shots. [00:30:48] Get down. [00:30:49] A charismatic politician. [00:30:50] You know, he just bent the rules all the time, man. [00:30:53] I still have a weapon. [00:30:55] And I could shoot you. [00:30:58] And an outsider with a secret. [00:31:00] He alleged he was a victim of flat down. [00:31:03] That may or may not have been political. [00:31:05] That may have been about sex. [00:31:07] Listen to Rorschach, Murder at City Hall on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:31:20] We're back! [00:31:22] We're back, and Sophia just said the sentence, we got to mold our own genitals at the Dick Johnson factory or Doc Johnson factory. [00:31:30] Doc Johnson. [00:31:31] Doc Johnson. [00:31:32] I loved that sentence, which is why I brought us back in mid-conversation from the ad break, because that's a wonderful sentence. [00:31:41] I have the Instagram story saved on my Instagram, if anybody wants to. [00:31:44] I want to get that sentence tattooed on my back. [00:31:48] Like where some people will have Jesus, like, I got my genitals molded at the Doc Johnson factory. [00:31:55] Yeah, and it was the most fun ever. [00:31:57] That sounds great. [00:31:58] It was cool. [00:31:59] That sounds so much better than YouTube's algorithm. [00:32:03] That's a really smooth transition. [00:32:05] Thank you. [00:32:06] That was like jazz fucking saxophone smooth. [00:32:09] I am as good at transitions as dick pills are at getting your dick hard. [00:32:15] Exactly. [00:32:15] Exactly. [00:32:16] Fuck yeah, him's good times. [00:32:20] Okay. [00:32:21] So, you know, one of the big sources for this podcast and one of the big sources for the articles that have covered the problems with YouTube's algorithm is Gillem Chaslow. [00:32:31] And he's not just a former employee with an axe to grind or someone who feels guilty about the work he participated in. [00:32:38] For years now, he has turned into something of an activist against what he sees as the harms of his former employer. [00:32:46] And obviously, as a guy with potentially an axe to grind, he's someone that you've got to approach a little bit critically. [00:32:51] But Chaslow hasn't just like complained about Google. [00:32:54] He's built, like, he has a team of people that have built like systems in order to test the way Google's algorithm works and show the way that it picks new content and like document with hard numbers, like, here's the kind of things that it's serving up. [00:33:10] Here's the sort of videos that it recommends people towards. [00:33:13] Like, here's how often it's doing them. [00:33:15] So he's not just making claims. [00:33:17] He has reams and reams of documentation on how Google's algorithm works behind him. [00:33:24] He's really put a lot of work into this. [00:33:26] And from everything I can tell, he's someone who's deeply concerned about the impact YouTube's algorithm has had on our democracy and someone who's trying to do something about it. [00:33:35] So just digging into the guy a bit, I have a lot of respect for what he's trying to do. [00:33:40] On November 27th, 2016, shortly after the election, while we were all drinking heavily, Gillem Chaslow published a medium post titled, YouTube's AI Was Divisive in the U.S. Presidential Election. [00:33:54] In it, he included the results of a study he and a team of researchers conducted. [00:34:00] They were essentially trying to measure which candidate was recommended the most by YouTube's AI during the presidential election. [00:34:06] And the code that they used to do this and all of the methodology behind it is available on the website. [00:34:11] If you're someone who knows how to do the coding, you can check it all up, but they're very transparent. [00:34:15] He says, quote, surprisingly, a Clinton search on the eve of the election led to mostly anti-Clinton videos. [00:34:20] The pro-Clinton videos were viewed many times and had high ratings, but represent only less than 20% of all recommended videos. [00:34:28] Chaslow's research found that the vast majority of political videos recommended by YouTube were anti-Clinton and pro-Trump because those videos got the best engagement. [00:34:39] Now, Chaslow explained that because Google Brain was optimized to maximize time users spent on site or engagement, it's also happy to route people to content that say, proposes the existence of a flat earth because those videos improve engagement too. [00:34:54] Gillum found that searching is the earth flat or round and following Google's recommendations sent users to flat earth conspiracy videos more than 90% of the time. [00:35:03] So if you're wondering why flat earth is taken off as a conspiracy, it's because simply asking the question, is the earth flat or round 90% of the time leads you to videos that say, it's flat, homie. [00:35:14] That's how come all those basketball players think the earth is flat. [00:35:19] And also what, Iraq? [00:35:20] Exactly, right? [00:35:22] It's totally spreading. [00:35:25] You can see in your head how that change happens. [00:35:29] It's like some guy's having a conversation with a friend who is kind of dumb and is like, no, dude, you know the Earth's flat. [00:35:35] And you're like, what? [00:35:35] That's bullshit. [00:35:36] And you type, is the Earth flat into YouTube, and then it serves you up a four-hour documentary about how the Earth's flat. [00:35:42] And like, yeah, it's predictable. [00:35:46] Is typing it into YouTube. [00:35:49] It's probably not the place you want to get that answer. [00:35:52] Yeah. [00:35:53] No, but it's not like schools in America teach people critical thinking or how to functionally do research. [00:35:59] It's like going to Yahoo Answers to be like, am I pregnant? [00:36:03] Yeah. [00:36:03] Which happens all the time. [00:36:07] The answer is yes. [00:36:08] If you are asking Yahoo whether or not you're pregnant, you are in fact pregnant. [00:36:13] For sure. [00:36:14] Probably second or third trimester. [00:36:19] You should at least stop smoking for a while until you find out for sure. [00:36:23] Maybe put down a bottle for a hot second. [00:36:26] Yeah. [00:36:27] Now, further reporting using additional sources from within Google seems to support most of Chaslow's main contentions. [00:36:33] In fact, it suggests that he, if anything, understated the problem. [00:36:36] Chaslow left YouTube in 2012, and while he knew about Google Brain, he did not know about a new AI called Reinforce that Google had just instituted or instituted, I think, in 2015 to YouTube. [00:36:48] Its existence was revealed by a New York Times article published just a few days before I wrote this, The Making of a YouTube Radical. [00:36:55] That article claims that Reinforce focused on a new kind of machine learning called reinforcement learning. [00:37:01] The new AI known as Reinforce was a kind of long-term addiction machine. [00:37:06] It was designed to maximize users' engagement over time by predicting which recommendations would expand their tastes and get them to watch not just one video, but many more. [00:37:15] Reinforce was a huge success. [00:37:17] In a talk at an AI conference in February, Min Min Chin, a Google Brain researcher, said it was YouTube's most successful launch in two years. [00:37:24] Site-wide views increased by nearly 1%, she said. [00:37:27] A game that at YouTube's scale could amount to millions more hours of daily watch time and millions more dollars in advertising revenue per year. [00:37:35] She added that the new algorithm was already starting to alter users' behavior. [00:37:39] We can really lead users toward a different state versus recommending content that is familiar, Ms. Chin said. [00:37:46] It's another example of like, if you take that quote out of context and just read it back to her and say, Ms. Chin, this sounds incredibly sinister when you're talking about leading people towards a different state. [00:38:00] Excuse me, ma'am. [00:38:01] Are you in fact a villain? [00:38:03] A super villain? [00:38:04] Are you evil? [00:38:05] You sound like a supervillain sounds like this might be evil is this a james bond in the tech industry yeah it's that no one ever has in the tech industry that are we the baddies moment where they're like oh we're addicting people to our service is that maybe bad are we the nazis damn this whole time i thought we were the americans nope yeah Now, [00:38:34] YouTube claims that Reinforce is a good thing, fighting YouTube's bias towards popular content and allowing them to provide more accurate recommendations. [00:38:43] But Reinforce once again presented an opportunity for online extremists. [00:38:47] They quickly learned that they could throw together videos about left-wing bias in movies or video games, and YouTube would recommend those videos to people who were just looking for normal videos about these subjects. [00:38:58] As a result, extremists were able to red pill viewers by hiding rants about the evils of feminism and immigration as reviews of Star Wars. [00:39:05] In far-right lingo, red pilling refers to the first moment that sort of sets someone off in their journey towards embracing Nazism. [00:39:12] And so prior to Reinforce, if you were looking up, I want to see gameplay videos about Call of Duty, or I want to see a review of Star Wars The Force Awakens, it would just take you to reviews and gameplay videos. [00:39:24] Now it would also take you to somebody talking about like how Star Wars is part of the social justice warrior agenda or how Star Wars, you know, embraces white genocide or something like that. [00:39:35] And so then, you know, and it'll recommend that to millions of people, and most of them will be like, what the fuck is this bullshit? [00:39:40] But a few thousand of them will be like, oh my God, this guy's right. [00:39:44] Like, Star Wars is part of a conspiracy to destroy white men. [00:39:47] And then they'll click on the next video that Stefan Molyneux puts out or they'll go deeper down that rabbit hole. === Stefan Molyneux Toxic Family (11:21) === [00:39:53] And that's how this starts happening. [00:39:55] Star Wars is a conspiracy, though. [00:39:58] Just take your fucking money. [00:39:59] That's all it is. [00:40:00] Not to take your money. [00:40:01] It's like any other conspiracy that involves movies. [00:40:06] The only thing is to take your money. [00:40:09] Yeah, not to destroy white people. [00:40:11] They want white people because white people spend the most money on Star Wars. [00:40:15] That's the whole point. [00:40:15] Yeah, if they killed that, that's the number one customer. [00:40:20] That's killing your whole customer base. [00:40:22] That's like if cigarette companies wants you to breed. [00:40:24] Didn't want teenagers to start smoking. [00:40:26] It's like, yeah, you need to replenish the flocks. [00:40:29] Yeah. [00:40:30] Yeah. [00:40:30] You want people to start smoking in their 20s as they have children who grow up watching dad smoke. [00:40:36] Yes. [00:40:36] That's the plan. [00:40:38] Yes. [00:40:39] Yeah. [00:40:40] They want kids like me to grow up who every now and then will buy a pack of cigarettes just to smell the open pack of cigarettes because it takes me back to moments in my childhood. [00:40:48] Nostalgia. [00:40:49] It's such a soothing smell. [00:40:51] Unsmoked cigarettes. [00:40:53] A little bit sweet, a little bit fruity. [00:40:55] Filter. [00:40:57] Yeah. [00:40:58] This is going to trigger somebody to buy cigarettes. [00:41:00] Yeah, right now someone's pulling over to 7-Eleven. [00:41:04] Fuck it. [00:41:05] And I feel terrible about that. [00:41:07] And they're like, also, I just bought dick pills. [00:41:09] They're like, I don't know what's happening to me. [00:41:11] Buy dick pills. [00:41:12] Fucking it's good for your health. [00:41:14] It's good for your heart. [00:41:16] It's great. [00:41:17] Fucking is all benefits. [00:41:20] Cigarettes are almost all downsides other than the wonderful smell of a freshly opened pack. [00:41:25] And looking really fucking cool. [00:41:27] Yeah, well, they do make you look incredibly cool. [00:41:30] I mean, that's unbelievably cool. [00:41:32] So fucking cool. [00:41:33] Nothing looks cooler than this. [00:41:34] Damn it. [00:41:35] No, something does. [00:41:36] Smoking a joint looks cooler. [00:41:38] You're right. [00:41:38] Smoking a joint does look cooler. [00:41:40] And the coolest thing of all, smoking a joint on a unicycle. [00:41:45] On a yacht. [00:41:47] Wow. [00:41:48] You just took it to another level. [00:41:51] I just would want to see how good your balance is to be able to remedy on a yacht. [00:41:58] One of our many millionaire listeners is going to message me tomorrow being like, my husband tried to smoke a joint while riding a unicycle on our yacht, and now he's dead. [00:42:07] You killed the love of my life. [00:42:09] Or we'll get some dope fan art of you on a unicycle smoking a joint on a yacht. [00:42:15] Yeah. [00:42:16] Burning a fat one. [00:42:18] Speaking of fat ones, the New York Times interviewed a young man who was identified in their article on radicalization as Mr. Kane. [00:42:26] And Mr. Kane claims that he was sucked down one of these far-right YouTube rabbit holes thanks to YouTube's algorithm. [00:42:31] He is scarred by his experience of being radicalized by what he calls a decentralized cult of far-right YouTube personalities who convinced him that Western civilization was under threat from Muslim immigrants and cultural Marxists, that innate IQ differences explained racial disparities and that feminism was a dangerous ideology. [00:42:46] I just kept falling deeper and deeper into this and it appealed to me because it made me feel a sense of belonging, he said. [00:42:52] I was brainwashed. [00:42:53] There's a spectrum on YouTube between the calm section, the Walter Cronkite, Carl Sagan part, and Crazy Town, where the extreme stuff is, said Tristan Harris, a former design ethicist at Google, YouTube's parent company. [00:43:04] If I'm YouTube and I want you to watch more, I'm always going to steer you toward Crazy Town. [00:43:11] And I will say, I'm very hard on the tech industry regularly on this podcast. [00:43:17] It speaks well of a lot of engineers that the most vocal people in trying to fight YouTube's algorithm are former Google engineers who realized what the company was doing and like stepped away and have been hammering it ever since being like, we made a Nazi engine, guys. [00:43:33] Like we weren't trying to, but we made a Nazi engine and we have to deal with it. [00:43:37] You got to reign the alarm on this one. [00:43:39] Yeah, gotta really gotta reign the alarm on this one. [00:43:41] You know, I used to work at Google. [00:43:43] I worked at Google for two years. [00:43:45] I didn't know that. [00:43:46] Yeah. [00:43:46] What did you do? [00:43:48] My job title won't explain what I did, but basically it was like a quality. [00:43:56] Yeah, it has nothing to do with anything. [00:43:58] But basically, I got to, in Russian, like help build a binary engine that can, well, like train it, not build it. [00:44:09] Train it to be able to tell whether something is a restricted category or not. [00:44:14] Like something is porn or not, gambling or not, that kind of stuff. [00:44:19] So yeah, it was crazy. [00:44:22] Well, that sounds different. [00:44:26] Yeah, I saw some of the most fucked up stuff on the internet, you know. [00:44:29] Like I've reported child porn before. [00:44:31] Oh, then you will have a lot to say about this latter part because we do talk about content moderators for a little bit. [00:44:38] That's kind of a significant thing. [00:44:39] I'm going to be asking a couple of questions about that at the end. [00:44:41] Yeah. [00:44:43] Now, that New York Times article, in full disclosure, actually cites me in it because of a study that I published with the research collective Bellingcat last year where I trawled through hundreds and hundreds of leaf conversations between fascist activists and found 75 self-reported stories of how these people got red-pilled. [00:45:01] In that study, I found that 34 of the 75 people I looked at cited YouTube videos as the things that red-pilled them. [00:45:10] I'm not the only source on this, though. [00:45:12] The New York Times also cited a research report published by a European research firm called Vox Poll. [00:45:19] They conducted an analysis of 30,000 Twitter accounts affiliated with the far right, and they found that those accounts linked to YouTube videos more than they linked to any other thing. [00:45:29] So there's a lot of evidence that YouTube is the primary reason why if you look at people who were researching the KKK and neo-Nazis in America in 2004, 2005, 2006, a big gathering would be 20 people. [00:45:44] And then in 2017, four or five hundred of them, however many it was, showed up at Charlottesville. [00:45:51] Like, there's a reason their numbers increased so much over a pretty short period of time. [00:45:57] And it's because these videos made more of them. [00:46:00] And there's a lot of evidence of that. [00:46:02] So while Google is raking in more and more cash and increasing time spent on site, they're also increasing the amount of people who think Hitler did nothing wrong. [00:46:12] And that's the tale of today. [00:46:15] So Mr. Kane, the New York Times source for that article, claims his journey started in 2014 when YouTube recommended a self-help video by Stefan Molyneux. [00:46:24] Mr. Molyneux is a great candidate for an episode of this podcast, but in short, he's a far-right YouTube philosopher, self-help guru who advises his listeners to cut all ties with their family. [00:46:33] He runs a community called Free Domain Radio that some people accuse of being a cult that, you know, tells people to cut off contact with their family. [00:46:41] Yeah, no cool club is going to be like, hey, please join us, but also never speak to anyone you love ever again. [00:46:50] Yeah, never talk to your mom again. [00:46:52] Like, that's not how a cool club starts. [00:46:55] You know, that's always. [00:46:56] That's not how a cool club starts. [00:46:57] That's always bad news. [00:46:59] Yeah. [00:47:00] Yeah. [00:47:00] Cool clubs say never talk to the cops again, which cool clubs do say. [00:47:05] Absolutely. [00:47:06] Now, Molyneux has been on YouTube since forever, but his content has radicalized sharply over the years. [00:47:11] At the beginning, he identified as an anarcho-capitalist, and he mostly focused on his ideas about how everyone was bad at being parents and people should cut ties with toxic family members. [00:47:21] In recent years, he's made a lot of people. [00:47:22] It's like, bro, just call your dad. [00:47:24] Call your dad, bro. [00:47:25] You probably need to have a convo. [00:47:27] Yeah, you guys probably just talk some feelings out. [00:47:29] Maybe you'll calm the fuck down. [00:47:32] I don't know. [00:47:33] I don't want to say there's actually a lot of people with toxic family members who they do need to cut out of their lives, which I think is part of why Molyneux was able to get a following. [00:47:41] Like there's not nothing in what he's saying. [00:47:43] There's a lot of people who have fucked up family backgrounds and who get told like, well, you just need to make things right with your mom. [00:47:48] And it's like, no, if your mom sent you to gay conversion therapy, maybe cut all ties with her forever. [00:47:56] I totally agree. [00:47:57] No, no, no. [00:47:57] I'm not trying to say that. [00:47:58] What I'm trying to say is that he himself, to pursue a life where you tell people to cut contact off with their family, you clearly have unresolved issues with your family. [00:48:10] Oh, hell. [00:48:11] And if you resolve those by, say, calling your parents and talking to them, I'm not saying you have to make up with them. [00:48:18] I'm saying somehow get closure for yourself so then you don't spend the rest of your life trying to get people to quit their families. [00:48:25] Yeah. [00:48:26] That's just like, seems, yeah. [00:48:30] You got some shit to deal with, bro. [00:48:32] Yeah. [00:48:33] But, you know, Molyneux didn't stay on that sort of thing. [00:48:38] Like, he made a switch over to pretty hardcore nationalism, particularly in the last two years. [00:48:44] There's like a video of him where he's in Poland during like a far-right march to commemorate like Poland's like Independence Day. [00:48:54] And he like said, like, starts crying and has like this big realization of how like I've been against nationalism and stuff for years. [00:49:01] And I realize it can really be beautiful. [00:49:03] And like the unsaid thing is like, I realize that white nationalism can be beautiful. [00:49:07] And that like instead of being an independent libertarian type, I'm going to focus on fighting for my people, which is like white people and stuff. [00:49:16] So that's how Stefan Molyneux is now. [00:49:19] Like he's essentially a neo-Nazi philosopher at this point. [00:49:22] And he spends most of his time talking about race and IQ and talking about how black people are not as good as white people. [00:49:29] Like that's the thrust of modern day Stefan Molyneux. [00:49:34] He also believes global warming is a hoax, so maybe nobody should have much respect for Molyneux's own IQ. [00:49:40] But a lot of people get turned on to Stefan's straight-up fascist propaganda because of their interest in Joe Rogan. [00:49:47] Rogan has had Stefan on as a guest several times, and YouTube has decided that people who like Rogan should have Stefan's channel recommended to them. [00:49:56] This may be why Mr. Kane saw Molyneux pop into his recommendations, which is what he credits as radicalizing him in 2014. [00:50:04] So, yeah, he wound up watching a lot of members of what some people call the intellectual dark web, Joe Rogan, Dave Rubin, guys like Steven Crowder, and of course, Stéphane Molyneux. [00:50:17] And over time, like he went further and further and further to the right until eventually he starts watching videos by Lauren Southern, who is a Canadian activist who's essentially, like, he called her his fascist crush, like his fashion bae. [00:50:32] So, like, by like 2016, this guy who starts watching Joe Rogan and like gets turned in the Stefan Molyneux videos about global warming as a hoax and IQ and race. [00:50:43] By 2016, he's like identifying a YouTube Nazi as his fascist like crush. [00:50:50] Like, that's how this proceeds for this dude. [00:50:52] And that's a pretty standard path. [00:50:55] But you know what's not a standard path? [00:50:58] No, what? [00:50:59] The path that our listeners will blaze if they buy the products and services that we advertise on this program. [00:51:08] You seem like your breath's been taken away by the skill and ingenuity of that transition. === Joe Rogan Nazi Crush (05:05) === [00:51:14] Truly, there was nothing I could add. [00:51:16] It was a perfect, perfect work. [00:51:19] I'm the best at this. [00:51:20] I'm the best around. [00:51:21] Nothing's going to ever keep me down. [00:51:23] Yeah. [00:51:23] I'm not going to put a bumper sticker on a Rolls-Royce. [00:51:26] You know what I'm saying? [00:51:28] The bumper sticker is going to say, I got my genitals molded. [00:51:32] Yeah. [00:51:32] Dr. I want that bumper sticker. [00:51:37] It's actually a hologram. [00:51:38] And like, when you look at it one way, I am wearing a skirt. [00:51:43] And when you look at it the other way, you see my vagina mold. [00:51:46] It's really cool. [00:51:48] Man, that put a lot of thought into it. [00:51:52] That's quite a bumper sticker. [00:51:54] And, you know, I have thought for a long time that what traffic is missing is explicitly pornographic bumper stickers. [00:52:01] Like, if truck nuts are okay, why isn't that? [00:52:04] Seriously, it's actually a lot more pleasant to look at than truck nuts. [00:52:08] Yes, yes. [00:52:09] Nobody actually likes truck nuts. [00:52:12] No one. [00:52:13] All right. [00:52:13] Well, this has been a long digression. [00:52:16] Yeah. [00:52:18] Products. [00:52:24] There's two golden rules that any man should live by. [00:52:28] Rule one, never mess with a country girl. [00:52:32] You play stupid games, you get stupid prizes. [00:52:34] And rule two, never mess with her friends either. [00:52:38] We always say, trust your girlfriends. [00:52:42] I'm Anna Sinfield, and in this new season of The Girlfriends. [00:52:45] Oh my God, this is the same man. [00:52:47] A group of women discover they've all dated the same prolific con artist. [00:52:52] I felt like I got hit by a truck. [00:52:54] I thought, how could this happen to me? [00:52:56] The cops didn't seem to care. [00:52:58] So they take matters into their own hands. [00:53:01] I said, oh, hell no. [00:53:03] I vowed I will be his last target. [00:53:05] He's going to get what he deserves. [00:53:10] Listen to the girlfriends. [00:53:11] Trust me, babe. [00:53:12] On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:53:22] What's up, everyone? [00:53:23] I'm Ego Modern. [00:53:24] My next guest, you know, from Step Brothers, Anchorman, Saturday Night Live, and the Big Money Players Network. [00:53:32] It's Will Farrell. [00:53:35] My dad gave me the best advice ever. [00:53:38] I went and had lunch with him one day. [00:53:40] I was like, and dad, I think I want to really give this a shot. [00:53:43] I don't know what that means, but I just know the groundlings. [00:53:46] I'm working my way up through and I know it's a place they come look for up and coming talent. [00:53:50] He said, if it was based solely on talent, I wouldn't worry about you, which is really sweet. [00:53:55] Yeah. [00:53:55] He goes, but there's so much luck involved. [00:53:58] And he's like, just give it a shot. [00:53:59] He goes, but if you ever reach a point where you're banging your head against the wall and it doesn't feel fun anymore, it's okay to quit. [00:54:08] If you saw it written down, it would not be an inspiration. [00:54:10] It would not be on a calendar of, you know, the cat just hang in there. [00:54:18] Yeah, it would not be. [00:54:20] Right, it wouldn't be that. [00:54:21] There's a lot of luck. [00:54:22] Yeah. [00:54:22] Listen to Thanks Stat on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:54:31] In 2023, former bachelor star Clayton Eckard found himself at the center of a paternity scandal. [00:54:37] The family court hearings that followed revealed glaring inconsistencies in her story. [00:54:42] This began a years-long court battle to prove the truth. [00:54:46] You doctored this particular test twice, Miss Owens, correct? [00:54:49] I doctored the test once. [00:54:51] It took an army of internet detectives to crack the case. [00:54:54] I wanted people to be able to see what their tax dollars were being used for. [00:54:58] Sunlight's the greatest disinfectant. [00:55:01] They would uncover a disturbing pattern. [00:55:03] Two more men who'd been through the same thing. [00:55:05] Greg Olespi and Michael Marancine. [00:55:07] My mind was blown. [00:55:09] I'm Stephanie Young. [00:55:11] This is Love Trap. [00:55:12] Laura, Scottsdale Police. [00:55:14] As the season continues, Laura Owens finally faces consequences. [00:55:19] Ladies and gentlemen, breaking news at Americopa County as Laura Owens has been indicted on fraud charges. [00:55:26] This isn't over until justice is served in Arizona. [00:55:30] Listen to Love Trapped podcast on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:55:40] 10-10 shots fired. [00:55:42] City Hall building. [00:55:43] A silver .40 caliber handgun was recovered at the scene. [00:55:47] From iHeart Podcasts and Best Case Studios. [00:55:51] This is Rorschach, murder at City Hall. [00:55:53] How could this have happened in City Hall? [00:55:55] Somebody tell me that. [00:55:56] Jeffrey Hood did. [00:55:58] July 2003. [00:55:59] Councilman James E. Davis arrives at New York City Hall with a guest. [00:56:04] Both men are carrying concealed weapons. [00:56:07] And in less than 30 minutes, both of them will be dead. [00:56:16] Everybody in the chamber's ducks. [00:56:18] A shocking public murder. === Pedophilia On YouTube (15:14) === [00:56:20] I scream, get down, get down. [00:56:22] Those are shots. [00:56:23] Those are shots. [00:56:23] Get down. [00:56:24] A charismatic politician. [00:56:25] You know, he just bent the rules all the time, man. [00:56:28] I still have a weapon. [00:56:30] And I could shoot you. [00:56:33] And an outsider with a secret. [00:56:35] He alleged he was a victim of flat down. [00:56:38] That may or may not have been political. [00:56:40] That may have been about sex. [00:56:42] Listening to Rorschach, murder at City Hall on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:56:55] We're back. [00:56:56] Boy howdy, what a day we've had today. [00:57:03] So, at this point, YouTube's role in radicalizing a whole generation of fascists is very well documented. [00:57:09] But YouTube is sort of stuck when it comes to admitting that they've ever done anything wrong. [00:57:14] 70% of their traffic comes from the recommendation engine. [00:57:17] It is the single thing that drives the platform's profitability more than anything else. [00:57:22] Back in March, the New York Times interviewed Neil Mohan, YouTube's chief product officer. [00:57:27] His responses were pretty characteristic of what the company says when confronted about their little Nazi issue. [00:57:32] The interviewer asked, I hear a lot about the rabbit hole effect where you start watching one video and you get nudged with recommendations towards a slightly more extreme video and so on. [00:57:41] And all of a sudden, you're watching something really extreme. [00:57:43] Is that a real phenomenon? [00:57:45] To which Neil responded, Yeah, so I've heard this before, and I think that there are some myths that go into that description that I think it would be useful for me to debunk. [00:57:53] The first is this notion that it's somehow in our interest for the recommendations to shift people in this direction because it boosts watch time or what have you. [00:57:59] I can say categorically, that's not the way our recommendation systems are designed. [00:58:02] Watch time is one signal that they use, but they have a number of other engagement and satisfaction signals from the user. [00:58:08] It is not the case that extreme content drives a higher version of engagement or watch time than content of other types. [00:58:14] So he basically has a blanket denial there. [00:58:17] A little bit of a double-decker. [00:58:18] That's a huge, just like blanket no. [00:58:20] We don't do that. [00:58:22] No, that doesn't happen. [00:58:23] Doesn't happen. [00:58:24] And he goes on, it's a bit of a rambling answer. [00:58:27] And later in his answer, Mohan called the idea of a YouTube radicalization rabbit hole purely a myth. [00:58:32] The interviewer, to his credit, presses Neil Mohan on this a bit more later and asks if he's really sure he wants to make that claim. [00:58:39] Mohan responds, what I'm saying is that when a video is watched, you will see a number of videos that are recommended. [00:58:45] Some of those videos might have the perception of skewing in one direction or, you know, call it more extreme. [00:58:50] There are other videos that skew in the opposite direction. [00:58:52] And again, our systems are not doing this because that's not a signal that feeds into the recommendations. [00:58:57] That's just the observation that you see in the panel. [00:58:59] I'm not saying that a user couldn't click on one of those videos that are quote-unquote more extreme, consume that, and then get another set of recommendations that sort of keep moving in one path or the other. [00:59:08] All I'm saying is that it's not inevitable. [00:59:10] So because everybody doesn't choose to watch more extreme videos, there's no YouTube radicalization rabbit hole. [00:59:16] Yeah, and also kind of acknowledging there that it does happen. [00:59:19] Yeah. [00:59:20] Nothing is inevitable. [00:59:21] I mean, except for like death and whatever. [00:59:24] You know, just to be like, yeah. [00:59:26] Yeah. [00:59:26] No, it's not. [00:59:27] A meteorite could hit your house before you get to click on the video that turns you into a Nazi. [00:59:31] So of course it's not inevitable. [00:59:33] Yeah. [00:59:33] Just be like, it's not 100% true is not a good answer when someone's. [00:59:39] A percentage of our users die of heart disease before the next video plays. [00:59:43] Yeah, pretty high percentage of people. [00:59:47] Yeah, that's not what we're asking, Neil. [00:59:49] Now, the reality, of course, is that Neil Mohan is, shall we say, not entirely honest. [00:59:56] I think I wrote a damn liar in the original draft, but I'm not sure where the legally actionable line is. [01:00:01] Pocket of big video. [01:00:03] Yeah, a pocket of big, big video. [01:00:05] For just one example, Jonathan Albright, a Columbia University researcher, recently carried out a test where he seeded a YouTube account with a search for the phrase crisis actor. [01:00:14] The up next recommendation led him to 9,000 different videos promoting crisis actor conspiracy theories. [01:00:21] So again, someone who heard the term and wanted to search for factual information about the conspiracy theory would be directed by YouTube to hundreds of hours of conspiratorial nonsense about how the Sandy Hook shooting was fake. [01:00:33] Now, I'm going to guess you remember last year's mass shooting at the Marjorie Stoneman Douglas High School? [01:00:38] By the Wednesday after that shooting, less than a week after all of those kids died, the number one trending video on YouTube was David Hogg the Actor, which is obviously a video accusing one of the kids who's been most prominent of being a crisis actor. [01:00:54] According to a report from Ad Age, it and many others claimed to expose Hogg as a crisis actor. [01:00:58] YouTube eventually removed that particular video, but not before it amassed nearly 200,000 views. [01:01:03] Other videos targeting Hogg remain up. [01:01:05] One that appears to show Hogg struggling with his words during an interview after the shooting suggests it's because he forgot his lines. [01:01:11] YouTube auto suggests certain search terms that would lead people directly to the clips. [01:01:15] If a person typed David Hogg in YouTube's search bar Midday Wednesday, for example, some of the suggestions would include exposed and crisis actor. [01:01:24] When reporters asked YouTube how that video made it to the top of their coveted trending chart, YouTube explained that, since the video included edited clips from a CNN report, its algorithm had believed that it was a legitimate piece of journalism and allowed it to spread as an authoritative news report would. [01:01:41] So again, that's their justification. [01:01:44] Like we couldn't have known that this was fake news because it was fake news that used clips from a legitimate news site. [01:01:51] So like, we're clearly not at fault here for the fact that we let a robot select all these things and no human being watched the top trending video on the site at the moment to see if like it was something terrible. [01:02:02] Also, that's bullshit. [01:02:03] Yeah. [01:02:04] Yeah, that's total bullshit. [01:02:06] Now, YouTube's or Nazi propaganda and conspiracy theories aren't the only things that spread like wildfire on YouTube, of course. [01:02:13] Pedophilia is also a big thing on the site. [01:02:16] Yeah, yeah, this is where we get to that part of the story. [01:02:19] So this broke in February of 2019 when a YouTuber named Matt Watson put together a video exposing how rings of pedophiles had infested the comments sections for various videos featuring small children and used them to communicate and trade child porn. [01:02:33] Now, this report went very viral and immediately prompted several major advertisers to pull their money from YouTube. [01:02:39] The company released a statement to their worried advertisers informing them that they had blanket banned comments for millions of videos, basically removing comments from any videos uploaded by or about young children. [01:02:49] I'd like to quote from NPR's report on Watson's video. [01:02:52] Watson describes how he says the pedophile ring works. [01:02:55] YouTube visitors gather on videos of young girls doing innocuous things, such as putting on their makeup, demonstrating gymnastics moves, or playing Twister. [01:03:01] In the comments section, people would then post timestamps that link to frames in that video that appear to sexualize the children. [01:03:07] YouTube's algorithms would then recommend other videos also frequented by pedophiles. [01:03:11] Once you enter into this wormhole, there is now no other content available, Watson said. [01:03:16] So it might seem at first like this is purely an accident on YouTube's part. [01:03:23] Like that cunning pedophiles figured out that there were like they could just find videos of young kids doing handstands and stuff and use that as porn and trade it with each other, right? [01:03:32] Which would not necessarily be like, how could we have predicted this? [01:03:36] It's just these people decided to use innocent videos for a nefarious purpose. [01:03:40] But that's not what happened. [01:03:42] Or at least that's not all of what happened. [01:03:46] So in June, three researchers from Harvard's Berkman Klein Center for Internet and Society started combing through YouTube's recommendations for sexually themed videos. [01:03:56] They found that starting down this rabbit hole led them inevitably to sexual videos that placed greater emphasis on youth. [01:04:04] So again, that's maybe not super surprising. [01:04:06] You start looking for sexy videos, you click on one, and then the next video, the woman in it is going to be a younger woman and a younger woman and a younger woman. [01:04:15] But then at a certain point, the video suggested flipped very suddenly until, and I'm going to quote the researchers here, YouTube would suddenly begin recommending videos of young and partially clothed children. [01:04:29] So YouTube would take a person who's like just looking for adults, like videos of like an exotic dancer dancing or whatever, like videos of attractive young women dancing. [01:04:40] And then YouTube would start showing them videos of children doing like gymnastics routines and stuff. [01:04:45] Like that's the algorithm being like, I bet you'll like child porn. [01:04:50] Like that's literally what's happening here, which I didn't realize when I first heard the story that like that's YouTube. [01:04:56] That's not just pedophiles using YouTube in a sleazy way because pedophiles will always find a way to ruin anything. [01:05:02] That's YouTube crafting new pedophiles. [01:05:07] Yeah, it's a system that's essentially training you. [01:05:10] Yeah. [01:05:11] I wonder if it's like that with violence too. [01:05:13] If you look up a violent thing, if it keeps recommending more violence, because that seems like a hate, like that would happen. [01:05:20] When I worked for Google, like the sensitive categories, the restricted categories are, you know, violence, hate, gambling, porn, child porn. [01:05:31] I think. [01:05:31] There's even a messed up thing about that because one of the problems that like people who document war crimes in Syria have had is YouTube blanket banning their videos because of violence. [01:05:41] And then like you have evidence of a war crime and then it's wiped off of the internet forever because YouTube doesn't realize that this isn't like violence porn. [01:05:49] This is somebody trying to document a war crime. [01:05:53] It's made it really hard to do that kind of research. [01:05:58] Their response is always so terrible. [01:06:02] Anyway, the New York Times reported, quote, so a user who watches erotic videos might be recommended videos of women who become conspicuously younger and then women who provose provocatively in children's clothes. [01:06:11] Eventually, some users might be presented with videos of girls as young as five or six wearing bathing suits or getting dressed or doing a split. [01:06:18] So yeah, in its eternal quest to increase time spent on site, YouTube's algorithm essentially radicalized people towards pedophilia. [01:06:26] And to make matters worse, it wasn't just picking sexy videos that people had uploaded with the intent of them being sexy. [01:06:35] Because it was sending children's videos to people, it started grabbing totally normal home videos of little kids and presenting those videos to horny adults who were on YouTube to masturbate. [01:06:46] The report suggests it was learning from users who sought out revealing or suggestive images of children. [01:06:52] One parent the Times talked with related in horror that a video of her 10-year-old girl wearing a bathing suit had reached 400,000 views. [01:07:00] So like parents start to realize like, wait a minute, I uploaded this video to show her grandma. [01:07:05] There's supposed to be like nine views on this thing. [01:07:07] Why have 400,000 people watch this video of my 10-year-old? [01:07:11] And it's because YouTube is trying to provide people with porn because it knows that'll keep them on the site longer. [01:07:18] That's fucking wild. [01:07:20] Yeah. [01:07:21] After this report came out, YouTube published an apologetic blog post promising that responsibility is our number one priority and chief among our areas of focus is protecting minors and families. [01:07:32] But of course that's not true. [01:07:34] Increasing the amount of time spent on site is YouTube's chief priority. [01:07:38] Or rather, making money is YouTube's chief priority. [01:07:40] And if increasing the amount of time spent on site is the best way to make money, then YouTube will prioritize that over all other things, including the safety of children. [01:07:50] Now, there are ways YouTube could reduce the danger their sites present to the world. [01:07:54] Ways they could catch stuff like propaganda accusing a mass shooting victim of being an actor or people's home movies of being accidentally turned into child porn, even if they're not going to stop hosting literal fascist propaganda. [01:08:05] Content moderators could add human eyes and human oversight to an AI algorithm that is clearly sociopathic. [01:08:12] And earlier this year, YouTube did announce that they were expanding their content moderator team to 10,000 people, which sounds great. [01:08:19] Sounds like a huge number of people. [01:08:21] Only, that's not as good as it seems. [01:08:23] The Wall Street Journal investigated and found out that a huge number of these moderators, perhaps the majority, worked in cubicle farms and India and the Philippines, which would be fine if they were moderating content posted from India or the Philippines. [01:08:36] But of course, these people were also going to be tasked with monitoring American political content. [01:08:41] Now, Alphabet, nay Google, does not disclose how much money YouTube makes. [01:08:47] Estimates suggest that it's around $10 billion a year and may be increasing by as much as 40% per year. [01:08:54] Math is not my strong suit. [01:08:55] I'm not an algorithm. [01:08:56] But I did a little bit of math and I calculated that if Google took a billion dollars of their profit and hired new content moderators, paying them $50,000 a year salaries, which I'm going to guess is more than most of these moderators get, they could afford to hire 20,000 new moderators, tripling their current capacity. [01:09:14] Realistically, they could hire 50 or 60,000 more moderators and still be making billions of dollars a year and one of the most profitable services on the internet. [01:09:23] But doing that would mean less profit for Google shareholders. [01:09:27] It would mean less money for people like Neil Mohan, the man who has been YouTube's chief product officer since 2011. [01:09:33] The man who has overseeing nearly all the algorithmic changes we are talking about today. [01:09:37] The man who sat down with the New York Times and denied YouTube had a problem with leading people down rabbit holes that radicalized them in dangerous ways. [01:09:45] I was kind of curious as to how well compensated Mr. Mohan is. [01:09:49] So I googled Neil Mohan net worth. [01:09:52] The first response was a business insider article. [01:09:54] Google paid this man $100 million. [01:09:57] Here's his story. [01:09:59] So that's cool. [01:10:00] Yeah. [01:10:02] Oof. [01:10:02] And I can tell you from being a moderator, I worked on a team where everybody did what I did in a different language. [01:10:10] So I did this in Russian and next to me was someone who was doing it in Chinese and Turkish and all of the languages. [01:10:17] I mean, not all, but a significant number. [01:10:20] Yeah. [01:10:21] And I can tell you that we were hired as contractors for only a year. [01:10:27] Very rarely would you ever be doing a second year because they didn't want to pay you the full benefits. [01:10:33] Right. [01:10:33] Like, you know, you don't get health insurance and whatever, all the perks that you would get from being a full-time Google employee. [01:10:40] And the thing about what we did is you got exposed to a lot of fucked up stuff. [01:10:45] Like, you know, the videos and stuff that I've seen are like some of the worst the internet has to offer, like beheadings or someone stomping a kitten to death in high heels, like crazy shit. [01:10:57] And it would really make you sick. [01:10:58] And they like give you free food at Google and you like wouldn't be able to eat sometimes because you would be so grossed out. [01:11:04] And it's not like they, that's why you're only there for a year also. [01:11:08] Not just that you wouldn't be able to get full benefits, but also because they are okay with wasting your mental and physical energies and then letting you go and then just cycling through new people every year because rather than investing, you know, in employees that are full-time, making sure they have, you know, access to mental health care and stuff like that. === Whistleblowers And Free Food (10:47) === [01:11:34] And, you know, making that job be something that they take more seriously considering how important it is. [01:11:43] Well, and that's part of what's really messed up is that like it's fucking Google. [01:11:47] Like if you go into the people, people who are like actually coding these algorithms and stuff, I guarantee you those people have on-site therapists they can visit. [01:11:56] They have gyms at work. [01:11:57] They get their lunches cut. [01:11:59] I mean, we all worked in the same building, but like I can't, you know, I couldn't go get a free massage during, it's like, you know, you have a CrossFit trainer on site and shit like that. [01:12:09] For sure, you get incredible perks. [01:12:10] And the whole point is what I thought was kind of ironic about what we were saying is like the whole thing is to try to make you stay on YouTube. [01:12:18] But when you work for a company like Google, their job is to try to make you stay at Google. [01:12:24] So, you know, the reason you're getting all these benefits and stuff and like free food and gym and massage and whatever is because they want you to stay and work forever. [01:12:34] But they don't want you to like what's messed up to me. [01:12:38] Exactly. [01:12:38] Like, and that's a very telling thing from Google's perspective because they are saying that increasing the amount of the people who are coding these algorithms that increase the amount of time people spend on site, that is important to us. [01:12:51] And so we will do whatever it takes to retain these people. [01:12:54] But the people who make sure that we aren't creating new pedophiles while we make money, the people who are responsible for making sure that Nazi propaganda isn't served up to like influenceable young children via our service, those people aren't valuable to us because we don't care about that. [01:13:11] So we're not going to offer them health care. [01:13:14] Like if Google really was an ethical company and if YouTube cared about its impact on the world, someone whose job, there's nothing less technical or less valuable about what you're doing, being able to speak another language fluently, being able to understand if content propagating on their site is toxic or not, that's a very difficult, very technical task. [01:13:36] If they cared about the impact they had on the world, the people doing that job would be well paid and would have benefits and would be seen as a crucial aspect of the company. [01:13:46] But instead, it's sort of like, if we don't have someone doing this job, we'll get yelled at. [01:13:50] So we're going to do the minimum necessary. [01:13:53] And we're going to have most of the people doing that job be working at a fucking cube farm in India, even though we're expecting them to moderate American content and to understand all of our cultural nuances and whether or not something's toxic. [01:14:04] Like, that's so fucked up. [01:14:08] And also, considering the fact that ads is the reason that they hire content moderators, not because they care about the content necessarily, it's that it would be a huge mistake if, say, an ad for Huggies was served on a diaper fetish website. [01:14:29] You know, they want something in place where the page knows, the algorithm knows not to serve that, even though it seems like a good match because the word diaper is repeated and blah, blah, blah. [01:14:40] You know what I mean? [01:14:41] So it's really less about, it's more about keeping the advertisers happy and making the most money than it is about ensuring that the internet is this less fucked up place. [01:14:55] This gets to one of the things, like when I, when I get in arguments with people about the nature of capitalism and what's wrong with the kind of capitalism that we have in this country, I think a lot of people who like just sort of reject anti-capitalist arguments out of hand do it because they think that you're saying, oh, it's just wrong to make to make money. [01:15:14] It's wrong to have a business that like makes a profit. [01:15:17] The issue isn't that. [01:15:18] The issue, like this company, Google could be making billions of dollars a year and still be one of the most profitable sites of its type, still make a huge amount of money and have three times as many people doing content moderation and all those people have health care. [01:15:37] But by cutting corners on that part of it, because it doesn't make them more money, it just makes the world better, they make more money. [01:15:45] And it's worth more to them to increase the value of a few hundred people's stock than to ensure that there aren't thousands of additional people masturbating to children. [01:15:55] Like that, that's that's what I have an issue with with capitalism. [01:15:59] Like that's that's that's what you can make a profit without also selling your fucking soul. [01:16:04] Yeah, we could have YouTube. [01:16:06] There's nothing should be banned. [01:16:08] Like I can get recommended new musicians that I like. [01:16:14] We can all watch videos to masturbate to without more people being turned into pedophiles and Nazis. [01:16:20] That's not a necessary part of this. [01:16:22] Like it's just because corners are being cut. [01:16:25] Yeah, it just shows what the value, what the value of our society is, what the values of our society are. [01:16:33] Yeah, they've literally said like $3 billion a year is worth more to us than God knows how many children being molested, than fucking Heather Heyer getting run down at Charlottesville, than there being Nazis marching through the streets and advocating the extermination of black people, of LGBT people, of whatever. [01:16:53] Which is again part of why so many Google employees are now speaking out in horrified because like they're not monsters. [01:16:59] They don't want to live in this world any more than the rest of us do. [01:17:01] They just didn't realize what was happening because they were busy focusing on the code and the free massages. [01:17:07] And then like the rest of us, they woke up to a world full of Nazis and pedophiles. [01:17:18] I feel like you're looking at me to make a joke now. [01:17:21] And I feel like, I don't know, this got real serious. [01:17:25] I'm more just tired. [01:17:27] We're all tired. [01:17:30] It's a very tiring world we live in. [01:17:33] Yeah. [01:17:35] Well, Sophia, that's the episode. [01:17:37] Yay! [01:17:38] You want to plug your pluggables? [01:17:40] Fuck. [01:17:40] I mean, not really. [01:17:41] Just want everyone to go and get a hug, you know? [01:17:44] May everybody go get a hug. [01:17:46] But Jesus. [01:17:47] Yeah, but also, I am to be found on the sites we hate. [01:17:54] You know, what a fun thing to pluck. [01:17:57] I'm available on Twitter and Instagram at TheSophia. [01:18:02] Huge fan of the sites we hate. [01:18:06] And I have a podcast about love and sexuality around the world that I co-host with Courtney Kosak. [01:18:10] It's called Private Parts Unknown. [01:18:12] So check that out. [01:18:14] Check out Private Parts Unknown. [01:18:17] I'm also on the sites we hate. [01:18:20] BehindtheBastards.com is not a site that we hate, but it's where you can find the sources for this episode. [01:18:25] I writeOK is where you can find me on Twitter. [01:18:28] You can find us on Twitter and Instagram at BastardsPod. [01:18:32] That's the, you buy t-shirts on tpublic.com behind the bastards. [01:18:38] Yep, that's the episode. [01:18:40] Go find YouTube's headquarters and yell at them. [01:18:46] Scream at their sign. [01:18:47] Take pictures of their company and wave your fists. [01:18:51] If you work at YouTube, quit. [01:18:56] It's not worth it. [01:18:57] I mean, the more whistleblowers, the better. [01:18:59] Yeah, quit and go talk to the New York Times or some fucking buddy. [01:19:05] Yeah. [01:19:06] Also, one random thing that's positive is if you want, there's a lot of videos of trains on YouTube that I've discovered of just trains passing by. [01:19:16] Trains and ski fails. [01:19:17] Yeah, I think you will find it very soothing. [01:19:20] First, they'll be like, what the fuck? [01:19:21] A video of a train that's 12 minutes long? [01:19:24] Guess what? [01:19:25] That'll soothe you. [01:19:26] Soothe your ass. [01:19:28] Or if you're more like me, watch videos of people skiing and then failing to ski. [01:19:33] I mean, that's if you want to laugh. [01:19:36] Yeah. [01:19:37] Yeah. [01:19:37] I mean, I feel like YouTube's algorithm is going to take you from train videos to train fails really fast. [01:19:45] Oh boy. [01:19:47] Yeah. [01:19:47] Shit. [01:19:48] I don't know. [01:19:49] Now that I know about the rabbit hole, I'm afraid that there's a way to connect trains to children that I have not thought of. [01:19:58] Oh, no. [01:19:59] I'm not even going to make any further comments on that. [01:20:01] We should get out of here. [01:20:11] When a group of women discover they've all dated the same prolific con artist, they take matters into their own hands. [01:20:19] I vowed I will be his last target. [01:20:21] He is not going to get away with this. [01:20:23] He's going to get what he deserves. [01:20:25] We always say that, trust your girlfriends. [01:20:30] Listen to the girlfriends. [01:20:32] Trust me, babe. [01:20:32] On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:20:43] 10-10 shots five. [01:20:44] City hall building. [01:20:45] How could this have happened in City Hall? [01:20:47] Somebody tell me that. [01:20:49] A shocking public murder. [01:20:50] This is one of the most dramatic events that really ever happened in New York City politics. [01:20:57] They screamed, get down, get down. [01:20:59] Those are shots. [01:21:01] A tragedy that's now forgotten. [01:21:03] And a mystery that may or may not have been political. [01:21:06] That may have been about sex. [01:21:07] Listen to Rorschach, Murder at City Hall on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:21:17] I'm Laurie Siegel, and this is Mostly Human, a tech podcast through a human lens. [01:21:21] This week, an interview with OpenAI CEO Sam Altman. [01:21:25] I think society is going to decide that creators of AI products bear a tremendous amount of responsibility to the products we put out in the world. [01:21:32] An in-depth conversation with a man who's shaping our future. [01:21:36] My highest order bit is to not destroy the world with AI. [01:21:39] Listen to Mostly Human on the iHeartRadio app, Apple Podcasts, or wherever you listen to your favorite shows. [01:21:47] Hey, it's Nora Jones, and my podcast, Playing Along, is back with more of my favorite musicians. [01:21:53] Check out my newest episode with Josh Grobin. [01:21:56] You related to the Phantom at that point. [01:21:58] Yeah, I was definitely the Phantom in that. [01:22:00] That's so funny. [01:22:02] Share each day with me each night, each morning. [01:22:10] Listen to Nora Jones's Playing Along on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:22:18] This is an iHeart podcast. [01:22:20] Guaranteed human.