The Megyn Kelly Show - 20240105_dems-dark-brandon-scare-tactics-and-reality-of-ai- Aired: 2024-01-05 Duration: 01:37:29 === Lawmakers Crossed The Aisle (02:43) === [00:00:20] Welcome to the Megan Kelly Show, live on SiriusXM Channel 111 every weekday at noon. [00:00:47] And I can feel their relief. [00:00:48] I feel second-hand relief. [00:00:49] Although I got to tell you, when I was off for two weeks, I was missing you guys. [00:00:53] I was missing you. [00:00:54] I was missing my team and I was missing the news and I was missing you and doing the show. [00:00:58] So, you know, vacation's good in the right amount. [00:01:01] Meantime, we begin the show with sanity prevailing in New Hampshire. [00:01:06] God bless the lawmakers in New Hampshire. [00:01:10] This actually gave me hope for the future of our country. [00:01:13] 12 Democratic lawmakers crossed the aisle to side with Republicans, two of whom voted the wrong way. [00:01:20] The Republicans were not unanimous. [00:01:22] Shame on you too. [00:01:24] Last night to pass a bill banning so-called gender-affirming surgeries for minors. [00:01:30] That's not what they are. [00:01:31] They're not gender-affirming. [00:01:33] They cut off children's body parts, their genitals, before they can even legally smoke a cigarette because they suffer from some gender confusion. [00:01:44] What sane society allows this? [00:01:49] And yet, they passed a ban on it in the North Carolina House. [00:01:53] Still has to go to the Senate. [00:01:54] We believe they have the votes. [00:01:55] And then we'll see what Sununu does in New Hampshire. [00:01:58] This is Chris Christie's big state. [00:02:00] This is where he's polling between 10 and 11%. [00:02:03] He's against these bans. [00:02:05] Pay attention, New Hampshire. [00:02:07] Your favorite at 10% is against you on this issue. [00:02:13] If New Hampshire passes this ban, we'll be the 21st state in the union to do so. [00:02:18] And these bans should be unanimous. [00:02:21] They should be in all 50 states. [00:02:23] Unfortunately, they didn't go so far as to include the ban on puberty blockers into cross-sex hormones, which sterilizes children. [00:02:32] I mean, do they know that? [00:02:34] Are they paying attention? [00:02:35] Because what in what world is it okay to sterilize an 11-year-old? [00:02:39] Okay, one step at a time. [00:02:43] I have nothing but praise today for those Dems who crossed the aisle to make this happen. [00:02:47] Wait until you hear what happened to one progressive lawmaker when he dared to stand firm against the mob. [00:02:54] Plus, Vivek Ramaswamy, once again, puts on a masterclass on how to deal with the media. === Joining Up In The Northeast (02:53) === [00:03:04] Fiken er et superenkelt regnskapsprogram for bedrifter. [00:03:06] Men visste du at du også kan starte din egen bedrift med Fiken? [00:03:10] Gjør som tusenvis av andre og registrer AS og enkelpersonforetak trygt og enkelt ved å fylle et skjema på Fiken.no. [00:03:17] Vi hjelper deg hele veien til ferdigregistrert bedrift. [00:03:19] Du trenger ikke være kunde av Fiken fra før, og velger helt selv om du vil bruke regnskapsprogrammet vårt etterpå. [00:03:25] Tjenesten koster heller ingenting ekstra. [00:03:28] Fiken, start din egen bedrift, superenkelt. [00:03:38] And I'm right over on the first TV. [00:03:40] Jesse, welcome back to the show. [00:03:41] Great to have you. [00:03:43] It is great to be here, Megan. [00:03:44] Although I'm going to push back on you on something there. [00:03:47] You said it's great that the kids are almost out of school on Friday. [00:03:50] I like when my kids are in school, Megan. [00:03:52] My house is so quiet. [00:03:53] My wife and I were talking this morning, and there was nobody interrupting. [00:03:56] There was no screaming. [00:03:57] There was no messes anywhere. [00:03:59] It was just like a real conversation. [00:04:01] I love when my kids are in school. [00:04:04] You know, you have a good point. [00:04:05] I was just talking about this with friends. [00:04:06] In the summer, you know, when they're off full time, let's just say it's like a hard to find private adult time. [00:04:12] And when you have like my youngest is 10 and you get this on the door, like, what's going on in there? [00:04:19] What are you doing? [00:04:20] There's a droplock. [00:04:26] So I take your point. [00:04:28] Our kids have, our kids have been scolded enough to know if the door is closed, just walk away or else dad's going to yell. [00:04:37] Walk away. [00:04:38] Okay, so maybe we might have another school day on Monday, by the way, because snow day, because we're expecting snow here in the Northeast. [00:04:44] Remember snow? [00:04:45] Remember how we used to get snow? [00:04:48] I am going to join you in the Northeast actually next week. [00:04:51] I'll not join you, but I'll be in the Northeast next week, Megan. [00:04:53] I'm used to living in Houston now. [00:04:55] So basically, I'm sharm and soft when it comes to the cold. [00:04:58] I used to be, I grew up in Montana. [00:05:00] I used to be all about that life. [00:05:01] And now the second it hits 40, I'm throwing on winter clothes and hats. [00:05:06] It's, it's my, my wife and me bundled up. [00:05:09] So I'll be dying up there. [00:05:10] I have to say, like, I really miss snow. [00:05:13] I, as a kid who spent your first 10 years in Syracuse and then moved to Albany, then went back to Syracuse for college. [00:05:20] I miss snow. [00:05:21] I used to have snows, like winters where I couldn't go outside because the snow was over my head. [00:05:25] You know, my parents had to be there holding my hand so that I could breathe. [00:05:30] Now it's like every snowstorm gets reduced from eight inches to three inches and then comes out at a half an inch or passes you by all together. [00:05:39] It just feels so lame now. [00:05:41] Megan, you know why you miss snow and I don't miss snow? [00:05:45] Because you were a daughter and daughters get spoiled by their parents. [00:05:48] Not that I'm saying you were spoiled, but daughters don't get woken up at 5 a.m. to go shovel the driveway like my dad did with me in Montana before school, 5 a.m. === Strategy For Black Women Voters (15:18) === [00:05:57] He'd walk in and he'd laugh. [00:05:58] He'd hand me a snow shovel and say, better bundle up. [00:06:01] It's cold out there. [00:06:02] Hours just shoveling the snow. [00:06:04] That didn't happen to young Megan, Kelly. [00:06:06] That's why you miss it. [00:06:06] And I don't. [00:06:08] You're right. [00:06:08] Never, never once. [00:06:10] I've never had to do it. [00:06:11] And now it's fun because I have a little like a walkway to get to my studio. [00:06:16] And we even had it heated because Doug was like, we'll make the kids. [00:06:20] The kids will have to shovel the snow off of that. [00:06:23] I was like, they're never going to do it. [00:06:26] It's going to be preschool. [00:06:27] They're not going to go out there. [00:06:29] Like I am not mean enough to them. [00:06:31] So I'm like, we're putting the heaters underneath the tiles so I get out there. [00:06:35] Because you're right. [00:06:35] We're raising them soft these days. [00:06:37] That's right. [00:06:39] All right. [00:06:40] Well, anyway, we may be having snow this weekend and my fingers crossed that it actually happens. [00:06:43] Something else that's happening today going into the weekend is not snow, but the opposite darkness. [00:06:49] Not the light of the beautiful snowflakes, but the darkness of Brandon, who returns to the stump today as Joe Biden officially starts doing 2024 campaign events and is going reportedly to deliver the message that democracy depends on him, on his reelection. [00:07:08] And now we see the strategy unfolding. [00:07:11] The New York Times did a long piece on it this week, and we're going to hear more of it today. [00:07:15] The Biden strategy of running on Bidenomics and his record is failing. [00:07:19] You've seen Trump is leading him in all seven of the swing states among likely voters, among registered voters. [00:07:26] He's just crushing. [00:07:27] So they've got to do something different. [00:07:29] And so they're going to go back with a strategy, Jesse, that let's face it, has worked for the Democrats in 20 and 22, and even in some extent the 23 special elections, which is make it all about Trump, Trump, democracy dies and doctors, that kind of thing. [00:07:45] What do you make of it? [00:07:47] I think it's as much as I despise it and despise him. [00:07:50] It is brilliant politics. [00:07:51] It's very effective. [00:07:53] It was earlier this week, Megan, Brandon Johnson, that idiot mayor of Chicago, that commie piece of trash who's just wrecking the city. [00:08:00] He gave a speech and he's very unpopular there, including by the areas that elected him, because the city's full of illegals, it's full of crime now. [00:08:07] Everyone's looking around. [00:08:08] Oh my gosh, how could this happen? [00:08:09] Anyway, so he gets up and he gives a speech, Megan, and he starts talking about the Confederacy. [00:08:14] Well, this is Jefferson Davis and the Confederacy. [00:08:17] And everyone kind of rolled their eyes and mocked him. [00:08:19] But it was brilliant. [00:08:20] Yeah, you hate me, you hate me, you hate me. [00:08:22] But look over there. [00:08:23] They're even worse. [00:08:24] That's there's the demons over there. [00:08:26] It's really all Joe Biden has. [00:08:28] He's going to do the exact same thing for Trump. [00:08:30] There's no record to run on. [00:08:32] Joe Biden is not popular at all, but they understand because of media poisoning and messaging, which they've been very effective of. [00:08:39] Donald Trump is not popular either. [00:08:41] And so they're going to make sure the election is about Trump and not Joe Biden. [00:08:45] And I guess we'll see what happens. [00:08:48] Yeah. [00:08:48] I mean, whenever they talk about Trump, Biden's numbers do go up. [00:08:52] And I don't know that Trump's exactly go down, but Democrats and independents get reminded of the drama that's around Trump that they didn't really enjoy. [00:08:59] Maybe they enjoyed his policies, but they didn't enjoy his behavior around J6. [00:09:04] And just when he was in office, there was a lot of drama where you kind of want to be thinking about your own life, not the life of the guy sitting in the oval, right? [00:09:13] Like that's kind of the ideal government where that's one benefit of Joe Biden is you don't hear from him that often. [00:09:18] He's the shrinking executive because he has to be. [00:09:21] Sadly, he's actually running a lot of crazy ass left-wing policies out of that oval, thanks to the people around him. [00:09:27] But in any event, I do, I agree with you. [00:09:29] I think it's a smart strategy, whether it's going to overcome the deficit that already exists. [00:09:35] I don't know. [00:09:35] We've got, what, 11 months now? [00:09:37] And those are going to include four criminal trials for Trump and so on and so forth. [00:09:40] So, I mean, how do you see it now? [00:09:42] A lot of Republicans I talked to out in Montana over the weekend, the vacation, they're feeling bullish. [00:09:47] They're feeling like Trump's going to get it and Trump's going to win. [00:09:51] Yeah, I don't think I echo that. [00:09:54] I'm not saying he's not, but I definitely don't echo the optimism because this is what I see out there right now, Megan. [00:10:01] There's so many parts of this evil system we have in this country that are going to go all in to try to make sure Donald Trump can never be elected president again. [00:10:10] People talk about the ballot, you know, the removing Trump from the ballot in Colorado. [00:10:14] They're removing him from the ballot in Maine. [00:10:16] We have the indictments. [00:10:16] We have this and that. [00:10:17] And people ask me all the time, hey, Jesse, what are they planning? [00:10:20] What are they planning? [00:10:21] What are they planning? [00:10:22] The answer is everything. [00:10:23] Everything. [00:10:24] Every different commie in this system from secretaries of state to the commie street trash to the senators. [00:10:30] Doesn't matter which one they are. [00:10:32] They're all going to go all in with whatever it takes to try to smear Trump, stop Trump, everything else. [00:10:40] Are they going to be successful? [00:10:41] I don't have any idea. [00:10:42] I don't have a crystal ball, but they were successful last time, Megan, when Donald Trump was president of the United States of America. [00:10:50] That's before he was convicted of any felonies, which he will be, as unjust and stupid as that is. [00:10:55] He's going to be convicted of felonies. [00:10:57] The media is going to make him out to be a disastrous insurrectionist felon. [00:11:02] And is that too much for the norms and normas of this country to vote for? [00:11:07] I don't have the answer to that question, but I don't feel near as confident as everyone else right now. [00:11:12] Honestly, I roll my eyes, Megan, when I see people on the right talk about the poll numbers. [00:11:17] Look at the polls. [00:11:18] The polls look great. [00:11:19] The right does this thing where the left will make us eat 10 pounds of cow crap. [00:11:25] And the second they hand us a mint to wash the taste out of our mouth, we celebrate it like it's some kind of a victory. [00:11:30] I remember when Trump got arrested in Atlanta and they put his mugshot up there. [00:11:34] They arrested the former president over ridiculous charges and they're going to send him to prison for that. [00:11:39] He's going to go to state prison in Georgia if he gets convicted for that. [00:11:42] No if, ands, or buts because of the appeals process there. [00:11:45] And the right spent an entire day celebrating how cool the mugshot was. [00:11:49] We're not even playing the same game here. [00:11:51] They're all in on their game. [00:11:54] They're all in. [00:11:55] They are going to do everything they can to destroy him, destroy his people and everything else. [00:11:59] And we still, we still pretend like it's a game on our side. [00:12:03] Wow, did you see the new AT poll? [00:12:05] I find it to be childlike and ridiculous, to be honest. [00:12:08] Well, and one bad thing about, you know, those polls is, you know, the polls looked pretty good before those 2022 midterms too. [00:12:15] And we all know how that worked out without the last minute conviction of any of those candidates and so on. [00:12:22] So you're right. [00:12:23] It's fraught with peril. [00:12:24] People ask me all the time, same thing, like, what's going to happen? [00:12:27] And I think it's like obvious at this point that Trump's going to get the nomination. [00:12:31] Something absolutely catastrophic would have to happen for him not to get the nomination. [00:12:35] But am I that confident he's actually going to win? [00:12:37] I'm not because the Democrats are very good at what they do and they're very disciplined. [00:12:43] And, you know, I was talking to somebody today, like friends with a diehard Democrat. [00:12:48] And that Democrat was saying, what do you mean? [00:12:50] He's not too old. [00:12:51] What do you mean? [00:12:51] He's totally confident. [00:12:52] What do you mean what's happening at the border? [00:12:54] So there are a lot of these Democrats who are maybe well-educated, but low-information voters who they don't have the problems with Joe Biden that you might or I might and are fully prepared to rush to the polls and to get the caravans to go vote for him. [00:13:09] And God knows what else will happen on election day. [00:13:12] I don't know. [00:13:13] It's dark. [00:13:14] Okay. [00:13:14] There are other candidates in the race. [00:13:16] He's not. [00:13:17] Yeah, go ahead. [00:13:18] No, no, sorry. [00:13:19] It's just that reminded me of a story. [00:13:20] Speaking of Chicago, sorry to interrupt, but what you just said reminded me of a story. [00:13:24] I was on vacation myself a couple of weeks ago and ended up at a big table and there were a couple of liberal white women there. [00:13:29] You know, the most evil creatures on the planet. [00:13:31] And they were from Chicago and they were bragging. [00:13:34] These are rich white women and they were bragging about how they voted for Brandon Johnson. [00:13:37] And one of them called him BJ. [00:13:38] Like that's like they were buddies. [00:13:39] That's all. [00:13:40] I love BJ. [00:13:40] We love BJ. [00:13:41] And other people at the table, it wasn't really a political talk, but they've started talking about the crime situation in Chicago, how bad it was, people were getting shot and robbed. [00:13:49] And verbatim, Megan, on my life, on my life, cross my heart and hope to die. [00:13:52] This is what she said. [00:13:53] Well, yeah, you might get robbed, but you won't get targeted. [00:13:59] What? [00:14:00] That was honestly what she said, Megan. [00:14:02] I know. [00:14:03] Well, you might get robbed, Megan. [00:14:04] If you go to Chicago, they might stick a gun in your face and take your wallet, but they're not going to seek you out to murder you. [00:14:10] So what's the big deal? [00:14:11] What are you complaining about? [00:14:13] That's how far gone these people are. [00:14:15] Some of these people, Megan. [00:14:17] Some of these people are so far gone. [00:14:18] Their entire worldview has been built up by this. [00:14:22] You can't pull it out like a game of Jangut or their entire world comes crumbling down. [00:14:27] That woman could get mugged tonight, Megan. [00:14:30] Tonight on her way home. [00:14:31] That woman could get assaulted and mugged and she would wake up tomorrow and vote Democrat. [00:14:35] I don't know how you fixed that. [00:14:37] Well, this brings me to my second topic, which is the downfall of Claudine Gay. [00:14:42] And I've heard many different takes on what if you zoom out, what does it mean? [00:14:48] Our friends over on the Ruthless program, they were saying this is great, great news because it's a win. [00:14:54] It's finally like a win for the right, which never has its shit together and they never band together to get anything done. [00:15:01] And this is one instance in which, you know, you had the free beacon, you had Chris Ruffo, you had all these other commentators online. [00:15:09] And then you had the help of some centrist and left of center, very well-known folks like Bill Ackman pushing for it. [00:15:16] And it was a win to get this charlatan removed from what should have been a prestigious position at one point, whether it is today, serious doubts. [00:15:26] Okay, so I can see that. [00:15:28] Then you've got the leftist woke crowd absolutely melting down, that this was racist to remove her, that this is just part of white people's rage in seeing black women elevated to the positions of power that we deserve. [00:15:47] We whites are very angry about black women like Claudine getting elevated. [00:15:52] And that's what this is really about, like our anger at her ascent to a position of power. [00:15:57] Obviously, you and I don't agree with that shit, but you know that black women are still with Joe Biden big time. [00:16:03] And I do wonder whether they're more in that second camp. [00:16:07] Like, yeah, you know, they tend to be more woke. [00:16:10] They tend to be more Democrat. [00:16:12] So how does it all shake out? [00:16:14] The right feels energized. [00:16:15] The center left is migrating to us because they see wokeism has corrupted our nation or the core woke left is totally activated that as Rufo put it, a scalp was taken by one of their beloved token black women atop the positions of power in America. [00:16:36] First of all, it's pretty emblematic that our first scalp is one that doesn't have hair. [00:16:42] That's one. [00:16:43] It wasn't our scalp. [00:16:44] It wasn't our scalp. [00:16:46] Yes, we helped. [00:16:46] Christopher Ruffo did great work in the Free Beacon. [00:16:48] They did great work on the plagiarism and things like that. [00:16:51] Flaudine Gay is gone because some of, as you pointed out, the center left came for her and the donors came for her. [00:16:58] This reminds me of when Andrew Cuomo got sacked in New York. [00:17:02] Everyone celebrated on the right because Andrew Cuomo is a piece of trash. [00:17:05] The right didn't take out Andrew Cuomo. [00:17:07] Letitia James in the Democrat machine there knifed Andrew Cuomo in the ribs. [00:17:12] The right had nothing to do with that whatsoever. [00:17:14] They were unable to take him out. [00:17:16] We celebrate when the commies kill each other, but that's what commies have always done. [00:17:20] So look, I'm not saying it's a bad thing. [00:17:23] Frankly, Claudine Gay being fired at Harvard is just a good start. [00:17:26] You'd be better off if you fired everybody on the campus, bulldozed the buildings to the ground and made it into an orphanage or something like that. [00:17:33] That would actually be better for the country. [00:17:35] As far as what it means for Democrats, you're going to see a ton of something in this next year. [00:17:41] And Joe Biden actually gave the game away with his opening ad. [00:17:44] He ran some opening ad. [00:17:45] This is opening ad for 2024. [00:17:47] And one of the main issues, if not the only issue, if I remember right, he cited in there was voting rights. [00:17:53] Voting rights? [00:17:55] What is it? [00:17:55] 1950? [00:17:56] What's he talking about? [00:17:57] Voting rights. [00:17:58] Look, what they're going to do is an endless amount of black outreach in 2024, because one of the things that you know that most people do not in Democrat circles is they must have 92, 93% of the black vote to win elections. [00:18:13] If that number even drops down to 80, Democrats cannot win national elections. [00:18:17] Their party is based on getting virtually every black vote in the country. [00:18:21] They are losing some of them right now because of the brilliant GOP stunt of shipping illegals into places like Chicago and New York. [00:18:30] They're shipping these people into the poor black communities. [00:18:32] Poor black communities already had crappy schools. [00:18:34] Now they're overrun with a bunch of kids who don't even speak English and they're mad about the whole thing. [00:18:39] But what I'm saying is Democrats are going to have to spend an unusual amount of time and money doing black outreach right now. [00:18:46] And things like this Claudine Gay stuff hurt that cause. [00:18:50] That's why Obama was behind the scenes pushing to keep her. [00:18:54] They have to maintain that base, that black woman base, the black voting base, or they're not going to be able to keep power. [00:19:00] And you're going to see a lot of that in the 2024 election season. [00:19:04] It's amazing to me. [00:19:05] It's like they, they want to say, oh, she was fired because she's a black woman. [00:19:10] And meanwhile, the truth is the only reason she had the job is because she was a black woman. [00:19:14] That's what people are objecting to. [00:19:16] If she had been qualified, if she had done her job well, if she'd been a true scholar, even if she'd been a leftist, that would have been fine. [00:19:22] I mean, look at like, and I'm not just picking conservatives, but look at Ayan Herciale. [00:19:26] She's a black woman. [00:19:27] She's totally brilliant. [00:19:28] I guarantee you, Ayan Hercial Lee has never plagiarized anybody in her life. [00:19:32] I'd be thrilled if she got elevated to the president of Harvard or she's at the Hoover Institution. [00:19:36] Let's make it Stanford tomorrow, as would most people who are objecting to Claudine Gay's behavior. [00:19:44] It's not about race or her gender for people who want her gone. [00:19:49] It's the fact that she's a fraud. [00:19:51] It's the fact that she doesn't belong in the position. [00:19:54] And we all know it. [00:19:55] And even the people at Harvard have known it for a long time. [00:19:58] Yeah, but this is what the communist does, Megan. [00:20:01] They know that too. [00:20:02] The people who are using that, they use these shields forever because they've always been effective on the right, because the right has some sort of a moral founding, even though the right can get crazy too, but they have some sort of a moral fabric. [00:20:15] And the communist does not. [00:20:17] He has no moral fabric. [00:20:18] So he uses your values against you. [00:20:21] If someone actually thought you were a racist, Megan, if they genuinely thought that, that would bother you because you're not, because you're a human. [00:20:27] It would bother you if somebody thought that. [00:20:29] The communist knows that. [00:20:31] They know you're not, but they know that. [00:20:33] So they understand a great way to maybe attack Megan Kelly or destroy her arguments is simply call her a racist. [00:20:39] They do this all the time. [00:20:40] You're a misogynist. [00:20:41] You're a racist. [00:20:42] You're a Nazi. [00:20:42] You're a white supremacist. [00:20:44] And what it does is it gets you off the topic at hand. [00:20:47] I see the right play this game all the time. [00:20:49] It gets you off the topic at hand. [00:20:51] And now you're talking about things they want to talk about. [00:20:53] Racists, I've got black friends. [00:20:56] And now you're not even talking about the issue at hand. [00:20:58] I watched it happen during the final couple of years of Trump's presidency when he would give an interview. [00:21:03] And every time the reporter would have Trump denouncing white supremacy 18 minutes every single hour. [00:21:09] Well, yes, I denounce white supremacy. [00:21:10] Do you denounce it? [00:21:11] Yes, I denounce it. [00:21:12] Playing their game on their field with their refs enforcing their rules. === Systemic Racism And Power (15:53) === [00:21:16] And we wonder why we lose the messaging battle. [00:21:18] The race, misogyny, all that crap, anti-gay stuff, all that stuff's just a shield they use to shut you up. [00:21:25] We have to stop letting it work. [00:21:27] You are so right. [00:21:30] Oh my God, what you said is so right. [00:21:32] And I'll give you an example today of it in our current presidential politics. [00:21:39] Nikki Haley was asked over the Christmas break by, yes, it was obviously a Democrat plant. [00:21:45] What started the Civil War, which she didn't answer well. [00:21:48] She did not mention slavery. [00:21:50] But who the fuck, sorry, is talking in 2024 presidential race about, gee, what led to the Civil War? [00:21:57] Is that an issue? [00:21:58] We're worried about the border. [00:21:59] We're worried about the economy. [00:22:01] No one's worried about what started the Civil War right now. [00:22:05] So this is the Democrats launching a bomb into the campaign of someone they perceive as a threat because the polls show she would beat Joe Biden by 11 points. [00:22:16] Trump is beating Joe Biden by some four to six points. [00:22:20] She would beat him by 11. [00:22:21] So they're terrified that she would get it. [00:22:23] She's not looking like she's going to get it, let's be honest, but it'd be very helpful to have her kneecapped. [00:22:29] So she gets asked this question. [00:22:31] She doesn't answer it well. [00:22:33] And now here's the follow-up on a CNN town hall that happened last night. [00:22:38] And the CNN moderator throws the bomb back in her face again. [00:22:43] And in answering it, she makes yet another misstep, which then Van Jones and Abby Phillips on CNN freak out on Nikki Haley about again when the town hall ends. [00:22:56] Listen to SOT 10. [00:22:59] I should have said slavery right off the bat. [00:23:02] But if you grow up in South Carolina, literally in second and third grade, you learn about slavery. [00:23:09] You grow up and you have, you know, I had black friends growing up. [00:23:13] It is a very talked about thing. [00:23:16] I was over, I was thinking past slavery and talking about the lesson that we would learn going forward. [00:23:22] I shouldn't have done that. [00:23:25] Okay, wait, before I get you to respond. [00:23:27] I love that you're laughing. [00:23:29] Wait, here's Van Jones after the fact. [00:23:32] She was cleaning it up with a dirty rag. [00:23:34] I mean, it wasn't a cleanup at all. [00:23:36] It's painful. [00:23:37] I don't get it. [00:23:41] I think it says something about her. [00:23:43] I think that's something about the Republican base. [00:23:46] It's literally what you just said. [00:23:50] On my life, Megan, I had not seen that clip before I said it, but they all do it. [00:23:54] I had black friends. [00:23:56] See, I want everyone to understand, because normal people will run into this. [00:24:01] It's not just Nikki Haley with your liberal aunt Peggy when she shows up at the Christmas party, screaming about her 15th abortion. [00:24:07] You're going to run into this with her as well. [00:24:09] When they sit down with you, when they start to talk about the Civil War, what about the Confederacy? [00:24:13] What about Nazis? [00:24:14] They're trying to associate a term with you. [00:24:17] They're trying to marry that term to you. [00:24:19] They do this all the time, masterfully. [00:24:21] And offense, you want to play offense against these people. [00:24:24] All the GOP does is throw out a play defense. [00:24:26] What? [00:24:27] I have black friends. [00:24:28] Offense is, are you a Nazi? [00:24:30] Yeah. [00:24:31] Are you a pedophile? [00:24:32] Now, that's a horrible thing to say, right? [00:24:34] That's a horrible thing to say to somebody. [00:24:36] Is it not? [00:24:36] Well, is it not horrible to associate me with Nazism? [00:24:39] If we're going to associate words that have nothing to do with me and you're going to try to attach them to me, then I'm going to attach them back. [00:24:46] I'm going to attach horrible words back to you. [00:24:48] Jesse, do you denounce white supremacy? [00:24:51] Do you denounce pedophilia? [00:24:52] Are you pro-pedophilia? [00:24:53] Prove that you're against pedophilia right now. [00:24:56] That's actual offense in changing the conversation. [00:24:59] But the GOP is so scared of its own shadow, so scared of the media, so scared of how they're going to be framed. [00:25:06] They actually get themselves talking about the Civil War at all, Megan. [00:25:10] As if Van Jones is writing this. [00:25:12] Yeah, of course. [00:25:13] As if Van Jones or any of those boobs on CNN actually have a single bit of emotional attachment to the Civil War. [00:25:20] I'm a history freak. [00:25:22] I love the Civil War. [00:25:23] I'm not emotional about the whole thing. [00:25:25] There's nothing you could say that would offend me about the Civil War because it was like 170 years ago or whatever it was. [00:25:30] I don't do math very well. [00:25:32] None of these people are emotional about it. [00:25:34] They're trying to attach something ugly to you. [00:25:36] And all the GOP does is know how to meekly back up. [00:25:39] It's, I'm not racist, my black friends. [00:25:42] It makes me want to vomit, honestly. [00:25:44] I know. [00:25:45] I distinctly remember it was after, I don't know which controversy it was, but it was one of the ridiculous controversies that the Media Matters crew was making up about me and allegedly being a racist. [00:25:56] And a couple of my friends who happened to be black were like, should we go out and do a photo op together? [00:26:01] And I was like, it's a hard no. We're good. [00:26:06] No. [00:26:08] But Nikki Haley just fell into the trap. [00:26:10] And, you know, maybe she's too green. [00:26:11] Maybe it hasn't been done to her enough. [00:26:13] I have to give credit. [00:26:14] You and I have both ripped on the vake when he's deserved it in the past, but he nailed it this week. [00:26:19] It was a very good week for him when Dasha Burns of NBC got after him. [00:26:26] First Washington Post came for him, which I'll get to in a second, but this just happened, I think, yesterday, and it's gone viral today. [00:26:31] Dasha Burns of NBC tried to turn him into a racist or like a racist adjacent because of his positions on various things. [00:26:42] And truly, it was a masterclass in how to handle this nonsense. [00:26:45] Here it is in part. [00:26:47] Do you believe punctuality is a vestige of white supremacy, Dasha? [00:26:51] Look, because if you don't, then you have a disagreement about many of the people who are defining those terms or the written word or the use or the nuclear family. [00:26:58] These aren't my words. [00:26:59] These are the words of intellectual proponents from Ibram Kendi to the INA Presleys to BLM that have said these are vestiges of white supremacy. [00:27:05] So you have to have an honest discussion. [00:27:09] Strawman arguments. [00:27:11] You brought up Justin Smollett as the best example. [00:27:14] Jesus Mollette was the hottest thing in news in the back of a fake actual attack on him that we have to contend with. [00:27:20] And yet, and yet you have examples like the Buffalo shooter in New York just in 2022. [00:27:25] You have other examples. [00:27:26] But you are also cherry-picking when you bring up. [00:27:28] So I'll look at all the statistics. [00:27:30] More black on black crime. [00:27:31] If you really care about actual crime against black Americans, let's get to the root causes of it in the inner cities of this country. [00:27:37] The anti-defamation league tracked a 28% increase in white supremacist propaganda last year. [00:27:43] The anti-defamation league. [00:27:44] Yeah, the ADL, I don't think, is a particularly credible source. [00:27:47] I think the media did not hold the police accountable. [00:27:50] Starting to get around, gain traction with the black community with Latino voters. [00:27:54] Do not worry that your rhetoric is pushing them away. [00:27:58] There are folks who are. [00:27:58] To the contrary, I think we're going to bring black people in the world. [00:28:00] But people are concerned about your rhetoric. [00:28:02] Well, you know what? [00:28:03] I'm concerned about their corruption. [00:28:05] Sebastian is a debate that is being had. [00:28:07] If I may just finish this, if I may finish my point. [00:28:10] I think I will be better. [00:28:13] I've never denied that racism or problem. [00:28:15] We're getting close to the promised land that Martin Luther King envisioned. [00:28:19] We're as darn close to it as we ever have been. [00:28:21] And so what bothers the heck out of me is it's right when we're close to that promised land. [00:28:25] Martin Luther King said that I may not get there with you and he didn't get there with us. [00:28:29] But I think it desecrates the legacy of our civil rights movement, desecrates the legacy of Martin Luther King. [00:28:33] That right when we get closest to the point of having racial equality and gender equality and even opportunities for people of minorities of many types, are we perfect? [00:28:42] No, but are we as close as we've ever been? [00:28:44] Yes, we have. [00:28:45] To then obsess over systemic racism, to then obsess over white guilt and otherwise. [00:28:50] We're creating new waves of racism, Dasha, that we otherwise would have avoided right when we're closest to having achieved what even the proponents of the civil rights movement would have dreamed of. [00:29:04] Boom. [00:29:04] That's how it's done right there. [00:29:06] A plus plus. [00:29:08] Pretty well done. [00:29:09] And like I've said, I don't trust Vivek at all. [00:29:13] I find him to be extremely untrustworthy and snake oily, but I want to make sure I give him all the credit in the world. [00:29:19] He's become a chaos agent, which we need on the right, someone who's smart enough and charismatic enough to change the conversation and make these people look and feel stupid. [00:29:28] And he gets all the credit in the world for that. [00:29:31] I will push back on just one thing he said there, but it's a very minor criticism. [00:29:35] We're not as close as we've ever been to some kind of racial harmony. [00:29:39] We were as close as we've ever been to some kind of racial harmony, probably in the 80s in this country, maybe even the 90s. [00:29:45] And then the communists in this country decided they could use the civil rights thing to really blow the society up. [00:29:51] That's what all this is about, Megan. [00:29:52] That's what Dasha Burns really wants. [00:29:54] She's not a journalist. [00:29:55] She's an Apparatch. [00:29:56] All this race stuff, the gay stuff, all this stuff. [00:29:59] This is all about just blowing the country up. [00:30:00] It's all about destruction. [00:30:02] If tomorrow, every single position of power was occupied by a black person in this country, they wouldn't slow down or back off for even a second because it has nothing to do with black people or gay people or women or whatever it is. [00:30:14] Everything's about destruction. [00:30:15] When you understand, it's all just about destruction. [00:30:17] That's why they want to destroy the nuclear family. [00:30:19] That's why they want to cut your kid's penis off. [00:30:21] That's why they want the border wide open. [00:30:23] It's not accidental. [00:30:24] They're not misguided. [00:30:26] They're not liberal. [00:30:27] They're not slightly left. [00:30:28] They're not progressives. [00:30:30] These are evil, dirty, demonic communists who are out there to destroy everything. [00:30:34] And they're being very successful at it at this point in time. [00:30:37] What happened in that clip was just, I mean, he saw her coming from a mile away and he's obviously way smarter than she is. [00:30:43] She's like, Dasha Burns, pick somebody else. [00:30:46] Try someone dumber because he's, I mean, he's literally written the book on wokeness and what they're trying to do on the left, similar to what you were just saying. [00:30:54] And one of the things that struck me was here she is clearly trying to perform for her leftist base over on NBC. [00:31:01] And you can see like the plaintiff whining. [00:31:04] What about this? [00:31:05] You raise Josie Smollett. [00:31:07] What about, like, what about white supremacy and the anti-defamation league? [00:31:12] And it was, it's so nice to hear a politician who's done his homework, who knows that the ADL is a joke of an institution that only ever criticizes people on the right. [00:31:24] Go Google what they've said about Tucker Carlson. [00:31:27] I mean, they, and by the way, they're completely mission strayed from where they originally began. [00:31:32] They've started to sound a little bit more like a policing organization for anti-Semitic comments against Jews in the wake of the Israel attack. [00:31:40] But really, their favorite cause for the past 10 years has just been anything a conservative says, anything a conservative says that's not woke. [00:31:46] So good for Vivek for knowing that there's absolutely no stock to be put into this group and shoving it back in her face, her whiny little unprofessional face. [00:31:56] Dasha, you embarrassed yourself. [00:31:58] I think you got shamed after your John Fetterman interview because you told the truth about what a mess he was in that particular sit-down. [00:32:06] And ever since you've been trying to make it up to your leftist base to prove you're one of them, okay. [00:32:12] Good luck in your future journalism career. [00:32:16] Let me squeeze in a quick break and then we'll come back. [00:32:18] And we have so much more fun to do, Jesse Kelly. [00:32:20] It's wonderful having you here. [00:32:21] Don't go away. [00:32:22] More with my fellow Kelly. [00:32:23] Coming up. [00:32:27] What funny and bright spot in the whole mess. [00:32:29] Love to see Claudine Gay go. [00:32:31] She did not deserve the position. [00:32:32] She was an intellectual thief. [00:32:33] So I have no empathy for her and her firing. [00:32:36] She was, she resigned. [00:32:38] Sorry, let me correct it. [00:32:39] She resigned. [00:32:39] Sure. [00:32:40] Sure, Jan. [00:32:40] Oh, I don't need to say it. [00:32:42] Sure, Jan. [00:32:48] Okay, anyway. [00:32:50] Is Al Sharpton, Jesse? [00:32:53] Al Sharpton, I mean, is there a bigger race hustler in America decided to punish Bill Ackman, the billionaire investor who's been pushing to get these three women whose abominable testimony on Capitol Hill got them in trouble to begin with? [00:33:08] He's outside of Bill Ackman's office with like the people he met on the subway that day. [00:33:14] He got like no one. [00:33:16] He's like, we are going to storm Bill Ackman's building. [00:33:20] Look at this. [00:33:21] I could fall asleep in the middle of this tapioca pudding fest. [00:33:26] He's lost it. [00:33:27] It's done. [00:33:28] People like this guy don't have the power they used to. [00:33:31] No, but that's all he knows, Megan. [00:33:33] And Al Sharpton, I've always thought he was an odd character, a very odd character. [00:33:37] One, he looked better when he was fat. [00:33:39] That's you never see that, but he did. [00:33:40] He looked a lot better when he was fat. [00:33:42] He lost way too much weight. [00:33:43] Now he looks like a lollipop, and it weirds me out every time I see the guy. [00:33:47] And he's kind of lost the result. [00:33:50] He looks all shrunken in, dude. [00:33:52] Go get a donut or something like that. [00:33:53] He looks terrible. [00:33:54] That's one. [00:33:55] Two, he's clinging to something that has worked for him. [00:33:59] Only now. [00:34:00] It's kind of old and pathetic a little bit. [00:34:02] Have you ever, I'm 42 now, so I'm getting older, Megan. [00:34:05] You ever seen that guy get up and play pickup basketball? [00:34:08] And you can tell he used to be okay when he was younger, and now he's out of breath after one time up the court. [00:34:14] He just can't really do it anymore. [00:34:15] It's kind of embarrassing. [00:34:16] That's Al Sharpton when he shows up to all these civil rights protests now. [00:34:20] Like, I get it, Al. [00:34:21] That's all you've known, but go to the Caribbean with more tax money you didn't pay. [00:34:27] Yeah, yeah, exactly. [00:34:28] So, but he can't get it. [00:34:29] I mean, truly, he's, he's out of magic acorns and you can see it, but he's still playing the same game. [00:34:34] Now you have Ellie Mistal. [00:34:36] He's the White House correspondent for the nation. [00:34:39] He's listening. [00:34:40] Listen to this racist stuff. [00:34:42] I mean, this guy, he's constantly on MSNBC. [00:34:45] All of this is happening, Claudine Gay, because racist white folks had to chew with their mouths closed for two months after George Floyd was murdered. [00:34:54] And now they're on their revenge arc. [00:34:58] They'll keep roasting any black people they can get their hands on until they satiate their bloodlust while people from apartheid states call for colorblind societies. [00:35:09] There's never a solution for this. [00:35:10] Racist whites just do this whenever they feel their positions of power are threatened. [00:35:16] The best advice I can give to any black person is hard to follow. [00:35:19] So hard, I don't always do it myself. [00:35:21] But the trick is to not ever rely on white folks for anything. [00:35:26] Because if you do, then that means they can take it from you the moment they get in their feelings, whatever that means. [00:35:32] They can take it from you. [00:35:34] Can you imagine if a white person tweeted anything like this about a person of color? [00:35:41] Yeah, look, we joke a lot, Megan, you and I, and I'm glad we do, but this is one of the things that we really we should talk about more. [00:35:50] This is there's a dangerous situation happening in this country, and history books say it's a dangerous situation. [00:35:56] Whenever you take any group of people, whether it be a religion or skin color or whatever, and you other them, they're the problem, they're the problem, they're the problem, they're the problem. [00:36:04] And othering them becomes sanctioned at the highest levels. [00:36:08] It's not, you know, one dirt ball on the street corner. [00:36:10] I hate white people. [00:36:11] When you hear that kind of rhetoric from the president, from media figures, from academics, Harvard, all the others, when it is universal across the board, white people suck, white people suck, white people are evil. [00:36:22] Kids learn about this in elementary school. [00:36:25] They learn about white colonizers and all these things all over the country. [00:36:28] What you're doing is you're creating a very dangerous situation for white people in this country. [00:36:33] And I know it's very hard to see this now because we live in the United States of America, but all it would take would be a nudge. [00:36:40] And this can manifest itself in some really, really ugly, really violent ways. [00:36:45] And it would be nice if one political party in this country had the balls to actually step up and start talking about it in honest ways. [00:36:53] Not that I'm holding out for that at all. [00:36:55] They're all going to still try to play the commie game, but it would be really nice if we could talk about the systemic racism that is taking place in this country. [00:37:03] Again, racism's always bad, but if one guy on the street corner hates me for the color of my skin, okay, that sucks. [00:37:08] I just walk away. === Social Shame On The Left (15:00) === [00:37:10] If it's sanctioned by the DOJ president, by academics, by everything else, sanctioned racism by the institutions of a nation is what ends up killing people. [00:37:21] And there's an anti-white racism in this country that's despicable and should be talked about a lot more. [00:37:26] Yes, this guy, Eli Mistal, has it in droves. [00:37:30] He's a racist. [00:37:32] He hates white people. [00:37:34] Can you imagine sending out a tweet about black folks just pissed off they had to chew with their mouths closed for two months and talking about black bloodlust. [00:37:46] The blacks will keep roasting any white people they can get their hands on. [00:37:50] That's, I'm just reversing the races in the what he put out, what he put in writing and posted and said the trick is, can you imagine it reversed again, to not ever rely on black folks for anything because they can take it from, my God, the hatred this guy has. [00:38:08] And as I point out, he's White House correspondent for a major publication. [00:38:11] He's all over MSNBC. [00:38:13] He's on Joyread every night. [00:38:14] How does this guy still have a job? [00:38:17] It's unbelievable to me. [00:38:18] He's on MSNBC every other night. [00:38:21] I left NBC because I talked about Halloween costumes. [00:38:26] It's like, it's insane. [00:38:27] And then he's like, they're out for the blacks. [00:38:30] That's the problem. [00:38:31] Like, what are you saying? [00:38:33] It's, it's completely backwards. [00:38:36] The one comfort I have, Jesse, is that some people are talking about it now. [00:38:41] Today, the beginning of 2024, unlike five years ago even, when like nobody was saying anything about this racism, you know, by people like Ellie Mistahl. [00:38:52] Now it's more understandable. [00:38:54] Now they've been stewing in it for five plus years. [00:38:58] And you see it at every corner. [00:38:59] You see it everywhere. [00:39:00] I think people have had it. [00:39:02] And you're right. [00:39:02] The question is, how much have they had it? [00:39:04] And how much more are we going to allow this to go on? [00:39:07] Because it's stirring up terrible racial tensions. [00:39:11] It is. [00:39:11] And people don't know how to push back on it, Megan, because most people are not racist. [00:39:18] Certainly not most people on the right. [00:39:20] There's not racist. [00:39:21] I mean, are there some? [00:39:21] Of course, there's some everywhere. [00:39:22] That's human nature, but most people are not. [00:39:25] And so when it happens to them, they're really tempted to just kind of shrivel up or ignore it, or they don't want to talk about it because our social shame system is so upside down in this country. [00:39:36] You're not allowed to push back on that. [00:39:38] But people have to start getting a lot more vocal about naming this and attacking these people because, again, I can't stress this enough. [00:39:45] Like you just pointed out, this is not some person. [00:39:48] This is not some random dude sitting in his mom's apartment putting out something stupid on social media. [00:39:54] This stuff has been institutionalized at the highest levels. [00:39:58] The head of the DHS, CIA, FBI, presidents, Harvard presidents, people and everyone in between, they openly talk about this stuff now. [00:40:09] White people suck, white this, white that. [00:40:11] And that is dangerous. [00:40:13] And I wish, I really wish there was a much bigger movement pushing back against it. [00:40:17] But that's what communism does, Megan. [00:40:19] You and I have talked about this before. [00:40:21] That I talk about, you were talking about Montana. [00:40:23] We used to go hiking in Montana and you'd see eventually these huge boulders, bigger than a car, and you go hiking and you'd find one that had been split in two or split in three places. [00:40:33] And you're thinking to yourself, was it God himself that came down with an axe? [00:40:36] What could possibly have split this? [00:40:38] But what split it was over time, rocks like societies develop cracks. [00:40:42] Those cracks eventually get water in them. [00:40:45] And then it's Montana. [00:40:46] The freeze comes, the water expands, the boulder, boom, splits in two. [00:40:50] Communism is the water. [00:40:51] That's what it is in any society. [00:40:53] And that's exactly what these people are. [00:40:55] They're just destroyers. [00:40:56] We were on the way to having a relatively harmonious society a few decades ago. [00:41:01] And now everyone is more racist than ever. [00:41:03] The black people are, the white people are, the Mexicans are, everyone else because of this. [00:41:07] That's what they do. [00:41:08] They dig in and they split us all apart from each other. [00:41:11] It's really gross, but people are scared to discuss it because no one wants to be called racist. [00:41:16] It's unbelievable. [00:41:18] To correct myself, he's the justice correspondent for the nation. [00:41:21] This is him in 2021, same guy, 2021. [00:41:25] He wrote a piece in The Nation. [00:41:27] I am not ready to re-enter white society after the pandemic. [00:41:30] Couple highlights. [00:41:32] As the pandemic wanes and I have to leave the safety of my whiteness-free castle, I know that racism is going to come roaring back into my daily life. [00:41:40] Over the past year, I have, of course, still had to interact with white people on Zoom or watch them on television or worry about whether they would succeed in re-electing a white supremacist president. [00:41:50] But white people aren't in my face all the time. [00:41:52] I can more or less only deal with whiteness when I want to. [00:41:55] White people haven't improved. [00:41:57] I've just been able to limit my exposure to them. [00:42:00] This man is gainfully employed and appearing on MSNBC every other night. [00:42:06] I don't even know what to say. [00:42:08] Our media is disgusting. [00:42:09] The double standard on racism is disgusting. [00:42:13] And while we may not be able to defeat it, we can certainly call it out. [00:42:17] Okay, before I move on from the Claudine Gay thing, I do think it's interesting, speaking of the disgusting media. [00:42:24] Now they're doing hit pieces on Bill Ackman's wife. [00:42:27] I mentioned Bill Ackman, the billionaire who's been fighting back against anti-Semitism and led the charge to get rid of Liz McGill and now Claudine Gay. [00:42:35] And now I was looking at the MIT lady. [00:42:37] Business Insider comes out with a hit piece on his wife, who used to be at MIT and I think did her PhD at MIT and went back and dissected her dissertation and found some paragraphs that they say should have had quotations around. [00:42:55] Like she did cite the author, you know, but she didn't put the quotes on right before, like she said, you know, X, Y, and Z, period. [00:43:04] C. Jesse Kelly, I'm right. [00:43:05] But she didn't actually put the quotation marks in there. [00:43:08] So this is the game they play, right? [00:43:10] Like, take that, Bill Ackman. [00:43:12] We'll humiliate your wife if you stay on this tear. [00:43:17] Yeah, they, they understand. [00:43:18] I'll give the communists credit for this. [00:43:20] understand how powerful social shame is. [00:43:23] I call it the social shame system, but they understand they create these organizations, the ADL, like you were talking about earlier. [00:43:31] They'll work for this publication. [00:43:32] They'll take over this publication. [00:43:34] And what they do is they whip up mobs that intimidate good people from coming out. [00:43:39] That's really what they want. [00:43:40] They want you on their side, but if you're not on their side, they at least want you to shut up and be afraid. [00:43:45] And this is why they do the things they do. [00:43:47] You got this dude, Akman, he speaks up. [00:43:49] They're going after his wife. [00:43:50] Why do they do that? [00:43:51] One, to shut him up, but two, so the next billionaire doesn't get quite so out about that because he doesn't want a new hit piece in the New York Times or whatever, ADL, whatever it may be. [00:44:02] They're very good at social shame. [00:44:05] They're very good at making you feel like the heat of a thousand suns is on you. [00:44:10] So you'll shut up and go away. [00:44:12] And if you let them, they will win. [00:44:14] And this is not abnormal. [00:44:15] This is what the commies have always done everywhere. [00:44:19] They used to stand in front of your business in Mao's China and scream at anybody who came inside of your shop because you were one of the bad people. [00:44:27] It's a social shame system. [00:44:29] You didn't want to be marked as the person who was walking in to buy wontons from Jim because you were one of the bad ones. [00:44:34] And so eventually people stopped going and you had to leave the country or you went out of business. [00:44:39] They do the exact same thing in this country with the various little lefty organizations. [00:44:43] And as you know, Megan, lots of these organizations are nonprofits. [00:44:47] Our nonprofit industry is flat out criminal in this country. [00:44:50] So much of these nonprofits are funded because you're not allowed to know who the donors are by these big commie billionaires. [00:44:56] And they do blatantly political things, very nonpartisan report on why Megan Kelly is an evil misogynist and a racist. [00:45:03] And they'll put these things out there. [00:45:04] And then the other parts of society will cite the nonprofit as it's somehow legitimate. [00:45:10] Well, you see, Joe Biden got up and he's the ADL said Megan Kelly's a racist. [00:45:14] You see, that's an official organization. [00:45:16] That's how they work. [00:45:17] And it's very effective, to be honest with you. [00:45:19] Yeah, they've been working very hard to do this to Tucker Carlson for quite some time, like completely diminishing this raging racist misogynist, put him on the front page of the New York Times, all in an effort to discredit. [00:45:29] That's why the ADL got involved in Tucker Carlson's alleged misogyny and racism. [00:45:33] It was supposed to be an anti-when in any event. [00:45:35] Okay. [00:45:36] I got to end with this. [00:45:37] Good news. [00:45:38] Good news out of New Hampshire, which I have to say makes me feel very happy because it's, I, you know, it's not exactly a deep red state. [00:45:49] And here the New Hampshire House has voted for sanity, saying we approve a ban on these gender reassignment procedures for minors. [00:46:01] How did they do it? [00:46:02] They got 12 Democrats, 12, that's not a small amount to cross the aisle. [00:46:07] Two Republicans abandoned, but they got 12 Dems to vote in favor of the band. [00:46:12] Here is one of the heroes. [00:46:13] Didn't go as far as I would have liked. [00:46:15] I would have liked puberty blockers into cross-sex hormones banned because it sterilizes kids. [00:46:20] But I'll take what I can get for now. [00:46:22] Take a listen to Representative Jonah Wheeler, Democrat, on why he did it. [00:46:28] Rise today, despite the uncomfortability of this vote, because for me, it comes down to whether or not kids should be able to get these surgeries. [00:46:40] And despite the fact that I am a liberal, despite the fact that I believe in non-discrimination for trans people, for gay people, for queer people, and that I will fight until my very last day until they are recognized as human beings. [00:46:56] The question before us is whether or not children under the age of 18 should be able to get these surgeries. [00:47:05] And they should not. [00:47:07] These are irreversible surgeries. [00:47:11] God bless him. [00:47:12] I honestly feel like divine intervention went into New Hampshire last night and made this happen. [00:47:17] Can't keep happening to children. [00:47:20] No, it is a step in the right direction Megan, and so I don't want to be a king Cynic here. [00:47:24] It is a step in the right direction and I and I applaud them for doing I applaud those Democrats, because that takes guts. [00:47:29] They're going to take a lot of heat now. [00:47:31] At the same time, I will just. [00:47:32] I would just say just just to close out with this, we are in a lot of trouble as a society because these bans are even something that has to happen. [00:47:41] I'm all about the bans right, that's great, but you shouldn't have to ban doctors from cutting off a 13 year old girl's breasts. [00:47:48] That's not a thing that should ever have to come up before the law, because it should never occur to a doctor to do that. [00:47:53] And even if it did occur to one of them to do that, he should be so afraid of society that he wouldn't do it. [00:47:59] So i'm glad we have these bans. [00:48:00] I hope we keep banning them, but it goes to show we don't have a politician problem. [00:48:04] We have a people problem, like I talk about on my show all the time, by the way, the Jesse Kelly Show podcast. [00:48:10] Go subscribe to it everybody. [00:48:11] Yes, all Kelly shows are very entertaining. [00:48:14] I think you'll really enjoy my brother Jesse's uh program. [00:48:17] He's just a brother in ideology and sense, not not blood brother. [00:48:21] But maybe somehow you know, you never know we went back and traced our roots. [00:48:23] Maybe one day we'll do it. [00:48:24] Um yes, I do feel like sanity prevailed there, but you're absolutely right. [00:48:28] The fact that it's a problem to begin with and if you look at, i'll see it i'll just say, in the couple seconds we have left, it's already like the problem starting to percolate up, i'm sorry to say. [00:48:38] And when it comes to pedophilia and the attempt to normalize quote minor attracted men, it's always men. [00:48:48] Yeah, it's happening in some corners of the left. [00:48:50] There was a publication on VICE yesterday that was raising. [00:48:56] Oh they're, they're, pushing, you know, like the pedophilia, the crazy right. [00:49:00] In response to Jeffrey Epstein, okay no, he actually was accused of wanting underage girls and having. [00:49:07] So there's, there should be no normalization of it. [00:49:09] There should only be people who call it out and anybody who deigns to actually do it should be ostracized and, in that case, locked up. [00:49:16] Jesse Kelly, you're the best. [00:49:17] Thanks for coming on. [00:49:19] Thanks Megan, appreciate you all. [00:49:21] Right, next up crazy crazy, spying on you. [00:49:24] Glasses, don't go away. [00:49:30] Are you ready for something scary? [00:49:31] I've been wanting to talk to this woman for a long time. [00:49:34] We are hearing a lot these days about artificial intelligence, of course, or AI, but here's one way. [00:49:40] It's already changing our lives, even if you don't know it. [00:49:43] There's one good piece of it and there's one potentially very disturbing piece of it at least. [00:49:47] Joining me now to explain what i'm talking about is NEW YORK Times journalist Cashmere Hill. [00:49:52] She specializes in quote Looming Tech Dystopia how about that for an area of expertise and is the author of the national bestseller your face belongs to us, a secretive startup's quest to end privacy as we know it. [00:50:08] It is the riveting story of a small AI company that advanced facial recognition technology and, in the process, may have ended privacy yours and mine as we know it. [00:50:21] Cashmere, welcome to this show, great to have you. [00:50:24] Hi Megan, thanks for having me. [00:50:26] Okay so, first of all, your name is based on a Led Zeppelin song. [00:50:29] That's amazing. [00:50:31] It is my parents name me after Cashmere. [00:50:34] Uh yes, it's unique. [00:50:36] Good for them. [00:50:37] I I talked to Crystal Ball when she first became a public person and I was like, what's the? [00:50:41] What's the deal with your name? [00:50:43] And uh, her dad was like some sort of he was a nuclear physicist or an astrologist I can't remember something like that. [00:50:48] He was just obsessed with um the skies, and that's where her name came from. [00:50:53] Okay, so this This is a great book, because it's got something for everyone, Kashmir. [00:50:59] It's like, I think the left is generally not to reduce everything to politics, but I think the left is generally concerned about AI and like where it's going. [00:51:06] And I know the right is very concerned about giving government more power to spy on us. [00:51:12] And not just government, but even third party agencies or anybody. [00:51:16] And I, for me, it's, it's most, it's interesting for both of those reasons, but it's also interesting because I really do care about like women who are the victims of domestic violence or stalkers, which has happened to me, and like the number of things you have to go through in order to protect yourself. [00:51:33] And look, let's face it, I've got some money, so I can do that with relative ease these days. [00:51:39] But most women who are subjected to domestic violence or stalkers have no money. [00:51:44] And just the hoops that they have to jump through to try to protect themselves are already too great. [00:51:48] And this technology that you wrote about doesn't work to their advantage at all. [00:51:54] Okay, so that's the setup. [00:51:56] So tell us just how you sort of got started on this. [00:51:59] Cause I know I think you were in my friend Meryl Gordon's journalism class, right? [00:52:05] Yes, I was. [00:52:08] How far back do you want to get started here? === Constant Background Data Collection (05:38) === [00:52:10] Well, no, because I mean, like you, I'm just curious, what made this your beat once you got into journalism? [00:52:16] Yeah. [00:52:17] So journalism for me was kind of a second career. [00:52:20] I had worked at a law firm as a paralegal. [00:52:22] I'd worked at a nonprofit. [00:52:24] And I was in my late 20s when I started on the journalism journey and was in Meryl Gordon's class, NYU, and was thinking about, you know, what should my beat be? [00:52:34] What do I want to do in journalism? [00:52:37] And at the same time, I was thinking about how invasive the practice of journalism is, that you're writing about people who sometimes don't want to be written about. [00:52:46] You're determining a reputation. [00:52:48] It was around 2008, the iPhone had just come out. [00:52:52] Everybody was getting onto Facebook. [00:52:54] And I just was thinking a lot about what privacy was in the modern age with all this new technology. [00:53:00] And so at NYU, I pitched a beat called The Not So Private Parts about this kind of intersection or collision of tech and privacy. [00:53:09] And it was supposed to be a year-long project, but it's what I've been writing about in the decade plus since. [00:53:15] Your most recent piece I saw was about how our cars are spying on us and are being used in some circumstances by when people get a divorce. [00:53:26] If one spouse is the registered owner, he or she can spy on the spouse who may get the car in the separation agreement. [00:53:36] And there's very little you can do about it. [00:53:39] Yeah, I mean, the world that we live in now is just so difficult in so many ways because all these objects are internet connected, things in your home, you know, your TV, your coffee pot maybe. [00:53:52] And cars now are collecting a lot of data. [00:53:57] It's concerning because most people don't understand how much information is being collected, where it goes, how it's being used. [00:54:03] And this particular issue I was writing about in this story is that many modern cars have apps that you can use to see where they are, to unlock them, to make the horn honk. [00:54:14] They're convenient features when you park somewhere in the parking lot and you can't remember where. [00:54:20] But I was talking to domestic violence experts who say that these convenient features are being weaponized in kind of abusive relationships. [00:54:28] And women, it was only women I talked to, were separating from husbands and finding that their husbands were tracking where they were going by firing up the car app and looking at where the car was, even harassing them, you know, by making the horn honk, making the lights turn on, making the car start in the middle of the night in their garage. [00:54:52] And they would contact the car manufacturer and say, hey, like stop giving my husband, my ex-husband, access to the car. [00:55:01] And the car manufacturers just were not able to help them, they said, because the, you know, the car was also in the husband's name, or maybe only in the husband's name, even though the woman had a protective order or had been awarded the car during divorce proceedings. [00:55:18] It's amazing when you look around. [00:55:20] And I'm going to get to the book and what you revealed about this company, what they're making, their product. [00:55:24] But it is amazing when you look around you and realize how much of your privacy you've already sacrificed to live in the modern world. [00:55:33] You know, we know that Facebook and the social media companies are tracking everything about us. [00:55:38] And even now, like when you try to opt out of cookies or anything like that, it's so hard. [00:55:42] They make you jump through so many things. [00:55:44] And your email address gets sold to so many companies. [00:55:47] And every day you get a new email from a new, you didn't ask for and then to unsubscribe, you know, like they want you to enter your email to unsubscribe. [00:55:54] You're like, wait a minute, is this a dummy account? [00:55:57] Who am I doing a relationship with now? [00:55:59] There's just, I complained on the show a couple of months ago about I was trying to buy a coat in Chicago and the woman was like, what's your email address? [00:56:07] I'm like, why do you need to know that? [00:56:08] Just give me my, here's my credit card. [00:56:10] It works. [00:56:11] Give me that and give me the receipt. [00:56:13] Nope, need your email address. [00:56:15] What we had an argument, you know, just at every turn. [00:56:19] Even we have Life360 on our phones, right? [00:56:22] So like you can see your kids now that two of my kids have phones. [00:56:26] Well, Doug and I went on it. [00:56:28] Okay, fine. [00:56:29] They can see where I am on Life360. [00:56:30] Did you know if you press something on Life360, you can go back and see every single spot you've visited over the last 30 days at least. [00:56:38] It's all right there. [00:56:40] Like your entire life. [00:56:41] It's very disconcerting the amount of privacy we've already sacrificed. [00:56:47] Yeah, I mean, I think there's a lot of benefits, right, that have come from the way that we live today. [00:56:52] The fact that with our, you know, smartphones, you can land anywhere in the world and you can call an Uber. [00:56:58] You know, you can figure out which restaurants to eat at. [00:57:02] Technology has benefited us in many ways, but increasingly there's this kind of constant, you know, background data collection and it's not always being used in ways that benefit us. [00:57:13] You know, there's these apps on your phones. [00:57:15] They have third-party, you know, ad networks that are keeping track of where you're going and creating that same kind of list of places you've been that you've seen created by Life360 in an app that you have chosen to use. [00:57:29] And so that's what I kind of try to track in my journalism is, you know, what is happening? [00:57:35] You know, how is the data being collected? [00:57:39] Who is using it? [00:57:40] And when is it being used in ways that really harm you? [00:57:44] Because that's what I get concerned about is, you know, what's the harm here? === Face Print Technology Risks (15:11) === [00:57:48] How is this coming back to haunt people? [00:57:50] And how can we prevent those kinds of uses of the technology? [00:57:56] So that's the perfect setup for Clearview AI, this company that you found out about and wrote an article about, wrote a book about. [00:58:06] And really, they've given you a lot of access. [00:58:08] So they tried to prevent it at first, but ultimately they submitted because they realized it's not great to not cooperate with the New York Times when they're doing an in-depth piece on you. [00:58:18] And this company is emblematic of everything we just discussed. [00:58:23] They're doing some stuff that is great that most people would say, right on, go get them. [00:58:29] We need a lot more just like this. [00:58:31] But this has the potential to and is most likely going to veer into a lane that many of us would find very disturbing. [00:58:41] So let's start with your initial encounter with this company and what kind of turned you on to them and their initial stiff arming of you. [00:58:51] Yeah. [00:58:51] So it started for me in the fall of 2019. [00:58:54] I had just become a reporter at the New York Times and I got a tip from a source, somebody I knew from the privacy security world who had been doing public records requests to police departments about what facial recognition tools they were using. [00:59:09] And he gotten this 26 page PDF back from the Atlanta Police Department. [00:59:15] And it included a legal memo written by Paul Clement, a very high profile lawyer, now in private practice, but used to be Solicitor General for George W. Bush. [00:59:25] Exactly. [00:59:27] And he was describing this tool called Clearview, how it had, I think, billions of photos at that point that had been scraped from the internet, you know, without anyone's consent to build this facial recognition tool where you could take a photo of somebody, a stranger, upload it to the app, and it would return all the other places on the internet where their photo appeared, revealing, you know, their name, [00:59:54] their social media profiles, maybe details about their life, maybe photos they didn't even know were on the internet. [01:00:00] He said he had tested it with, you know, lawyers at his firm. [01:00:04] It returned very fast and accurate results. [01:00:07] And, you know, it had scraped Facebook, Instagram, Venmo, LinkedIn, you know, basically all your favorite social media sites, as well as kind of the wider web. [01:00:16] And he had written this legal memo for police departments who might be interested in using it to reassure them that they could use the app without violating any state or federal privacy laws. [01:00:29] And I am reading this and I am just astounded. [01:00:32] I mean, I've been covering privacy at that point for more than 10 years and I had never heard of that kind of technology that could do this. [01:00:41] And it was being offered by this company I had never heard of before called Clearview AI. [01:00:46] And the more I started looking into them, the stranger it got. [01:00:50] So just as an overarching theme, they're using what you refer to as one's face print. [01:00:59] People are familiar with fingerprints and generally would be reluctant to place their fingerprints online as a record associated with them. [01:01:08] You know, you wouldn't want that out there. [01:01:10] They're familiar with an iris scan, also seems very intrusive. [01:01:15] But a face print is also existent on every person. [01:01:20] And while, yes, your face can be seen in various photos, your face print would be much more widely and easily detected by this technology. [01:01:30] And it will collect photos of you that you didn't even know existed, half photos, three quarter photos. [01:01:37] You're in the background on something, not even posing. [01:01:40] It's extremely sophisticated and good at what it does. [01:01:44] And you make the point in the book that the people who put this company together, they're not some geniuses that this has been considered and rejected by all the big, basically, tech companies who are already out there collecting our data, but this was a bridge too far for all of them. [01:02:02] Yeah. [01:02:03] I mean, when I first started looking into Clearview AI, there was very little out there about them. [01:02:07] They've kind of taken pains to hide who was behind the company, which was ironic given that they were putting all this information out there about all of us. [01:02:16] I reached out to them. [01:02:18] I reached out to Paul Clement. [01:02:19] I reached out to anyone I could find kind of attached to the company and no one would respond to me. [01:02:24] So I thought, well, maybe I can, I can, I even actually had an address on their website and it was only a couple blocks away from the New York Times office in Manhattan. [01:02:33] And I walked over there and discovered that there was no such address. [01:02:37] I was kind of looking for it. [01:02:38] I compare it in the book to Harry Potter because it's like looking for a platform that wasn't there. [01:02:43] They were very secretive when I first discovered them. [01:02:45] And so I went to police officers who I thought had used the app based on it kind of showing up on budgets or because they were appearing in these public records requests. [01:02:57] And I ended up talking to a financial crimes detective in Gainesville, Florida named Nick Ferrara. [01:03:04] And he was telling me, wow, I love Clearview AI. [01:03:08] I would be their spokesman if I could. [01:03:10] The tool works so well. [01:03:11] It's way more powerful than anything I've ever used before. [01:03:16] He said that he had a pile of unsolved cases on his desk. [01:03:21] He'd run them through the state recognition, facial recognition system in Florida, not gotten any matches. [01:03:27] And so run them through Clearview AI. [01:03:29] And he said, I got match after match after match. [01:03:32] He said it was really incredible. [01:03:33] And so I said, well, this sounds great. [01:03:35] I'd love to see how it works. [01:03:36] And he said, well, send me your photo and then I'll screenshot the results for you and send them your way. [01:03:42] So I sent them, I sent him my photo and then he goes to me and he stopped responding to any of my messages. [01:03:48] This basically happened again with another officer, though before he stopped talking to me, he did run my photo and he said I didn't have any results, which we both thought was very strange because I have a lot of photos on the internet. [01:04:00] Eventually I'd find out that Clearview AI had put an alert on my face so that they were being notified when I was being searched for. [01:04:09] They were reaching out to these officers and saying, you know, don't talk to her. [01:04:14] You're violating the terms of our app by running her picture. [01:04:17] And so this was alarming to me because it showed me that this company could, you know, see who law enforcement was searching for, could control whether they could be found. [01:04:27] And it just shows the power of facial recognition technology, this idea that you can be searched for. [01:04:32] There can be an alert on your face and people might be reacting to you and you might not even realize it. [01:04:40] And we kind of have seen that play out at Madison Square Garden, where James Dolan, the owner of Madison Square Garden, decided to start looking for, he decided he wanted to ban certain people from coming into his venues for lawyers who worked on, who had sued, who had sued Madison Square Garden or any of their other companies. [01:05:02] And so they went to their law firm websites and scraped their photos, you know, from their own biographies on their websites and created this big ban list. [01:05:11] And when those lawyers tried to go to Madison Square Garden to see a Knicks game or the Rockettes at Radio City Music Hall or Mariah Carey concert, they would be stopped at the door and turned away and told, you're not welcome here until your firm drops this suit or settles this suit or the suit comes to an end. [01:05:30] And so it just shows you how powerful this technology could be in the hands of corporations, for example, companies who want to know who you are the minute you're walking through the door. [01:05:41] Yes, it's it's like, I mean, you think about I like, you could go either way with it, but, you know, after January 6th, all the proposed bans on anybody associated, not just with the protesters, the rioters, but with Team Trump. [01:05:56] Like anybody with the last name, Trump, anybody who's on the Trump team or the administration, all banned. [01:06:02] Publishers were saying no books by you. [01:06:04] Imagine that expanded to you can't come in here for coffee. [01:06:09] You can't come in here to watch a Knicks game. [01:06:11] Like it could go so far beyond that and at least, at least in the example I just raised, it would require a name and the person would have had to have asked for something and submitted a record. [01:06:25] This is who I am and this is what I'm at. [01:06:26] It's not just like walking in for a cup of coffee, like doing the things we do for a, to exist. [01:06:32] And just your mere face, your face print tells them so much about you. [01:06:38] It's like making yourself instantly famous, basically. [01:06:41] And you don't want to be famous. [01:06:43] You want to be a private civilian. [01:06:45] I do always feel funny talking about this topic with somebody like you who, you know, the ship has sailed for you. [01:06:52] In most places where you go around, they know that you're Meghan Kelly. [01:06:56] But the rest of us still have a certain degree of anonymity as we move through the world. [01:07:01] And I do, my big fear with facial recognition technology is that it brings an end to that and that it takes all the information that over the last couple of decades that we've been online, that we've been putting information out there about ourselves, that it's been collected without our knowing it. [01:07:18] There's all these dossiers now that basically exist for all of us, that that could just be attached to us in the real world, that our face becomes the token to be able to access all of this information about you all of the time. [01:07:33] And it can be used to judge you in ways. [01:07:35] Yeah. [01:07:35] I mean, whether you're liberal or you're conservative or you're, you know, rich or you're poor or who you work for. [01:07:46] These guys who started this firm, they're more right-leaning. [01:07:50] They, I guess if you had to put money on who they would want to use it against if they really went nefarious, it would be against liberals because they're like one of the big investors is Peter Thiel, who's a conservative investor. [01:08:01] And the other guys, as I understand, they met at the Trump convention in 2016, the Republican National Convention. [01:08:08] Like these are more right-leaners. [01:08:09] Yeah, they met, they met before the Trump convention, but apparently the kind of idea for Clearview first came about when they were at the Trump convention. [01:08:18] They were thinking, wow, there's all these strangers here. [01:08:21] You don't know who anyone is. [01:08:23] Wouldn't it be nice if you had some kind of tool, an app on your phone where you just kind of pointed it at somebody and it told you, gave you an indication, you know, who this is. [01:08:32] Are they a friend? [01:08:33] Are they a foe? [01:08:33] Are they somebody I should get to know? [01:08:35] So even in the beginning of this company, they were thinking about this kind of this use in this kind of more polarized world, you know, who's on my side? [01:08:45] Who's not on my side? [01:08:47] Yeah. [01:08:48] I mean, it's very, it's, it's very scary. [01:08:50] It could be used against everyone, depending on whose hands it would fall into. [01:08:55] And let's not kid ourselves, that's the concern. [01:08:57] It's that this Clearview AI will not be the only company using it. [01:09:01] It's going to be widespread. [01:09:03] And there's a real question about whether it can be stopped at all. [01:09:05] This really could be our future. [01:09:07] In 10 years, everyone might have it. [01:09:09] And truly, privacy may be a thing of the past. [01:09:11] One of, I think one of the executives said that to you, like it's, it's over, Kashmir. [01:09:17] There is no privacy. [01:09:18] Those days are over. [01:09:20] Yeah. [01:09:20] I mean, that was one of the investors in Clearview AI. [01:09:24] I said when he, when I first started tracking down the company and I went to his door and he ended up letting me in in part because I was quite pregnant at the time. [01:09:33] And I said, oh, you know, I've come all this way, offered me water. [01:09:35] We sat down. [01:09:36] And he was talking about how excited he was about Clearview AI as an investment, that he believes he hoped in the future that the same way you Google someone's name, you would clearview someone's face. [01:09:49] And he said, you know, right now, Clearview is just selling it to police departments, but his hope was that they would start selling it to everybody, that it would be an app on everybody's smartphone. [01:09:59] And I said, you know, that seems kind of alarming to me, this idea that we wouldn't have the right to be anonymous anymore. [01:10:06] And he said, yeah, I realize it's dystopian, but I just think that's the nature of technology, that it's, you know, it's eroding privacy and there's nothing we can do about it. [01:10:15] And, you know, I think tech companies are selling these kinds of tools. [01:10:18] That's what they want people to believe. [01:10:20] You know, give up. [01:10:21] There's no hope for your privacy. [01:10:23] Just accept this. [01:10:24] But I think that we still can protect it. [01:10:26] And there are examples in the past of times that we've done that. [01:10:30] So I remain optimistic that we might still preserve a bit of our anonymity. [01:10:34] Well, and you've got the one state of Illinois, which is like the one state that's done something to protect its citizens from this kind of technology. [01:10:41] We can talk about that in a minute. [01:10:43] But let's just spend a minute on. [01:10:44] So finally they did. [01:10:45] You were pregnant. [01:10:47] You did what a good reporter will do, which is somehow got yourself in the door. [01:10:51] And eventually they started talking to you. [01:10:53] And there's a very interesting guy behind the technology. [01:10:58] What's his name, Tom? [01:11:00] Juanton Tat. [01:11:02] Juanton Tat. [01:11:03] Okay. [01:11:03] Who is Juanton Tat? [01:11:06] Juanton Tat is the kind of technical mastermind behind Clearview AI. [01:11:11] He grew up in Australia, was always really interested in computers, technology. [01:11:16] At 19 years old, he drops out of college in Canberra, Australia, and he moves halfway around the world to San Francisco, kind of chasing the tech dream. [01:11:28] And he at first was creating Facebook quizzes and iPhone games and not really having a lot of success. [01:11:37] And eventually he ends up moving to New York, kind of falling in with this more right-leaning crowd. [01:11:42] And then he goes and creates this incredibly powerful technology, Clearview AI. [01:11:50] And I kind of, you know, at first the company did not want to talk to me. [01:11:53] Eventually they came around and I've actually spent a lot of time with Juanton Tat, a really interesting character. [01:11:59] And I asked him, you know, how did you go from building iPhone games to this incredibly powerful, you know, potentially world-changing technology? [01:12:09] And he said, I was standing on the shoulders of giants. [01:12:13] He said at the beginning, he just went onto Twitter and followed machine learning experts. [01:12:17] He went on to GitHub, this kind of place where computer scientists share code and he looked up facial recognition. [01:12:25] And we both started laughing when he's telling me this. [01:12:27] He said, I realize it sounds like I Googled a flying car and then I built one. [01:12:32] But this kind of Clearview's journey, Juan Ton Tat's journey really reveals what has happened in technology, which is that there has been a lot of sharing and open sourcing of these AI tools. [01:12:46] And it allowed him and kind of like a ragtag band of people to create a really, really powerful technology. [01:12:53] And at first I thought that Clearview had had this kind of technological breakthrough to create this tool. === Facial Recognition Search Errors (09:20) === [01:13:00] But in my reporting for the book, I discovered that, as you were saying before, Facebook and Google had both created technology like this internally and decided not to release it because they thought it was too dangerous. [01:13:13] And so what Clearview had done was more of an ethical breakthrough than a technological one. [01:13:18] Ethical breakthrough, if that's a nice way of putting it. [01:13:21] A disregard, perhaps. [01:13:23] But again, I'm not against the company. [01:13:25] I love what they're doing in law enforcement. [01:13:27] The scary thing is when it goes beyond that. [01:13:29] So let's spend a minute on what they are doing for law enforcement, because when I get to the stories about them nailing pedophiles, I'm cheering on my feet. [01:13:37] And they are, they are using this and have found pedophiles through some incredible detections of, as I was kind of saying, not even people who are front and center in various photographs. [01:13:51] Yeah, I talked to a Department of Homeland Security agent who this was actually the first time that the Department of Homeland Security. [01:14:00] He had this case where they had come across an image of abuse in a Yahoo account of somebody who is based outside of the country. [01:14:12] And he had the photos and he's thinking, what do I do? [01:14:17] How do I solve this case? [01:14:20] And so he sent a, he ended up running the, they did a screenshot of the abuser's face and he sent it to other child crime investigators. [01:14:30] And he said, hey, does anyone else recognize this person? [01:14:33] Have you seen them in other photos? [01:14:35] They knew it was somewhere in the U.S. because of an electrical outlet. [01:14:39] They could see it was a U.S. outlet. [01:14:41] So they knew it was somewhere in the United States. [01:14:43] And there was another agent who had access to Clearview at that time. [01:14:46] And so she ran the photo and sent him one of the results, which was an Instagram photo. [01:14:52] And when he first looked at it, he didn't see the abuser's face. [01:14:56] And he said, oh, he's not here. [01:14:58] And the other agent told him, look in the background. [01:15:01] And in the background of this Instagram photo is this guy standing at a booth. [01:15:06] And that wound up being the person. [01:15:09] And so this investigator followed these breadcrumbs, figured out where he worked, figured out who he was, figured out where he lived in Las Vegas. [01:15:17] And they wound up arresting him. [01:15:18] And he's in jail now, you know, getting this child out of his access. [01:15:25] And the Department of Homeland Security, ICE, it was the unit he was part of. [01:15:29] They said, we have to have this tool. [01:15:31] And they wound up signing up for it. [01:15:33] And so Department of Homeland Security now does have this contract with Clearview that they've had for a few years now. [01:15:39] And he said, you know, we just never would have found that guy without this technology. [01:15:43] So you can see why this is so appealing to law enforcement. [01:15:46] You know, when you only have a face to go on, this is something that could help you solve that case. [01:15:52] So this is like a guy who wasn't necessarily even posing for not the illegal photo, but the other photo. [01:15:59] He just happened to be caught in the background of someone's photo that's available on the internet or on Facebook or in someplace where Clearview searches. [01:16:07] And he made the mistake of having a picture of him committing this disgusting sin and crime in a picture that was on his email or however law enforcement got its hands on it to begin with. [01:16:17] And they matched it just from the back, him being in the background of a, I mean, that's amazing. [01:16:24] What you point out in the book, and it kind of got me thinking about it, though, is on the privacy front there, even with law enforcement, they're doing something that is, you know, a little disconcerting, which is you and I are potentially in that photo lineup, right? [01:16:43] Like our faceprint, maybe not us because we're females, but I'm just in general, maybe all American men were in that photo lineup without really consenting to be in the photo lineup with their quote faceprint. [01:16:59] Right. [01:16:59] I mean, Clearview AI says that they now have 40 billion photos in their databank. [01:17:06] And so I know, I know for a fact that you and I are in that, you know, in that database. [01:17:11] And so every time somebody does run a Clearview search, they are searching all of those photos. [01:17:17] They are searching through all of our faces, essentially, for a match. [01:17:21] And so that worries some constitutional experts. [01:17:24] They say, hey, you know, I think if the United States government built this, you know, we would probably be fighting back. [01:17:31] This seems almost unconstitutional, you know, that we're all part of every search that's being run through this tool. [01:17:37] And it can go wrong. [01:17:40] You know, there have been a handful of cases now where people have been arrested for the crime of looking like someone else because we're not all unique snowflakes. [01:17:49] Some of us look similar to other people. [01:17:51] And so there is this concern that if you act on the facial recognition search alone, you might end up arresting the wrong person. [01:17:59] Beyond that, bigger. [01:18:00] I would imagine they require more than that, right? [01:18:03] It can't just be only clear view recognition. [01:18:08] So, hopefully, ideally, if the police are doing it right, it won't just be clearview identification. [01:18:14] But I have written about cases where they have arrested people based on not much more evidence than that. [01:18:20] A guy who was arrested in Atlanta for shoplifting, essentially, purses in and around New Orleans. [01:18:30] And he was arrested. [01:18:32] The police, he's like, Why am I being arrested? [01:18:34] He got pulled over. [01:18:35] Why am I being arrested? [01:18:37] And they said, Oh, for larceny in Jefferson Parish. [01:18:41] He said, Where's Jefferson Parish? [01:18:42] They said, It's in Louisiana. [01:18:43] He said, I've never been to Louisiana. [01:18:46] But he ended up getting arrested and spending a week in jail before they realized they had the wrong person. [01:18:54] He did look a lot like the offender, and they had done a clear view search. [01:18:58] He'd come up as a match. [01:19:00] And when the police looked at his Facebook profile, they saw that he had a lot of friends in New Orleans. [01:19:05] So, based on very little information, I mean, not enough. [01:19:09] No one should be arrested based on that. [01:19:10] He had been arrested. [01:19:11] And so, sometimes these things do go wrong because of confirmation bias, automation bias, where the police just rely too heavily on this high-tech tool, which does seem so amazing and often so amazing. [01:19:25] I like how accurate is it then? [01:19:28] Because, you know, I've told the audience before, you know, of all the weird conspiracy theories that are out there, and there's, there are plenty of weird ones. [01:19:35] One of the weird ones that's out there is that I am Nicole Brown Simpson, like either reincarnated or she never died. [01:19:44] I'm not exactly sure how it works, but somehow we're the same person. [01:19:48] So, truly, like, what about two people who look very similar in certain of their features? [01:19:54] Does Clearview generally very good at distinguishing between two similar-looking people or not so good? [01:20:03] Well, so there's um, there is a federal lab called the National Institute for Standards and Technology or NIST that tests all the facial recognition algorithms or runs these tests periodically. [01:20:14] And a lot of these algorithms now are incredibly accurate, like they're 99% accurate. [01:20:20] Um, but it does depend on you know, the quality of the image that you run. [01:20:25] Is it a grainy, you know, still from a surveillance tape? [01:20:29] Uh, it might not work as well under those circumstances. [01:20:33] Um, anecdotally, you know, I have seen it work quite well when Juan Ton Tat has run clear view searches on my own face. [01:20:41] There are no doppelgangers who come back, it is me, it's photos that I've put out there, it's me at concerts in 2005 in the background of other people's photos. [01:20:54] Really? [01:20:55] Um, yes, there was a photo of me in the background of someone else's. [01:20:59] Um, it was, well, there was a woman in the background of someone's photo walking by. [01:21:03] And at first, I didn't think it was me until I recognized my coat that I had bought at a vintage store in Tokyo. [01:21:08] It was so unique, it had to be me. [01:21:10] I mean, it is, uh, Juan Ton Tat said, it's a time machine. [01:21:14] I invented it, and it really is. [01:21:16] It's, I mean, it was, it was incredible that I was able to connect my face to me, you know, in profile in 2003, four. [01:21:23] I mean, it's, it's, it's kind of astounding how well it can work under the right conditions. [01:21:28] Oh my gosh. [01:21:28] All right. [01:21:29] So that's one thing. [01:21:30] I mean, maybe people out there are feeling less safe than I am knowing that it's in the hands of law enforcement right now across the country. [01:21:38] Many law enforcement divisions already have this on you. [01:21:42] I'm still, my, my default is generally to be trustworthy of law enforcement. [01:21:46] I have a cop in the family. [01:21:47] I don't know. [01:21:48] That's, but I feel very less, very much less secure when it comes to private citizens having this stuff because it's not even just like Megan Kelly. [01:21:57] Okay. [01:21:57] Anybody can Google that and see Cashmail Hill. [01:22:00] What, you know, what comes up about Cashmere. [01:22:02] It's so much more. [01:22:03] Like you say, it's like private photos that you had no idea. [01:22:07] Maybe whatever. [01:22:08] Maybe you went to some march at one point or who knows what you did. [01:22:12] Or when you were a stupid kid that now private citizens or your enemies know about and could use against you. === Privacy Protections By Address (13:13) === [01:22:20] Or like the thing I worry about is the freaks out there, people who are crazy, who just want information on you that you would never voluntarily give. [01:22:28] Now they've got it. [01:22:29] And maybe even they have photos. [01:22:32] Like, for example, in my case, I don't publish any of my addresses for very obvious reasons. [01:22:38] But what if my neighbor was out playing football with their kids on the front lawn and I got caught in the background of one? [01:22:45] You know, now am I going to have to run out there and be like, give me that photo, you know, delete that photograph, right? [01:22:50] That you can't do that. [01:22:51] But I could be there and I could be in front of my house and now it's identifiable, like all this stuff. [01:22:57] So let me take a quick break and we're going to pick it up there and talk about that risk and the glasses, which is what got my attention to this whole case to begin with. [01:23:07] More with Cashmere Hill right after this. [01:23:10] I'm Megan Kelly, host of the Megan Kelly Show on SiriusXM. [01:23:14] It's your home for open, honest, and provocative conversations with the most interesting and important political, legal, and cultural figures today. [01:23:22] You can catch the Megan Kelly Show on Triumph, a SiriusXM channel featuring lots of hosts you may know and probably love. [01:23:29] Great people like Dr. Laura, Glenn Beck, Nancy Grace, Dave Ramsey, and yours truly, Megan Kelly. [01:23:36] You can stream the Megan Kelly Show on SiriusXM at home or anywhere you are. [01:23:41] No car required. [01:23:42] I do it all the time. [01:23:44] I love the SiriusXM app. [01:23:46] It has ad-free music coverage of every major sport, comedy, talk, podcast, and more. [01:23:52] Subscribe now, get your first three months for free. [01:23:55] Go to seriousxm.com/slash MK Show to subscribe and get three months free. [01:24:01] That's seriousxm.com/slash MK Show and get three months free. [01:24:06] Offer details apply. [01:24:13] Now, I've been thinking about this, including yesterday we had on Nancy Grace and we were talking about the Brian Kohlberger Idaho murders case. [01:24:21] And I want to play you this clip because it relates to some of this technology potentially, where she was stating here her suspicions. [01:24:30] She made it clear, this is her view about Brian Kohlberger and who he is, and then offered some facts about what we know about some of his past interactions with women. [01:24:41] Listen to this soundbite. [01:24:42] And you know another thing which is not going to get brought up at trial, I guarantee you, because the defense is going to argue it's too incendiary and prejudicial, blah, blah, blah. [01:24:51] Incel, the theory that Brian Koberger is, in fact, an incel involuntary celibate and hates women because he can't be with women. [01:25:01] And remember, he was also banned from a bar because he would go up to women and say things like, what's your home address? [01:25:08] I would run for the heels as if I had seen a monster. [01:25:11] If some creepy dude comes up to me at a bar and says, what's your home address, lady? [01:25:15] Uh-uh, N-O. [01:25:16] He had to get thrown out of that bar. [01:25:18] So if that happens to you in today's day and age, and you're not a public figure, right? [01:25:23] Like I am, you're usually fine because the person doesn't know your name. [01:25:28] They can't Google you. [01:25:30] And it's, you do have to spend some money and make some effort to find somebody's address, even a private citizen today, but it's knowable. [01:25:38] I mean, let me tell you people, if you've gone down to the DMV and given them your real address, like most people do when they go to the DMV, because you register to vote and all that, it's findable on the internet for $40. [01:25:48] It's very easy to find somebody's home address. [01:25:50] Very easy. [01:25:51] So it's creepy to think that, you know, now if this technology is available to private citizens, that bar encounter takes on a whole new meaning because he'll know who you are like that. [01:26:06] He'll have your name. [01:26:08] He'll have those, he'll know you went to the concert, you know, where you were wearing the jacket from Tokyo, you know, 20 years ago. [01:26:14] He will have so much information about you instantly. [01:26:18] And that brings me to the glasses because it's not going to require a big mainframe computer for him to get that info about you under this Clearview technology. [01:26:30] Yeah, I mean, so Clearview is working on these augmented reality glasses. [01:26:36] They had funding from the Air Force to develop them. [01:26:38] They would be used at military bases so that soldiers could theoretically identify threats from very far away. [01:26:45] But yeah, I mean, you can imagine this world in which maybe we do have all start wearing augmented reality glasses. [01:26:52] And with tools like Clearview, you might be able to be able to identify the people around you in real time. [01:26:59] I hate to tell you, we are already in that world to a certain extent. [01:27:03] Clearview has limited its tool to law enforcement and the government, but there are other copycat companies that have created the same kind of technology as Clearview. [01:27:14] Their databases aren't as big, but they're on the internet right now. [01:27:19] Sites that you can use for free, sites that you can pay a subscription to, where you upload a photo of somebody and it will show you other places on the internet where their photo appears, where you might be able to find out what their name is, you know, where they live. [01:27:33] So this is, this is not a future scenario. [01:27:36] This could happen to you in a bar, you know, tonight where somebody walks up to you. [01:27:43] They are creepy. [01:27:44] You never want to see them again. [01:27:46] They surreptitiously take your photo and all of a sudden they could know who you are. [01:27:51] I mean, I do think that that is a very scary scenario. [01:27:56] On the other side, maybe you're talking to somebody who seems great. [01:27:59] They're saying all the right things. [01:28:01] You take their photo, you look them up, and all of a sudden you see that they have this criminal record or they have this online reputation that you find really disturbing and you want to walk away. [01:28:13] So it's like, you know, technology in so many ways. [01:28:16] It's just this double-edged sword. [01:28:17] There is positive use cases and negative use cases. [01:28:20] And it really is about who's using it and how they're using it. [01:28:24] Oh my gosh. [01:28:25] I mean, I'll tell you this. [01:28:27] The one thing you should not put your home address on your driver's license or give it to the government. [01:28:32] Get a PO box. [01:28:33] Just get a PO box. [01:28:34] It's a bigger pain in the ass to get your mail, but it will put a layer between you and I mean, look, it's, it's, I've done it because I'm well known, but it's, this is, everyone's well known now. [01:28:46] There, there are no more civilians with technology like this out there. [01:28:50] So take those steps before it becomes a problem in your life. [01:28:54] I mean, you have to do it preemptively before the weird guy is trying to find you. [01:28:59] It's just so dark. [01:29:00] I don't know. [01:29:01] This is, I know you write about this in the book, but it's very much like Minority Report, where like everything about us is out there. [01:29:07] Like there's that scene in Minority Report where Tom Cruise is walking through the like the shopping mall and all the ads are personalized to him because they can see, I don't know if it's his iris or his face, but like this is the future's here. [01:29:20] And we cut it. [01:29:20] Here it is, just to remind those who haven't seen the movie in a while. [01:29:25] A road diverges in the desert. [01:29:28] Lexus. [01:29:29] The road you're on, John Anderton is the one less traffic. [01:29:39] You can move the old fashioned 321. [01:29:46] You can use a Guinness Rider on there. [01:29:48] This is going tomorrow. [01:29:51] John Adams. [01:29:53] Don't get away, John Andrew. [01:29:55] Forget your troubles. [01:29:59] So I see that, Cashmere, and the only thing I think is, how do I opt out? [01:30:03] How do I say, I don't want them doing that to me? [01:30:06] And I don't like, I want to opt out somehow. [01:30:08] So can we opt out? [01:30:10] Yeah, it's funny. [01:30:11] I think that we see that movie that way and that people working in technology see that as what they aspire to do. [01:30:19] Hashtag goals. [01:30:22] So there are ways to opt out. [01:30:25] It depends on where you live. [01:30:27] Basically, your face has different privacy protections depending on your address. [01:30:33] So Clearview AI, for example, will allow you to get out of their database if you live in a state that has a privacy law that requires them to delete information about you. [01:30:45] And there's just a handful of states that have those privacy laws. [01:30:48] California, Colorado, Connecticut are examples. [01:30:53] If you live in those places, you can go to Clearview AI's website and you'll have to upload a photo of yourself and you'll be able to see your report, like see what's in your database, what's in their database about you. [01:31:08] And then you can tell them to delete you. [01:31:11] And same goes, I think, for Illinois. [01:31:14] We talked about how Illinois has this law. [01:31:16] It's a very unique law to protect people. [01:31:19] But yeah, if you live in Illinois, it says that companies can't collect your faceprint, collect your biometric information without your consent, or they have to pay a very hefty fine. [01:31:32] So we talked earlier about Madison Square Garden and how they ban lawyers. [01:31:36] The company that owns Madison Square Garden is doing that at all their venues in New York City, but not at their theater in Chicago because they would need lawyers' consent to have their face prints and ban them from coming in. [01:31:51] And yeah, and some of these other tools I talked about that are on the internet right now where you can do this. [01:31:57] A lot of them have opt-outs. [01:31:59] But again, you have to submit your face, kind of tell them who you are in order to get out of their databases, which not everyone is comfortable doing if you care about privacy. [01:32:09] Right. [01:32:09] It's like I was saying, when they make you enter your email to unsubscribe and you're like, well, wait a minute, you emailed me to begin with. [01:32:16] So you have my email. [01:32:17] So what is this? [01:32:18] What am I being asked to enter into here? [01:32:20] It feels like our relationship is getting stronger, not weaker, which is my goal. [01:32:27] Okay, so I like that. [01:32:28] I like that with the potential of that. [01:32:30] I know that my friend and a man I deeply admired, admire currently, argued this case on behalf of Clearview, Floyd Abrams of New York Times versus Sullivan. [01:32:43] He's the father of Dan Abrams, who's also a friend. [01:32:46] Floyd's 84 years old. [01:32:47] He's a giant in legal circles. [01:32:50] So Clearview hired him to go in and argue against the ACLU, which sued them over this. [01:32:56] And my pal Floyd, I guess it either he didn't win or it wasn't looking like he was going to win. [01:33:02] And what happened? [01:33:03] So yeah, so they hired Floyd Abrams because he is the expert on the First Amendment, which is the right, you know, freedom of the press, freedom to information. [01:33:13] And Clearview was making this argument that they have a First Amendment right to collect public information that's on the internet and analyze it. [01:33:24] And they said they're just like Google, you know, they're just scanning the internet and collecting it and organizing it. [01:33:30] And instead of organizing it, you know, by name, they're organizing it by face. [01:33:35] And so yes, Floyd made this argument in a few of the different lawsuits, there's been quite a few against Clearview AI, including in Illinois, where the ACLU sued them. [01:33:46] And the judge there, it didn't make it to the case didn't go all the way. [01:33:51] It wound up settling, but the judge said, no, the First Amendment is not going to protect Clearview AI. [01:33:57] Illinois, the state of Illinois still has the right to say that you're not allowed to do this particular thing with somebody's faceprint. [01:34:04] And so that suit did settle with Clearview agreeing in the future to only sell this database, you know, of billions of photos to law enforcement, to the government. [01:34:16] And they said they won't sell it to private entities. [01:34:18] They won't sell it to individuals. [01:34:21] So the ACLU saw that as quite a win. [01:34:23] And Clearview saw it as a win too, because they said, that's what we're already doing. [01:34:26] And that's just what we'll continue to do. [01:34:29] I wonder if they could get like, you know, one of those actresses who's had all that work done. [01:34:34] You know, like the Melanie Griffith or like the Meg Ryan of like, you know, you've got mail versus the Meg Ryan of today. [01:34:41] Would it notice, you know, like, are the criminals going to start getting plastic surgery to get past this? [01:34:47] Someone's going to come up with a moisturizing cream that dulls the lens, you know, that Clearview would put on you. [01:34:52] I don't know. [01:34:53] There's going to be some technological advancement probably to counteract the creep in the bar with the glasses, don't you think? [01:35:02] Maybe those 3D plastic masks that you like pull over your head, like a mission possible. [01:35:09] It is, it can be, it can be hard to evade facial recognition. [01:35:14] I did this experiment with some of my colleagues at the times on one of these sites that's available to anyone to use, a site called PimEyes. [01:35:22] And, you know, we uploaded photos where, you know, somebody's face was half covered like with a COVID mask, their nose, their mouth, and it was still able to recognize them and find photos of them. === Evading Facial Recognition Systems (01:56) === [01:35:33] I mean, it is astounding how powerful facial recognition has gotten. [01:35:37] So it can be hard to evade it. [01:35:39] It is possible. [01:35:40] I talked to one lawyer who managed to evade the ban and go to a Knicks versus Cavs game at Madison Square Garden, even though she was on the list by wearing a baseball cap, glasses, and a COVID mask. [01:35:54] That was enough to get through MSG security. [01:35:56] The lawyers. [01:35:58] All right, we only have like 30 seconds left, but is there anything you want to flag for us that we need to be worried about in addition to all the stuff we've already discussed? [01:36:06] I guess just thinking, knowing that this power is out there now, just think about the photos that you do put online and whether they need to be public photos or whether you want to make them not be on the internet. [01:36:20] Or if you do, make them private so that these companies and there's more and more of them out there aren't there out scraping it and using it in ways that you wouldn't want or didn't expect. [01:36:31] You know, it's like yet another reason not to put your kid on the internet. [01:36:34] Do not put your kid's face all over the internet. [01:36:37] Be careful. [01:36:38] You don't know how those photos are going to come back to haunt him or her. [01:36:42] What a fascinating discussion. [01:36:44] Kashmir Hill, the book is Your Face Belongs to Us. [01:36:49] You'll learn a lot. [01:36:49] You'll be fascinated. [01:36:50] It's a quick, easy read. [01:36:51] Thank you so much for writing it. [01:36:53] Thank you, Megan. [01:36:55] Wow. [01:36:55] Okay. [01:36:56] I want to tell the audience, we're going to be back on Monday with Maureen Callahan. [01:36:59] She's going to come here inside the studio and we're going to talk to her about some of the latest shenanigans. [01:37:05] The Royals, did you see the clip that's going around about Madonna? [01:37:10] I'm dying to talk to Maureen about this, among other things. [01:37:13] And we're also going to have the head of Stop Anti-Semitism here to unveil their anti-Semite of the year. [01:37:20] Have a great weekend, everyone, and we'll see you then. [01:37:25] Thanks for listening to the Megan Kelly Show. [01:37:27] No BS, no agenda, and no