The Megyn Kelly Show - 20220428_elite-panic-over-alternative-media-power-and-press Aired: 2022-04-28 Duration: 01:10:01 === Exclusive Feed Content (01:51) === [00:00:15] Welcome to the Megan Kelly Show, your home for open, honest, and provocative conversations. [00:00:27] Hey, everyone, I'm Megan Kelly. [00:00:28] Welcome to the Megan Kelly Show. [00:00:30] In just a second, I'm going to be joined by Matt Taibbi. [00:00:33] But first, I wanted to tell you that today we uploaded exclusive content on our podcast feed for all of our podcast listeners. [00:00:42] It has insider details on our show. [00:00:45] My thoughts on how our first 18 months on the air has gone. [00:00:49] We've done 18 months of the podcast, and I don't know how many months since September now with Sirius. [00:00:57] And we've included a story about a recent interview we pulled. [00:01:01] We did not air because the guest who champions themselves as a warrior for free speech and non-divisive politics, right? [00:01:11] Lifting themselves above divisive politics, demanded an edit right after the interview that we refused to comply with. [00:01:18] Turns out the guest wasn't quite as courageous as they wanted our audience to believe. [00:01:23] And we'll get into what happened after that. [00:01:26] So you might find it interesting. [00:01:28] It's short and sweet, and I think you'll enjoy it. [00:01:31] Anyway, you can find the podcast on Apple, Spotify, Pandora Stitcher, or wherever you get your podcasts for free. [00:01:36] And while you're there, you will find over 300 episodes. [00:01:40] Our archives are all on our podcast feed. [00:01:43] So go ahead and download, enjoy, follow the show so you can listen whenever you want. [00:01:48] If you're not able to catch us live on Sirius XM Triumph Channel 111, joining me now, one of my favorite guests, Matt Taibbi, editor of the TK News Substack. [00:02:03] Matt, great to have you back. [00:02:04] Thanks for having me, Megan. [00:02:06] All right, so let's start with. === Government Speech Censorship (15:18) === [00:02:07] I'm sure this is going to come as a great comfort to you that now the Department of Homeland Security is going to crack down on disinformation on the internet. [00:02:16] Okay, that's one of their homelands. [00:02:17] These guys used to be protecting us from radical Islamists who wanted to unleash terror attacks on us here at home, or so they said. [00:02:25] And now they are attacking us as disinformation purveyors, the random American citizen who has access to a computer or a microphone. [00:02:34] And the chief, the chief of this effort, is going to be a woman named Nina Jenkowitz, who will be the executive director. [00:02:41] As far as I can tell so far, her most notable accomplishment is calling the Hunter Biden laptop a quote fairy tale, a quote, Trump campaign product. [00:02:52] And this is the person who now will be in charge of regulating what is and is not quote disinformation. [00:02:57] What do we make of it? [00:02:58] Well, look, this has been going on for five or six years now. [00:03:02] There's been, ever since Donald Trump was elected, a pretty concerted effort on the part of mainstream politicians, really in both parties, but particularly in the Democratic Party, to make the internet a place that will be under some kind of governmental control. [00:03:24] And this began in 2017 when we had members of the Senate calling up executives from Facebook, Google, and Twitter to the Hill, and essentially demanding that they come up with strategies to prevent what they called the foment of discord. [00:03:42] Back then, butter. [00:03:45] Right, exactly. [00:03:46] The boogeyman back then was Russian disinformation. [00:03:51] Then it became hate speech. [00:03:52] Then it was disinformation about the pandemic. [00:03:57] You know, now we're circling back to Russian disinformation with regard to the Ukrainian conflict. [00:04:04] And, you know, I think the problem is we're in a generation of people who they agree that there's a problem with disinformation in the media landscape, but they don't understand that the biggest lies are always official lies. [00:04:20] And the only real defense against that is free speech. [00:04:24] And so they want this top-down system of control, which I think is very, very dangerous. [00:04:31] That's so true. [00:04:32] If you hear the Barack Obama remarks from late last week, he was longing nostalgically for the days in which it was just ABC, NBC, and CBS, and information was controlled, right? [00:04:45] We didn't have all these internet hacks and trolls out there pushing so-called disinformation. [00:04:51] And I'm sure that was a much more delightful time for people in the position like he had at the White House. [00:04:58] But think of all the lies that have been told to us over the years from people in that post that the evening news, for whatever reason, went along with or had an incentive not to check too far into. [00:05:08] And today it's no different. [00:05:09] You know, today it's like, okay, do we not think that the people at Fox were manipulated by Trump? [00:05:15] Do we not think that the people at every other network are manipulated by Biden? [00:05:19] Like, that's the way it works. [00:05:21] Right. [00:05:22] But the problem is, people have an alternative now. [00:05:24] They have a way to get around that, which they didn't have before, as President Obama noted. [00:05:30] And I remember this pretty graphically because I was a campaign reporter in 2004 and 2008, back in those alleged salad days, or I guess towards the end of them. [00:05:44] And I would be on the bus listening to journalists talk about which candidates they thought were serious or electable and which ones weren't. [00:05:55] So, you know, you'd be in a bus full of CNN and Fox and MSNBC anchors, and they'd be scoffing at Dennis Kucinich, saying, no, he's, we're not going to take him seriously. [00:06:06] And then there would be some other candidate like John Kerry, like, oh, he's electable. [00:06:10] You know, and they made those judgments and they were important judgments because, you know, what they signaled to audiences back then had an enormous impact on how people, voters, behaved at the ballot box. [00:06:24] It's different now. [00:06:25] Like, you know, ironically, Barack Obama was a beneficiary. [00:06:29] He was one of the first people to lose the so-called invisible primary donors and still win the nomination. [00:06:40] But then when Trump broke through in 2016, that was really when the chokehold of those networks collapsed. [00:06:47] And they missed that. [00:06:48] They just, they really do. [00:06:50] And so how do we think this is going to work? [00:06:52] I mean, this woman can't really crack down on anything. [00:06:56] Like, what did the DHS is going to come try to, what, censor what happens on your show, on this show, on your sub stack? [00:07:05] Like, how on earth is this going to work? [00:07:07] I don't know. [00:07:08] I mean, but I think we've already seen that they'll go to pretty extraordinary lengths to try to have influence over information that's online. [00:07:20] We've seen in the last six years that there's been pretty extraordinary cooperation between the Senate, between bodies like the CDC and the FDA, and platforms like Facebook, YouTube, and Twitter. [00:07:39] I mean, I did a story randomly about a podcaster who was having trouble with YouTube. [00:07:48] And when I called them up for comment and asked them what they decided was, how they decided what was in misinformation and what wasn't, they just told me outright that they made those decisions in consultation with federal agencies. [00:08:02] So I think this is the world we're going to be living in, where we have basically a privatized speech landscape, but there are going to be political actors from the government that are going to be influencing the moderation decisions of those platforms. [00:08:18] It's very, you know, we're going a lot closer to blatantly protected First Amendment speech. [00:08:25] Like the reason Twitter and Facebook and YouTube can censor content is because they're not the government. [00:08:33] They're pretty close. [00:08:35] They're pretty close to having the power and certainly the fingerprints of the government all over their editorial decisions. [00:08:40] But technically still, the law does not recognize them as the equivalent of, or certainly as an actual government actor. [00:08:47] Not the case for DHS. [00:08:49] They're not allowed to censor our speech. [00:08:51] So I don't really know what they think they can do, but they may be sad to realize it's written right there in the First Amendment. [00:08:57] They're not allowed to censor speech. [00:08:59] There's a teeny tiny category they get to touch. [00:09:02] And the vast majority of what they're going to object to won't be in it. [00:09:06] That's true, but somebody still has to do the test case. [00:09:10] They have to file that lawsuit and win that First Amendment case. [00:09:16] And, you know, who's going to do that? [00:09:18] The reality is these government agencies have already been meddling with speech on private platforms, whether it's the FDA and the CBC, sort of encouraging platforms like YouTube to go by their guidelines and deciding what's misinformation and what isn't. [00:09:39] Or it's the FBI, which has been in consultation with some of these platforms about things like hate speech and which groups might need cracking down on. [00:09:49] So, yes, you're right. [00:09:52] I think there's already a powerful First Amendment argument that they've crossed the line, but that has to be challenged. [00:09:59] And who's going to issue that challenge? [00:10:02] It's a very difficult road ahead. [00:10:04] I mean, possession is nine-tenths of the law. [00:10:07] And if they're already doing this, it has to be undone. [00:10:11] It doesn't matter if it's illegal. [00:10:14] Well, speaking of possession being nine-tenths of the law, people on the right half of the aisle are just the sane half. [00:10:19] It's not just all righties. [00:10:21] It's a lot of centrist lefties too, are rejoicing that Elon Musk is taking over Twitter, is going to buy Twitter. [00:10:27] And so it appears. [00:10:28] And there's a stock problem at the moment. [00:10:30] And people, we have to keep our eye on the Twitter shares and on the Tesla shares because he does need money in order to buy it. [00:10:37] Peter Schiff came out and explained all that to us yesterday. [00:10:39] I don't totally understand it, but it's not a done deal. [00:10:42] It's not a totally done deal. [00:10:43] He's very, very rich, but he does need his actual $44 billion or at least $21 billion of it to buy. [00:10:49] So they're happy, but he's already taking crap from, I mean, all corners of Twitter and the Twitter top executives, the chief legal officer, the general counselor, was reportedly in tears upon learning on Monday that he was actually going to close the sale, you know, that they agreed to the sale. [00:11:06] And Elon, I guess, liked to tweet or took a shot at her yesterday and then took all sorts of crap in response, right? [00:11:18] Like he, let's say, he shared a meme Wednesday that mocked her response to accusations of the company's political bias. [00:11:26] And it was this thing involving Tim Poole and Joe Rogan and when they all went on there and how she's got the circular reasoning and she denies viewpoint discrimination, but we all know they do it. [00:11:34] And this is the thing that, you know, the Twitter CEO from 2010 to 2015 responds, what's going on? [00:11:41] You're making an executive of the company you just bought the target of harassment and threats? [00:11:45] Bullying is not leadership. [00:11:47] And Elon kind of defended himself saying, what are you talking about? [00:11:50] So this is the thing. [00:11:52] He doesn't behave like your normal CEO or owner when he takes over these massive corporations. [00:11:58] He's being sued right now for some of the tweets he sent out about Tesla. [00:12:02] But this is how they get you. [00:12:04] You offered your opinion. [00:12:06] His opinion is that this woman is biased and has been wrongfully manipulating content on Twitter. [00:12:14] He said it publicly. [00:12:16] The way they shut you down is, how dare you? [00:12:18] How dare you? [00:12:19] Online harassment, sexual harassment, comments, threatening, right? [00:12:24] We talked about this yesterday with Vivek Ramaswamy, which is he said they smuggle in viewpoint discrimination through the guise of hate speech, threats, you know, words or violence, all those principles, which are in active form of Twitter. [00:12:43] Yeah, absolutely. [00:12:44] And I'm one of the people who's enjoying a Schadenfreude moment this week, you know, to watch the reaction to Musk potentially buying Twitter, because, first of all, [00:12:58] I'm one of the few reporters who for years now has been covering the phenomenon of content moderation, who took it seriously from the very beginning, warned that this was going to be a problem that was going to become a bigger and bigger part of American life. [00:13:17] And I was laughed at by, you know, a lot of my colleagues, particularly like my left-leaning colleagues who I thought were free speech advocates, you know, after the Alex Jones episode, which I thought was deeply troubling, [00:13:31] not because I like Alex Jones, but because the precedent of big companies like, you know, Apple and Facebook and Spotify getting together and making sort of an ex parte decision to kick somebody off the internet, that was a radically different approach to policing speech than what I grew up with, which is if you make a mistake, you get sued and there's an open forum and there's a process and it's all transparent. [00:14:01] I think what, you know, what I said from the beginning, the issue isn't who's being censored. [00:14:07] The issue is how it's being done. [00:14:10] And if you were in favor of a handful of mega wealthy executives back then kicking off people like Alex Jones and, you know, and then eventually Donald Trump off the internet, you can't now cry that there's a different billionaire who you just happen not to like sitting in that same chair and meddling with speech in a way that you don't like. [00:14:36] You had to have objected on principle grounds before and they didn't. [00:14:40] And so, you know, look, I don't have any sympathy for these people. [00:14:44] They had a chance to stand up for something like speech principles once upon a time, and they didn't do it because they wanted to censor people. [00:14:53] They just, and now they're, now they're feeling, you know, a taste of their own medicine. [00:15:00] Can let's talk about the Alex Jones case for one second, because I, you know, reported on all of that and lived through that in a weird way myself. [00:15:09] But I know very well that Alex Jones was in a weird place versus most people who get targeted on the internet because he had been serially unleashing like very personal and in your home threats pretty much on purpose to the most sympathetic group of people in our country, namely the Newtown grieving parents. [00:15:38] So people who had their first graders shot to death in class kept getting harassed by his listeners who he kept telling to believe this was all a hoax, that it was made up. [00:15:51] And one family had to go into hiding. [00:15:54] It was found in court that they had been receiving death threats by this lunatic inspired by Alex Jones. [00:15:59] Many of them had been having to deal with the Alex Jones listeners for years in deeply painful ways. [00:16:06] And honestly, Matt, it was like, that's, I don't think you can compare that to, you know, James O'Keefe with his secret camera getting, you know, dishonest New York Times reporters saying something in a bar one way versus what they put in the pages of the Times another way. [00:16:25] I just think he's in a class of his own. [00:16:27] And it wasn't just the Newtown families. [00:16:29] I could go down the list for you of people who have been actually hurt by people he intentionally inflamed. [00:16:35] It's much closer legally to what we know as incitement, which is not protected speech. [00:16:41] Well, I don't, I don't think anything that he said with regard to Newtown was protected speech. [00:16:47] And I said that at the time. [00:16:48] I also said I think it was probably pretty obvious that he violated the terms of service of each one of those platforms. [00:16:57] Again, I had no interest in defending Alex Jones on any grounds. [00:17:03] The issue for me had to do with the method, right? [00:17:08] So once upon a time, the way we would have dealt with speech like Alex Jones was he would have been sued and the penalties, the financial penalties would have been so great that he probably would not have emerged with a career at the end of it. === Clear Line Between Threats (14:40) === [00:17:25] I mean, that's typically how those things happening at the time. [00:17:30] It took a long time against him. [00:17:32] Right. [00:17:33] And people were impatient to go through that process. [00:17:37] And I understand that. [00:17:38] Like, look, as this is happening, none of those private businesses want to deal with that. [00:17:45] I totally get that. [00:17:47] The problem is, is that is that by doing this, they opened the door for a new kind of speech policing that didn't involve any kind of open and transparent process. [00:18:02] All these companies got together, clearly coordinated. [00:18:05] You know, they all did it at the same time. [00:18:08] And they essentially decided, you know, this person is going to no longer be on the internet. [00:18:13] So that opens the door. [00:18:17] The next thing is going to be O'Keefe. [00:18:18] And then before you know it, it's the Babylon B. [00:18:21] And that was the issue that I had. [00:18:24] No, I mean, that is what happened. [00:18:26] That's why it's hard. [00:18:28] That's why it's hard because for years they'd been censoring like these radical Islamists who wanted to show people how to build bombs and commit terrorist attacks. [00:18:36] And I've got zero problem with that censorship. [00:18:39] Go for it. [00:18:40] We don't need that shit on the internet. [00:18:41] And, you know, people will die as a result of that. [00:18:43] And I just, that's indisputably not okay to me. [00:18:47] I don't see anybody defending that censorship by the big tech companies. [00:18:51] But that's not even censorship because those things are actually against the law. [00:18:56] Like, you know, the authorities can come in and they can stop actual imminent incitement to violence. [00:19:03] Well, no, but what I'm talking about, but if I just sit in front of the camera and I show you how to make a dirty bomb, that's not against the law. [00:19:11] No, it's not. [00:19:12] But, you know, and that's, and neither is hate, hate speech, but that's part of part of what the American experiment is all about. [00:19:18] They raised that bar very, very high for a reason. [00:19:21] If you go back and look at those cases, you know, in the Supreme Court that, you know, that decided what's legal speech and what is not, They made it pretty clear that they were willing to tolerate some pretty extreme stuff in order to protect the principle of free speech. [00:19:42] No, I agree with you. [00:19:44] But that's why this is so dicey. [00:19:45] Because if I were running YouTube, I would not allow that. [00:19:49] I would not allow videos of how to make a dirty bomb to be posted, even if there weren't. [00:19:54] I mean, and it's not, it's not unprotected. [00:19:57] That is protected speech. [00:19:59] But am I going to let somebody sit there and show people how to create that level of dangerous weapon that could kill a bunch of people on my platform? [00:20:07] I'm not, because I'm not a government actor and I don't have to. [00:20:10] So I would draw some lines, but I don't know. [00:20:15] And would I have allowed the Alex Jones speech against the Newtown families over and over and over? [00:20:21] I don't know. [00:20:22] I mean, it's very sticky, right? [00:20:24] Like, there are gradations on this. [00:20:27] And if you wind up canceling the Babylon B, you've gone too far. [00:20:31] If you cancel James O'Keefe, you've gone too far. [00:20:34] So I just feel like, why aren't there adults in the room who can distinguish between genuinely dangerous behavior that can and has gotten people hurt or killed, and these false claims of words are violence that you know, like claiming what, what was said about this Twitter general counsel is somehow the same as this other stuff. [00:20:59] You know look I I, i'm i'm as close as as one gets to a free speech absolutist. [00:21:04] I uh. [00:21:05] But even even I, you know uh, grew up uh understanding that there were whole ranges of things that, as a journalist, I can't say right. [00:21:14] You know, we're trained that we can't commit libel, we can't, we can't uh incite people, we can't do a whole uh list of things and we have to run things through lawyers before we publish and all that. [00:21:27] And that's not the case in the internet, and I understand that, that we have to come up with some kind of process for dealing with difficult speech. [00:21:36] My criticism throughout this, this uh, this period, has been that a lot of the people who are looking at this problem, I don't think they're really interested in solving those difficult issues that you talk about. [00:21:50] Like, if you ask me I I, I think you know, for something like Alex Jones or or, you know, making bombs. [00:21:57] I think there should be some kind of transparent, open process where you know you get to actually see how these things are decided. [00:22:06] But what you, what you you've seen instead, is you've seen a lot of politicians who seem very, very anxious to use the um, you know, quasi monopolistic power of these platforms to push speech in in a certain direction. [00:22:21] They're attracted uh, by that power uh and, and that's that's where the danger is, because as soon as, as soon as somebody sees that, oh wow, if I just flick a switch, this person's gone, they're gonna be tempted to do to, to take the next step and find the next person they don't like and and and that's that's how you end up with the Babylon Bee. [00:22:41] But here's the problem. [00:22:42] So here so okay, let's say that they do. [00:22:44] They do make it more transparent. [00:22:46] You know we're going to be more open about how we ban somebody or what have you. [00:22:52] That's, they don't care. [00:22:53] They don't care about saying you Babylon Be said, Rachel Levine is a man, and that's hate speech, that's harassing. [00:23:00] We have a policy against harassing someone based on their gender identity. [00:23:03] That's what you're doing in the view of of us. [00:23:06] And then flash to the trans person on their board who says hateful, you have no idea the suicide rate, the right. [00:23:12] You get that. [00:23:13] Therefore you're banned and I don't think they'd have any qualms about owning what, what we see as viewpoint discrimination, but what they see is just this universal non-bullying campaign. [00:23:27] Yeah, I agree. [00:23:28] I would just quickly like to point out that when they started this campaign, obviously a lot of the people who were sort of discriminated against first, and you talk about viewpoint discrimination, a lot of them were on the right, but a lot of them were on the left too. [00:23:45] I mean, some of the companies, the media outlets that saw enormous drops in traffic when companies like Google were told that they had to prevent the foment of discord, they were outlets like Truthdig and the World Socialist website and even Democracy Now, because the new algorithms essentially just favored large carriers over small ones. [00:24:11] So I just wanted to point that out that it's not, but no, I agree with you, but I do think that there has to be some way to do this that mimics the effectiveness of the litigation-based system that we had dating back to New York Times v. Sullivan in 1963. [00:24:34] You know, as a journalist growing up in that era, I always felt like the system worked extremely well because the rules were very clear about what we were and were not allowed to publish. [00:24:49] There was a pretty high bar that you had to meet to prove that somebody had committed libel or slander. [00:24:56] And yet, when there was a real egregious violation, it was usually, if not career-ending, close to it. [00:25:03] And you just raised a good point, though, Matt. [00:25:06] You raised a good point because back in those days, you know, this is the Barack Obama golden days. [00:25:10] And in this way, I see the point. [00:25:13] There was a self-imposed high bar of class, of dignity, of not, you know, unfairly targeting one individual over and over or creating a circumstance where somebody could literally get hurt. [00:25:25] You know, you wouldn't have had Alex Jones in print, you know, in the Times and in the Post. [00:25:32] Back then, those papers were more respectable. [00:25:35] Yeah, sure, they still had a left-wing bias, but they were nothing to what they are today. [00:25:39] You know, they were definitely more committed to trying to be fair. [00:25:44] And then they would not have allowed these types of things to appear in their papers. [00:25:50] So it was sort of a better approach on both sides. [00:25:53] They were less censorious, but they had a higher bar for what could be printed and who could be targeted in the first place. [00:26:01] Well, that's what I, I mean, I think that's what we're all striving for: a system where there's kind of sensible self-censorship before you print something. [00:26:13] I mean, I think the processes that we went through before we published things in major magazines, I always thought that was a good process, that we weren't afraid to use strong language. [00:26:26] We weren't afraid to say things about people if we had a strong opinion. [00:26:30] But when it came down to facts, you know, we had to be accurate. [00:26:35] We had to check. [00:26:37] And if it was a close call, we usually erred on the side of caution and left it out of the paper because the penalties were high. [00:26:46] Now, on the internet, there's nothing like that right now. [00:26:49] And there's plenty of people who don't do any fact checking at all. [00:26:53] No, there's no fact checking. [00:26:54] And this is bled into quote-unquote mainstream media, which has learned that its audiences now forgive mistakes as long as they're in the right direction. [00:27:08] So they're not careful anymore. [00:27:10] They make constant factual errors. [00:27:13] They don't worry about it. [00:27:14] They don't worry about being sued for libel nearly as much as they would have once upon a time. [00:27:20] Again, I'm not particularly sympathetic to Kyle O'Rittenhouse, but I was shocked by the way he was described in the first days of that story. [00:27:30] Like there were major news outlets that were calling him a white supremacist. [00:27:34] The president was calling him that. [00:27:36] And or the, sorry, the future president. [00:27:39] And again, once upon a time, you would have needed something to go on in order to use that terminology. [00:27:47] And they didn't. [00:27:48] They just did it because the landscape has changed so much. [00:27:53] So yeah, it's a big problem. [00:27:56] I understand that there has to be something done to fix all the craziness on the internet or at least address it. [00:28:04] But what they're doing instead, I think, is they want to leave the system in place so that they can push speech in a certain direction. [00:28:14] That's the sense that I, the clear sense that I get from. [00:28:18] I mean, I think if you asked me to sit there and say, what's the difference between, you know, threatening messages that actually could harm, you know, physically harm somebody, forget emotional harm. [00:28:32] That's just, we can't deal with that. [00:28:34] But physical harm, I could tell the difference between a tweet that did that and something that just expressed a controversial view. [00:28:42] And so, you know, I feel like maybe what Elon Musk needs is people who are just less ideological, you know, people who are committed to free speech as a principle, but people who are reasonable and don't want to see, you know, people get hurt unnecessarily because you've got some lunatic on the internet continuing to dock somebody and call for violence or, you know, stretch, come close enough to the line. [00:29:05] But they don't have ideological diversity at these companies. [00:29:09] And, you know, I've told my audience before, Matt, I went out to Silicon Valley in 2016. [00:29:14] It was 2016, right before the election. [00:29:16] And I met with the heads of a lot of these companies. [00:29:20] I was on the campuses. [00:29:21] I was meeting with the top executives. [00:29:23] And they wanted to know my thoughts on how they could do better at what they recognize as their own ideological bias. [00:29:30] And I told them all the same thing, which is get more ideological diversity on your boards. [00:29:36] Get more ideological diversity in your C-suite. [00:29:39] And certainly if you have any sort of a monitoring or a censorship group, make sure it's totally even, totally even, right? [00:29:48] You can't just have a bunch of people on one side of the aisle making all these calls and not expect that to be reflected in your decision making. [00:29:55] And guess what? [00:29:56] Nobody listened to me. [00:29:59] Well, of course. [00:30:00] Yeah. [00:30:00] No. [00:30:01] And again, I think this gets to the fact that although some people asked you for your advice, mostly people don't want to do that kind of self-reflection. [00:30:13] Mostly they want to exercise that authority in a certain way, which is unfortunate. [00:30:21] The clear line between threats and opinion. [00:30:23] There's lots of stuff that's already illegal that's allowed on these platforms. [00:30:30] And the platforms would do well if they were if they just focused on, well, let's eliminate the stuff that we that's already against the law. [00:30:39] Let's try to cut down on libel. [00:30:41] Let's try to cut down on threats because those are already against the law, right? [00:30:45] We don't need a special new policy to deal with that. [00:30:52] There are laws about that. [00:30:54] Where they get in trouble is where they try to establish things like factual truth and say that something is disinformation or misinformation because that's a moving target that you basically can't get right in a way that's going to be fair. [00:31:13] And, you know, or if they're trying to define something that's an opinion as being beyond the pale and abusive and hurtful, like hurtfulness isn't a standard that can be applied in any way. [00:31:27] I think that's rational. [00:31:30] I agree. [00:31:31] It just can't hold up as something that's consistent. [00:31:36] That's why it's like, okay, well, what is bullying? [00:31:38] Perhaps if there's some large campaign, you know, designated at one person that just completely upends the person's life. [00:31:45] It would have to be massive, massive, not just a few tweets from the Babylon Bee that gets a bunch of likes. [00:31:52] Maybe, I don't know. [00:31:53] But otherwise, like, we can't really do feelings. [00:31:57] We can't do feelings. [00:31:58] We can definitely take account for physical threats, but words are violence and my feelings are hurt. [00:32:04] Stay off the internet. [00:32:05] It's a cesspool. === Conspiracy Theory Limits (04:12) === [00:32:06] If you don't know that, you know, I'm sorry, but big reveal. [00:32:10] And by the way, you can get smartphones that don't have an internet button on them. [00:32:14] Like there are ways of protecting yourself in modern day America if you just choose not to engage with forums that you know are hurtful and toxic. [00:32:22] They are. [00:32:22] Under Elon, they will be. [00:32:24] Right now, Twitter's totally toxic. [00:32:26] I love all these libs who are like, it's rainbow and unicorns. [00:32:29] Walk a mile in my shoes on the internet while you people run it because it's been disgusting. [00:32:34] Yeah, and totally humorless and miserable experience for quite some time now. [00:32:43] I also think they get into incredible trouble when they try to police misinformation and disinformation because I think most journalists understand. [00:32:55] I mean, Megan, you know this, in the first days of any news story, there's always some error baked into the reporting that only comes out later, right? [00:33:06] So if you, you know, if you have some kind of star chamber of fact checkers who are declaring this or that to be the truth and everything else needs to be wiped out, inevitably what's going to happen is you're going to have fiascos like the lab league business, where, you know, for some initial period, they're going to declare, well, this is an untruth. [00:33:30] This is conspiracy theory. [00:33:32] Oh, but six months later, it turns out it might be true. [00:33:37] Like the COVID lab leak theory. [00:33:39] Right. [00:33:39] Yeah, exactly. [00:33:40] And once you do that, you lose all credibility with audiences. [00:33:44] And now what's going to happen is they're going to trust what you call the official trusted version of reality. [00:33:53] They're going to distrust that even more once you make a couple of mistakes like that. [00:33:57] And they're going to drift even more towards conspiracy theories. [00:34:00] So that for me is like, that's a fundamental misunderstanding of how news consumers work. [00:34:06] If you try to weed out conspiracy theories and crackpots and all these other things in the name of truth, what you end up with most of the time is more of that. [00:34:17] And I think that that's not very well understood. [00:34:22] I'm going to squeeze in a break, but I'll read this from the very well worth your time sub stack from Taibbi. [00:34:28] This site, talking about Twitter, used to be fun, funny, and a great tool for exchanging information. [00:34:33] Now it feels like what the world would be if the eight most vile people in Brooklyn were put in charge of all human life, a giant, hyper-pretentious thought Starbucks. [00:34:44] So good. [00:34:46] Hard standby, Matt. [00:34:48] More with Matt Taibbi after a quick break. [00:34:50] Loving this conversation. [00:34:51] And we'll tell you about Dr. Fauci's reversal and what Biden's doing that Trump never did before. [00:35:04] Back with me now, Matt Taibbi, editor of TK News Substack. [00:35:09] All right. [00:35:09] So the reason I stumbled on the intro is because I've got Joe Biden in my head. [00:35:13] This just in. [00:35:14] He made remarks this morning that Tom Cotton, Senator Tom Cotton of Arkansas, is tweeting out as, quote, alarming because of the little bit of slurring and a lot of stumbling. [00:35:26] Take a listen for yourself. [00:35:27] We're going to seize their yachts, their luxury homes, and all the ill-begotten gains of Putin's kleptocracy. [00:35:35] Yeah. [00:35:37] Kleptocracy, the guys who are the kleptocracies. [00:35:43] But these are bad guys. [00:35:45] Oh my God, Matt. [00:35:48] I too find it alarming. [00:35:51] Well, I've told this story before, but this was trying to cover Biden's issues on that front was actually one of the reasons I ended up moving to Substack because I was covering a, I was doing a feature on Biden on the campaign trail for Rolling Stone, and I was noticing what everybody else was noticing. === Title 42 and Mandates (05:25) === [00:36:18] Like, this guy's having trouble getting through sentences. [00:36:20] Every time he has to ad lib, he gets lost. [00:36:23] He forgets where he is. [00:36:25] He forgets what the question is. [00:36:28] And I called back some of the people I had talked to for a story about the potential use of the 25th Amendment to get Donald Trump removed on the grounds that he was mentally incompetent. [00:36:43] If you remember, there was a big drive to do that. [00:36:46] And I was assigned to cover that story. [00:36:48] And lots of psychiatrists were very happy to talk about that then. [00:36:53] But nobody would talk about the Biden issue. [00:36:56] And I just realized we were in a completely different media environment where, you know, certain things are just sort of off limits. [00:37:04] And I think it was, we did kind of the country a disservice by not talking about this a whole lot before he was elected. [00:37:12] Right. [00:37:12] Did you see the Title 42 thing last week? [00:37:15] No. [00:37:16] Oh, you've got to see it. [00:37:17] We have it. [00:37:18] So he was asked about, I think about Title 42. [00:37:24] My team will refresh me whether the question was about 42 or the mask mandate being struck down. [00:37:30] It was one or the other. [00:37:31] Hold on. [00:37:32] Go ahead. [00:37:34] Okay. [00:37:34] So the question was about the mask mandate being struck down by a federal district judge in Florida. [00:37:39] And he answered it about Title 42, the COVID immigration regulation that allows our border agents to reject everyone who wants asylum, just saying, you know, it's COVID, get out. [00:37:53] So he gets totally confused about the two. [00:37:55] They start meandering. [00:37:57] He starts intertwining. [00:37:58] Just take a listen. [00:37:59] On Title 42, Mr. Are you considering delaying lifting Title 42? [00:38:04] No, what I'm considering is continuing to hear from my, first of all, there's going to be an appeal by the Justice Department, because as a matter of principle, we want to be able to be in a position where if in fact it is strongly concluded by the scientists that we need Title 42, that we'd be able to do that. [00:38:27] But there has been no decision. [00:38:29] My God, so you hear he's asked about the mask mandate. [00:38:33] He starts meandering all over about 42. [00:38:35] He can't keep it straight. [00:38:36] Vice versa. [00:38:37] Neither can I right now, but I'm not the president. [00:38:40] And I wasn't facing the reporters and he had to issue a cleanup later in a written statement. [00:38:44] We've seen it happen time and time again. [00:38:46] Yeah, it's certainly not reassuring when you look up at the president of the United States and the emotion that's being betrayed in his eyes is terror, right? [00:39:00] Because he's not quite sure what the question is and or whether he's answering appropriately. [00:39:08] I've seen this with some other politicians in the past. [00:39:12] But Biden got worse quickly in the last election. [00:39:19] And again, I think the reporters just kind of decided to not talk about it because they had already decided that he was going to be taking on Donald Trump and they didn't want to give him ammunition, which I think was a huge mistake. [00:39:33] Did those presidents' last names rhyme with Megan? [00:39:36] Because there was a real issue with one of them in his second term that went on to become quite a news story. [00:39:42] Right. [00:39:42] Yeah. [00:39:43] Well, Reagan was one of the ones I was thinking of. [00:39:45] I've seen it. [00:39:46] I saw it with Boris Yeltsin when I lived in Russia. [00:39:50] You know, I think the issues there might have been a little bit different, but similarly, he had some cognitive issues. [00:39:59] But look, this is what happens when reporters start messing with things beyond their purview. [00:40:07] Like our job is just to tell you like what we see and worry about whether it's right or wrong. [00:40:14] And then it's up to the public to figure out what they think about it. [00:40:18] What started to happen in 2016 when Trump came on the scene is reporters suddenly were like looking at news stories. [00:40:25] Just to take an example, there was that issue with Hillary Clinton not filling up her crowds, right? [00:40:32] So she was having trouble filling the halls. [00:40:35] And reporters got together and they kind of silently decided not to make an issue out of that because they didn't want to make it look like her campaign was doing badly. [00:40:46] But that ended up hurting her because it created a false sense of security in the campaign. [00:40:52] And, you know, instead of doing something to try to fix it, they just kept going and they ended up losing. [00:40:59] So, you know, reporters should just, you know, tell us what they see and, you know, let the chips fall where they may. [00:41:07] And they won't make, they won't affect history in a negative way, at least that way. [00:41:11] Well, and it's like, you know, when grandpa starts to lose his marbles, you know, when he starts to go south, grandpa can be easily manipulated. [00:41:20] You know, we don't do that because we love grandpa, but this is the sitting president of the United States. [00:41:24] And we were promised somebody who wasn't going to be some far left woa ster and he has been. [00:41:30] And we were promised somebody who was going to be the voice of reason and he hasn't been. [00:41:34] And we were promised somebody who said he was very skeptical of, quote, forgiving student loans because he understood the problems that would create and the fairness issues it would create. === Pandemic Phase Confusion (08:58) === [00:41:44] And now he's about to do it. [00:41:46] And one wonders, what did I buy? [00:41:49] What did I get? [00:41:50] Who is running the show? [00:41:52] Legitimately, who is making these decisions? [00:41:54] And if it is Joe Biden, who is manipulating him into these decisions? [00:41:57] Because I'm not sure I elected them. [00:42:00] Yeah. [00:42:00] And that was another question. [00:42:03] Because there was so clearly a competency issue with Biden. [00:42:08] There should have been a secondary news story. [00:42:10] Like, who's actually going to be running the country if this guy gets elected? [00:42:13] And there weren't a whole lot of those stories. [00:42:15] I mean, I blame myself. [00:42:17] I didn't really do it either. [00:42:19] But somebody needed to do that story and needs to do it now too. [00:42:25] And we're not really doing it. [00:42:28] We know that there's some infighting, but we don't know. [00:42:30] We don't know exactly how decisions are being made. [00:42:33] Well, so Joe Biden is doing something that Trump didn't do. [00:42:37] And that is as the sitting president, he's about to go to the now reborn White House correspondence dinner, which is going to happen in Washington, D.C. this weekend. [00:42:48] Cue the vomit emoji. [00:42:50] I know that. [00:42:51] I've been calling the White House self-congratulation dinner, but yeah, go ahead. [00:42:55] It's disgusting. [00:42:56] They're awful. [00:42:56] My favorite was I went to one where Pamela Anderson was. [00:43:00] She was, you know, they always invite these celebrities. [00:43:02] George Clooney was there once. [00:43:03] He was like the biggest star ever there, bigger than any president. [00:43:07] Pam Anderson was at one and they said, so, you know, Ms. Anderson, what are you doing at the White House correspondence dinner? [00:43:13] And she said, oh, I'm sorry. [00:43:15] I thought I was at the white trash correspondence dinner. [00:43:22] Greatest thing ever to happen. [00:43:24] That's great. [00:43:25] That's great. [00:43:26] So Biden's going to go. [00:43:28] He's only going to sit. [00:43:29] He's not going to have the dinner out of COVID fears. [00:43:31] He wants to be responsible. [00:43:33] He's not going to sit for the actual dinner. [00:43:34] He's just going to go for the, you know, the humor and the roasts. [00:43:37] I mean, that's what everybody wants to do. [00:43:39] No one wants to sit for the damn dinner. [00:43:40] So he's basically just, you know, parachuting in for the comedian. [00:43:44] But then turns out the comedian's Trevor Noah. [00:43:47] So who wants to see that? [00:43:49] We all know what we're going to get. [00:43:50] And the other sort of subline to all this, Matt, is that Dr. Fauci was supposed to go, but bailed because the four-time vaccinated Fauci doesn't think this is safe. [00:44:01] Yeah, I mean, that story is ridiculous on so many levels that it's just hard to even know where to begin. [00:44:08] But they've been consistently irrational about this from the very beginning. [00:44:15] You know, from the very start, they were signaling to us that they didn't really think the vaccines worked. [00:44:26] Why did we have to stay in lockdown if the vaccines were effective? [00:44:30] Well, they just don't really believe in them. [00:44:34] And I think there's sending mixed messages, which again gets back to the point of, you know, when people stop trusting you, that's when they drift even more towards conspiratorial interpretations of things. [00:44:45] So I think it sends a terrible message what he's doing. [00:44:49] It's so true, right? [00:44:50] It's like, aren't the vaccines supposed to protect us from severe illness or death and reduce COVID to something rather mild that the average person can handle? [00:45:01] Yes is the answer. [00:45:02] So why are they behaving like this is the very first form of COVID, which actually was more severe, far more than what we're dealing with now, Omicron, whatever, the second version of Omicron. [00:45:13] Why are they pretending like it's still that version and we have no vaccine and we have no therapeutics, right? [00:45:19] They aren't going out and living their lives. [00:45:21] Or maybe it's just all one big massive virtue signal to try to cover for their overextended big government hand, which is still literally over the mouths in effect, I guess not literal, of little children in New York City, two-year-olds who are masked. [00:45:38] Yeah, clearly there were people who just loved all of the rules to a degree that was a little bit unseemly. [00:45:47] Like there were lots of policies in the last two years where I thought, well, maybe I agree with that. [00:45:53] It's possible that that might be the sensible thing to do. [00:45:56] But I was put off by the glee with which people were glad to impose some of these restrictions, especially with schools and kids where it was suddenly became taboo to talk about the fact that kids didn't really get sick with this very much. [00:46:15] That's disturbing. [00:46:15] I think there are people who just like it too much, like the rules too much. [00:46:19] And that's not a good thing. [00:46:22] It's like Brian Stelter. [00:46:23] Would you go to a party with no rules? [00:46:26] It speaks for so many of them. [00:46:30] Okay, listen, when we come back, I'm going to play you Dr. Fauci, who literally in the course of a few hours declared the pandemic was over, only to reverse himself moments later. [00:46:39] It's not over. [00:46:39] It's over. [00:46:40] Celebrate. [00:46:41] We finally. [00:46:42] No, it's not. [00:46:44] Is anyone surprised? [00:46:45] There's much, much more to go over, including the news we just got about what we're prepared to do in Ukraine, where Matt has had some good thoughts on Russia and what our potential role should be all along. [00:46:56] More with Matt coming up. [00:47:00] All right, Matt. [00:47:01] So staying on the subject of Fauci, literally in the course of a few hours, he said the pandemic was over, only to reverse himself and say, no, it's not over. [00:47:13] It's never going to be over for Dr. Fauci, I'm sure. [00:47:16] Take a listen to these butted soundbites. [00:47:18] We are certainly right now in this country out of the pandemic phase. [00:47:23] Is the pandemic still here? [00:47:25] Absolutely. [00:47:26] So when I said phase, I probably should have said the acute stage of the pandemic phase. [00:47:35] I see you laughing. [00:47:37] It is laughable. [00:47:40] It is. [00:47:40] It is. [00:47:41] And again, this just gets back to why you can't have YouTube or Google or Facebook or Twitter relying upon government officials to tell you what the truth is about something, because even they don't know. [00:47:59] They change their minds every 10 seconds about stuff, including like really important things like whether or not to wear a mask or, you know, whether the vaccine is actually going to protect you from getting infected. [00:48:12] Like that's why you cannot have top-down information controls because, you know, the truth is always a moving target. [00:48:24] I know, I feel like he just, I don't, either he had a momentary slip, you know, when he said it's over, because I don't think he's ever going to say that and really mean it. [00:48:33] He doesn't really want that. [00:48:35] Or he just got woodshedded. [00:48:36] He said it because it's actually a fact and he slipped into factual reporting for a second there only to get wood shedded by the administration. [00:48:42] He said, we're not admitting that. [00:48:44] We have mandates in place. [00:48:46] We're still firing people for not getting like, no, it's not over. [00:48:49] Get back on message. [00:48:52] Yeah, I think they took out the cattle fraud and found a nice quiet room somewhere to set him straight about what the official message is. [00:49:03] Yeah. [00:49:04] Like, what is happening with all a lot of these vaccine mandates are still in place. [00:49:08] People are still getting fired. [00:49:10] Even I wonder about in my schools, they have vaccine mandates in our schools. [00:49:14] They don't kick in until they're 16 years old. [00:49:16] And I'm not there yet with my kids. [00:49:18] But I wonder, like, how do you justify that? [00:49:20] Right. [00:49:21] For the kids who are about to turn 15 to 16, you can't justify that anymore. [00:49:25] You got, you got Fauci on tape saying the pandemic phase at least is over. [00:49:31] It's over. [00:49:32] So what's going to happen? [00:49:32] Do you think, you know, do these politicians and bureaucrats and school administrators follow through with these things? [00:49:40] The writer Christopher Lash once said the essence of propaganda was keeping the public in an ongoing state of emergency. [00:49:50] And I think we've, especially in the Trump years, we've fallen into the pattern of always being in an emergency and politicians finding ways to find that useful. [00:50:03] The pandemic has been extremely useful to politicians. [00:50:08] It has given them the ability to dictate all kinds of behaviors and to allow them to stick their fingers in things like the news and internet content moderation. [00:50:23] I don't think they want the emergency to end. [00:50:25] I think they like this new normal, you know, and it's a problem. [00:50:30] You know, the idea that there aren't people who are motivated to end crises is a big problem just generally, I think, in politics. === ESPN Punishes Anchor (06:59) === [00:50:42] So speaking of the vaccine mandates and how they've impacted people's lives, an interesting couple of cases in the news. [00:50:50] One has to do with the mandates, one doesn't. [00:50:53] Sage Steele of ESBN just filed a lawsuit against ESPN and its parent company, Walt Disney, alleging that the company treated her unfairly for comments she made on a podcast interview last September. [00:51:07] This made news at the time. [00:51:08] She had been one of the lead anchors for ESBN's flagship show, Sports Center. [00:51:13] I know you're big into the NFL draft and things like that. [00:51:15] I am not. [00:51:16] I know nothing about sports. [00:51:17] So I'm reading this. [00:51:19] Okay. [00:51:19] But since that interview, she says she's been sidelined for the prime assignments. [00:51:24] She does continue to anchor the Noon Sports Center broadcast, but quite a few things were taken away from her and she was pulled off the air for some big assignments, she says. [00:51:32] So she had gone on former NFL quarterback Jay Cutler's podcast and shared her thoughts on ESPN's vaccine mandate, sexism in sports journalism, and on Obama's ethnicity, the fact that he selected black as his ethnicity on the census because he's biracial. [00:51:50] And she's also biracial and had some thoughts on it. [00:51:54] So here's what she said on the Jay Cutler podcast that she's now alleging she was punished for. [00:51:59] I respect everyone's decision. [00:52:01] I really do. [00:52:02] But to mandate it is sick and it's scary to me in many ways. [00:52:09] But I have a job, a job that I love and frankly, a job that I need. [00:52:16] But again, I love it. [00:52:17] I just, I'm not surprised it got to this point, especially with Disney. [00:52:21] I mean, a global company like this. [00:52:24] So ESPN melted down. [00:52:27] We embrace different points of view. [00:52:28] Dialogue and discussion are great. [00:52:30] That said, we expect those points of view to be expressed respectfully in a manner consistent with our values and in line with our internal policies. [00:52:38] She got hit by, of course, Jamel Hill, who just, once again, lost yet another show over there on CNN Plus. [00:52:45] How many shows can Jamel Hill lose? [00:52:48] And then ESPN required her to issue an apology. [00:52:52] So the thing about ESPN, and normally they could punish her for her viewpoints because they are not a government actor. [00:53:00] But the state of Connecticut, where she is and where I am, they have apparently a law that actually says corporations can't always do that. [00:53:11] And she's taking advantage of that. [00:53:13] So what do you make of her fighting back against what this company allegedly did to her? [00:53:19] This is a difficult issue for me. [00:53:21] I'm of two minds about this because, you know, I remember when Liz Spade, the former public editor of the New York Times, got in trouble some years ago, among other things, for talking about New York Times writers being on social media too much. [00:53:44] And, you know, I understand the rationale for that because once upon a time, you know, in my father's day, when he was on the news, viewers didn't really know a whole lot about the political views of reporters. [00:53:59] And that was, and that actually added to their credibility. [00:54:04] Like they, you know, if you didn't know whether a person was liberal or conservative and they were just delivering the news, it did kind of tend to make people feel like they were more likely to believe just that they were watching a news program. [00:54:21] However, you know, nobody really is just a pure newsreader anymore. [00:54:26] And everybody has a social media presence. [00:54:28] So you can't, I also think you can't talk about that. [00:54:31] Especially at ESPN talking. [00:54:33] Yes. [00:54:33] Right, Matt at ESPN. [00:54:34] They're encouraging these anchors to go out there and they've forced moments of silence on them and they've gotten very politically active on the air there. [00:54:43] So why single out Sage? [00:54:46] Yeah. [00:54:47] I mean, and again, I know a lot of people in the news business who were who were outright told by their bosses, like, you have to get a Twitter handle. [00:54:57] You've got to have more of a presence on social media. [00:55:00] Clearly, on ESPN, you know, they're trying to build up the brand, the individual brands of all of these on-air personalities. [00:55:08] So when they do that, but they do that in a way that doesn't fit with some kind of orthodoxy. [00:55:15] I don't think you can punish those people. [00:55:16] I think that's crazy. [00:55:19] It's once again, it's viewpoint discrimination. [00:55:21] By the way, the Connecticut law, just to clarify what I said, it states companies cannot discipline employees for exercising their First Amendment rights as long as the comments do not directly impact their work performance or the company. [00:55:32] She's arguing that her comments were made in a third-party podcast and that she should be considered a private citizen in this situation making these comments. [00:55:40] The thing is, like, I don't see how ESPN gets away with punishing just her, given its push to make its anchors go totally woke on the air. [00:55:50] And now you have one person here who happens to be a woman of color who pushes back on some of the narrative. [00:55:56] She didn't want to get the vaccine. [00:55:58] She didn't think it made sense. [00:55:59] She didn't like, well, she didn't say she didn't like that Obama choosing black, just to clarify what she actually said. [00:56:05] She said, Barack Obama chose black and he's biracial. [00:56:07] I'm like, well, congratulations to the president. [00:56:10] That's his thing. [00:56:11] I think it's fascinating considering his black dad was nowhere to be found, but his white mom and grandma raised him. [00:56:16] But hey, you do you. [00:56:17] I'm going to do me. [00:56:18] That's why is that an unfair point? [00:56:20] She's basically asking, why do you identify with one side of the family versus the other when it was the other that raised you? [00:56:26] Okay, you can say I'm offended by that. [00:56:28] I don't like that. [00:56:29] It's her POV. [00:56:30] Same as, you know, some audience members may get offended by the incredibly woke, anti-patriotic statements coming out of the mouths of the anchors sitting on set during the big basketball games or the big football games. [00:56:42] And we've heard that too. [00:56:43] ESPN has no problem with that. [00:56:46] Yeah. [00:56:46] And what I would say is as a sports fan, I don't want to hear it. [00:56:52] Like when I turn on ESPN, I'm turning it on, or I used to anyway, it's because I'm looking for an escape from politics. [00:57:01] That's the thing. [00:57:01] She didn't do it in the anchor chair, unlike those guys. [00:57:04] She did it on a podcast. [00:57:06] Right, exactly. [00:57:08] You know, so I don't know. [00:57:11] I don't know what they're thinking. [00:57:12] I mean, I think a lot of these companies that have gotten away from what really works, you know, sportscasting used to be a really, you know, interesting and colorful and creative wing of the media world because they were able to write with style. [00:57:35] They were able to use humor and wit in ways that regular newscasters weren't really allowed to do. === Johnny Depp Victim Guilt (07:00) === [00:57:42] But it's become just as dreary in a lot of ways as the rest of media. [00:57:46] And I don't really understand why they would voluntarily do that. [00:57:49] What did you call the Starbucks? [00:57:51] What did you call the Twitter now? [00:57:53] Thought Starbucks. [00:57:54] Thought Starbucks. [00:57:55] There are some thoughts, Starbucks, too. [00:57:56] Now, nobody wants to be that. [00:57:59] Okay, so the second lawsuit I wanted to ask you about, I realize you're not here in any legal capacity, but they're interesting. [00:58:05] And people are talking about them, these cases, is the Amber Heard Johnny Depp defamation case. [00:58:10] She claimed in the Washington Post that she was a domestic abuse victim in 2018. [00:58:15] This is two years after she had made sure she was caught on camera by the paparazzi with what she claimed was a bruise on her face from what she claimed was a phone thrown at it, the face, by Johnny Depp. [00:58:27] We've had witness, so she's laying the foundation, I'm an abuse victim at his hands. [00:58:32] The WAPO op-ed did not name Johnny Depp, but everybody knew that's who she meant. [00:58:37] And now he's sued her. [00:58:39] Got fired from, I guess it was the fifth installation of Pirates of the Caribbean right after that and lost millions of dollars, not to mention reputational damage. [00:58:48] And he's filed a lawsuit for defamation against her. [00:58:51] And the trial has not gone well for her. [00:58:54] It has not gone well at all. [00:58:56] There's been plenty of testimony about how they're both hot messes and they're both way into drugs and violent and weird. [00:59:03] But it's certainly established at a minimum she has attacked him repeatedly. [00:59:09] And so, I mean, that's at a minimum. [00:59:11] Okay. [00:59:12] Best case scenario for her is they attacked each other. [00:59:15] She did it more, but he a couple of times may have hit her too. [00:59:19] That's best case scenario. [00:59:20] All inference is in her favor. [00:59:22] That does not necessarily support, I'm an abuse victim and I've had the, you know, the internet unleashed against me. [00:59:30] You abused him repeatedly. [00:59:33] You cost him the end of his finger. [00:59:35] You or your friend actually defecated in your marital bed. [00:59:38] The evidence has shown. [00:59:40] I mean, it goes on, Matt. [00:59:42] And this is Johnny Depp's testimony in court this week in part. [00:59:45] Take a listen. [00:59:47] I lost a fucking finger, man. [00:59:49] Come on. [00:59:51] I had a fucking, I had a fucking can of mineral spirits thrown on my nose. [00:59:58] You can please tell people that it was a fair fight and see what the see what the jury and judge think. [01:00:04] Tell the world, Johnny. [01:00:05] Tell them, Johnny Depp. [01:00:07] I Johnny Depp, man, I'm a victim too of domestic violence. [01:00:10] And I, you know, it's a fair fight. [01:00:12] It seems how these people believe or side with you. [01:00:16] And what did you say in response when Miss Heard said, tell the world, Johnny, tell them Johnny Depp, I, Johnny Depp, a man, I'm a victim to of domestic violence. [01:00:27] I said, yes. [01:00:31] So that's her admitting basically on tape that she cut his finger off with a vodka bottle and him complaining about it and her kind of mocking him, like, oh, go ahead, good luck. [01:00:41] Tell the world you're the victim. [01:00:43] And him saying, you know what, live, I am. [01:00:47] Yeah. [01:00:48] I don't know. [01:00:48] This is this is a tough one. [01:00:52] You know, I've obviously gotten in trouble over the subject in the past. [01:00:56] And I do understand the idea that there needs to be an initial reaction that we believe women at least enough so that they get a hearing, you know, to not believe, but keep it open. [01:01:17] Yeah, like at least accept the seriousness of the accusation, like initially. [01:01:23] You can't just dismiss it like you used to. [01:01:26] Right, which is what happened, you know, in the past. [01:01:29] And that's something that definitely needs to be corrected. [01:01:33] I'm doing a story right now about, I can't really talk about who it is, but there's a company that's, you know, that's gotten in a lot of trouble and had all sorts of issues financially really over allegations and not really about substantiated conduct. [01:01:54] And this is something that's just become a little bit, I think, too easy in modern media, which is, you know, we raise an allegation of something or we imply that something happened. [01:02:05] And before you know it, you know, the Twitter takes off and turns it into a fact. [01:02:11] And next thing you know, it's a reputational harm issue. [01:02:14] And we can't have that person working at our company because, you know, the staff will be upset about it. [01:02:23] That's just become too easy. [01:02:24] Like we, I think there has to be some kind of happy medium where you have to prove these things out before people really, you know, go through serious damage. [01:02:34] Yeah. [01:02:35] I mean, You're not wrong because I mean, I do think like the believe all women thing was always a lie and stupid and absolutely un-American. [01:02:44] Nobody gets a presumption of belief. [01:02:46] Nobody, right? [01:02:47] It's the worst case scenario is you're you're charged with a crime and the system says you get a presumption of innocence because the state has such an advantage over you when you're sitting there in shackles and he or she gets to go in on the other side in their suit saying, I represent the United States of America. [01:03:04] For those reasons, because the deck is stacked against the defendants, we give them a presumption of innocence. [01:03:08] We want to hold the system to account before we throw somebody in jail, take away their freedom. [01:03:13] That you don't get that presumption of truth telling in any forum, including a court. [01:03:20] And so I'm glad he brought this case because she really was painted as just this poor victim who'd been abused by him. [01:03:30] And definitely he suffered from it financially and otherwise. [01:03:33] Not that he needs the money, but still, just the principle. [01:03:37] And I think this trial has exposed that at a minimum, these situations can be a lot more complicated than we admit. [01:03:46] Well, and this has always been a big, yeah, I think you're absolutely right. [01:03:51] And this has been a big issue for me over the years, which is that a lot of reporters think that, you know, there's a playbook to news stories or that you can, you know, lapse into cliches when you report things. [01:04:09] The reality is you have to clean your slate every time and approach every news story as a completely new set of facts because, you know, what might be a Matt Lauer story, you know, in one instance, you know, you might have a completely different fact pattern the next time. [01:04:27] You can't carry over expectations from your previous reporting and just kind of shoehorn in a, you know, a cliched understanding of what happened. [01:04:38] And I think that's, we've gotten away from doing that, of just wiping the slate clean each time. === Collective Guilt Drift (04:57) === [01:04:43] That's good. [01:04:43] It's part of our drift toward collective guilt. [01:04:47] You know, this, he must be guilty. [01:04:49] He did it. [01:04:49] He's a man. [01:04:50] He's a rich man. [01:04:51] He's a celebrity. [01:04:51] He did it. [01:04:53] And he can't, that's just not the way life works. [01:04:55] He doesn't have any collective guilt because of any, because of his gender, because of his celebrity status. [01:05:01] Okay, hard turn now, because I do, before we go, want to get your thoughts on Ukraine. [01:05:06] You've been really interesting on this whole conflict over there, which goes on. [01:05:11] And the news of the day is that Biden wants another $33 billion from Congress for Ukraine emergency funding. [01:05:21] It's a big price tag. [01:05:23] Germany has now reversed itself on sending arms to Ukraine after claiming it would tap into its reserves. [01:05:30] So some rollback from the Europeans, America sending more money. [01:05:34] There's still some calls from Republicans and Democrats even for us to get more involved, more weapons. [01:05:42] And even still, some people are saying no fly zone and so on, though I don't think that's going to happen. [01:05:46] So where do you make of where the United States is now and where this conflict is now? [01:05:53] So, first of all, I was one of the people that got this wrong. [01:05:57] Like, I never expected Russia to actually invade Ukraine or at least the Western part of Ukraine. [01:06:06] And so I made a wrong call on that. [01:06:08] And then you did something extraordinary. [01:06:11] You admitted that you were wrong and you apologized to your listeners and your readers, which is all that's expected, but nobody does that anymore. [01:06:18] I mean, it's crazy. [01:06:20] Nobody takes accountability, responsibility. [01:06:22] Yeah, you do have to do that. [01:06:24] But, you know, I got that wrong. [01:06:26] And it's an unpredictable situation. [01:06:28] But I think what's happened over time is that we're not really reporting on what the United States is, what their policy is. [01:06:43] You know, Secretary of State of Defense Austin said this, I thought really fascinating thing this week where he said that, you know, basically what our plan is is to is to weaken Russia so that it can't do this to the next Ukraine. [01:06:59] Now, that seems to me at cross purposes with Ukraine's mission in all this. [01:07:05] I'm sure Ukraine wants to defeat Russia militarily, but they may also come to a point where they just want to end the conflict with minimal damage. [01:07:14] And so if the United States is committed to a different policy where we're not going to give them the ability to negotiate, for instance, the end of sanctions, then Russia is really at war with us, not with Ukraine. [01:07:32] Like if Ukraine doesn't have that autonomy, then this is an immensely complicated situation. [01:07:41] And I also think the United States is delusional if they think that this is going to end in some kind of happy regime change scenario in Russia. [01:07:48] The much more likely outcome is that you're going to get a more hardline leader who's going to come in after Putin and they're going to drop vacuum bombs on every city in Ukraine. [01:08:00] Like that's my worry in the whole thing is that we're pursuing this with this sort of fairyland expectations about how it's going to end. [01:08:10] Yeah, I was joking the other day that they seem to think Biden and those around him that if they could just get rid of Putin, they'd get Jed Bartlett. [01:08:19] There he is just waiting. [01:08:21] He's dying for democracy. [01:08:23] Just if somebody could take out Putin, I could come in with all my liberal ideas. [01:08:28] Right. [01:08:28] I mean, were they not paying attention the last 30 years? [01:08:32] I mean, that's the thing that's amazing to me is the United States has already been around this track many times with Russia. [01:08:38] I was there during this process. [01:08:40] Like, you know, we tried to foist an America-friendly leader on Russia, and those people were hugely unpopular, mainly because they were friendly with the West. [01:08:54] And it was part of the reason we got Putin in the first place, because, you know, Boris Yeltsin was seen as too close to the United States. [01:09:03] Putin was seen as somebody who stood up to us. [01:09:05] And so he had popular backing. [01:09:08] So the person who comes in after Putin, if they think it's going to be like, you know, Emmanuel Macron or something like that, they're high, you know, like that, they're not understanding what the situation really is. [01:09:22] And I worry, you know, this is like all the president's men thing. [01:09:27] Like this is, these are just not very bright guys and things are going to get out of hand. [01:09:31] You know, that's what I worry about with this. [01:09:33] No wonder Yeltsin was drinking so heavily. [01:09:35] Nobody liked him. [01:09:36] His only friends were America. [01:09:39] Oh, right. === Pleasure Always Returns (00:20) === [01:09:41] It's been a pleasure, as always. [01:09:42] Thank you so much for coming on. [01:09:44] And to our audience, go check out Matt Substack now. [01:09:46] TK News, well worth your time, as you can see. [01:09:49] All the best. [01:09:50] Thanks so much, Megan, for having me on. [01:09:51] Take care now. [01:09:52] Don't forget to join us tomorrow. [01:09:53] Cheryl Atkinson will be here. [01:09:55] Talk to you then. [01:09:57] Thanks for listening to the Megan Kelly Show. [01:09:59] No BS, no agenda, and no