The Megyn Kelly Show - 20220713_bidens-inflation-crisis-and-elon-musk-vs-the-bots- Aired: 2022-07-13 Duration: 01:35:49 === Rising Unemployment and Inflation (15:01) === [00:00:00] Welcome to the Megan Kelly Show, your home for open, honest, and provocative conversations. [00:00:11] Hey, everyone, I'm Megan Kelly. [00:00:13] Welcome to the Megan Kelly Show. [00:00:15] Wow. [00:00:15] Oh, wow. [00:00:16] Those inflation numbers. [00:00:18] They are worse than expected and they were expected to be terrible. [00:00:22] Hitting yet another 40-year high at 9.1%. [00:00:26] My God, that's just, that's just an eye roller. [00:00:30] Much of that is due to higher prices at the pump, at the grocery store, and at home. [00:00:35] The stock market is, of course, dropping on the news. [00:00:38] Meantime, President Biden just arrived in the Middle East where gas prices will certainly be on the agenda. [00:00:44] But in a statement, he's insisting things are not as bad as they seem. [00:00:48] All is well. [00:00:49] Remember Kevin Bacon and Animal Health? [00:00:51] All is well. [00:00:52] Remain calm. [00:00:53] They're not as bad as they seem. [00:00:55] And is once again blaming, guess who? [00:00:58] Putin. [00:00:59] Fair? [00:01:00] We'll get to it. [00:01:01] Plus, breaking up, maybe hard to do. [00:01:04] Elon Musk has been sued officially now by Twitter in Delaware in an effort to force him to complete the deal. [00:01:10] They want that, quote, specific performance that he promised. [00:01:14] They don't just want their billion-dollar breakup fee. [00:01:16] They want him to buy Twitter and they're trying to force it. [00:01:19] We're going to be joined by a legal expert on that who's going to tell us whether Elon's likely to win that case or not. [00:01:28] And then a bit later, we're going to talk to an expert in bot behavior and we'll ask her what Twitter's bot situation looks like. [00:01:35] But first, we are joined today by David Sachs, a venture capitalist who runs Craft Ventures and is co-host of the tech podcast All In. [00:02:08] Tjenesten koster heller ingenting ekstra. [00:02:10] Fiken. [00:02:11] Start din egenbedrift. [00:02:12] Superenkelt. [00:02:17] David, welcome back to the show. [00:02:19] Hey, Megan, good to be here. [00:02:21] So, my God, 9.1. [00:02:23] And it really is shockingly high. [00:02:27] And Biden's out there already saying, don't believe your lying eyes. [00:02:30] It's really not that bad. [00:02:32] Things have gotten a lot better since those numbers were calculated over the past 30 days. [00:02:37] It's out of date. [00:02:38] Energy alone comprised nearly half of the monthly increase in inflation. [00:02:44] And this data doesn't reflect the full impact of nearly 30 days of decreases in gas prices, says the president, also pointing out that other commodities like wheat have fallen sharply since this report. [00:02:56] And then goes on, of course, to say other countries are suffering from inflation and battling, quote, this COVID-related challenge made worse by Putin's unconscionable aggression. [00:03:09] What do you make of it? [00:03:10] Well, you're right. [00:03:11] So the expectation on the part of analysts was that this number, this inflation number, coming at 8.8%. [00:03:17] Like you said, it was 9.1%. [00:03:19] Last month, it was 8.6%. [00:03:21] So the number still hasn't peaked. [00:03:23] I remember a couple of months ago, the belief was that inflation would have peaked by now because inflation is measured on a year-over-year basis. [00:03:32] And so as you start to lap bigger and bigger numbers from last year, then you would expect the inflation rate to go down. [00:03:38] But that has not happened. [00:03:40] We're still setting new highs each month in inflation. [00:03:44] And you're right that gas is the biggest culprit here. [00:03:48] However, it's also groceries are up 12% in the past year. [00:03:51] That's the biggest annual increase since 1979. [00:03:54] Chicken is up 19% in the past year. [00:03:56] That's the biggest increase ever. [00:03:58] Electricity is up 14%. [00:04:00] That's the biggest increase since 2006. [00:04:02] Rent is up about 6%. [00:04:04] That's the biggest increase since 1986. [00:04:06] So it's not just gas. [00:04:08] It's a broad-based inflation problem. [00:04:10] And yeah, I think we're still in the midst of dealing with it. [00:04:14] And the problem is that any wage growth we've seen, any employment numbers that look good on paper, all get dented. [00:04:23] They all get dinged up by inflation. [00:04:26] It's like, who cares if you get a 5% wage increase when your inflation rate on all of your groceries and so on is 9.1%. [00:04:36] That's right. [00:04:36] I mean, workers, real wages are not keeping up with the rate of inflation. [00:04:40] And so they can really feel it when they go to the pump or buy groceries. [00:04:44] And I think this is going to be foremost on voters' mind in November. [00:04:47] Just to give the president his due, gas prices have come down about 15 to 20 percent over the past month. [00:04:53] So if you're to measure inflation today in light of that decrease, it would be a little bit lower than this 9.1% number, but it still wouldn't be good. [00:05:02] You're talking about 8% roughly. [00:05:04] Inflation numbers, I think you can depend on between now and the November election. [00:05:10] So the numbers just aren't going to get good enough, fast enough to help the administration. [00:05:14] I think that they've got a big problem here coming into November. [00:05:18] It's, of course, so you have Biden coming out today and saying out of date, out of date. [00:05:23] But what we've been told, even in the face of the 8.8 number last month, is really not to believe our lion eyes. [00:05:30] In addition to the Putin price hike stuff that he keeps saying, here was the White House press secretary just a couple of days ago on how strong our economy is at SOT1. [00:05:42] When you look at inflation, when we look at where we are economically, and we are in a strong, we are stronger economically than we have been in history. [00:05:51] When you look at the unemployment numbers at 3.6%, when you look at the jobs numbers, more than 8.7 million of new jobs created. [00:06:00] That is important. [00:06:02] Stronger than we've been in history and citing the unemployment rate, which is, I mean, it's just such cherry-picking and lacking context. [00:06:10] In any fair press, we'd have an immediate fact check. [00:06:13] But since it's the Biden White House, we won't. [00:06:16] Yeah, I mean, the unemployment rate as it stands today is low, but the labor participation rate is also low. [00:06:22] We've got millions of people who haven't gone back to work, and that's not really counted in the unemployment number. [00:06:27] The other thing that the administration should be really worried about is the economy is slowing down really fast. [00:06:32] And this is a result of the rate increases that the Fed is now having to push in response to inflation. [00:06:39] So I think that we're very, you already saw in Q1, we had a negative growth rate for GDP. [00:06:47] In another couple of weeks, we'll get the growth rate for Q2. [00:06:51] If that number is negative, we will officially be in a recession. [00:06:55] But regardless of whether that number is technically negative or not, if you poll most Americans right now, most Americans already believe we're in a recession. [00:07:03] So they are feeling the pinch from this inflation. [00:07:07] I think that you're seeing companies slam on the brakes in response to the Fed rate increases. [00:07:14] And so the economy is definitely slowing down. [00:07:16] And I think there's a pretty good chance that if we're not in recession by the end of this month, we will be by the end of this year. [00:07:23] It's a weird recession in a way, because I remember the recession of 1991. [00:07:28] I was in college and, you know, the graduating class a year or two ahead of me, they were all struggling to find work. [00:07:35] You know, I mean, when you think of recession, you think about a tough job market. [00:07:39] As you point out, this is a low, this is not a tough job market. [00:07:43] You can find a job in this job market. [00:07:45] It's just the question of when you get your salary and you take it home, what can you buy? [00:07:50] And how does it compare to what you could have bought with the same number 12 months ago? [00:07:55] But this unemployment rate is very interesting and it's kind of frustrating. [00:08:00] I mean, I'll tell you, I've experienced it myself personally. [00:08:03] I've read accounts of other people who are going through the same thing. [00:08:05] We come to New Jersey during the summer months and you cannot go out to a restaurant here because there's no staff. [00:08:14] Like all of these restaurant owners are begging for the college students, for anybody who can work to apply for a chef or a cook job for a waiter or waitressing job for a bar back or a busboy type job. [00:08:29] And I mentioned this guy before. [00:08:31] He's clearly a Republican here at the Jersey Shore. [00:08:34] And I was kind of laughing because on the upper west side, you have AOC action figures and Dr. Fauci superhero dolls. [00:08:40] And here on the Jersey Shore, where it's a little redder, you got this guy who posted in his, he runs like a mall, like a strip mall kind of place. [00:08:50] And this is what his sign reads. [00:08:52] Please be patient. [00:08:53] We are short staffed. [00:08:54] Hopefully the government will soon cease in their endeavor to enslave people through handouts and crush small businesses. [00:09:02] Hopefully, but don't hold your breath. [00:09:04] So this guy's basically saying he can't get staff. [00:09:07] And I think people are feeling this all over the country where you go and you can't get service because they just can't find employees. [00:09:12] So why are how can we both be in a recession and have a shortage of workers? [00:09:18] Well, we have a very low, by historical standards, labor participation rate. [00:09:23] And so a lot of workers have not gotten back into the workforce. [00:09:27] And you could lay some of the blame for that at the $2 trillion, that last $2 trillion of stimulus that Biden passed last year, the American Rescue Plan, along straight party lines. [00:09:37] And Republicans were accused of being cold-hearted when they pointed out that the STEMI checks and the super extended unemployment insurance would encourage people not to go back to work or delay them from going back to work. [00:09:48] And so I think we're still seeing the residual effect of all the stimulus money flowing through the economy. [00:09:54] There are the technical unemployment rate is low, but there's a lot of unfilled jobs. [00:10:00] There's a lot of people not participating in the economy. [00:10:03] I think what you're going to see now, though, is that the unemployment rate is going to start to rise. [00:10:09] There's no question that the economy is slowing down. [00:10:12] And it's, I think most analysts now believe it's just a matter of time before we're in a recession. [00:10:16] So I think you are actually going to see a lot of increased joblessness claims over the next six months or so. [00:10:24] And you're going to start to see these things normalize and behave more like you'd expect. [00:10:28] Remember, the unemployment rate is really a lagging indicator of economic success. [00:10:33] And so, you know, it's still reflecting the economy we had last year. [00:10:38] I think that the number will change over the next year or so. [00:10:40] Yeah, because haven't these checks stopped? [00:10:42] I mean, when you say it's a residual effect, it's like people saved up. [00:10:45] They put their stimulus checks in the bank and now they're still living on mom's couch, just enjoying the remnants of those checks. [00:10:53] I mean, they were big and they were unnecessarily large and generous for a portion of the population, but they weren't that big. [00:11:02] You can just retire at age 27 forever. [00:11:06] No, it's true. [00:11:07] I mean, it's a residual effect that's lagging. [00:11:09] I mean, people are using up those savings, but Americans still have, I think, quite a bit of excess savings stored up. [00:11:15] You also had a lot of things around eviction moratoriums, rent abatement, things like that, which is most people's number one expense is their rent. [00:11:22] So if you don't have to pay rent, you can basically live off those STEMI checks for a lot longer. [00:11:27] Look, I don't think this is the predominant thing happening in the economy, but this is one variable at the margins that is creating that feeling that you described of not being able to staff up some of these service jobs. [00:11:42] I wonder about it too, because it's like part of me wondered whether the teenagers who used to fill these jobs, like, where are they? [00:11:49] They didn't get any stimulus checks. [00:11:50] So where are they? [00:11:51] And part of it is, I feel like there's so much pressure on young people today to like, you have to be in 10 clubs and you have to be the president or the captain of four sports. [00:11:58] It's like they can't work. [00:12:00] They have to, they're doing stupid model UN as if that's going to prepare them for life, you know, instead of like doing the things that would probably cause a David Sachs to want to hire them, like working, shoveling fish guts for a summer and figuring out what it's like to get your fingernails dirty, you know, for a living. [00:12:16] Yeah. [00:12:17] I mean, I haven't searched for anyone with fish gut experience, but I would value that, I guess, if someone's willing to do that. [00:12:25] Yes. [00:12:26] Me too. [00:12:27] That's what I want my kids to do. [00:12:28] I definitely do not want model UN. [00:12:31] So what do you see happening now? [00:12:32] Because we're already in your industry, tech. [00:12:35] We're seeing some scaling back in the employment business. [00:12:38] So you say you think it's going to happen elsewhere. [00:12:40] Lyft just announced it's going to slow hiring Instacart, said it's going to roll back growth. [00:12:44] Microsoft announced a small cutback on jobs. [00:12:46] Tesla revealed it's going to cut salaried staff by 10%. [00:12:49] Meta says it's going to reduce hiring over economic concerns. [00:12:52] And now today we got an announcement. [00:12:53] Google's going to slow hiring through the rest of this year. [00:12:57] And they say they are there they're heading for quote another rough patch. [00:13:03] So that's tech. [00:13:04] And number one, why is tech getting it so bad? [00:13:06] Why are they sort of the leader on all these rollbacks? [00:13:09] Well, startups in particular are kind of the canary in the coal mine. [00:13:12] So that's why I've been warning on your show for months now that we are seeing a slowdown. [00:13:16] What basically happened is that if you look, if you go back to November of last year, the stock market, specifically growth stocks, peaked in November last year. [00:13:25] That's when the Fed finally got serious about inflation, admitted it wasn't transitory, and started projecting a regime of interest rate increases. [00:13:34] Those expected rate hikes then caused growth stocks to go down. [00:13:38] And we've seen a huge decrease in the stock market. [00:13:41] I mean, really across the board. [00:13:42] But if you look in particular at the growth stocks and the new listings, the SPACs, the IPOs, they're down 60, 70, 80%. [00:13:50] So what happened is over the last six months, venture investors obviously started noticing that. [00:13:56] The public markets and the valuations that are set in the public markets are the exit for all of us. [00:14:02] And so we realized that valuations were a way off. [00:14:04] And that caused a constriction in the amount of venture capital that was available. [00:14:09] That's been happening over the last six months. [00:14:11] And founders are seeing that it's hard to raise money. [00:14:14] The valuations are lower. [00:14:16] So raising money is more dilutive. [00:14:17] In any event, all these forces basically cause startup founders to burn money more gradually. [00:14:24] They want to extend their runway. [00:14:26] So this is why I've been saying for several months now that we've been in a slowdown. [00:14:31] And what I saw in all of my board meetings going back several months is these companies were slamming on the brakes. [00:14:37] They were not hiring as quickly. [00:14:38] So that was sort of, again, the canaries in the coal mine are these startups. [00:14:42] But now it has sort of, it's now spread to these big tech companies. [00:14:46] And then it will then spread to other kinds of companies. [00:14:50] And I just think it's based on how sensitive they are to economic changes in, you know, in the economy. [00:14:56] Startups are the most sensitive, then tech, then sort of the more traditional value companies. === Democrats Lose Working Class Voters (06:33) === [00:15:01] You know, switching it to politics for a minute, the New York Times slash Sienna College came out with a really shocking polling this week about President Biden's very low approval rating, 33% now, and the fact that some 63% of the Democratic Party wants a different nominee for a second time around. [00:15:19] They don't want him for a second term and how low his rating has fallen with independents, with the white working class. [00:15:25] He only has 20% support. [00:15:27] Even his core base, which the New York Times describes as black voters, more of the black voting base would prefer a different candidate other than Joe Biden on the Democratic ticket. [00:15:37] So it's not good news for him. [00:15:39] But today on their daily podcast that they call it the daily, they were pointing out another side of this poll showing that when it comes to the congressional midterms, the Democrats are doing a little better than they were and better than expected. [00:15:54] You know, if you looked at these polls three months ago, it was predicted bloodbath. [00:15:58] And now it's getting tighter in the wake of Roe being overturned, in the wake of some of these mass shootings and some Supreme Court decisions on guns. [00:16:07] And they believe in the wake of the January 6th hearings, which may not be really pulling the heartstrings of a ton of Republican voters, but some of them, but certainly seems to be amping up the Democratic base, which was, I think, in part their purpose. [00:16:21] So what do you make of that possibility? [00:16:23] They're saying now Democrats have a one-point lead on the generic congressional midterm ballot among registered voters, and they have a one-point deficit on likely voters. [00:16:34] So pretty tight. [00:16:35] I mean, surprisingly tight given these economic numbers. [00:16:40] Well, I think that if you look at the House, I mean, everyone's forecasting that Republicans are going to win back the House. [00:16:44] I think the question is the Senate. [00:16:46] And there is, you could call it a candidate quality issue there in a few races. [00:16:51] The Republicans haven't necessarily fielded the best candidates. [00:16:54] And so that matters quite a bit at the margins. [00:16:58] I do think that this should be a red wave in November. [00:17:01] If you look at, if you poll likely voters on what are the top issues that they care about, number one and two are inflation in the economy. [00:17:08] It's true that Roe is an issue for, especially for the Democratic base, but for voters as a whole, it's something like a 5% issue. [00:17:17] So, you know, the Democrats are going to go out with the best issues that they have. [00:17:21] But I think that the paramount issues for most voters are going to be the economy and inflation. [00:17:26] And I don't see a big positive change happening in those numbers before November. [00:17:31] In fact, overall, it could get worse. [00:17:33] So I would expect the Republicans to do very well in November. [00:17:36] One of the things they were pointing out, again, this is the New York Times talking about this issue within the Democratic Party is something that you've been pointing out as well, which is the Democratic Party is now a party of college-educated white elite people, elite so-called elites, right? [00:17:52] Meaning well-educated, you know, well-off, and that the working class has switched, has switched. [00:18:01] They're now Republicans and people of color have migrated to the Republican Party in numbers never seen before. [00:18:10] They are in no way a reliable voting bloc as they used to be, Hispanics. [00:18:15] And as I pointed out, even he's losing support amongst black voters. [00:18:19] So it's really shocking. [00:18:20] And you've been onto it. [00:18:21] You've been pointing it out. [00:18:22] A lot of people have, but I know you've seen it in a way a lot of others haven't. [00:18:26] So what do you make of that, sort of this switch in who represents the elites and who represents the working class? [00:18:34] Well, I've been onto this trend for a while because I read Roy Tusheura, who is a Democratic political scientist. [00:18:41] He back 20 years ago, he wrote a book called The Emerging Democratic Majority, in which he argued that demographic trends were working in the Democrats' favor and would basically create Democratic majorities and Democratic presidents as far as the eye can see. [00:18:54] And he was basically hailed as a prophet when Obama got elected in 2008, based largely on the coalition that he was talking about, basically young voters, women, people of color. [00:19:06] But for the last few years, Tushira has been warning that the Democrats, that demographics are no longer working in favor of the Democrats, that they are losing their historic base because of what he calls, he calls it basically professional class hegemony, that the Democratic Party is basically catering to the college graduate elites who run the think tanks and the foundations and the big woke tech companies and the Fortune 500. [00:19:32] And they are catering to that narrow group of voters and the issues they care about. [00:19:36] And they've lost sight of what matters to the average working class voter. [00:19:40] And that's why you saw in that special election in Texas, Maya Flores got elected a Republican for the first time by a largely Hispanic district that I think went 18 points for Biden, and then they just voted her in by a huge majority. [00:19:56] So you can see now that working class voters of all races are migrating from the Democrats to the Republican Party because the Republican Party is speaking to their concerns about economic issues, inflation, and so on. [00:20:09] Whereas the Democrats are really, you know, they are focused on these sort of elitist progressive woke policies. [00:20:16] And it's not just the economy, it's also issues like crime. [00:20:19] You know, the Democrats are very wedded to this progressive agenda of deprosecution and defunding the police and allowing rampant homelessness. [00:20:30] And the average working class parent, they don't want their kids to have to get off the school bus and walk through a phalanx of drug addicts and junkies and homeless in campus to get to their school. [00:20:41] So, you know, these are ordinary quality of life issues that are motivating the electorate and they're motivating the working class to move to the Republican Party. [00:20:50] Yeah. [00:20:50] Yeah. [00:20:51] That video out of San Francisco, we played some of that yesterday. [00:20:53] It was just horrifying of these young kids getting off the bus and having to walk through exactly that in this so-called progressive city that claims to care about the homeless. [00:21:01] Okay, how about the six-year-olds? [00:21:03] Do we care about them? [00:21:05] And then, yeah, we've seen that in place after place, especially a lot of families of color, Hispanic families and black families standing up saying, we don't want this CRT nonsense in our schools. [00:21:16] Don't tell us we're second class or our kids arrive here, you know, with the behind the eight ball just because of their skin color. [00:21:22] We don't accept any of your presumptions about our children based on their melanin. [00:21:27] And I do think it's pissing people off and it's causing a shift, especially with these economic numbers. [00:21:32] It's like, what do you have to lure us over? === Europe's Energy Independence Risks (07:03) === [00:21:35] Like, what are you selling? [00:21:37] And in the meantime, you get messaging from Joe Biden as they pay almost $5 at the pump of this is an amazing opportunity. [00:21:44] How do you put it? [00:21:45] This is an incredible transition. [00:21:47] Don't worry. [00:21:48] This is an incredible transition for all of us in the U.S. economy away from fossil fuels. [00:21:53] So you're welcome. [00:21:55] Right. [00:21:55] And this is why I think Biden does bear substantial responsibility for the inflation and the economic mess we're in. [00:22:01] He basically baked this cake last year. [00:22:03] Remember, his first day in office, he cancels the Keystone Pipeline. [00:22:06] He made it harder to drill and transport energy. [00:22:09] So, you know, number one, you know, he basically contributes to the higher gas prices we have this year. [00:22:15] He also pushed for this extra $4 trillion in stimulus last year, deficit spending. [00:22:21] Again, we mentioned the 2 trillion American rescue plan, which was stimulus that we didn't need. [00:22:26] COVID was basically winding down, at least as an economic issue. [00:22:29] You had economists like Larry Summers and his own party warning that if you pass his ARP, it could create inflation. [00:22:35] There's substantial risk there. [00:22:36] He did it anyway. [00:22:37] So that was the second thing he did. [00:22:39] And then, you know, let's talk about this Putin price hike idea. [00:22:41] Biden took throughout all of 2021, he basically took a very tough on Russia position in which he would not use diplomacy to try and find an off-ramp to this Ukraine crisis. [00:22:53] As a result, earlier this year, we had a war. [00:22:55] Now, we could debate whether that was a wise foreign policy. [00:22:59] I personally don't think it was. [00:23:00] But even if you do, even if you want to take a tough on Russia stance, why would you alienate the Saudis? [00:23:07] Why would you cancel America's energy independence? [00:23:10] If you knew you're going to be taking on Putin in the year 2022, you would want to use 2021 to create an energy glut. [00:23:17] And instead, Biden did the opposite. [00:23:19] And now he's going hat in hand to Saudi Arabia on this trip. [00:23:22] It's very humiliating. [00:23:23] He's basically having to beg for forgiveness to get the Saudis to pump more after he basically said last year he's going to treat them like pariahs. [00:23:30] So there was no overall grand strategy or coherence to this administration's policies. [00:23:37] If they want to get tough on Russia, they should have maintained good relationships with the Saudis and they should have basically encouraged domestic energy production. [00:23:45] Yeah, they went a different way. [00:23:46] You're right. [00:23:46] Now he's over there with on unbended knee to the so-called pariahs. [00:23:51] And I guess there was some news that he wasn't going to shake the hands of leaders over there because of COVID. [00:23:55] And everybody knows it's because he doesn't want that photograph, you know, some pariah. [00:23:59] Look at the two of you buddying up. [00:24:02] I don't know. [00:24:03] That may be fake news, but I heard that from my team. [00:24:05] Let me ask you about this. [00:24:07] If this year teaches us anything, it is that you cannot have security without energy independence. [00:24:12] We have to be energy independent. [00:24:14] The Europeans need to be energy independent. [00:24:16] What's happening in Europe right now is that the Russians are actually restricting the flow of gas from the Nord 1 pipeline. [00:24:23] And they're really showing the Europeans who's Putin is showing the Europeans who's boss right now. [00:24:27] And I think that there is a significant chance that come this winter, when the Europeans need to heat their homes, that is when the Western alliance on Ukraine may fracture. [00:24:37] I think this is what Putin is betting on. [00:24:39] And so we've seen now, again, you cannot be secure as a country unless your source of energy is secure. [00:24:46] And I think the Europeans are learning that the hard way and we're learning it the hard way. [00:24:50] It's so crazy because we were, you know, we were energy independent under Donald Trump and he gave it away. [00:24:55] And we see, we're seeing something similar, you know, because of his green energy policies and his, he's too beholden to the elite super green faction of his own party that makes huge donations, unlike the working class members who used to be part of his party, who are going to get hit by these policies of making us not energy independent and, you know, dependent on wind and solar, which doesn't work nearly as well. [00:25:21] And that's why what's happening in the Netherlands is interesting because it's a parallel. [00:25:26] We have these farmers who are about to get hit by these attempted green policies severely out there protesting. [00:25:34] I mean, we never talk about the Netherlands. [00:25:36] In fact, the Netherlands confuses me because my husband's Dutch and it always just confuses. [00:25:41] Like, what is the Netherlands? [00:25:42] What is Dutch? [00:25:42] Why aren't you considered Netherlandian? [00:25:45] Why are you Dutch? [00:25:46] Anyway, I could go on, David, but for the viewers who haven't been following this, so farmers are protesting around the Netherlands over the government's new policy, which would see the country slash nitrogen oxide and ammonia emissions by 50%, 50% by 2030. [00:26:04] This is in an attempt to go more green. [00:26:06] All right. [00:26:07] And this, all of this means they have to reduce their livestock numbers. [00:26:11] It could force some farms to shut. [00:26:13] They have to use less fertilizer in a very short time. [00:26:16] And they're saying this demand by the government is totally unfeasible. [00:26:19] And what's going to happen is now the government's going to try to buy up all the farms. [00:26:24] And it's like a takeover. [00:26:26] So they're protesting. [00:26:28] They're burning bales of hay. [00:26:29] They're kind of doing what the Canadian truckers did in a way and fighting back. [00:26:33] And it's somewhat inspirational to see them pushing back on this government overreach where their government's trying to do to them what Joe Biden's kind of doing to us. [00:26:42] Right. [00:26:43] Yes. [00:26:43] I mean, the Dutch farmers or the new Canadian truckers, these are working class folks who are basically being punished and they're basically being legislated out of existence by their own government. [00:26:54] And for what? [00:26:55] I mean, basically, some bureaucrat in Brussels had the bright idea that we're going to cut this type of emission by 50% by 2030. [00:27:01] Those are suspiciously round numbers to me. [00:27:04] I'd like them to prove why 50% is the number and why 2030 is the number. [00:27:09] I mean, these things, it doesn't really make sense, right? [00:27:12] They've just kind of picked these numbers arbitrarily out of thin air. [00:27:15] And then what happens is the Dutch legislators say, oh, we have to implement this new directive from Brussels. [00:27:20] They start confiscating these farms and banning a way of making a living that these farmers have been, you know, have been engaged in for generations. [00:27:30] So, you know, it is very similar to what's happened with energy, which is the Europeans adopted a policy towards energy that made them dependent on Russia for their energy because they refused to use nuclear or develop other or develop their own gas for environmental reasons. [00:27:48] They're about to do the same thing with food, which is make themselves dependent on other people's food because they refuse to produce themselves, even though they have an enormous natural advantage. [00:27:59] The Netherlands is actually the number two agricultural exporter in the world. [00:28:03] So they are very, very good at this. [00:28:05] And the Dutch government seems intent on destroying the advantage they have. [00:28:10] It's lunacy. [00:28:11] And if you want to see where all of this leads, just look at Sri Lanka right now. [00:28:15] So, you know, the Sri Lankan government and society just collapsed. [00:28:19] Why? [00:28:20] If you go back to April of last year, they banned these same types of chemical fertilizers that the Netherlands wants to restrict. [00:28:28] And as a result of that, their agricultural output fell by something like a third this year. [00:28:33] And the production of rice, which is one of their main staples, fell by something like 43%. === Housing Market Correction Looms (02:25) === [00:28:38] So all of a sudden, people are going hungry. [00:28:41] They can't feed themselves. [00:28:42] It's a much poorer country, obviously, than the Netherlands. [00:28:45] But as a result, the whole society has collapsed. [00:28:47] Why do they implement that policy? [00:28:49] Well, it's the same types of policies that are being set out of Brussels. [00:28:53] It's this environmental extremism that doesn't take into account the needs of ordinary people. [00:29:00] One other point on the economy, and then I have to ask you a question about Elon and Twitter. [00:29:05] One of the things I heard you say on the all-in podcast was here domestically, the next thing to get hit after the unemployment rate starts to get shakier are nest eggs and homes. [00:29:16] So that's scary. [00:29:17] Do you mean 401ks and just the housing markets likely to crash? [00:29:21] What are you projecting there? [00:29:24] Well, 401ks have already been hit. [00:29:25] I mean, we're in a bear market officially. [00:29:29] I think the S ⁇ P is down something like 22% for the year, NASDAQ, 30-something percent, maybe the Dow Jones a little bit less, but we're already in bear market territory. [00:29:38] And if you're a growth stock investor like I am, you know, it's even much worse than that. [00:29:42] So, you know, if you haven't looked at your 401k lately, you probably don't want to. [00:29:47] It's going to be depressing. [00:29:49] Don't look at it. [00:29:50] Now, on the housing market, I think the issue there is that if you look at some of these charts showing the ratio of housing prices to median income, what you see is that the ratio has never been this high since around 2006, right before we had that sort of the global financial crisis driven by the real estate crash. [00:30:14] So essentially what that means is that people cannot afford home prices as they stand today based on their current income levels. [00:30:23] And now that the it, now the interest rates have gone up so much, mortgage prices are rising rapidly. [00:30:29] So, you know, your typical home mortgage has gone from call it roughly 3% to almost 6% in just the last few months. [00:30:36] So that is also creating a huge amount of pressure on the real estate market because people simply cannot afford as much house as they could just a few months ago because they can't borrow as much. [00:30:47] So, you know, I think we are due for some sort of big correction in the housing market. [00:30:51] What tends to happen first is that inventories build up. [00:30:54] You get illiquidity in the market because sellers don't want to drop their prices. [00:30:58] Eventually, they capitulate and then we see a price decrease in the housing market. === Twitter Deal Accountability Questions (14:24) === [00:31:03] Very, very bearish all around. [00:31:05] Okay, last question. [00:31:07] Elon has officially been sued now by Twitter, which is seeking to force specific performance of his promise to buy Twitter at 44 billion. [00:31:16] He says they materially breached the deal first by not disclosing all of the information on how many bots are actually on Twitter and perhaps in other ways. [00:31:26] CNBC analysts predicting Elon may go to jail if he loses this case. [00:31:32] What do you think? [00:31:33] That was a ridiculous comment about Elon going to jail. [00:31:36] Even Jim Kramer, who's not exactly known for, let's say, not making hyperbolic statements, laughed out loud at that. [00:31:43] The worst thing that could happen to Elon, although I would consider it to be a good thing, is that he could be ordered by Delaware Chancellery Court to actually consummate the acquisition. [00:31:51] He could be ordered to basically perform, or alternately, he could end up having to pay damages, a kill fee, basically. [00:31:57] So that's the risk to Elon, and that's what Twitter is seeking. [00:32:02] What Elon has to show is that there was a material adverse effect related to this bot problem. [00:32:08] That basically the fake accounts, the bots were massively understated by Twitter's public filings. [00:32:14] They wouldn't give him the information. [00:32:15] And that had basically permanently impaired the business. [00:32:19] And it meant that their revenue would be much lower. [00:32:21] That's his assertion. [00:32:23] And so I think, you know, what Elon has going for him is that the discovery around this for Twitter, I think, is going to be very messy. [00:32:30] The question very quickly is going to become: what do Twitter executives know and when do they know it with regard to this bot fake account problem? [00:32:37] What do they do about it? [00:32:39] Is there an email anywhere in the company in which executives are saying, well, gee, do we really want to pursue this vigorously when we know it may decrease our revenue? [00:32:45] I mean, I'm not saying that email exists, but if it does, it's going to be very messy and embarrassing for them. [00:32:50] So I was a little surprised that Twitter moved forward this because I don't think they're going to like the discovery. [00:32:56] On the other hand, you know, the battle for Elon is he's got to show this material adverse effect, which traditionally, that's a pretty tough thing to show in Delaware court. [00:33:06] So that's sort of the base of the balance there. [00:33:09] You know him. [00:33:10] You've worked with him. [00:33:11] Do you think he still wants the company? [00:33:15] You know, I don't know. [00:33:16] The short answer is I don't know. [00:33:18] I mean, I have been a fan of the idea of him buying Twitter because he would restore free speech to Twitter. [00:33:25] And I think that's an important cause. [00:33:26] We've talked about that before on the show. [00:33:28] So I still hope the deal goes through. [00:33:32] I'd love to see him restructure the company. [00:33:37] But I don't know if he wants it. [00:33:39] Maybe he realized this would be a big headache. [00:33:41] It certainly surely would be. [00:33:44] And of course, we're operating against this backdrop of a massive stock market decrease. [00:33:48] So obviously it's hard to ignore that the deal may not be economically quite as good a deal as it was. [00:33:54] Say, no, but it's a public service. [00:33:57] It would be a public service. [00:33:59] I would really like it as a public service. [00:34:01] Although, as an investor in some of Elon's other companies, I'm not sure I want him signing up for the distraction. [00:34:06] But yeah, I mean, I think it would be a great thing for society if it actually happened. [00:34:11] Well, he is one of the disruptors, and he's certainly disrupted Twitter in a way that is fascinating to watch, but a little sad for those of us who want to see him close this deal. [00:34:19] David Sachs, always a pleasure. [00:34:21] Thank you. [00:34:22] Thanks for having me. [00:34:23] And up next, we are digging much more into the Elon Musk lawsuit with a lawyer who had a fascinating piece in the Wall Street Journal today, which he's got a very different take than the one you're going to hear on the mainstream. [00:34:40] Twitter now officially at war with Elon Musk, but who will prevail? [00:34:45] My next guest just co-authored a piece in the Wall Street Journal that says, Twitter's lawsuit looks like a loser. [00:34:51] That's not what you're hearing anyplace else. [00:34:53] Todd Henderson is a law professor at the University of Chicago, one of the top law schools in the world. [00:35:00] And he joins me now. [00:35:02] Todd, thank you so much for being here. [00:35:04] Delighted to be here, Megan. [00:35:05] So that's a fascinating and provocative headline, and it fairly encaptures your position. [00:35:11] Twitter's lawsuit against Elon Musk looks like a loser. [00:35:14] If you check any mainstream publication or television show, they're going to tell you that Elon will be the loser. [00:35:21] Twitter's got him. [00:35:22] You know what? [00:35:24] And that he's either gonna have to pay a billion dollars or he's gonna be forced to buy this company, or according to the CNBC analyst, will go to jail. [00:35:32] So, let's start with why you think, do you think on the merits they're gonna lose, Twitter's gonna lose? [00:35:38] Or are you just taking a different position on punishment to Elon if he doesn't go through? [00:35:43] Well, let's just start with, you know, don't believe me. [00:35:46] I mean, I'm just a law professor. [00:35:49] I will say I have some credibility because I was law school classmates with David, who was your previous guest. [00:35:55] So, this is the class of 1998 University of Chicago Law School Day on the Megan Kelly show. [00:36:01] Small world. [00:36:02] But yeah, but look, the stock market, Twitter's stock, I didn't check it this morning, but yesterday it was $33 or something. [00:36:11] Twitter's argument is that the court should order Musk to buy Twitter at $54. [00:36:17] And that means if you're a stock market professional and you believe the court is going to do that, you can buy Twitter today for $33. [00:36:24] And when the court issues its order requirement to buy it, you get $54. [00:36:29] That seems like a pretty good trade if you believe the court's going to do that. [00:36:32] And the stock price is nowhere near $54. [00:36:34] So I think a lot of pundits are saying that Twitter's got a good case. [00:36:38] I think the stock market, the wisdom of crowds, is kind of on my side. [00:36:44] Yeah, let me address your question, which is, you know, they've asked him, they said, basically, you promised to buy us. [00:36:53] And now that you've backed out, we're going to make you buy us. [00:36:57] And you went to law school, Megan. [00:36:59] And so you were in a first-year contracts class. [00:37:02] And pretty much the first thing we teach our idealistic, you know, change the world students when they come to law school is not every wrong has a remedy. [00:37:12] And that the rule in contracts generally is not promise and you have to do it or we send you to jail, like these crazy analysts think, but either do it or pay damages. [00:37:23] That's the kind of general principle that animates our piece. [00:37:28] And so if he backs away and breaches the contract in the technical sense, and I understand, you know, you were talking with David about these bots and whatever. [00:37:37] We can talk about that, whether he is actually breaching or whether they breached. [00:37:40] But let's just imagine he just walks away and says, you know what, I changed my mind. [00:37:45] In that situation, probably the worst thing that could happen to him is that he would have to pay damages. [00:37:50] And of course, there's a question of what those damages would be. [00:37:53] Some have suggested, you know, Matt Levine at Bloomberg has suggested those damages are the damages of the shareholders. [00:37:59] But as we point out in the piece, shareholders are not part of this contract. [00:38:03] They can't sue through Twitter to get their damages. [00:38:06] It's only the damage that Twitter has. [00:38:09] So I don't think the courts will make him buy it. [00:38:12] And I think for reasons we can talk about, that would be a disaster if we forced people to do things they don't want to do. [00:38:18] And the damages to Twitter are probably a lot less than a billion dollars. [00:38:21] All right. [00:38:21] So let's get to that because that was an interesting piece that you raised about Hearst Castle in California as an example of what happens if they act, if the court in Delaware says, Elon Musk, you must buy Twitter now. [00:38:35] You must do it. [00:38:36] And if Elon Musk then does it, what does Twitter now have? [00:38:40] Who does it have running it? [00:38:41] How does that affect shareholder value? [00:38:43] Like, who wants to force a reluctant owner into running a several billion dollar company? [00:38:51] Yeah, I think it's, it's, as we, as we allude to in the piece, I think it's irresponsible of Twitter's board. [00:38:58] Twitter's board has fiduciary duties to Twitter as an entity. [00:39:02] That means they have to put the interests of Twitter as an entity above anything else. [00:39:08] And forcing someone who doesn't want to run the company or own the company to buy the company seems antithetical to their obligation to do the best thing for Twitter. [00:39:19] You want someone to buy, if you're selling your house, you want someone to buy your house that's going to be taking care of it, not someone that's going to want to, you know, would never cut the grass and let it fall apart or whatever. [00:39:31] And that's the obligation the board of directors has. [00:39:34] And so I think, you know, forcing a reluctant owner is a really bad idea. [00:39:40] You know, we point out, we use the analogy in the piece that if you contract with someone to paint your house and they back out and say, you know what, I got a better deal. [00:39:51] I promise to charge you $500. [00:39:54] Someone else down the street is going to pay me $1,000. [00:39:57] You don't want to go to court and compel that house painter to paint your house because they'll do a bad job. [00:40:05] They'll shirk and they'll be lazy and maybe in ways that it's very hard for you to detect. [00:40:11] And for courts, this raises another problem, which is imagine that the court does order the person to paint your house or Musk to buy Twitter. [00:40:20] And then he kind of does a bad job. [00:40:23] He doesn't show up on time. [00:40:25] He's not using a really high quality paint. [00:40:27] Or in this case, Musk sort of slow walks the funding with the banks and delays the acquisition, keeping Twitter in legal limbo. [00:40:36] Then Twitter's going to run back into court and say, look, he's not fulfilling. [00:40:40] He's not doing the best job he could closing the deal or running the company, whatever it is. [00:40:45] And that would be sort of enmeshing the court in a continuing obligation to make sure that the order to specifically perform to do the job, that they're actually doing it right. [00:40:55] Because to think that the court's just going to snap its fingers and say, you do it. [00:40:59] And then Musk will say, yep, sure, I'll do it the best I possibly can, I think is naive in the extreme. [00:41:04] And people need to keep the real world in mind here. [00:41:07] I mean, they sued Elon in Delaware, I presume because Twitter's a Delaware-based, I mean, incorporated in Delaware, like virtually every corporation. [00:41:16] But that's a good thing for Elon, too, because this court deals with basically nothing but business disputes and understands the realities of his own power, whoever's going to decide it, and of how businesses work. [00:41:30] So yes and no. [00:41:32] Can I agree with you about everything in the premise you said and then sort of push back a little bit in one sense? [00:41:40] The Delaware Transit Court judges are experts and they will understand what I just described and the limits of specific performance and those things. [00:41:50] They do, however, have a kind of, you know, we're in charge of business disputes. [00:41:56] We're worried that if we let Musk back out of this, the kind of ramifications that have for other deals. [00:42:02] And they have, because of their expertise, a little bit of an arrogance and a willingness to sort of hold people to deals in ways that don't reflect the economic efficiencies or maybe what the right market conditions would be. [00:42:17] So I think that's a point in my favor. [00:42:20] That's a point in my favor because I think the arrogance will make them not want to get in a position where they're ordering Elon Musk, the richest man in the world, to do something he's not going to do. [00:42:32] Yeah, we say in the piece, there's a little bit of a game of chicken that could go on. [00:42:37] I mean, first of all, the fundamental problem here is something, again, that we teach and you learn in law school, which is the lay person sees Musk agreed to buy Twitter. [00:42:49] Well, that's not what happened. [00:42:51] The contract or the merger agreement was structured in a particular way. [00:42:56] Musk created some separate entities, X Holdings one and X Holdings two. [00:43:02] They agreed to be funded. [00:43:06] Twitter agreed to cancel the shares of its shareholders and give them certificates and they could show up at these separate entities and exchange those certificates for cash. [00:43:14] That particular structure was not Musk himself promising to pay each Twitter shareholder $54.20. [00:43:23] And that structural difference really matters. [00:43:25] The lawyers could have struck this deal in a way that really did bind Musk, that made him accountable for the promise that he made to the shareholders effectively and could have been liable for the difference between the $33 stock price today and the $54. [00:43:41] But the lawyers did not structure it that way. [00:43:42] And they're kind of stuck with the structure that they have, which what it does is forces Twitter in the first instance to sue these shell companies, X Holdings 1 and X Holdings 2, to force them to do something. [00:43:58] And as we point out, there's no them really. [00:44:01] You can't put them, you can't put X Holdings 1 in jail. [00:44:05] The agreement does require Musk to do his best efforts to get those entities to fulfill their obligations, but it's a kind of second order thing. [00:44:15] And I cannot see the Delaware courts holding Musk in contempt. [00:44:22] And it just shows a little bit of limitations of law here in holding people to deals. [00:44:29] So what do you think is likely to happen? [00:44:30] Yeah, I know you end your piece with a good line. [00:44:32] You say they could have structured it that way, but either Mr. Musk's lawyers were too smart for that or Twitters weren't smart enough to structure it in a way that would have really required accountability on his part to the shareholders. [00:44:42] Twitter, of course, is saying we were injured. [00:44:44] You know, he's blown up the company. [00:44:46] He's damaged our reputation. [00:44:48] He's created doubt amongst advertisers in our customer base about how many real accounts we have and so on. [00:44:54] So he came in, he damaged us, he left. [00:44:57] So we want to be made whole. [00:44:59] And maybe the wholeness is the difference between $33 a share and $54. [00:45:02] Who knows? [00:45:03] But what are you knowing there, both sides' arguments? [00:45:06] What do you predict? [00:45:07] And I won't hold you to it, is likely to happen. [00:45:09] Yeah, so yeah, so the first thing is on the lawyering, just that's a plug for law for my students. [00:45:14] Lawyering really matters. [00:45:16] And it seems like it, you know, oh, just they agreed to this deal. [00:45:19] Lawyering really matters. [00:45:20] And that's a point we want to get across in the piece. [00:45:22] As for predictions, I mean, I'm a law professor, and so I'm not a prognosticator as such. === Automating Spam Verification Problems (12:52) === [00:45:28] And so, you know, with that caveat, you know, I think the, at the end of the day, the court is going to want to not force the sale of a $44 billion company to somebody who doesn't want it. [00:45:44] And I think the stock market, as I said, reflects that reality. [00:45:48] Musk agreed to pay this billion dollar breakup fee. [00:45:52] And so that's the cleanest way out of this. [00:45:54] There is a question we raise in the piece whether or not that is actually Twitter's damages. [00:45:59] What pundits are doing by pointing to the shareholders in the $54 is just completely mistaken. [00:46:04] The shareholders are not a party to this contract. [00:46:06] The lawyers could have made them a party to the contract and brought their damages onto Musk. [00:46:12] They didn't do that. [00:46:13] Twitter is the part of the contract. [00:46:16] And so only Twitter can sue for its damages. [00:46:18] And you mentioned some things like their reputational harms or things like that. [00:46:23] Great. [00:46:24] If Twitter can go into court and prove that it is worth less today and make a causal link between that reduction in its value, reduced profits, reduced asset value, market value, and Mr. Musk's behavior, then they could get those damages. [00:46:43] But I haven't seen any evidence that Twitter is less profitable today than it was before Mr. Musk made his offer. [00:46:52] And that's what they'd have to prove. [00:46:54] And Twitter took a dive along with it. [00:46:56] Yeah, that's going to be a really tough standard. [00:46:57] So why don't you think, though, and we only have a short time, that they'll make him pay the billion bucks, the breakup fee? [00:47:05] Well, I won't, I say they may do that. [00:47:09] And that seems like the cleanest way out. [00:47:11] And for Musk, I think, you know, a billion dollars, that's couch change for him. [00:47:14] And so I don't think that's a real, I think that's a win for him if he can walk away here with just paying the billion dollars. [00:47:20] The reason I'm a little bit cautious about that is because breakup fees are supposed to reflect the actual estimate of the damages. [00:47:29] So imagine you're buying a house, something everybody, your listeners are all probably pretty familiar with. [00:47:33] You put up some earnest money. [00:47:35] The earnest money when you put up a house and walk away or try to buy a house, that's what you lose. [00:47:40] You can't put in a contract, buy this house, or pay me a billion dollars. [00:47:44] That's not the way it works. [00:47:45] The damages, the billion is supposed to be an estimate of Twitter's losses. [00:47:49] I don't think they're a billion dollars and the courts are reluctant to enforce penalties. [00:47:54] Fascinating. [00:47:55] This is like totally different than what, you know, the people who hate Elon Musk, who control the rest of the media, say. [00:48:02] I appreciate the honest, straightforward analysis. [00:48:05] Todd Henderson, please come back. [00:48:07] I would love to, Megan, anytime. [00:48:09] Thank you. [00:48:09] All right. [00:48:10] All the best. [00:48:10] Coming up, the bots angle to this story from a person who knows she's an expert in how many bots there are and how they're manipulating you right now. [00:48:24] Joining us now is someone who kind of studies bots for a living. [00:48:28] So she knows a lot about what Elon Musk says is his problem with the Twitter situation, though Twitter says it's a ruse. [00:48:36] She has also studied social media disinformation campaigns for years. [00:48:40] And believe it or not, you have been a victim of a social media disinformation campaign. [00:48:45] They're ubiquitous. [00:48:47] They're everywhere. [00:48:48] And she can tell you some of the signs of, let's say, the Twitter account or the LinkedIn account that you may be interacting with that you have no idea is fake. [00:48:57] The person's fake, it's fake news. [00:49:00] So she's neck deep in something that we're all either living or we're about to know a lot more about in the coming decade. [00:49:06] Renee DeResta is the technical research manager at Stanford Internet Observatory, and she joins me now. [00:49:13] Welcome, Renee. [00:49:14] Great to have you. [00:49:15] It's great to be here. [00:49:16] Thanks for having me. [00:49:17] So, yes, you've done far more than study bots online, exactly, but that's where we left off with our last two guests. [00:49:24] So we'll just pick it up there with you. [00:49:26] Putting aside whether that's genuinely Elon's problem with Twitter, you know, they'll hash that one out between themselves. [00:49:33] Bots on Twitter are a problem. [00:49:35] Twitter's acknowledged that too. [00:49:37] And bots online are a problem. [00:49:38] But let me ask it this way. [00:49:40] If Elon hired you and said, Renee, I need an expert in like how I can figure out how many bots there are on Twitter, can it even be done? [00:49:51] I mean, is it really knowable? [00:49:54] So not in the, so I read the filing as, you know, the lawsuit filing, and I'm not a lawyer, but I was very interested in the technical aspects of it. [00:50:01] And there's a couple things in play here. [00:50:04] So first of all, there's a strong public perception that bots are a huge problem on Twitter. [00:50:09] And we can talk about the history of that, why that is, you know, some of the dynamics in 2015 when they were particularly impactful. [00:50:16] Twitter actually did take a lot of steps to minimize the impact of bots after 2015, around 2017, 2018 timeframe. [00:50:24] But the first thing I'll say is that bots are not an evenly distributed problem on Twitter. [00:50:29] Meaning, if you're a person like Elon Musk and you are famous, you have millions of followers, you're active in spaces where like cryptocurrency, where scams are abundant, you're going to see a lot more bot activity in part because people make impersonation bots of you and then they try to kind of dredge in your replies to manipulate your followers to pump cryptocurrency scams. [00:50:53] So Elon no doubt sees a whole lot more of this stuff than the average person who's engaging on Twitter does. [00:50:59] So I think that that perception is really key here. [00:51:02] He has, my team's telling me he has 100 million followers right now. [00:51:04] So I mean, yes, so he's got a lot of incoming. [00:51:08] Go ahead. [00:51:09] Right. [00:51:10] So the terminology for bot, you know, it's a little bit fuzzy. [00:51:15] It should mean automated accounts. [00:51:18] What winds up happening, meaning a person doesn't sit there typing content into the user interface. [00:51:25] Instead, what's happening is the content is kind of pushed out at set time intervals. [00:51:30] Or when a famous person like Elon tweets, you have bots that will see that his tweet has come out and will immediately reply to it. [00:51:37] This happened with Donald Trump. [00:51:38] This happens again with many, many famous accounts because people kind of want to get the reply in first. [00:51:44] Under Donald Trump, you used to see people selling like liberal tears, mugs, just a form of spam, right? [00:51:50] Economic spam. [00:51:51] And so this is, again, this is not an uncommon problem. [00:51:54] The question is, in this context, per this legal filing, Twitter is claiming that 5% of its monetizable daily active users, so MDAOs, are that fewer, that 5% or less are these spam bots. [00:52:11] That is different than just number of users on site. [00:52:14] And so Twitter in its calculation of MDAOs is theoretically already filtering out these spam bots that we all do know are on the platform. [00:52:22] Elon is saying, again, per my understanding of these filings, that Twitter is misrepresenting that number. [00:52:28] And so in response, what he asked for was access to first an understanding of Twitter's methodology, which it's my understanding they provided. [00:52:36] And I know in mid-May, the CEO Parag was tweeting about this methodology. [00:52:41] You know, they're sampling, they have manual review. [00:52:45] It's almost impossible for an outsider, even a researcher like me, to concretely say that someone or something is a bot, is an automated account or is a fake account. [00:52:55] Oftentimes, what people see is, for example, a bunch of accounts posting the same content over and over again. [00:53:01] So they think, oh, that's automated. [00:53:03] That's not always necessarily true. [00:53:05] What Twitter is looking at is information that it has about did this account verify its phone number? [00:53:11] Is it, you know, what mechanisms is it using to engage on the platform? [00:53:16] Does it have other social media platforms linked in some way? [00:53:18] You know, they've got a number of different types of kind of information in the background that no outside person can see. [00:53:28] What we look at as researchers often comes through what's called the Twitter fire hose. [00:53:32] And Elon asked for the Twitter fire hose and it was provided to him, again, per the terms of this, per the information in this lawsuit. [00:53:38] But you can't gauge bots by looking at the tweets that are coming through the fire hose. [00:53:43] You can see maybe common uses of phrases. [00:53:46] You can see some repetition in terms. [00:53:48] But again, you can't verify that those are automated accounts, unfortunately. [00:53:52] And you can't verify that they have not already been filtered out of this monthly, sorry, monetizable daily active users that is MDAO. [00:54:01] Wait, so that makes it sound to me like you might give Elon the point that he was not provided with satisfactory information in order to be able to tell what percentage of the monetizable daily accounts are bots. [00:54:17] So there's two different things. [00:54:19] So there's the data that he was given, which I'm saying is not particularly useful for answering the question that he wants answered. [00:54:24] But then what Twitter says in its lawsuit is that they did provide extensive briefings detailing their methodology of how they arrive at MDAO and where their sampling happens and the processes by which they go and they check that background data to understand if this is a real account. [00:54:40] But not the actual date, not the underlying data. [00:54:42] That's where he's going to try to wiggle, right, Elon? [00:54:44] Probably. [00:54:45] You're the lawyer, not me. [00:54:47] I'm listening for exploitable points. [00:54:49] You know, if I'm Elon's lawyer and he's going up against Twitter hired Wachtel, which is a great, great, you know, white shoe law firm. [00:54:55] They both have these white shoe law firms. [00:54:56] So it'll be the very best and brightest lawyers duking it out. [00:55:00] Yeah, he's going to say, I don't need to accept their summaries when I'm paying $44 billion. [00:55:06] I want to see the actual data. [00:55:07] And then they're going to say, we gave you everything that was reasonable under the circumstances. [00:55:10] Oh, and by the way, you kind of waived your right to have a full vetting of all of our methodology when you made the offer. [00:55:17] So it'll go back and forth. [00:55:18] Okay, that's fascinating though. [00:55:19] I, you know, it's until you just said that, I never really asked myself exactly what is a bot. [00:55:25] I just kind of assumed it was a computer generated, like a human generated at some point program that sent out a bunch of annoying messages to us all on various forms of social media. [00:55:39] So that's kind of the purist term, but colloquially, it really has taken on a meaning where I think bot and troll are often used quite interchangeably. [00:55:48] You know, people's experience of Twitter, of course, everybody has the experience where some, you know, we used to call them egg accounts, like the old egg profile picture. [00:55:55] There was no profile picture. [00:55:56] There was just an egg and it would send something nasty back at you kind of instantaneously. [00:56:01] And we've all had that experience. [00:56:04] As public awareness about bots increased, particularly in the 2016, 2017 timeframe, when there was a lot of conversation about fake news, about small groups of accounts, you know, kind of trying to make things go viral, about manipulation using fake accounts to make hashtags trend, just kind of political shenanigans. [00:56:24] There started to be quite a lot of academic research on bots that had actually gone back several years prior to it really coming to public awareness. [00:56:31] But once it came to public awareness, there was this interesting phenomenon by which the more people learned that there were such things as bots on Twitter, the more they started to actually kind of immediately dismiss people who would reply to them with like a nasty or snarky response. [00:56:46] Oh, that must be a bot. [00:56:47] You know, this was particularly as the kind of polarization on Twitter really heated up and anybody who waded into any vaguely political topic, which now is, you know, much of Twitter, would get this response back and they would decide, oh, that's a right-wing bot. [00:57:01] Oh, that's a left-wing bot. [00:57:02] And so this idea that the person who was responding to the kind of like nasty throwaway account was a bot, that was the term that people came to use, even though most of those accounts are not actually automated. [00:57:14] Most of them are not really bots. [00:57:16] So, and I definitely want to get to like your experiences with identifying the fake ones. [00:57:22] And, you know, there are some tells which could be useful to, you know, our listeners and our viewers. [00:57:26] But one of the reasons I think this is so important, what you do, is I remember before I went over to interview Vladimir Putin the first time, I had a big briefing at NBC, like behind closed doors with FBI, CIA, like all sorts of top intel analysts and legit guys, you know, not hard partisans like we've seen in some corners, but I mean, legit guys who had taken a hard look at this. [00:57:51] And they showed me on a graph, not just how like the sort of Russians had been manipulating conversation in America leading up to the election, but even how I'd been targeted. [00:58:02] They could pull sort of the bot activity around my own name, especially after I got in the crosshairs of Donald Trump and how it was amplified, like how negative tweets were amplified in different pockets of the world. [00:58:14] It was crazy. [00:58:15] I mean, like the way that it can be seen if you care to, if you actually know what you're doing like you do, it's all right there. === Algorithmic Manipulation of Networks (15:14) === [00:58:20] It's like right there. [00:58:22] It's very traceable, or at least it was back then. [00:58:24] And so this was the point I try to make to my audience a lot, Renee, which is you don't, you can 100%, if you so believe, you can, you can say that Donald Trump won the 2016 election fairly. [00:58:38] And you can even believe that he won the 2020 election. [00:58:40] You can be, you know, sort of one of the people who believes his claim that that election was stolen from him. [00:58:45] But I'm telling you, the Russians interfered with the 2016 election and they've interfered with our national dialogue for years prior to that and after. [00:58:55] And they definitely were pro-Trump, but their agenda is so much bigger than that. [00:58:58] They were pro-Trump to the extent they didn't like Hillary. [00:59:00] But what they really want is division in America. [00:59:02] They want us weakened. [00:59:03] They want us fighting with one another. [00:59:05] They'll take both sides of the Black Lives Matter issue or whatever issue and just try to make a spite. [00:59:10] That's true. [00:59:11] That really happened. [00:59:12] And you know that firsthand. [00:59:15] Well, and that's something that I tried to emphasize actually in my own work also. [00:59:19] I think, you know, I led one of the teams for the Senate Intelligence Committee in 2018. [00:59:25] So there were tech hearings in 2017 as the public, you know, as investigations began to show that Russian interference in the election had happened. [00:59:34] Again, many people in the Obama administration knew about that in advance. [00:59:37] I think there's been a number of books written on this topic. [00:59:40] The FBI had been observing GRU, which is Russian military intelligence activities. [00:59:45] There was an excellent article in mid-2015 in the New York Times by a reporter named Adrian Chen. [00:59:52] I believe it was called The Agency. [00:59:53] And it talked about the internet research agency, the Russian troll factory, which was really kind of put to use first, very domestically, actually, to try to justify the Russian invasion of Crimea. [01:00:03] Of course, this kind of harkens back to the interesting dynamic that we're in with the current Russian invasion. [01:00:08] But what was happening there was they realized that propaganda had changed. [01:00:15] And there's this very old Cold War phenomenon called the agent of influence, right? [01:00:19] And this was, have you ever seen the TV show The Americans? [01:00:22] That's what they are. [01:00:23] So they're these kind of like deep cover agents. [01:00:26] You know, they put them there and they recruit assets. [01:00:28] And you see in the narrative of the Americans, they recruit this black man. [01:00:32] This is during the civil rights, I think, kind of with the talking about civil rights issues. [01:00:38] That phenomenon of agents of influence, you used to have to actually like put someone physically in country to go and infiltrate an activist movement and nudge things in a particular direction. [01:00:47] Or you started front media organizations. [01:00:50] That required a whole lot of effort. [01:00:52] You had to have front journalism. [01:00:54] You had to pay people. [01:00:56] It took multiple years at times for false news to go viral back then. [01:01:01] Viral was quite different in the age of just broadcast media and print. [01:01:05] But what the Russians realized is that you could actually pretend to be somebody else online quite easily. [01:01:10] And more than that, you could ingratiate yourself in online communities quite easily. [01:01:15] And so this was where, you know, it was just sort of an evolution of that tactic. [01:01:20] It wasn't that propaganda was new. [01:01:21] It wasn't that Russian interference in elections was new or interference in American politics. [01:01:26] It was that social media had put us into these online factions where we were already telegraphing certain aspects of our identity, right? [01:01:35] If we were joining a Texas secessionist group or a Black Lives Matter group, we were saying, this is a belief that I hold, right? [01:01:41] And so when they decided that this destabilization effort was worth going for, what they did was they didn't use fake news, actually. [01:01:51] That was kind of a wholly separate thing that military intelligence that the GRU was doing, but just staying focused on the internet research agency, the troll factory. [01:01:59] What they started to do was create pages and just repurpose content from hyper-partisan and identity-based American media that already existed, or even just kind of memes that were very identity-based. [01:02:11] And they created dozens and dozens of these pages in which they really got very, very deep down into what it meant to be an American and who America was for and these kind of existential questions. [01:02:24] And, you know, they had extraordinarily niche, I mean, they really did their research. [01:02:29] If you were a person with an incarcerated spouse, there was a page for you. [01:02:33] If you were a Texas secessionist, there was a page for you. [01:02:37] There was a page for Chicanos, just all of these different facets of American identity, Muslims, feminists, liberals. [01:02:45] They just kind of ran the gambit. [01:02:47] Lots and lots of conservative and Black Lives Matter pages, though. [01:02:49] Those were the two that they really wanted to pit in opposition. [01:02:53] And by pretending to speak as a member of that community, the person who receives the message is much more receptive to it. [01:03:00] They don't think, oh, I'm getting some incentivized political propaganda piece from some random news site I've never heard of. [01:03:07] Instead, they see like commentary on Twitter that looks like it's coming from a black woman that says, as black women, we shouldn't do this. [01:03:14] We shouldn't vote for Hillary Clinton. [01:03:16] As Texas secessionists, we need to rally at this park on this date at this time to preserve our Texas identity. [01:03:23] And so it was always framed very much as like, we are a member of your community sharing your views. [01:03:29] And that was how it was really done. [01:03:31] It was really entrenching people, creating pride, deep pride in their identities, and then pitting those identities as inherently in opposition to each other because of this question of who is America for. [01:03:43] It's so scary, manipulative. [01:03:47] And you don't know how many times it's happened to you. [01:03:50] What genuinely held belief do you have right now that's been planted there by somebody else intentionally in an effort to manipulate your vote or undermine our country? [01:04:03] It's scary to think about. [01:04:05] And you've also pointed out that it's not just Vladimir Putin. [01:04:10] He kind of worked in a way hand in hand with the social media companies in two ways. [01:04:15] Tell me what you think of this. [01:04:17] In the first way, that stuff did get on social media with absolutely no gatekeeping for a long time. [01:04:24] And in the second way, they too are trying to manipulate us. [01:04:29] You know, you were in the social dilemma, like that by Tristan Harris, who's been on the show. [01:04:34] That's one of the points the film makes, you know, that the social media companies want to stoke outrage, want to fire us up, want to divide us. [01:04:44] My feed on Twitter or Facebook, whatever will look totally different from my neighbors, and my news consumption will be completely manipulated based on my prior likes and so on. [01:04:52] It's not just about like giving you what's in the news today. [01:04:55] It's about trying to collate results that they think will fire you up or outrage you in particular. [01:05:01] And then when we all sit back and we ask, why are we so divided? [01:05:05] Why don't we feel patriotic anymore? [01:05:07] Why do we all hate each other? [01:05:08] And there really are answers to those questions. [01:05:12] There's a lot there. [01:05:14] So I'm trying to think of how to kind of explain this. [01:05:18] There's a financial incentive. [01:05:20] There's a business model incentive to social media, which is to keep you on site because most of these companies are ad-based, which means that in order to serve you an ad, you have to be there to see it. [01:05:31] This creates a kind of perverse incentive where they're constantly gathering data to try to make sure that they're serving you content that you're interested in so that you stay on site. [01:05:41] And what starts to happen is that, you know, they want to show you content that you're going to engage with. [01:05:46] So if you join groups, they're going to show you a lot of posts from those groups. [01:05:51] If you follow friends and you engage with, you know, baby pictures and wedding pictures, you know, that's always going to be kind of pushed into your feed because people love baby pictures and wedding pictures. [01:05:59] So it's not so much a deliberate intent to say like we want to rile people up, but the problem is that the intersection of social media dynamics with human nature and with the polarization that does exist creates an unfortunate feedback loop where the user does have agency. [01:06:19] You know, we can decide what to click on. [01:06:21] We can decide what to share, but we're picking what to click on and what to share from content that's been curated for us or recommended to us. [01:06:30] So that's where the kind of agency intersects with the algorithm, right? [01:06:34] And then we have these tools that we've been given to be propagators. [01:06:37] And one of the things that social science regularly shows is that people tend to propagate messages that are, you know, they have a strong, what's called signaling factor. [01:06:46] I am a member of this political tribe. [01:06:48] I am a member of this American identity. [01:06:50] You know, I am a member of this particular group. [01:06:53] And oftentimes people really respond to language of like moral righteousness, right? [01:06:58] And so you're saying, I am outraged about this story and I think the world needs to know. [01:07:05] And so I am going to click that share button. [01:07:08] Again, this is content that's been curated for and recommended to me. [01:07:11] I do have the decision and what I want to do with it. [01:07:14] But in that moment, because of the norms and the behaviors that we've kind of come to embody on social media, the way that online behavior, you know, particularly on Twitter, which is really like the arena, the way that that's evolved, we do tend to share this stuff that has this strong component, the strong moral component where we're saying, you know, as a good Democrat, I am very outraged about this particular political decision that these other guys did. [01:07:40] And so I am going to share along that story that makes them look terrible, which is then going to continue to propagate. [01:07:47] It's a challenge because you do want social media to serve as an amplifier, as a way to call attention to speak. [01:07:57] This is where we can talk about the Elon. [01:07:59] What he's actually asking for in the free speech kind of debate is actually a really interesting thing to interrogate. [01:08:05] But we are using them increasingly as like kind of factional battles for attention, ways to activate political tribes, again, fundamentally in opposition to each other. [01:08:16] And so just to kind of connect it back to Russia, there are these, and China and Iran. [01:08:21] And I mean, every nation state has these accounts at this point. [01:08:25] It's not new anymore. [01:08:26] It's not novel. [01:08:27] Most of them are not particularly impactful. [01:08:29] That is one thing I really do want to caveat. [01:08:31] You know, there are hundreds of thousands of Chinese bots. [01:08:33] They just don't get much engagement. [01:08:35] So you have these things that are throwing accelerants on the fire, but the fire is really domestic at this point. [01:08:41] It's really domestic American influencers, domestic, highly activist crowds constantly fighting in this particular environment. [01:08:50] Again, through this sort of the unintended consequences of the business model incentive structure, the curation and recommendation tools that were the algorithms that were developed. [01:09:01] But you say unintended. [01:09:02] That may have been true originally, but now that it's been called to their attention, you can't say that anymore. [01:09:08] They know what they're doing. [01:09:08] They don't care. [01:09:09] I mean, look at the whistleblower from Instagram and Facebook. [01:09:13] They know. [01:09:13] They don't care. [01:09:14] Their money is more important to them than the wellness of the country. [01:09:17] I think that, I mean, I'm not going to dispute that. [01:09:19] I think that there's a lot of really horrible stuff that came out in the whistleblower thing. [01:09:22] But there's one thing that has been really interesting, which is this question of once those networks have been established, meaning once we've all kind of self-sorted over the last seven years into these very highly factive, you know, highly activist online factions, it's really hard to know what to do next. [01:09:38] So even when the platforms decide, hey, you know, uh-oh, we recommended QAnon to these hundreds of thousands of people. [01:09:46] What do you do after that? [01:09:47] Right. [01:09:47] This is where we're, I kind of feel like we're like, we're like paying debt, you know, the debt accrued from these terrible decisions that were made. [01:09:55] Maybe the answer is not in the curating. [01:09:57] It's in the obsessive, you know, need to make us part of their feed every two minutes, you know, all the notifications. [01:10:03] And like, maybe that's the answer. [01:10:05] Now you've created this monster by feeding us all this disinformation and trying to shove us into these groups that are highly partisan or what have you. [01:10:12] Maybe the answer is since you can't undo it, leave us alone. [01:10:15] Let us go on Facebook if we want to go on Facebook. [01:10:16] Stop tapping us on the shoulder all day, having the computer think about how to get into our heads. [01:10:22] You know, this is where I think the, I have, I've had Twitter notifications. [01:10:25] Twitter is my favorite, my favorite social media platform. [01:10:28] It's the one I spend the most time on. [01:10:29] You know, despite all the, you know, all the critiques, all the disasters, I really do find it an interesting place for like, you know, hearing novel things that I wouldn't hear otherwise and seeing people's perspectives. [01:10:40] I like Twitter, but I did turn off notifications about five years ago. [01:10:43] And I think that it means that I open it on my terms as opposed to getting some kind of push notification. [01:10:51] You know, I think Tristan talks about this a lot in the social dilemma. [01:10:54] And we're starting to see Apple kind of come in. [01:10:57] You know, you might notice if you get a lot of notifications from a particular app, Apple will actually kind of pop up a prompt if you have an iPhone asking, Do you want to receive these? [01:11:05] Right. [01:11:05] And so it's constantly saying, you know, how do we avoid, you know, it's almost like the device manufacturers sort of like serves as the gating function against the excessive push notifications. [01:11:16] But I get them from, you know, media apps. [01:11:18] It's just everything is a constant battle for attention. [01:11:21] And so whether that's social media or even again, unfortunately, like media properties at .ny, media apps needing to compete for attention, wanting to drive people directly to their apps, this phenomenon of like the constant, incessant push notification constantly. [01:11:36] Well, yes, but it's not the same. [01:11:37] And look, I'll be the first to tell you that cable news is based on outrage and they want you to be on their websites as much as anybody else. [01:11:44] But they're not like, they don't have the access to so much personal information about you in the way that your phone does. [01:11:50] That's what's so dangerous about the social media companies. [01:11:52] You know, they can see everything and they use it. [01:11:55] You know, they use it for evil. [01:11:57] Don't get me wrong. [01:11:58] I go on these websites, but I'm very guarded. [01:12:00] And, you know, I'm now thanks to movies like this one. [01:12:03] I'm constantly with my kids. [01:12:04] Like, don't say yes to the notifications. [01:12:06] No, the answer is no. [01:12:08] And don't give any information about yourself. [01:12:09] And don't, you know, it's like, it's, it's hard to raise kids in this era because what they want, what my kids want to do is just go on these games. [01:12:16] You know, they don't, they don't do any social media, but the games want a lot of information about them. [01:12:22] And then it's like, year of birth, what is your full name? [01:12:25] What is a phone number to recover the account? [01:12:27] I'm like, ah, I don't want them to know all this stuff about my child, right? [01:12:31] But like, and he can't get on the game. [01:12:33] I don't, it's, it's very disconcerting. [01:12:35] Yeah. [01:12:36] This was mine are eight, five, and two. [01:12:38] So I have little, little kids, but um, the eight-year-old is definitely in the uh, I want to play Among Us. [01:12:44] And I'm like, who, what are the chat restrictions on Among Us? [01:12:47] Let me go play it for a while first, you know. [01:12:50] Right. [01:12:50] Um, and but but there is also like, you know, you're right. [01:12:53] There's this is this is where people go though for social connection now, right? [01:12:58] I mean, I was on America Online when I was in like, you know, sixth grade or something like that. [01:13:03] You seem too young. [01:13:03] So you would have been like two. [01:13:06] No, no, no. [01:13:06] I was, um, I, you know, was definitely one of those people who like got all those free CDs and like ran up my parents' phone bill secretly and got in a lot of trouble for that. [01:13:15] But no, I was, I was like very early to the internet. [01:13:18] And as a person who was and remembers like anonymous chat rooms and things, I definitely think about all the really stupid decisions I made as a kid and how I'm, you know. [01:13:27] Thank God I was an adult when this happened. [01:13:29] I was like 25 when the internet came out. [01:13:31] That was safe. [01:13:31] I had done all my stupid shit privately. === Political Bias in Censorship Decisions (14:56) === [01:13:35] Yeah, no, it's it's it's a real thing though, right? [01:13:37] I mean, like teaching kids to understand the incentive structures, I think, is actually huge. [01:13:40] And, but again, so much of it is, um, this is kind of human nature and our desire to connect with each other. [01:13:47] You know, it's fun to find groups where you have common interests. [01:13:51] And this is this is the one thing that I think is important to raise, right? [01:13:54] When we think about social media company obligations here, it's very, very clear. [01:13:59] There's some real bright lines where we say, okay, Russian, Chinese, Iranian bots, like that's a hard no. [01:14:04] And the policy that was developed to address that was a policy called inauthentic activity, right? [01:14:10] It was a policy around inauthenticity. [01:14:12] The argument they were making was not that this stuff was true or false. [01:14:16] Oftentimes the content wasn't even falsifiable. [01:14:18] It was just political propaganda, you know, which were awash and everywhere. [01:14:22] But what they were saying was that you could not pretend to be a Texas secessionist from a troll factory in St. Petersburg. [01:14:31] And so the arguments for taking down these accounts, for disrupting these networks, were all around authenticity. [01:14:37] But when you ask the question of what do you do about the outrage machine, since so much of it is domestic, right? [01:14:43] Since so much of what is curated for you and what is recommended to you is not Russian at all. [01:14:48] It's just domestic hyper-partisan content or, you know, again, these sorts of facets of identity, identity-based content or interest-based groups. [01:14:56] The question that kind of confronts the platforms now is what can you recommend? [01:15:01] How should you decide what to recommend? [01:15:04] And this is where that really interesting conversation around free speech versus content moderation begins to come into play. [01:15:11] Because there is a sense that if the platforms nudge the algorithm, let's use the algorithm is not the best term, but I think it's the most colloquial at this point. [01:15:22] If the algorithm, if Facebook nudges the algorithm, you might recall after January 6th, they had what they call these like break glass measures, where it chooses to deprioritize political content in the feed. [01:15:35] There are a whole lot of people in politics who get very, very upset about that, who see that as censorship, who see that reallocation of attention by changing curation as being fundamentally viewpoint-based discrimination. [01:15:50] And so that's where this tension really comes into play as they're not just there, though. [01:15:56] I mean, I understand, like we can dispute whether that was the right move or the wrong move, but being more on the right side of the aisle, I can see the censorship that they do to the conservatives and their viewpoints all the time. [01:16:07] And I know like COVID, that's not, it wasn't even a right left issue. [01:16:11] I mean, all my Democrat friends in New York practically are ready to vote Republican over what was done to their families during COVID and the misinformation that was classified online, all of which turned out to be true, you know, like whatever, the earlier stuff about this thing looks like it really came from a lab. [01:16:28] And, you know, you'd get censored for saying things like that. [01:16:31] And questions about the vaccine, my friend Dave Rubin, he got banned from Twitter for a while for saying the vaccines don't prevent the spread. [01:16:39] And now we know that's true. [01:16:41] Like they don't prevent the spread, you know, like anyway, things like that that were deemed disinformation that we later learned are not disinformation and we should have been allowed to have an open conversation about it and dispute it in the, however you want to call it public square or in these forums that we're all on. [01:16:55] That's that's where the word censorship comes up. [01:16:58] And they were censored. [01:16:59] The Hunter Biden laptop story, right? [01:17:01] Which Twitter wouldn't allow to be circulated saying that was disinformation. [01:17:05] Meanwhile, their censorship campaign was disinformation. [01:17:09] The Hunter Biden laptop was real and people should have been able to see it and make up their minds about whether it mattered. [01:17:15] See, I actually agree with you. [01:17:17] Even approaching it largely from the center left perspective, I don't think that, I don't think that censorship, I don't think that the takedowns actually work. [01:17:26] And there's a variety of reasons for that. [01:17:27] But let's stick with COVID because I think that that's an interesting, that was an interesting time. [01:17:32] Because the problem with COVID was the platforms were trying to decide how do we take our health misinformation policies that we've had for years. [01:17:42] You know, Google since 2012 has had a policy called your money or your life, right? [01:17:46] And what it says is we shouldn't be returning search results to you based on what's popular, because what's popular can be manipulated. [01:17:55] Because if we're using engagements or links on a social, on a search engine backlinks or likes likes or something like that, then we're just surfacing what's popular, which means going back to the bots or the fake accounts, anybody with a, you know, anybody who can generate those engagements can kind of trick the curation algorithms into returning stuff that is popular. [01:18:14] So the question that the platforms have to ask is for, you know, are there certain areas where they should try to return accurate information? [01:18:22] And this is where those original policies came from, this idea of your money or your life, your health or your finances. [01:18:27] If you get a cancer diagnosis and you go to Google and you search for the name of your cancer and what comes up is a bunch of like eat some mushrooms and have some peach pits, you know, cyanide cures cancer kind of alt health quackery, you're probably not going to necessarily get the, you know, the kind of most authoritative medical information or links to, you know, to the right hospitals that you might want to connect with. [01:18:50] Now, what's happening there is there's actually been a scientific consensus, and it's evolving, science is always evolving, but there's some sort of consensus where there's some sort of expert opinion that says, when you get this cancer diagnosis, these are the reputable centers that you should go to. [01:19:05] And here is the most reputable information about the various facets of the diagnosis you've just been given. [01:19:12] Versus if you go to a social media platform, particularly in the kind of 2015 to 2019 timeframe, and you typed in that same thing, you'd probably find some good support groups, which is very useful. [01:19:24] But you might also find just a lot of people who are creating content to try to pull in people who've gotten these diagnoses so that they can sell them something. [01:19:33] And so what was happening during COVID, even prior to the rollout of the vaccines, was you had this novel disease. [01:19:39] The health institutions were not producing good content. [01:19:43] They weren't saying anything, really. [01:19:44] They were very reticent to communicate. [01:19:47] There was no strong consensus about what had happened. [01:19:50] Nobody knew whether it was a lab leak or pangolins in a wet market. [01:19:54] And so the platforms, though, had to decide what results do we return in this environment of incomplete consensus. [01:20:02] And so that's where what you start to see is these policies that were made for much more established things. [01:20:10] The MMR vaccine staff see very, very well established. [01:20:13] When people search for MMR vaccines on Facebook, particularly after the measles outbreak in Samoa and the measles outbreak in Brooklyn in 2019, they didn't want to surface the most popular content because oftentimes that was not medically reliable, or it was from anti-vaccine groups. [01:20:30] So what they tried to do was create this policy that said we're going to surface authoritative information from the CDC and WHO. [01:20:37] That all really collapsed during COVID because again, the consensus wasn't there and the institutions weren't producing content. [01:20:44] And most importantly, I think institutions are not adept at communicating with the public in this media environment. [01:20:52] They don't say, here's what we know and here's what we don't know. [01:20:55] They just wait until they think they know something and then they say that then. [01:20:58] I feel like I spent most of 2020 to 2022 writing articles about institutional failures and media overreach and social media trying to communicate and just the absolute disaster that the information environment had become. [01:21:12] At the same time, what I'm not comfortable with is the idea that large accounts that get a lot of attention because they have a lot of followers that they've managed to accrue in a totally unrelated space should be the things that platforms surface just because they have a contrarian perspective about a disease. [01:21:29] And that's why I think this curation phenomenon is actually really the question that plagues us. [01:21:34] It is, in fact, I think the most important question as we move forward in this environment that's not going away. [01:21:40] So, you know, how do we adapt to this? [01:21:41] What do we surface? [01:21:42] That I think is actually really the key question for us. [01:21:45] And these are, these are interesting issues. [01:21:46] I mean, I agree with your point of like the person who gets the cancer diagnosis and diagnosis should be served up the best information from the most respected institutions. [01:21:54] And I see your point about how that, but that's an established thing. [01:21:57] Know, like the MAYO Clinic is not trying to mislead us on. [01:22:00] You know what pancreatic cancer means. [01:22:02] You know, we know that we've lived in this earth long enough to know that. [01:22:05] And and your distinction about how covet things went off the rails because it was new and there was too much radio silence and there. [01:22:11] But there's another element to it too, which is now we know that you know, Fauci and Collins actually did try to suppress, like the Great Barrington Document declaration by, you know, three very well respected doctors, Stanford Harvard um, the third one was equally respected so, forgive me, I can't remember uh, what university he was from. [01:22:30] Uh YALE okay yeah, so my point is like and we've seen now thanks to the Foyer requests and so on like the active attempt to silence really smart doctors who are thoughtful, who are infectious disease doctors, saying here's another way we might go about this. [01:22:45] You know, and the so the institutions with all their power said no, let's silence them and let's disparage them as quacks and the social media companies went along with it. [01:22:54] You know that's that's where people get angry, totally distrustful. [01:22:59] You know there's got to be a way of handling the politics that are behind some of these decisions. [01:23:04] You know, and in that instance the left eventually had its near total respect and trust in Fauci and the right had near total. [01:23:12] You know the opposite. [01:23:14] Like the, the social media companies, the more they're on us on a side in making those decisions, the more aggravating they are. [01:23:21] The more they infuriate people, the more people choose other forums where you can really go into a rabbit hole, like Reddit. [01:23:27] You know, I just, I don't know what the solution is I, but I see the problem very clearly. [01:23:31] Yeah I I uh, I spend a lot of my time trying to think about the solutions, you know, I think I think there's um, there's a couple things again. [01:23:39] I, you know, I I listened to your interview with um Robert F Kennedy Jr right and, and I thought um, i'm stridently pro-vaccine. [01:23:47] That's just where my, my personal uh politics are right. [01:23:49] You know I have three kids and and and the idea that um, you know for me, I see, you know the the, the um, I did a lot of uh arguing in California that I, that I actually do think that the childhood vaccine should be required for school. [01:24:01] So there's my kind of disclosure of bias. [01:24:03] Um, and and Rfk Jr was involved in that, in that uh kind of legal battle that we had in California um, but what I thought was very interesting about about your interview with him was that you had the conversation right. [01:24:15] There was the the, the dialogue there, but you also did the fact check right alongside it. [01:24:20] You know you spliced it in in some cases. [01:24:22] You asked hard questions, you pushed back, and one of the things that I think is really challenging about social media is that, because of just the um again, we self-select to some extent and then the algorithms reinforce um who we want to follow, and so it's very hard to see those ideas juxtaposed, to actually have that counter speech, to have that correction. [01:24:44] And so one of the questions for social platforms I think you know, just for um, for any of your listeners who aren't familiar with platform moderation roughly falls into three buckets, there's remove, which is what it sounds like they're going to take it down. [01:24:56] Then there's reduce, which is they're going to algorithmically throttle it, and we can talk about that. [01:25:00] That was used in the Hunter Biden laptop uh story by Facebook, even as Twitter went the remove route right. [01:25:06] So there's difference um, differences in how platforms choose to respond to these things. [01:25:11] And then there's inform. [01:25:13] And inform is the posting of the fact check, right? [01:25:16] Or the posting of some sort of contextualization or counter speech right alongside the content in an effort to try to make people realize that there's that there's a matter of debate. [01:25:29] I like that. [01:25:30] That is third one. [01:25:33] And I never complain when YouTube wants us to throw up the CDC website or whatever. [01:25:38] That's fine. [01:25:38] I don't care. [01:25:39] I truly am like the more speech, the better. [01:25:41] Like you should check out what the CDC is saying. [01:25:43] It's up to you. [01:25:44] You're smart enough, viewers and listeners to figure it out for yourself. [01:25:47] I have no problem doing that, but I do definitely have problems with the other two in most instances. [01:25:53] I mean, there's certainly some. [01:25:54] I know you've written about sort of like the ISIS videos on how to make a bomb. [01:25:57] No, they serve absolutely no social purpose and they shouldn't be allowed to stay on there, but that's very different than I think COVID started in a lab. [01:26:05] Yeah, I agree. [01:26:06] I mean, hey, I am not defending the moderation choices of platforms. [01:26:10] What we try to do, again, you know, my team is at Stanford Internet Observatory, so I am a researcher of the stuff. [01:26:16] And we ask these questions. [01:26:18] You know, we say, like, was content uniformly actioned? [01:26:24] And what that means, that's very like kind of nerdy way to put it. [01:26:27] But when a platform decides that something violates its policies, first, is the policy clearly articulated? [01:26:33] This is something that the Facebook Oversight Board in particular kind of publicly puts out assessments. [01:26:38] Was this policy clearly articulated? [01:26:39] Then was it fairly applied? [01:26:42] And then was it uniformly applied? [01:26:44] And what we see sometimes is that, you know, one guy's content does get a label or come down and the other piece of content with the exact same claim does not. [01:26:54] And that's, I think, that feeling of unfairness, that feeling of enforcement unfairness is one thing that people can constantly point to because there's millions and millions of posts. [01:27:03] I think there's not a single group on social media that I can think of where I haven't seen some sort of claim that social media is biased against them. [01:27:10] But this, our ways of actually analyzing these things for us really come from like we make a very discrete, you know, we make a very, very small data set where we say like, okay, here are all the platform actions on election misinformation in 2020 on URLs that we looked at that were fact checked to be false. [01:27:28] Okay, then what happened? [01:27:29] And that's the kind of work that we try to do now where we say like, is the enforcement fair and is the policy fair? [01:27:37] These are kind of two different questions. [01:27:39] Do you have different ideologies on your team? [01:27:43] Do you have conservatives with you? [01:27:44] I would like us to have more conservatives. [01:27:47] I think that this is the kind of constant chronic challenge of academia. [01:27:53] I think that there's a fair bit of healthy debate among the team, actually. [01:27:59] And one way that we try to deal with this is through interinstitution partnerships where we say, what are ways in which we can engage with civil society or other academic institutions that have a different perspective or that have different data sets? [01:28:15] And that's really the best thing that's a solution. [01:28:19] That's the next move. [01:28:21] I've told my audience this, but I went out, I spoke at Google and Facebook, and I've been at Snapchat. [01:28:26] A bunch of these media companies have asked me to go out and talk to them about how they can be more fair. === Ideology vs False Claims Debate (03:20) === [01:28:31] And I did that. [01:28:33] And I told them all the same thing. [01:28:35] And listen, I just so you know, I've been a registered Democrat. [01:28:37] I've been a registered Republican. [01:28:39] I've been a registered independent for the past 20 years. [01:28:42] I just, I don't like wearing anybody's team jersey. [01:28:44] And there's too many losers in each party for me to associate with them. [01:28:47] So I'm not, I'm not ideological. [01:28:49] You know, I would say my sensibilities lean center right. [01:28:52] As you said, you're center left. [01:28:53] But my advice to the companies was wherever you stand, you've got to get more conservatives in on these decisions so that you can make sure, you know, you got your hands at 10 and two and this thing pulls to the left. [01:29:05] So you got to get somebody in there to make sure it doesn't pull to the left too much. [01:29:08] Otherwise, you will wind up with biased decisions and upset, you know, consumers and so on. [01:29:15] And I'm sure it is hard in academia, but talk to David Sachs. [01:29:18] He's in tech and he's a concern. [01:29:20] He probably knows some people. [01:29:22] We know we know a lot of the same people, actually. [01:29:24] There's a, I think that Silicon Valley has been a, that's where I spent a lot of time in tech prior to going into academia. [01:29:31] So I've only been in academia for three years. [01:29:33] I'm not exactly like a, you know, an entrenched academic. [01:29:36] But the center left is good there. [01:29:38] We'll take it. [01:29:39] Pardon? [01:29:40] Even center left is good in academia. [01:29:43] Well, you know, I mean, and I think it's also, it's interesting though, like how you get, how you get read. [01:29:47] I think it is, you know, I think I am occasionally read as center right. [01:29:53] You know, I was, I was pretty active in the, you know, some of the, you mentioned COVID and outrage at how kids were treated. [01:29:59] And I was pretty outraged about a lot of that myself, a lot of school closure stuff. [01:30:03] And so, you know, I think that it's, there's a tendency to kind of reduce people down to an ideological persona. [01:30:10] And I think that it's, you know, it's a, it's a kind of like a short-term heuristic, but I think that the importance is having those people who are on the, you know, who lean more in one direction or another. [01:30:21] I will say that one of the things is, you know, it's there are certain things that are in fact, you know, quite, quite demonstrably false. [01:30:28] And then there comes the question of how do we handle those and where is the, because you have a free speech right to say nonsense whenever you want to. [01:30:37] Right. [01:30:38] And in anything, elections, COVID, you name it, you can, vaccines, you know, you can, you, you have your right to your bad, wrong opinion and your right to express it. [01:30:46] Right. [01:30:46] And so the question becomes then, because everything is curated, what should be surfaced? [01:30:52] Like, what is the ranking function? [01:30:53] And that's where I think this, this tension is really in what is upranked or downrank? [01:31:02] To what extent should factual accuracy be factored in? [01:31:04] No, I get that. [01:31:05] But I also think to what extent should expertise be factored in? [01:31:08] That's a, that's a real tough one too. [01:31:10] Well, I agree. [01:31:10] I agree with that, but I also think it's, it gets so gray because it's like you could say, yes, I agree with you on election misinformation. [01:31:20] There are definitely claims that you know are false that you could that you could, but there could also be political bias definitely playing a role in what gets deemed false and what you know, what doesn't. [01:31:30] You know, for one example, you know, there's a very strong difference in opinion about whether the mail-in ballot should have been allowed in Pennsylvania, you know, given the way they changed the voting, whatever. [01:31:40] There was like a legitimate dispute there. [01:31:42] So if somebody's tweeting out the vote in Pennsylvania is not legitimate, there may be actually a very good basis to say that that has nothing to do with Kraken, Sidney Powell, Rudy Giuliani, or any, right? [01:31:51] So it's like, who's going to make that decision? === AI Blending Into Background Noise (03:36) === [01:31:52] John Stossel, my old pal from, he was a Fox business and he was ABC for years before that. [01:31:58] He did this great expose. [01:32:00] If you haven't, I'm going to send it to you if you haven't seen it, but it's basically on how he tried to do some environmental reporting that challenged some of the green energy crew's assertions. [01:32:09] And he had both sides. [01:32:11] He had both sides. [01:32:12] And they censored him and he went through the layers to see like, why was I censored? [01:32:16] And it truly was just a matter of opinion. [01:32:18] They couldn't point to anything he had said that was factually wrong. [01:32:21] It was infuriating. [01:32:21] Anyway, I got to leave it at that because I got to squeeze in this quick break, but I'm going to bring you back, come back for a couple of minutes on the opposite side before we have to end. [01:32:27] This is fascinating. [01:32:28] And I really do want to ask you about how people can tell whether they're interacting with a real life human on LinkedIn, on Twitter, on any social media, because Renee is the person who knows. [01:32:37] Stand by. [01:32:41] All right, Renee. [01:32:42] So there you were online and you got a message on your LinkedIn. [01:32:48] Seemed normal enough, according to the report from NPR, from Keenan Ramsey wanting to connect with you. [01:32:54] You're both in a LinkedIn group for entrepreneurs. [01:32:56] And we have a picture of Keenan here. [01:32:58] And you thought Keenan looked a little weird and your expertise led you to start asking questions. [01:33:04] And Keenan is fake news. [01:33:05] She's a fake person, just like my Lewis at Air France, fake news. [01:33:11] So how do we figure out when we're interacting with someone who is fake? [01:33:18] So AI can be used to generate wholly novel faces. [01:33:22] And it's called generative adversarial networks. [01:33:24] GANs is sometimes the terminology that you'll see. [01:33:27] But AI can generate images, videos, text, right, at this point now. [01:33:31] And so what happened there was I got this email from this person. [01:33:36] You can usually tell the, you know, the eyes, nose, and mouth, when a computer generates a face, it uses a certain kind of grid. [01:33:42] And so it puts the features roughly in a particular line. [01:33:44] And if you superimpose a lot of these faces on top of each other, you'll actually see that the eyes, nose, and mouth are always in the same place. [01:33:53] A lot of times the hair is wrong. [01:33:55] It blends into the background in some way. [01:33:57] The collar melds into the neck, melds into the hair, melds into the background. [01:34:02] The teeth are often wrong. [01:34:05] The pupils, actually, AI, for some reason, computers don't do a great job with pupils. [01:34:09] The pupils are wrong. [01:34:10] The ears are weird, and it doesn't know what to do with jewelry. [01:34:13] It doesn't understand earrings. [01:34:15] So you'll have one earring, but not another. [01:34:18] I think it's one of these things where if you've seen enough of them, they kind of jump out at you. [01:34:22] I thought it was interesting because I thought, okay, there's, you know, of all the, it was novel to me to receive a message on LinkedIn in particular. [01:34:30] These things are all over Twitter and they're used because it's very hard. [01:34:34] You can't just reverse image search and go figure out that it was a stock photo that this fake account is using. [01:34:39] But people do now at this point, actually interestingly use them just because they want to be anonymous online as well. [01:34:45] So again, this question when we were talking earlier, what's a bot, what's a troll, what's a fake account, who's really behind something? [01:34:53] It is increasingly hard to tell. [01:34:54] And that's where I think platforms do have an important role to play here in identifying these networks themselves. [01:35:01] Well, Renee's done a lot of great work in figuring out, like, you know, how like your AI on your computer can figure out what word you're typing or how you want to end your sentence. [01:35:09] It's so far beyond that. [01:35:11] They can generate whole articles now that get sent out that were not written by a human. [01:35:16] And the future is more of that. [01:35:18] And they're getting better at it. [01:35:19] And that's what we're up against, which is why we need Renee and her group to be out there foreseeing it and cluing us all in on, you know, how to avoid it or recognize it. === Future of Automated Content Generation (00:20) === [01:35:29] So we appreciate you and what your team does and your expertise, Renee. [01:35:32] Thank you. [01:35:32] Thank you so much for coming on. [01:35:34] Thanks for having me. [01:35:35] All right. [01:35:35] I want to tell you that this Friday, Andrew Schultz is making his return to the show. [01:35:39] This is going to be big news. [01:35:41] You'll find out why. [01:35:42] Stay with us. [01:35:43] See you tomorrow. [01:35:45] Thanks for listening to the Megan Kelly Show. [01:35:47] No BS, no agenda, and no