Straight White American Jesus - The Sunday Interview: Governing Without Accountability: Silicon Valley’s Ideology with Adrian Daub Aired: 2026-05-03 Duration: 01:04:34 === German Politics vs New Horror (03:08) === [00:00:07] Axis Mundy Welcome to the Straight White American Jesus Sunday interview. [00:00:34] I am Annika Brockschmidt, author of Americas Gotteskriege, America's Godly Warriors and Die Brandstifter, the Arsonists, and host of podcasts like Feminist Shelf Control. [00:00:44] Today, I'm very excited to be speaking with Adrian Daub about his new book, What Tech Calls Governing, which is out in German right now, if I've understood that correctly. [00:00:55] Under the title, I'm going to say the German title, because why not? [00:00:58] Was das Welle Herrschen nennt. [00:01:01] And Adrian is a professor of German studies and literature at Stanford University. [00:01:06] So he's right at the source, as well as the director of the Michel R. Clayman Institute for Gender Research. [00:01:13] And basically, whether you want to learn about Novalis, Adorno, Wagner, by the way, I just realized that there are some Wagner notes right on the shelf behind me, or the cancer culture panic, or the weird gender and sex politics of the modern right, Adrian is your guy. [00:01:33] That's Adrian, that's your claim to fame now. [00:01:34] That is. [00:01:35] That is what you are known for. [00:01:37] Oh, God. [00:01:38] Life choices. [00:01:40] Yeah. [00:01:41] But this is actually. [00:01:42] Yes. [00:01:43] And this is actually the second time that we, as two native German speakers, will be recording a podcast episode in English. [00:01:51] That's right. [00:01:51] Because you were on In Bed with the Right to dispel myths and to explain the German election to our listeners, which is still a very, very popular episode. [00:02:01] Remain so. [00:02:03] Because people are, every once in a while, people are seized by the question, what the fuck is up with Friedrich Merz? [00:02:08] I mean, that is a question that I think puzzles me daily. [00:02:14] I have daily moments of thinking about Friedrich Merz, and that's quite unpleasant, I have to say. [00:02:18] Yeah, no. [00:02:19] I don't care for it. [00:02:20] No. [00:02:21] And so, the last time we spoke in English, the German election results had just come in. [00:02:26] And let's just say things have gotten distinctly worse since then. [00:02:30] Yeah. [00:02:31] And the vibes were already bad. [00:02:34] And by the way, we just, I don't know if you saw this because it's quite early in the morning. [00:02:40] On the West Coast, we just had the first ever nationwide poll come out that sees the AFD in first place. [00:02:48] Wow, that's a massive demand. [00:02:50] 26% AFD, 25% CDU. [00:02:54] So, if there was an election this Sunday, that's what would be happening. [00:02:58] Anyway, so you know, this fun stuff aside, we're not here to talk about the horror show that is German politics today, but we're here to talk about your new book, which is great, but describes a different sort of horror show, if I Can tease it in that way. === What Tech Calls Thinking (13:00) === [00:03:15] And this is actually the second book that you've written about the mindset behind the elites of the tech industry behind Silicon Valley, right? [00:03:23] The first one was published, I think, a couple of years ago now. [00:03:26] 2020 came out in October 2020 and then in German in November. [00:03:30] It came very, very quickly. [00:03:32] And that was called What Tech Calls Thinking. [00:03:34] Yeah. [00:03:35] So that was the title was so for German listeners, what tech calls thinking is a play on the way was heist denken by Martin Heidegger is usually translated. [00:03:46] But I liked the title beyond the joke, which probably only I've really appreciated. [00:03:54] I liked it because I didn't want to say, sometimes it gets introduced as like, oh, it's about how tech thinks. [00:04:00] And I was like, I definitely don't want to write that. [00:04:01] And I don't know how to write that. [00:04:03] The course thinking is really important because my contention in it came out of an editor saying, why don't you write an intellectual history of Silicon Valley? [00:04:14] I said, I don't think you can because I think an intellectual history. [00:04:18] Confers a certain amount of gravitas, and you're like, This person has read this other person and has done something with their thought. [00:04:24] And I'm like, these people have like gleaned like a third of this book while like on ketamine and like then like used the buzzwords like and put them in their own private sort of mythology and cosmos. [00:04:35] I don't want to call that intellectual history. [00:04:37] So it's like, okay, but they're calling it thinking. [00:04:39] They perform it as thinking, they perform it as intellectual work, as thought leadership, as they like put. [00:04:44] And same is true for the new book because they're in charge, but it's not quite clear they understand that they're governing. [00:04:50] It's like they're very petulant when you are like, well, Like, you're actually in control here. [00:04:56] Like, what do you want to do with that control? [00:04:58] Like, well, I mean, that feels like a very rude thing for you to say to me. [00:05:01] Like, well, no, I'm sorry. [00:05:02] You keep making these claims about how your technology will reshape the world and seem deeply offended when people think that that might create reciprocity, that this may create certain claims on you, your money, your person, that there might be ethical constraints that exist on you. [00:05:21] Yeah. [00:05:23] And so, again, I was like, well, it's not how they govern. [00:05:26] We can all see how they govern, but it's what they call governing. [00:05:30] So, what was it that made you want to dive into what tech perceives as, or the tech elites, we should probably say, what they perceive as both thinking and then later governing? [00:05:45] So, when did you start writing the first one? [00:05:47] The first one, I think, in 2019, 18, 19. [00:05:53] And it was very much a product of a particular moment. [00:05:57] I had started thinking about it around 2012, 2013, when some of my smartest students started going into tech, which is fine. [00:06:05] Like, this is how you make money in this area. [00:06:07] If you want to live here, I kind of get it. [00:06:10] But they did it with this kind of naivete that really shocked me, where they're like, I want to do good things. [00:06:14] And therefore, I'm working for Google. [00:06:16] And you're like, oh my God. [00:06:17] I thought I taught you better than that. [00:06:19] That's a lot to unpack. [00:06:20] And so I was really like, this, I need to understand how these people find this compelling. [00:06:26] These kids should know better. [00:06:27] Well, funny thing happened on the way to the forum. [00:06:30] By the time I started writing the book, a lot of those same students had spent time in the industry and had come away quite disillusioned and would come back to me and say, I now understand why you were so crestfallen when I said that all the way back then. [00:06:43] Let me tell you what's actually going on in these companies. [00:06:46] So suddenly I had this very privileged window. [00:06:49] And it's not that like the interviews sort of didn't make it into, they're not formally part of the book, but they would point me in the right direction. [00:06:57] Like, you got to check out this guy. [00:06:58] He's a fucking trip. [00:06:59] Or like, this is the thing they're all doing. [00:07:01] Can you look into that? [00:07:02] Right. [00:07:02] And I was like, That often became newspaper articles. [00:07:05] And at the same time, I had a lot of friends in the Valley who, likewise, were getting older and starting to really kind of feel that things were off and sort of disillusioned with the whole place. [00:07:15] And so I just got a never ending stream of, like, you got to check out this fucking Joker. [00:07:20] And so I had really good seats for this moment when I think the kind of investment, the libidinal investment in Silicon Valley kind of shifted, where people really, there had been a kind of dominant. [00:07:34] Sort of zeitgeist that had proposed that Silicon Valley somehow like did things better and we all had to learn, we, whoever we is, we had to learn from them. [00:07:43] And then it sort of shifts right before the pandemic that people are like, it starts with the Cambridge Analytica scandal, but like really by 2018, 2019, the idea that these people are kind of a problem predominates. [00:07:55] And I think, you know, I had sort of, and that had knock on effects in Silicon Valley and that these people did not take it well, that people were no longer. [00:08:05] Just giving them big slobbery kisses, but being like, Hey, you appear to be a massive multinational company. [00:08:12] Can we see your books? [00:08:14] Are you a fraud? [00:08:16] These kinds of rude questions were asked. [00:08:18] And it didn't take a lot of the, as you say, the top echelon didn't take it well. [00:08:22] And that's also, you've alluded to it twice now, and I thank you for that. [00:08:26] That, like, it's not a portrait of these companies. [00:08:29] I'm not saying anything about your average Googler. [00:08:31] I'm not saying something about, like, a guy who works at Meta or a person in Salesforce's legal division or whatever. [00:08:38] This is, and it has become more and more so, a story of the top echelon and really of the kind of medially visible billionaire class in Silicon Valley who have indeed, since 2020, and I'm sure we'll get to that, radicalized in a massive way. [00:08:53] The rest of the Place, as far as I can tell, hasn't shifted that much. [00:08:58] And really, it's the boss and investor class that really has done what your weird uncle on Facebook did, right? [00:09:07] Like, fully has started sharing weird Nazi memes. [00:09:11] Yeah. [00:09:11] And like, fully cook their brains. [00:09:13] Yeah. [00:09:14] And like, fulminate about how no one's working anymore while you're like, you're going to tennis right now. [00:09:18] I'm not sure you're in any position to lecture anyone on working. [00:09:23] This is interesting. [00:09:25] Whenever that pops up, I can just hear, I only hear the Kim Kardashian sound vibe. [00:09:32] No one's working anymore. [00:09:33] Get off your fucking ass and work. [00:09:35] Sorry. [00:09:36] But yeah, Kim, Kim, she is. [00:09:39] We stand a working queen, which apparently she is. [00:09:41] I don't know. [00:09:42] I mean, she is. [00:09:44] It's weird, deeply weird. [00:09:45] There's a great different book on that coming out, but that would derail me further. [00:09:50] Sorry. [00:09:51] It's called Kardashian Colloquium. [00:09:52] Anyway, very interesting. [00:09:54] With a K, I'm guessing. [00:09:55] With a K. Of course. [00:09:56] I think you would like it. [00:09:57] There's a lot of. [00:10:01] Political and philosophical angles to unpacking the other rich text that is the Kardashians. [00:10:06] Anyway, you're listening to Straight Wide American Jesus. [00:10:09] I will stop talking about the Kardashians now. [00:10:11] But because you just said, so it was really something that shifted within the elite of the tech industry. [00:10:19] I think that's so important because I'm sure we've all read and heard a lot about the alleged so called vibe shift, the shift to the right. [00:10:30] That, you know, generally politically speaking, but also in Silicon Valley. [00:10:34] And that kind of is an assumption that's often taken at face value without being really interrogated who has shifted and who is claiming that. [00:10:45] And one of the people who really likes to claim that and claimed that, I think, in 2019, Mark Andreessen, one of the names that keep popping up in your book, you know, one of the tech billionaires that we're talking about today, he claimed that there was a vibe shift that was sweeping the country. [00:11:04] And one that was even happening, as he claimed, in San Francisco, in Silicon Valley. [00:11:10] And what you show in this book, in your analysis, is that's not necessarily true. [00:11:15] So who is Andreessen talking about here? [00:11:18] And why did this happen so quickly? [00:11:21] Because it did happen in a rather short amount of time. [00:11:24] Yeah. [00:11:25] I mean, yes and no. [00:11:27] One of the problems is it's not that the liberalism of the tech industry was largely overstated before, especially at the top echelon. [00:11:36] But there are, I think, you know, not to be like reductively Marxist, but like, you know, if someone makes $200,000 a year, they're going to be less likely to be a bleeding hard tax and spender than someone who makes $40,000. [00:11:50] Like, these were often the richest people on the block. [00:11:54] Like, again, like the people I know all vote Democratic, but like, do they have certain conservative tendencies? [00:12:00] Like, yeah, I mean, that comes with like having two cars that people can scratch, a front yard that someone can take a shit in, and like, you know, And maybe, like, not knowing that many more people who, like, you used to know people who really were struggling, and you don't know that many anymore. [00:12:16] Yeah. [00:12:16] Like, yeah, it was never as hippie ish as its outside reputation. [00:12:21] But still, I mean, we don't have data, obviously, on these companies. [00:12:25] That would be kind of creepy, even for them. [00:12:28] Sorry, just one more thing, because you just mentioned hippies. [00:12:31] Maybe for our non US listeners, something that I think Europeans often project onto the sort of historical hippie movement in the US is that it was automatically staunchly anti capitalist. [00:12:45] Yes. [00:12:46] Which wasn't. [00:12:48] The case, broadly speaking, I would say. [00:12:51] Yeah, especially the Northern California variant was very compatible with founding usually small companies, but they valorized the form of the corporation. [00:13:01] My colleague Fred Turner has written about that in his book, From Counterculture to Cyberculture, where he says this isn't really that big a transition. [00:13:11] At the same time, in terms of their voting behavior, in terms of their social attitude, in terms of how they think about trans children, for instance, they're Gonna be kind of on the left. [00:13:22] And I mean, just to give your listeners kind of a sense, I mean, the vibe shift, Andreessen never defines what he really means by vibe shift, but clearly it has something to do with Donald Trump. [00:13:32] So votes for Donald Trump would seem to be a pretty good proxy. [00:13:35] And I looked into this. [00:13:36] I mean, the voting data for the counties that Silicon Valley really draws its about 900,000, 950,000 employees from that would be San Francisco, San Mateo, Santa Clara. [00:13:51] The Donald Trump did. [00:13:54] Gain some votes, but like in the thousands between 2020 and 2024. [00:14:01] Right. [00:14:01] These, I mean, Santa Clara County has something like 3.5 million. [00:14:05] Yeah. [00:14:05] That's not happy. [00:14:06] Is that right? [00:14:06] I forget exactly how big it is, but it's immense. [00:14:08] I think there's something like almost a million voters. [00:14:11] And so it's like a tiny shift. [00:14:12] Yeah. [00:14:13] The biggest shift is to non voting. [00:14:15] Interesting. [00:14:15] From Democrats to non voting. [00:14:18] This would be the enthusiasm gap that we can all debate on what it was brought on by, but certain explanations suggest themselves. [00:14:28] And at the same time, there's just no real groundswell. [00:14:34] But that's, of course, extremely non granular. [00:14:38] And you can look at other. [00:14:45] So you can break it down a little bit by precinct. [00:14:47] And there are one good thing about these ultra rich is that they kind of segregate it. [00:14:51] There are like two or three communities where they love to live, like Atherton, Woodside, Las Gatas. [00:14:57] There's these places that tend to have a higher. [00:15:01] Than FDA recommended dose of CEOs in them. [00:15:05] And if you look at those, I'm still crunching the numbers. [00:15:08] San Mateo County unfortunately changed its precinct structure between 2020 and 2024, which makes this very hard. [00:15:15] But from my spreadsheet so far, even there, the shift is there, but it's extremely minimal. [00:15:23] What I think is happening, what my suspicion is, is the same way that in 2020, in your family chat, There was that one uncle who went crazy, and he went crazy precisely because everyone else is like, What the fuck are you talking about? [00:15:38] It's not a milieu that kind of moves. [00:15:41] It's the single contrarian. [00:15:43] And that's often how these guys perceive themselves. [00:15:45] They're surrounded by wives, children, et cetera, et cetera, co workers, golfing buddies who are like, What the fuck are you on about? [00:15:52] But that actually, in some way, makes their radicalization stronger and kind of makes them think, Oh, I'm swimming against the mainstream. [00:16:03] But also, There is this big movement, and I'm part of it. [00:16:06] And that big movement is sort of suggested by the fact that, as you were already alluding to, a lot of these guys don't have that much human contact anymore, but are getting their brains cooked on Twitter. === Evading Regulation and Radicalization (10:52) === [00:16:16] So they do make, they do in a weird way create a collectivity, but they create it with like Catered or whatever. [00:16:22] They create it with white supremacist accounts on X. [00:16:25] They create it, you know, with other weirdos on message boards. [00:16:30] That's the kind of thing. [00:16:31] Like they get radicalized in the way we think of our youth getting radicalized. [00:16:34] It's really quite remarkable. [00:16:36] But here's like a 62 year old guy. [00:16:38] On his way back from a squash game, and he's like, and he's deeply dark, it's really dark, yeah. [00:16:44] And in some ways, like you know, they're victims of the same services that make them rich, like they're Pablo Escobar just like snorting mountains of cocaine. [00:16:53] Like, it's like, you guys should know this is harmful, you made it. [00:16:57] What the hell? [00:16:57] Why are you don't consume this? [00:17:00] Don't just gobble it up. [00:17:02] It's so interesting. [00:17:04] I especially find the stuff that Marc Andreessen says out loud into microphones not fascinating because I think it's like. [00:17:12] Intellectually stimulating, but it's quite telling about how he thinks or what he values, what he considers as intellect or influence. [00:17:25] If we take him and others like him, because even though, as you said, these people were never bleeding heart liberals to begin with, and there's variants to this, Peter Thiel told us back in 2008 how he felt about democracy. [00:17:42] Sure. [00:17:43] But in their public behavior, mostly those pesky ladies who don't like libertarians. [00:17:50] And that's a bummer, real bummer. [00:17:53] So democracy has got to go, you know? [00:17:55] You gotta. [00:17:55] Yeah. [00:17:57] And nothing else to do. [00:17:58] But if we look at these people who have, in these last couple of years, radicalized in those kind of weird silos of obscene wealth, can you tell us what? [00:18:14] Sort of societal moments were that led these very, very rich, influential people from what they've told us down this path. [00:18:24] Sort of moments that they perceived as personal slights against themselves. [00:18:31] Because I feel like this sense of being owed something and being besieged at the same time. [00:18:41] So you're all powerful, but you're also. [00:18:44] A victim. [00:18:45] Yeah. [00:18:46] That's quite fascinating to me. [00:18:49] Yeah. [00:18:49] And I mean, they've had this forever. [00:18:51] It's how they've been able to evade regulation for so long. [00:18:56] Always just a little scrappy, multi billion dollar company. [00:18:59] Just a mom and pop. [00:19:00] Yeah. [00:19:01] Mom and pop rideshare service. [00:19:03] Yeah. [00:19:04] But, and we were just defending ourselves against big evil that cabbie over there. [00:19:08] Like, okay, I'm sure. [00:19:10] Operating with slightly skewed metrics there, but parameters. [00:19:13] Yeah. [00:19:14] But in general, I think Andreessen is a good case in point. [00:19:18] So, one story he already tells that he feels like he wasn't respected by the Biden administration. [00:19:24] He was an. [00:19:25] I'm sorry. [00:19:26] That choice of word is so wild. [00:19:27] Yeah, respected. [00:19:29] Yeah, exactly. [00:19:30] They wanted. [00:19:31] Which is also this kind of moving goalpost. [00:19:34] It's like tone policing. [00:19:38] There was no perfect point. [00:19:40] This is your ex post facto rationalization for a radicalization process that you underwent. [00:19:45] One clear point is the attempts. [00:19:48] To start regulating some of these companies in earnest. [00:19:52] This would have been actually during the first Trump administration, which they don't remember, but like, you know, there was in 2018 sort of a push to put Meta sort of on a chain, and that clearly upset a lot of investors in Silicon Valley. [00:20:04] But it was also the attempts to sort of bring crypto and later AI companies sort of under some kind of regulation. [00:20:12] You'll note that, you know, Trump is basically banning state level regulation of AI. [00:20:17] This is the kind of thing that tells you where they got upset. [00:20:21] But then there are other things. [00:20:22] Like there is the pandemic, which really messed these guys up. [00:20:27] For one thing, because they were treated just like everyone else, which they found to be, you know, despicable and really just oppression. [00:20:36] They. [00:20:37] Also, it was a moment of kind of personal humiliation, which I think was easy to miss because we were all like, what the fuck is going on right now? [00:20:46] But if you were here, you got to see all the headlines about like their weird solutions to the pandemic. [00:20:52] They were all super confident, like Silicon Valley would fix it. [00:20:56] And they were going to fix it the way they fix everything else. [00:20:58] They were like, oh, we're going to make an app, or like Palantir will contact trace for us, or, you know, the blockchain will save it in one case. [00:21:06] And you're like, okay, I don't think I'm following, but okay. [00:21:09] But so it became. [00:21:12] And all that, of course, became kind of very publicly like it was made ridiculous ex post facto by the thing they hate most established authority and state authority. [00:21:25] It was, you know, a biotech company and governments being like, you got in your injection or else you're not going to the airport that like made it possible to return to some kind of normalcy. [00:21:36] It was not these weird gizmos. [00:21:38] I don't know how many, at some point, I tried to count how many. [00:21:42] Apps I had ended up downloading on my phone that were supposedly gonna be like the way we could go out. [00:21:47] I mean, I think Germany committed to one much longer than anyone else. [00:21:51] It's the one that one of the things that was like one centralized big app. [00:21:55] Yeah. [00:21:55] Yeah. [00:21:56] And even that didn't really work. [00:21:58] But like in the end, you got rid of it the same way you got rid of smallpox. [00:22:02] And that I don't think sat very well with them. [00:22:05] The next thing that happened was that in 2021, 22, as things did sort of start opening up in the Bay Area, which had a longer than usual lockdown, which is also probably important for your listeners to realize. [00:22:17] There was a spike in crime in San Francisco. [00:22:20] And this was something that it was largely not crime that targeted the wealthy, except for like it, it targeted wealthier individuals. [00:22:31] But the number of VC investors who parked their Bugatti on the street and could get it broken into, I think, is kind of limited. [00:22:41] It really hit people who had just rented a Lexus at SFO and then got that. [00:22:46] Broken into, which sucks. [00:22:48] I'll acknowledge that. [00:22:49] But I don't think that was Marc Andreessen, if I can be quite honest with you. [00:22:52] I think his driver would have been in the car and he would have been like, hey, don't break into my car. [00:22:56] But they experienced this as an attack on the wealth creators, the value creators. [00:23:01] And this was a rare moment where probably they were in agreement with sort of the lower rungs of the hierarchy in their particular companies. [00:23:13] So the kind of person who doesn't share Marc Andreessen's politics on anything else. [00:23:17] Who's, let's say, you know, in the, who's a corporate lawyer for, say, Google or OpenAI or whatever, might agree that, like, the number of tent encampments by the highway are a problem, et cetera. [00:23:31] And so there was, like, this brief moment of comedy. [00:23:33] And this sort of came to a head in, I think it was 2022 when Bob Lee got murdered, the CEO of Cash App. [00:23:41] And this was, like, a moment when, you know, David Sachs and people like that and Elon Musk just went off on San Francisco and basically portrayed the city as sort of deliberately. [00:23:53] Basically, well, not deliberately, but like allowing psychotic homeless people to murder CEOs, open season on CEOs. [00:24:00] It turns out that the killer was another tech person. [00:24:03] Those two knew each other. [00:24:05] It was a crime of emotion, of passion, some kind of long simmering grudge that boiled over. [00:24:14] But it crystallized that moment where they really were like the thing they'd always wanted to be victims now felt really ready at hand. [00:24:23] And so you get these moments building up to the 2024 election, which was really like the crime issue was a big one. [00:24:32] Crime was already going down. [00:24:33] It was very clear that it would, that this was a momentary spike. [00:24:37] But yeah. [00:24:38] Something like the, you know, in San Francisco, for instance, the recall of our district attorney, Chesa Boudin, was big, was championed not just by like you weirdo billionaires, although they did champion it, but like that actually went a little bit further down the chain. [00:24:55] And my friends, you know, who I take to be, you know, perfectly dyed in the wool, you know, San Francisco liberals would, you know, would be like, well, we got to get rid of this guy. [00:25:05] I'm like, I don't really think this does anything, but whatever. [00:25:07] But so there was kind of a readiness. [00:25:10] And then the next recall effort then targeted a school board that was perceived as too woke. [00:25:15] So you can kind of see how this kind of becomes more about culture, culture of permissiveness, often a culture of permissiveness towards minorities. [00:25:24] And then that's the other thing that like there is a post Black Lives Matter, post Me Too retrenchment in a lot of these companies. [00:25:33] And a lot of people like Andreessen did experience 2020, where their employees really pushed for a Racial reckoning within the company as well. [00:25:44] They experience that also as a kind of an imposition and a kind of dictatorship, right? [00:25:48] And when you hear Andreessen talking about now there's a changing of the elite, you're like, oh, wait, you think you weren't the elite and the woke were the elite? [00:25:56] How the fuck were they the elite? [00:25:58] What were they in charge of? [00:25:59] But in his mind, like this is basically the counter revolution against 2020. [00:26:04] And so, again, and that is true for a smaller slice of people. [00:26:08] So, from the crime stuff to the anti woke stuff, you do lose a lot of people who. [00:26:12] You know, maybe we were with you when it was about tormenting the unhoused, but who are not willing to just flat out say racist shit. [00:26:22] But that, I think, is sort of the trajectory of sort of a prototypical VC investor who's sort of been part of this vibe shift. [00:26:31] That's so interesting because you show, and I'm assuming that this is one of the reasons why you now wrote this second book, What Tech Calls Governing, because we're seeing. [00:26:47] How big tech and the US government now under Trump, they've become more tightly intertwined in a way that we didn't see before, but that was also foreshadowed in the months and the year before this last one. [00:27:02] And when we look at how some of these influential figures are talking about, how they're behaving. === Fantastical Versions of Power (07:16) === [00:27:08] So we have Alex Karp, head of Palantir, he's been rambling on about the supremacy of the West that needs to be maintained in Palantir. [00:27:20] Can and should be used to kill the enemies of the supposed quote unquote West. [00:27:25] Elon Musk has been retweeting Nazi content for forever now. [00:27:31] Doge cuts have already killed so many people. [00:27:33] Peter Thiel has been touring the world with his own sort of weird autodidactic version of a bastardized Rene Girard inspired theology that very conveniently justifies whatever he does. [00:27:47] It's so nice for him that how that tends to happen. [00:27:50] It's very convenient. [00:27:51] By the way, Did I ever tell you that I happened to listen to one of these talks kind of as a jump scare in person in Hungary? [00:28:00] Oh, yes. [00:28:02] Where he was announced as a surprise guest at the MCC Fest. [00:28:07] Of course. [00:28:07] The Matthias Corvinus Institute. [00:28:10] Yeah. [00:28:11] Collegium. [00:28:11] Yeah. [00:28:11] Collegium. [00:28:12] Yeah. [00:28:13] That was wild. [00:28:15] The lecture on the Antichrist was very slapstick like, interrupted by an incredible thunderstorm because this was open air. [00:28:24] So it was quite poetic. [00:28:25] But so. [00:28:26] Can I ask which one did you hear? [00:28:28] Because there's four of them by now. [00:28:30] Although, frankly, the fourth one is mostly filler. [00:28:32] Yes. [00:28:33] I heard the one where he rambles on a lot about the atomic bomb, Greta Thunberg, and the Katechon. [00:28:41] You got to be more specific. [00:28:42] Oh, Katechon is, I think, mostly two. [00:28:44] The first one is very art historical. [00:28:47] No. [00:28:48] Okay. [00:28:48] Yeah. [00:28:48] No, It's mostly complaining about how certain scientists he doesn't like and figures like Greta Thunberg will make AOC. [00:29:05] He did bring his PowerPoint. [00:29:07] Yeah, he did. [00:29:09] I wanted to record it secretly. [00:29:11] Unfortunately, the only space left was right next to his security team. [00:29:14] So I was like, I'm not going to do this. [00:29:18] Anyway, so these people seem to have, in their own ways, radicalized in a way that makes it very interesting to ask the question that your book poses What are the sort of common denominators? [00:29:35] When it comes to these people's ideas of power and ruling and governing and hierarchy in general, have you? [00:29:44] I mean, I know you found an answer to the question. [00:29:48] What stood out to you the most there? [00:29:50] So, one is definitely the fact that they have a picture of power that rules without and is recognized without friction. [00:30:01] So, there is a kind of, it's almost a fantastical version of power. [00:30:06] And then I sort of talk about the fact that they fly into. [00:30:09] Sort of science fiction so quickly. [00:30:11] It's often very clear in AI talk. [00:30:15] AI may well become dominant, but every technology known to humans had to be implemented in a society through social means. [00:30:26] It's not straightforward. [00:30:28] The train line can go this way or it can go that way. [00:30:30] A printing press can be, you can issue licenses for it, et cetera, et cetera. [00:30:35] It'll make some people rich and powerful, it'll make, immiserate some other people. [00:30:38] These are social decisions. [00:30:39] I mean, if you hear AI boosters talk about it, it's like this power will act almost on its own. [00:30:46] Kind of like a force of nature, almost. [00:30:48] Like a force of nature or a divine force. [00:30:50] I mean, like, again, I'm thinking of the Muad'Dib here in Doom. [00:30:53] Like, this is how they envision this. [00:30:55] Like, it's just faded, which in a weird way then makes them kind of bystanders of their own power. [00:31:01] And you can see that with Elon Musk very well that this guy seems genuinely surprised sometimes by the fact that he has this influence. [00:31:07] Like, why would you be surprised? [00:31:08] You're basically the richest man on earth. [00:31:10] Like, why? [00:31:11] What do you think happens to the richest man on earth? [00:31:13] Do you think he might be influential? [00:31:14] I think he might be influential. [00:31:15] And this ability to wield power while being disidentified from it, I think is very, very key to what TechCall is governing. [00:31:26] And it is, in fact, the big commonality that it has with the Trump administration. [00:31:32] We have a president who's very capable of saying, someone really should look into this. [00:31:36] Like, yeah, man. [00:31:38] God, do you know someone who runs the government? [00:31:39] Yeah. [00:31:42] They're capable of. [00:31:44] Incredible insurgent energy and claiming to stick it to the government. [00:31:48] Like, I always think of that there's a scene. [00:31:50] I mean, this dates me, but like, there's a scene in Apocalypse Now where Martin Sheen gets to this battle somewhere in the Vietnamese jungle and he crawls around in these trenches forever. [00:32:01] And he's like, he keeps asking, Who's in charge? [00:32:03] Where's your commanding officer? [00:32:04] Who's in charge? [00:32:05] And he meets these two guys. [00:32:06] He goes, Who's in charge? [00:32:07] And they go, Ain't you? [00:32:09] And it's like, Yeah. [00:32:10] Okay. [00:32:10] If someone's in charge, it's you guys. [00:32:12] Like, if you want to talk to the manager, it is you. [00:32:16] You are the manager. [00:32:17] The coal's coming from inside the house. [00:32:19] Exactly. [00:32:19] The White House. [00:32:20] Yeah. [00:32:21] And so there is, that's, That is to me a really constitutive element of this a power that cannot fully identify with itself, partly because it doesn't fully believe itself. [00:32:33] That is to say, that these guys have a kind of fascist theory of power and they can't, they themselves, they judge themselves to be deficient in it. [00:32:42] They kind of think, no, but, but like, I'm a nerd, I'm a weirdo, I'm a creep, I can't be, I cannot build this power because, like, the way I've been taught that power should look, I don't look like that. [00:32:54] Think of if you want to get a picture of The ideal, you only have to look at the bizarre pictures, AI generated slop that Elon Musk will retweet of himself. [00:33:05] That is the guy who would have the power. [00:33:07] That is not Elon Musk. [00:33:08] And Elon Musk knows that. [00:33:09] No amount of jaw chiseling will do that for him. [00:33:12] No amount of looks maxing. [00:33:13] Looks maxing. [00:33:14] I don't know if he looks maxes. [00:33:16] I hope he smashes his jaw with a hammer, but I don't think he does. [00:33:18] There is a chance. [00:33:20] Yeah. [00:33:20] No, no, I think he just gets tons of plastic surgery. [00:33:24] And my guess is the nootropics. [00:33:27] Luckily, those don't do anything, but it's, you know, share this help. [00:33:30] But yes, I think it is this kind of power that stands aside itself because it partly can't fully believe itself because it is actually more invested in the power trappings of American liberalism than it cares to admit. [00:33:42] It sort of carries that residually with it. [00:33:45] The same way some democratic parties that seem to be tilting towards the right sort of seem to think that these institutions that rein in fascists are just going to be around forever. [00:33:54] It's like, oh, you sweet summer children. [00:33:56] Like, no, like those things die. [00:33:58] Yeah. [00:33:59] But like they kind of can't quite believe it. [00:34:01] They're like, well, it's not going to be like back then. [00:34:03] It's like, well, That's what they said back then. [00:34:05] And the other thing I think is also that part of why it feels, why this power doesn't feel full and self assured is that it ultimately is, you know, something of a sign of weakness as well as of strength. [00:34:22] There was a moment, and I know that's paradoxical. === Entitlement Meets Embattlement (15:19) === [00:34:25] These people are as dominant as they've ever been. [00:34:27] At the same time, I don't think they must, on the whole, they should not like where they are. [00:34:31] And I don't know if they like where they are. [00:34:33] There was a moment when Meta, when. [00:34:36] Alphabet, et cetera, et cetera. [00:34:37] It's like, well, we're not American companies. [00:34:39] We're companies for the world. [00:34:40] Everyone, Germans, Googled, Pakistanis, Googled, Americans, Googled. [00:34:43] Great, wonderful. [00:34:45] Now we're getting to the point where they're, as you say, fused to the US security state to where SpaceX is the space program, which means that suddenly you're identified with a national project. [00:35:00] That is for a corporation not the best thing, especially if the guy running your country keeps starting trade wars. [00:35:07] Very easy to imagine that the EU at some point is like, well, okay, we'll just ban X. Like, we know it pisses off a good friend of Donald Trump's. [00:35:14] And honestly, no one's going to miss this fucking sewer at this point. [00:35:19] Like, it's not like depriving our citizens of T shirts. [00:35:22] We're depriving them of the Nazi swamp. [00:35:23] Is gone horrible. [00:35:24] Yeah, we're depriving them of the Nazi swamp. [00:35:26] So there's that. [00:35:27] There is the fact that, like, a lot of these undertakings, like, people just seem to be voting with their feet. [00:35:32] I mean, the metaverse to me is always a really interesting thing. [00:35:34] It was sold to us with the same air of inevitability. [00:35:37] It's worth looking back at coverage of the metaverse and the media. [00:35:40] It was wild. [00:35:41] Yeah, people are like, oh, it's inevitable, it's all coming. [00:35:43] It sounds a lot like the way they talk about AI, to be honest. [00:35:46] It does. [00:35:46] And in the end, people were just like, I want legs, man. [00:35:49] I don't have legs in this. [00:35:50] This is weird. [00:35:51] Yeah. [00:35:52] I got sick from the Oculus Rift device that I wore. [00:35:55] I think it was kind of an early version, they were kind enough to demo an early version for me, but it was fun for 10 minutes. [00:36:01] And I was like, I'm going to fall over this trash can. [00:36:05] I need to take this thing off. [00:36:06] And they're like, yeah, it does tend to happen. [00:36:07] I'm like, OK, but how do you think someone's going to attend a conference in this? [00:36:12] Again, to game with, it was pretty fun for 10 minutes. [00:36:15] But I'm like, the idea of having to. [00:36:18] Hear someone lecture in this is like honestly among my top five nightmares, yeah. [00:36:22] And and and also like the identification with the Republican right, you know, it does, you know, depending on what happens in November, it does look like the Republicans did not succeed in fully coordinating the state and abolishing democracy to the extent that they'd hoped. [00:36:40] Let's be clear, which means that there is a non zero risk of a Democrat eventually being in power. [00:36:46] And while one should never Underestimate the Democrats' capability for yellow belly cowardice. [00:36:54] I do think that the incentive to be nice to these people is close to zero. [00:37:00] And the incentive to really, the appetite for really showing them the business, I think is fairly high among the electorate. [00:37:10] Oh, yeah, definitely. [00:37:12] I mean, I think, you know, as much as I am not a fan of making political decisions based on the results from focus groups. [00:37:23] I think that would pull very well. [00:37:25] Yeah. [00:37:26] There'd be no downside, right? [00:37:28] And the same with the EU and trade on Silicon Valley. [00:37:32] I think it's just, they've positioned them, they've maneuvered themselves into positions where they're heavily identified with very specific projects that don't appear to be that long lived. [00:37:45] That is not the bet they wanted to be making, I think. [00:37:48] I think the bet they made after 2008 was much more sustainable, which was to basically make yourself. [00:37:55] Identical with neoliberal sort of globalized capitalism and sort of very wishy washy kind of, you know, rainbow coalition rhetoric while being controlled entirely by mostly white men, right? [00:38:10] That was a, that I think the bet on that surviving is a lot better than the bets they're making now. [00:38:16] And that's because I think they don't, they didn't, those are forced bets. [00:38:18] Those are not bets they wanted to be making. [00:38:20] So that's the other thing. [00:38:21] It's a power that sort of understands itself in the decline. [00:38:24] I always think of this, I mean, Sorry. [00:38:28] Quince Labodian and Ben Tarnoff, in their new book on Muskism, talk a little bit about Musk's South African background. [00:38:35] And it's hard not to think of sort of late apartheid kind of politics, where it's like you can see which way the wind is blowing, but you're trying, you have this weird politics of forestallment, politics of postponement, politics of sort of delaying the inevitable. [00:38:52] And I think that's kind of where they're at. [00:38:54] They're all living in the Kral somewhere. [00:38:56] And that's how they talk, right? [00:38:58] Like Mark Andreessen basically can. [00:39:01] He can hear the, at the moment, kind of imagined, but not really, the masses at the gates of his lair, his mansion, his whatever. [00:39:13] I think that is a part of the sense of dread that seems to propel these people, that then they know that in spite of their immense power, there might be a reckoning coming for them in whatever way that might look like. [00:39:30] Yeah, I mean, the bet has to be that. [00:39:32] The parliamentary democracy or pluralistic democracy will not survive until the reckoning comes. [00:39:38] One big, I mean, again, like I'm careful about pronouncements about AI. [00:39:43] I seem to be the only person on the planet who's careful about this. [00:39:47] I'm like, I'm a literature professor. [00:39:48] I don't know. [00:39:49] But I can tell you what I know as someone who's been around for many Silicon Valley hype cycles that it is a hype cycle. [00:39:54] Like sometimes there's something real behind a hype cycle, sometimes there's not. [00:39:57] What's very noticeable is how they talk about it. [00:40:00] For them, this is class warfare against the very people who humiliated them in the pandemic. [00:40:06] They're white collar workers who said, I don't want to come in. [00:40:08] I don't want to get sick. [00:40:10] I want to have childcare leave. [00:40:11] I want to have childcare leave, et cetera, et cetera. [00:40:14] Like they think they can make them. [00:40:17] Obsolete. [00:40:19] And they can finally fire them, which means that they kind of also dry out their surroundings at the same time, poison the drinking water. [00:40:26] Right, there's that. [00:40:27] Yeah, I mean, that's a bad bonus, I guess. [00:40:30] But the interesting thing is, like, their hope is how do you do that? [00:40:33] Well, you declare war against sort of the professional managerial class, as Barbara Ehrenreich would call them, which can sort of sometimes align with management, sometimes align with workers because they're both. [00:40:45] Yeah. [00:40:45] And so the. [00:40:48] Their vision is that, sort of like as a people's tribune, they bypass the white collar workers straight towards the Trumpist base of just like proletarians, basically, or subproletarians, really, lumpenproletariat, right? [00:41:03] That they meet online. [00:41:05] But their bet, I think, on the size of that lumpenproletariat is wrong. [00:41:10] Like, I think it's actually a pretty foolish bargain, which means that they're going to have, you know, workers, their managers, Everyone is mad at them. [00:41:19] Like, I do think this is, ultimately, I wrote myself into a place of kind of hopefulness in this book. [00:41:23] I do think that, like, they're not in a good place. [00:41:26] That's quite rare in our field of work. [00:41:28] I know. [00:41:29] I was kind of pleasantly surprised when I was reading the book. [00:41:31] I was like, oh, this is taking a slow but pleasant turn. [00:41:35] I think, like, I mean, yes, if they succeed in abolishing democracy, then we're fucked. [00:41:40] I don't know. [00:41:41] I think they're not, these are the people who invented the metaverse. [00:41:44] Like, I'm not sure they can deliver the best of democracy. [00:41:47] They can hold on to that morsel of hope. [00:41:49] Yeah. [00:41:49] Killing democracy is best left to real professionals like Friedrich Merz. [00:41:53] Oof. [00:41:55] That man is just plowing forward. [00:41:58] Yeah. [00:41:59] I just. [00:42:00] We have to talk about Karen Prien at some point. [00:42:02] Oh, God. [00:42:03] Yeah. [00:42:03] Jesus. [00:42:03] Yeah. [00:42:05] What a sweetheart. [00:42:06] But yeah. [00:42:06] Every German listener just now at this moment heaved a very big sigh at my utterance of this name. [00:42:12] But yeah. [00:42:14] But yeah, so that I think is my hope that I do think that I see the plan. [00:42:20] I see the overall. [00:42:22] Direction which someone like Andreessen or someone like Thiel, I think, sees things going. [00:42:28] Yeah. [00:42:28] But I also know that these are billionaires ensconced in their weird palaces surrounded by weird yes men who are frankly grifting them. [00:42:35] Why don't you take this pill? [00:42:36] Why don't you run for governor? [00:42:38] Like, why don't you put a weird proposition on the ballot? [00:42:41] Like, they're mostly surrounded by, sorry, parasites, people who, you know, like the kinds of people who were on Louis XIV. [00:42:48] And like, I do think that that inhibits your ability. [00:42:53] And then they go online and are around these like, Crazy making spaces like Twitter. [00:42:59] Stop talking about IQ all the time. [00:43:01] Exactly, right? [00:43:02] It does become this kind of thing. [00:43:05] It's not a movement that is built to create greater and greater coalitions. [00:43:13] It is one that, in fact, does the opposite. [00:43:15] It just shuts more and more people out, which is ironically also, not ironically, which is very logically also true of Trumpism. [00:43:22] I'm working, as you mentioned, on the 1933 book right now. [00:43:26] And one of the things that's very spooky about the Nazis in their first year is. [00:43:29] How they figure out how to play to constituencies that were not convinced Nazis by January 30th, and how they really managed to sort of like not necessarily convince them, but make them sort of speak their language or participate in a particular aspect of it. [00:43:44] Yeah, make them feel like maybe just holding still and waiting is the right thing to do, or they participating in this one kind of immediate way, but not fully going in. [00:43:54] And now that's okay, that's okay. [00:43:56] And if you look at what Trumpism has done in its first two years back in power, It is not that. [00:44:01] Like, it essentially seems to be designed for the kinds of people who reply to Elon Musk andor JD Vance tweets with three fire emojis. [00:44:12] And it's like, well, that's not how you build a coalition. [00:44:15] That is how you distill the next group that's going to storm the Capitol. [00:44:19] Like, these people are going to end up in a deeply regrettable place andor a mass suicide. [00:44:25] But, like, it's not. [00:44:26] This is Jonestown logic. [00:44:27] It's not sustainable. [00:44:28] Yeah, it's not sustainable. [00:44:30] It's not a growth model. [00:44:31] And I feel the same way about Silicon Valley or about its political ambitions. [00:44:36] Yeah. [00:44:37] I find it quite interesting because you also spend some time in your book about when you write about the way that this tech elite sees the world and how they are trying to find proof, because you just said they're so insecure in their power. [00:44:55] They're trying to find natural, biological proof of their supremacy that they themselves don't really believe in, but they think they should have. [00:45:06] But it can't really seem to grasp. [00:45:07] So I think it's quite fascinating to see this. [00:45:10] There's this incredible cruelty and callousness to the way of thinking that you describe. [00:45:16] And at the same time, there's this sense of entitlement combined with the sense of embattlement that we've been talking about. [00:45:23] Neediness. [00:45:23] It's a desperate neediness. [00:45:25] It's not enough to make the rules to dictate how the world works, what technology people use, who lives and dies when we're talking about USAID. [00:45:35] It's in their minds. [00:45:36] Offensive to be held accountable in even the vaguest of ways, yeah. [00:45:40] I mean, I'm sure we've all seen the Doge Bros, uh, you know, the equivalent of Big Ball's testimony. [00:45:47] Um, that's a sentence I never thought I would say, anyway. [00:45:50] Yeah, that's the other thing. [00:45:51] They debase the way we talk. [00:45:53] We all have to like talk about freaking Palantir, and like, I haven't talked about Palantir's since I was 14 and reading Lord of the Rings, and here now I'm discussing Palantir. [00:46:04] That's wild, but do you know, I find that. [00:46:07] It is, you write about this posturing as the underdog, even as they're, I think that's the example you gave, as they're dismantling USAID, effectively condemning people to death that they don't seem worthy of living. [00:46:22] And I hope you don't mind. [00:46:23] I'm going to do the awkward thing where I quote your words back at you. [00:46:28] And I will take full responsibility for the translation. [00:46:31] This is not Adrian's work. [00:46:33] I translated your German quote into English. [00:46:35] So sorry in advance. [00:46:37] You wrote, this makes them susceptible to modern populism, which does not believe in its own populism, and to authoritarianism, which never truly believes itself to be in power, not even when it exercises power in a dictatorial manner. [00:46:52] And that really reminded me again of Peter Thiel and the way he thinks about himself. [00:46:57] Yeah. [00:46:58] That's kind of the desolation of this. [00:47:00] Exactly. [00:47:01] A guy who thinks that it is bullying to out him and it is not bullying to destroy the. [00:47:08] Newspaper that outs you. [00:47:09] I mean, like, that's it. [00:47:11] Like, you're like, you have your theory of power is like that of a comic book. [00:47:17] It's not based on how society works or, you know, or an objective analysis. [00:47:23] It's this kind of sense of aggrievement. [00:47:25] It's kind of adolescent. [00:47:27] It's very much hyper individualistic. [00:47:31] It's fantasy. [00:47:32] We're living in their LARPing, essentially. [00:47:36] And when we look at how all of this relates to Donald Trump, I think the parallels are quite obvious. [00:47:41] But if we, what you do is you basically go a bit deeper, you zoom out, you write the following. [00:47:50] Yes, I do have a second quote about the way that this dominance play, like dominance LARPing almost, is performed by these tech billionaires and those who think. [00:48:01] Like them, you write, it draws its strength from a lack of understanding, indifference, even ignorance. [00:48:07] It does not understand the things it dominates and it does not respond to this lack of understanding with shame, but with a perverse pride. [00:48:15] And the example you give, this is one of my favorite parts of the book, is how this relates to the Trump regime Scott Besant and what he said about the Federal Reserve. [00:48:25] Yeah, that's a great quote, isn't it? [00:48:27] Where he's like, I don't know what anyone over there does, basically. [00:48:32] It's first of like Scott Besant knows what they do. [00:48:34] He's been in the business for 20 years. [00:48:36] Second of like it was very clearly a bid to become Fed chair. [00:48:40] The idea that you're like, I don't know this thing. [00:48:43] Oh, let's put you in charge of it. [00:48:44] Like that's deeply frightening. [00:48:46] Yeah. [00:48:46] Yeah. [00:48:47] But also the fact that like the ignorance is kind of feigned. [00:48:51] It's not real. [00:48:53] It's often that these people forget expertise that they have. [00:48:57] And that is the corrosive thing. [00:48:59] Like big balls really didn't understand USAID, but they weren't playing people who did understand what it did and sort of. [00:49:05] Chose to accede to that logic. [00:49:07] I think that's how a lot of us are with AI. [00:49:09] Every time we hear the phrase, oh, well, the AI is not fucking ready yet or whatever. [00:49:13] And I'm like, it's never fucking ready. [00:49:15] Have you noticed this? [00:49:15] Like, every time it's like, well, yeah, right now it gives you all this kind of garbage. [00:49:19] And, but, but, but like down the line, I'm like, I, okay, maybe, but like it feels very weird that people are losing their jobs to a technology that is not quite ready for prime time. [00:49:33] Like, it's like, what, like, Imagine that with self driving cars. [00:49:36] Self driving cars are pretty freaking good, I have to say. [00:49:40] And imagine if it ran over people at a constant clip. === The AI Readiness Myth (05:02) === [00:49:45] And you're like, well, I mean, it's not perfect, but we should fire all the Uber drivers. [00:49:50] It's wild. [00:49:51] I find it very strange the way we sort of allow our idea of something working to kind of become almost like a folk mythology of how something ought to work to sort of become. [00:50:07] The way people decided works, even if they know better. [00:50:10] Like, Big Balls didn't know any better. [00:50:13] He's 22 and an idiot, you know? [00:50:15] Yeah. [00:50:15] That's not to defend him. [00:50:16] He shouldn't have taken that job, I guess. [00:50:19] But he is, I think, genuinely an idiot. [00:50:21] But Scott Pacent, I think, is not genuinely an idiot. [00:50:24] He just plays one on TV. [00:50:25] I think that is, or for his boss. [00:50:27] And I think that is far worse. [00:50:29] And the fact that, like, people who ought to know better are acceding to these things over and over again and are sort of like playing along with power. [00:50:38] That isn't yet fully matured like AI, but that has to be respected and acceded to as though it were already mature. [00:50:46] That to me, that is the central pathology of our moment. [00:50:50] One key moment in the radicalization of the tech elite that you point out in your book is Mark Zuckerberg's congressional testimony in, I think, was it 2018? [00:51:00] I think it's 2018, yeah. [00:51:02] Which is really interesting because in that hearing, he basically got it from both sides, he got it from both Democrats and Republicans. [00:51:10] What to you was the difference? [00:51:12] Why was the Republican criticism less offensive to him? [00:51:15] Because I think this tells us quite a lot about the sort of switch up in maybe affiliation with one party that kind of seemed to have happened here. [00:51:26] Yeah, I mean, Zuckerberg afterwards had this meeting with Facebook leadership where he basically talked about like an existential threat to Meta and to Facebook. [00:51:37] And that was very clearly about Elizabeth Warren's question because they essentially were deriving at antitrust questions, which really does concern the integrity of the company. [00:51:48] Like, if followed to their conclusion, they would suggest that there cannot be a single Meta, that it has to be broken up, and that you'd apply the logics of US antitrust law, which used to once be quite robust. [00:52:00] But from the Republicans, of course, he got these kind of, yeah, the anti woke, what we today would call the anti woke questions about like free speech concerns, shadow banning, et cetera, et cetera, which are easy as shit to accede to. [00:52:15] One little tweak of the algorithm, like make Tucker Carlson your fact checker in chief and Bob's your uncle. [00:52:20] Like it's very, very easy. [00:52:22] So that is the thing that I think people sort of like it was, he obviously, like just as an experience, hated being there and hated getting it from both sides. [00:52:31] He must have realized he had very few friends in that room. [00:52:34] But he must have also sensed, or whoever was with him must have been like, well, one of these is extremely easy to satisfy because they're not upset at our power. [00:52:46] They're upset that our power doesn't uplift white men enough, right? [00:52:51] We can do something about that. [00:52:52] We can do something about that. [00:52:53] The other one is really about how we wield power. [00:52:55] And, well, that's fucking central. [00:52:58] That's the whole ball game. [00:53:00] And so it sort of makes sense. [00:53:03] That is the kind of forking path that a lot of these companies find themselves in. [00:53:08] And I mean, in Zuckerberg's case, it coincides with a bunch of other things that are happening. [00:53:16] I think the turn towards the metaverse is sort of like him trying to prove to himself that he is a genius. [00:53:21] And it is just a kind of a project of midlife, $80 billion midlife crisis. [00:53:27] And at the same time, gender anxieties, I would also say. [00:53:29] Exactly. [00:53:30] The gender anxieties around Sheryl Sandberg, the fact that they had gotten professionalized by this lady. [00:53:37] So there's all this stuff that's sort of going on for him. [00:53:40] So it's not the only thing. [00:53:41] But I do think that that's a key moment. [00:53:44] And I do think, you know, on the one hand, like, I don't know how much the Republicans knew they were rolling out a red carpet to him because it could have sounded like hectoring. [00:53:54] But at the same time, I think they've, it's not an accident that they understand. [00:54:02] Like, they looked at that company and were like, well, this is very much an average American company where, you know, white male Americans are in charge. [00:54:10] Like, we can like this. [00:54:11] We don't like that it won't allow us to Holocaust deny. [00:54:15] And Mark Zuckerberg is like, well, I can do something about that. [00:54:18] That's easy peasy. [00:54:19] Some light Holocaust denial just comes with the territory. [00:54:24] Sure. [00:54:26] And then it becomes this kind of thing where the Republicans, who also, of course, as you know, love this kind of pose of victimhood, can identify with the CEO as a victim. [00:54:34] I mean, I would urge people to look back at early statements about Twitter when it got taken over by Musk. [00:54:42] There was talk about, or also against actions against Tesla after Musk started Doge. === Metaphysical Forces and Skynet (09:46) === [00:54:47] There are. [00:54:49] There was talk of like compelling advertisers to stay with the platform by law in order to, like, because like that was infringing on Elon Musk's, you know, white supremacist free speech. [00:54:59] That Delta Airlines was like, I don't know if we need to be next to, you know, CSAM material. [00:55:04] Like, I think we can just argue, we can just advertise on Instagram instead. [00:55:07] Yeah. [00:55:07] And it's like, no, you must, you must speak there now and you must, you know, hang out with Elon and his Nazi buddies. [00:55:14] Probably state power looked a lot more fun. [00:55:16] Yeah. [00:55:16] And I believe it was Pam Bondi who suggested that people who, Scratch people's Teslas could be prosecuted for terrorism. [00:55:23] Sure. [00:55:24] Suddenly, like, it's all about consumer choice until, you know, you can identify in victimhood with that CEO and you're like, if you don't buy this guy's shit, you're a criminal, okay? [00:55:36] And this is what fusion with state power looks like. [00:55:39] It's a grievance complex on both sides, and they found the perfect way to express it together. [00:55:45] You point out that a lot of this way that this tech elite is thinking about governing is incredibly. [00:55:55] Dark, and almost like a celebration of almost not just dystopia, but nihilism or dystopia as utopia. [00:56:05] And because you, like me, I would say, have moments of being chronically online, you cite an evergreen tweet, which is also, I think, very illustrative about the weird relationship you've mentioned it already that these guys have with science fiction and fantasy. [00:56:23] Yeah. [00:56:24] You write as evergreen. [00:56:27] At long last, we have created the Torment Nexus from classic sci fi novel Don't Create the Torment Nexus. [00:56:32] Yeah. [00:56:33] And given that Peter Thiel named his surveillance machine Palantir as a fantasy and sci fi connoisseur, why is that so bizarre? [00:56:41] And why is this such a weird thing with these guys? [00:56:45] Well, the whole point of the Palantir is that it's a technology that destroys you, that destroys the wielder. [00:56:51] The one main Palantir we meet in Lord of the Rings basically drives Saruman, who uses it, insane. [00:56:58] But it's par for the course for these guys. [00:57:00] They tend to identify. [00:57:03] I mean, how many AI ads have you seen that basically evoke Skynet from the Terminator movies? [00:57:09] You're like, in a hype cycle, dystopian warning and utopian promise almost fall into the same thing. [00:57:17] And that's why I would always be careful for people who think themselves critical of this industry to fall into AI doomerism. [00:57:26] I think you're giving up something very, very central, which is. [00:57:31] This is not an all powerful technology. [00:57:32] This is a technology that will be deployed and at very crucial junctures regulated by human beings one way or the other, whether it's state regulation or the companies themselves. [00:57:43] And short of like, oh, it gets a hold of the nuclear codes or whatever, like, which, you know, whatever, if that happens, we're all just fucked. [00:57:51] But like, very, very likely, very, very likely, we're looking at these P doom scenarios really are the same as the kind of, oh, we won't have to worry about X anymore because of AI. [00:58:03] Like, it's just hand waving away the process of doing politics, the process of having a society where these things are decided, where people get together and, like, well, I don't like the way this is being deployed, et cetera, et cetera. [00:58:14] Or I don't like this being automated. [00:58:15] I don't think it should be. [00:58:16] Or, you know, where a company decides, we keep getting sued for this. [00:58:21] I was yesterday introduced to our new AI financial tool at Stanford, which I can ask questions. [00:58:27] It's basically a glorified Google, but it does hallucinate. [00:58:31] So they're like, don't take, when it says you can get this approved, don't take that for gospel. [00:58:35] And I'm like, well, what happens if I do? [00:58:37] And they're like, Well, for the time being, while the technology again is not fully matured, we'll let it go through. [00:58:44] And I'm like, okay, so basically, you're double dog daring me to find bizarre shit that this thing will approve and just buy the fuck out of it. [00:58:52] Is that the idea? [00:58:54] This is bizarre, but like, I guess there you can sort of see where this is going, very likely, which is that there are going to be use cases for AI where companies judge the possible exposure in a legal sense is minimal. [00:59:12] And it's pretty okay, fine, we'll fire the people. [00:59:15] There can be other things where they're like, it's really important to have someone to blame, and it needs to be Steve, not Gemini. [00:59:20] And it has to be Steve who fucked up. [00:59:22] Like, who the fuck approved this thing? [00:59:24] And I just, I, so the, the, I think the reason to get back to our Palantir tech billionaires, the reason they like this stuff is because it's total and it's mythical and it's power that sort of doesn't, that, that erupts into people's lives without having to integrate itself, without having to go through the layer of the social. [00:59:46] And, and it's, it's a fantasy. [00:59:48] It's, it's a, but it's also a promise. [00:59:49] That's how they think their technology is going to work. [00:59:51] When in fact, we know full well. [00:59:53] How do these technologies usually integrate themselves into society by meshing with the social, where certain uses, where certain things will be prohibited, some things will be allowed, some things will indeed go by the wayside, some new jobs will be created. [01:00:07] Like it's just, there's a kind of noise or kind of, what does Clausewitz call it, friction? [01:00:13] There's a friction that normally occurs when something like this is introduced. [01:00:20] And I think what something like the Palantir is supposed to evoke or Something like Soylent or whatever. [01:00:26] Or Skynet, this is the unremitting. [01:00:29] This is the angry god. [01:00:30] This is a metaphysical force. [01:00:33] And I have got some news for you. [01:00:34] I don't think it is. [01:00:35] I don't think it is a metaphysical force. [01:00:36] I think it is a more or less okay plagiarism technology or probabilistic generation technology. [01:00:46] And you will find that it has some very serious limitations. [01:00:49] And there are parts of society that it will not find entry into. [01:00:53] And it will call for more politics, not less. [01:00:55] But I think something like Palantir as a name is meant. [01:00:58] To move it towards the meta political or apolitical. [01:01:04] So, you, or one of the conclusions that you draw in your book is also when reflecting on your other work, you say that what this tech elite calls thinking or governing is very compatible with fascism. [01:01:23] Yeah. [01:01:25] Can you give us an example of what you mean by that that really distills this for you? [01:01:30] Yeah, the big thing is the hierarchies. [01:01:32] They are, they position themselves as hyper libertarian. [01:01:36] But if you looked closely and you asked, well, if the state were to withdraw, what would emerge? [01:01:42] Well, white men would naturally be at the top. [01:01:44] It's like, okay, so there is actually a very serious gendered and racialized hierarchy behind this. [01:01:51] And that is, I think, the big compatibility. [01:01:53] Like someone like Mark Andreessen experiences Trumpism as a return to normal and to what it ought to be like. [01:02:00] Because ultimately, he thinks white men are better. [01:02:03] Like a lot of the maha shit is just eugenics with extra steps, a lot, which is very popular in Silicon Valley. [01:02:09] But honestly, this was true when, you know, when a bunch of Silicon Valley influencers sort of started thinking about stoicism, when they started thinking about like, you know, biohacking. [01:02:20] Like all this stuff is ultimately eugenicist. [01:02:23] And it's soft eugenics, and it's eugenics often like with a self help book, but it is eugenics. [01:02:29] And I think that is that idea that there is an elite. [01:02:33] And it is biological, it's innate, and that any encroachment upon it is government interference, is something that American fascism shares with Silicon Valley. [01:02:47] And before we finish up, can you tell us where listeners can connect with you and your work? [01:02:52] Because speaking of fascism, I hear you've just finished yet another book. [01:02:58] Yes. [01:02:59] So Project 1933, Fascism Then and Now, is coming out. [01:03:03] Probably early 27, late 26. [01:03:07] It is going to be a hardcover. [01:03:09] It is going to be quite affordable, and I think people should pre order it. [01:03:11] It is on Amazon and anywhere else, non evil, where you want to buy your books. [01:03:16] You can also, of course, always check out Embed with the Right, where I discuss gender and sexuality issues around right wing politics, very broadly conceived with my wonderful co host and colleague, Maura Donegan. [01:03:30] We're just about to record an episode today on Guy Fiedi and do one on food, a food zone. [01:03:35] That's exciting. [01:03:36] Yeah. [01:03:37] Very exciting. [01:03:38] Who's not like a terrible guy, it seems like. [01:03:39] So I'm excited. [01:03:40] Bigged. [01:03:41] Yeah. [01:03:41] No, I was the moment someone suggested it to us, we're like, yeah, maybe. [01:03:46] Sure. [01:03:47] And yeah, and you can follow me on Blue Sky and on Instagram. [01:03:51] And yeah, I'm always happy to hear from folks and else. [01:03:56] Oh, yeah, and what Tech Hall is governing will come out in English in August, August 18th, I believe, is the pub date. [01:04:02] Normally, these things drop a couple of days early. [01:04:04] So, again, if you would like to pre order, that would be great. [01:04:07] Yeah, I would. [01:04:09] I think it's quite affordable. [01:04:10] Again, my books are always cheap. [01:04:13] And I would love to hear people's opinions of it. [01:04:17] All right, I'm going to ask Adrian one more question about tech CEOs who post AI generated pictures of themselves as glad. [01:04:24] Gladiators and in togas subscribers, you can stick around. [01:04:28] And if you are not a subscriber yet, today is the best time to sign up. [01:04:32] You can see our show notes to get access.