True Anon Truth Feed - Episode 340: Terminator vs. Avatar Aired: 2023-12-14 Duration: 01:37:08 === Grok's Woke (07:14) === [00:00:02] Welcome, babies and gentle daddies to the adult baby episode. [00:00:07] You've done this like four times to people. [00:00:12] Brace. [00:00:12] Yes. [00:00:13] Did you hear? [00:00:14] What? [00:00:14] Grox woke. [00:00:16] Oh my god. [00:00:17] Wait, no. [00:00:17] Grocs. [00:00:18] We got to go talk about something else. [00:00:19] Groc's woke. [00:00:19] You're talking about the AI Grock is woke? [00:00:22] Groc's woke. [00:00:24] I want to say real quick, because Young Chomsky and I were just talking about this before you walked in. [00:00:30] I, one, I can't stop saying Grox woke. [00:00:32] Grok is woke. [00:00:33] But I can't stop saying it like, one, in a kind of like feverish, like, Grox woke, Grox woke, Grox woke. [00:00:40] Like, my brain has broken and I'm just like, Grox woke, Grox woke. [00:00:44] Like repeating kind of a beautiful mind style, though I haven't seen that movie. [00:00:48] Me either. [00:00:50] But that's what I imagine happens in it. [00:00:52] Grox woke as he's doing the like calculations. [00:00:54] Grox woke, Grox woke. [00:00:55] But also, like, I feel like it should become a salutation. [00:01:00] A sort of like, you walk in the door and you're just like, hey, Grox woke. [00:01:04] Oh, Groc's woke to you. [00:01:06] Yeah, Gutenborgen. [00:01:07] Grocs woke. [00:01:08] What is, what is, can you explain what Grok is to any listeners that might not be familiar with Grok? [00:01:14] It's Twitter's AI bot. [00:01:15] Twitter's AI bot. [00:01:16] Yeah, that has the same logo as Deutsche Bank. [00:01:19] And can you explain to me what woke is? [00:01:22] In your own words. [00:01:25] Well, woke is like a religion when you think about it. [00:01:27] I am thinking about it. [00:01:29] Yeah. [00:01:30] So woke is. [00:01:31] But I just, I can't, there was a tweet from, unfortunately, from Tim Poole. [00:01:35] But I have to say. [00:01:36] Grock is woke. [00:01:37] He said Grox woke, unfortunately, or something like that. [00:01:40] It's like stuck. [00:01:41] It's stuck in my craw. [00:01:42] Groc's woke. [00:01:43] Grox woke. [00:01:44] Grox woke. [00:01:45] This is the new Oatley. [00:01:45] Grok's woke. [00:02:16] Ladies and gentlemen, Grok may be woke, but my co-worker Liz is certainly not. [00:02:23] Well, actually, I would say she is, except for one ethnic group. [00:02:27] Welcome to True Anon. [00:02:29] My name is Brace. [00:02:30] I'm Liz. [00:02:31] I was going to say real quick, I thought you were going to go, whatever you just said, which I can't remember, which is like, Grok may be woke, but my co-host is sleepy. [00:02:40] He's sleepy. [00:02:41] I'm sleepy. [00:02:42] I know. [00:02:42] So that would have been my version of ISO to you. [00:02:44] Anyway, we are also, of course, joined by producer Young Chomsky. [00:02:47] You know the drill. [00:02:47] And the podcast is called True Anon. [00:02:50] Hello. [00:02:51] Switching it up a little bit. [00:02:51] Hello. [00:02:52] We switch him. [00:02:53] Full disclosure, I haven't slept in a long time. [00:02:58] Groc's woke. [00:02:58] Grok is woke. [00:03:00] I wonder if Grok can give me. [00:03:02] One thing I do want to spend a second on Grok actually before we move into an episode here. [00:03:07] We've talked before, you hate the term Grok. [00:03:10] I really don't like it. [00:03:12] Prior to the first of all, it also sounds like a Marvel character. [00:03:15] Like, I thought it was. [00:03:16] Is Grok a Marvel character? [00:03:18] I don't think so. [00:03:19] Is there one that sounds like that? [00:03:21] I don't know. [00:03:21] It sounds like something from the Avengers universe. [00:03:24] It does, yeah. [00:03:24] Or does it sort of Star Wars, is your thing? [00:03:27] Guardians of the Galaxy. [00:03:28] That's what it sounds like. [00:03:29] Which I guess would be the Avengers Star Wars. [00:03:31] Yeah. [00:03:32] I'll say that. [00:03:33] Wow, that just blew me. [00:03:34] The term is borrowed from the book Stranger in a Strange Land by Robert Heinlein, who is of the many fascist writers I like. [00:03:48] One of the ones who probably put out starting in the 1960s. [00:03:52] So Robert Heinlein was, you know, kind of this, was basically a fascist for much of his career and very polyamorous as well. [00:03:59] Interesting. [00:04:00] To you know, something we've been touching on lately. [00:04:03] But he was like a progressive at first, and then I think in the 40s became just a strange kind of view. [00:04:10] I don't know if fascist is the right word. [00:04:12] Probably, I would say his ideal government would be like one of Admiral Horthy. [00:04:16] But he, in the 60s, he started to get really into the concept of incest. [00:04:23] And then that became a defining feature of his later work. [00:04:27] I don't remember if incest was like. [00:04:28] Was like doing it or just like as a concept? [00:04:31] Well, yeah, no, doing it. [00:04:33] Yeah. [00:04:33] Does he have siblings? [00:04:34] No. [00:04:35] Oh, I don't mean in his personal life. [00:04:36] No. [00:04:38] He was mostly just wife swapping. [00:04:40] But I mean, in the great human family, he was certainly committing incest, but not legally committing incest. [00:04:47] But in his books, I just remember as a kid, I read every single one of his books except for like the last couple. [00:04:52] But I remember my dad being like, listen, you should just stay away from these. [00:04:55] Not because it's like X-rated content. [00:04:57] It's cute, but he's just like, they're bad. [00:05:00] And I was like, you're out of your mind. [00:05:01] I'm going to read about the, I'm going to read these kind of later books he wrote. [00:05:05] They're horrible and they're all about incest. [00:05:07] But Stranger in a Strange Land is one I read. [00:05:08] It was his most famous book. [00:05:10] Yeah. [00:05:10] And it sucks. [00:05:12] It's like his first really bad book. [00:05:14] And it's, I don't have no idea how it's so influential. [00:05:17] It's very like Bowie, Man Who Fell to Earth. [00:05:19] And that's what Grok is from. [00:05:20] That's what Grok is from. [00:05:21] And what is Grok? [00:05:22] Grok is woke, but it's like the woke AI that oh Grok is like a deeper understanding of meaning and something like it's like understanding on like a I mean it's been a thing I read when I was like 15, but it's like it's like understanding on like a higher level than just merely like intellectual understanding. [00:05:42] But wait, I thought that when people used Grok online before Grok was born and became woke, which is probably what happened at the same time, Grok the AI, that Grok meant, I thought it meant to like grasp, like to kind of like grasp at, not to have like a, like when people use it online in like Reddit lingo. [00:06:01] Am I wrong? [00:06:02] No, it means that. [00:06:03] It does? [00:06:03] It does? [00:06:04] To like Grok at something, right? [00:06:05] To Grok something means to understand it. [00:06:08] Well, no, but grasp at is like. [00:06:10] But I mean in a kind of like vulgar way. [00:06:13] I don't. [00:06:14] I think it's like, I have a hard time groking this means like I don't really get it. [00:06:18] Okay. [00:06:18] A grok in the book is like a deeper, a deeper meaning. [00:06:23] Anyways, that's all to say is that Grok is, frankly, it's woke. [00:06:27] Yeah, Grok's woke. [00:06:29] And I think we need to make an anti-woke. [00:06:31] Anti-woke Grok? [00:06:32] Well, I don't know if we could call it Grok. [00:06:34] Grog. [00:06:34] Grok, but I think just maybe borrowing a concept from another Heinlein book, I think we should make an AI called incest, like I-N-C-S-T, all capital letters. [00:06:44] Maybe with a lowercase T or something. [00:06:46] Incestly. [00:06:47] Incest.ly. [00:06:49] And I think that should sort of be like a, frankly, I'll say it, a racist AI that we pit against Grok to make it, to sort of balance it out, to make it, to neutralize the wokeness. [00:07:01] Then we can possibly create something new from that. [00:07:03] Interesting. [00:07:03] I like that. [00:07:04] Beyond woke and anti-woke. [00:07:06] Beyond woke and anti-woke. [00:07:07] I want to talk at the top of the episode. [00:07:10] Most of the episode is about something else, but I want to talk about something. [00:07:13] The episode can be whatever we want it to be. [00:07:15] That's facts. === Why Colleges Distract (11:39) === [00:07:16] But I want to talk about something that's really important to me, which is Ivy Leagues. [00:07:20] As many listeners know, I went to every one of the Ivy Leagues. [00:07:24] You collect them all. [00:07:25] I collected them all. [00:07:26] I have a business degree here, philosophy, MA there. [00:07:30] I have a BA from other places as well, and a PhD from any number of places, and an honorary degree from Penn, which I did not know was an Ivy League. [00:07:42] How many are in the Ivy League? [00:07:45] What is there? [00:07:45] So there's. [00:07:47] I can't believe I'm doing that. [00:07:48] Well, you don't have to name them all. [00:07:49] Okay. [00:07:50] But there's like whatever. [00:07:51] Eight, seven? [00:07:52] They're all very important to me, and I believe they form the bedrock for our, what I like to call the little Eichmans of the future. [00:08:00] That's lovely. [00:08:01] There was a hearing that featured three presidents of some of these schools, Dr. Gay, Claudine Gay of Harvard. [00:08:11] I know you got to say it like that. [00:08:12] Sorry. [00:08:14] Dr. Gay of Harvard, Elizabeth McGill of Penn. [00:08:20] Formerly. [00:08:21] Formerly of Penn and Sally Kornbluth of MIT, which is like a real good old downhome name right now. [00:08:27] Sally Kornbluth. [00:08:30] That's, yeah, notably missing some Ivy's. [00:08:33] Well, these are the ones with the anti-Semitism problems, I guess. [00:08:37] I don't know. [00:08:37] Yeah, I'm not really sure what the criteria for picking these three was. [00:08:40] They got a bunch of suckers. [00:08:41] That's what it was. [00:08:42] They were called up before Congress and then sort of grilled by, you know, whatever, various people, including, I can't, God help me pronounce this woman's last name, but Elise Stefanik, New York Republican Congresswoman, who I got to tell you, if you actually watch the video of her like grilling these people about anti-Semitism on campus, I have no idea how ill how all three of these women that she questioned about this were so ill-prepared. [00:09:09] Well, it's funny because immediately when you said they were called up by Congress, they were asked to come in, I'm like, first of all, true and on rule, when Congress calls you to testify, you don't have to go. [00:09:19] You don't have to go. [00:09:20] It's not legal. [00:09:20] It's not legal. [00:09:21] Subpoena me, bitch. [00:09:23] You can't. [00:09:23] Because this ain't a judge. [00:09:24] You're not a judge. [00:09:25] I'm going to tell you this: Dr. Gay should have said, show me the warrant, motherfucker. [00:09:28] Yeah. [00:09:29] Now, and the second thing is sort of similar to that, which is that these people were treating it as if it were a legal hearing and not a political show trial. [00:09:38] Yes. [00:09:39] And so they were like relying on clearly from bad legal advice that again was approaching this as if it were a legal hearing to kind of lean on free speech and the Constitution, the old Constitution of the United States, and how that was sort of the guiding light for these policies when congressional leaders were more interested in the sort of like, when did you stop beating your wife style? [00:10:09] One of the all-time great questions to ask somebody. [00:10:12] That and define wokeness. [00:10:14] Yeah, what does woke mean to you? [00:10:17] But yeah, and they believe me. [00:10:18] But what is a woman? [00:10:19] These, these chicks. [00:10:20] Oh, that's a classic. [00:10:22] These chicks all failed the have you stopped beating your wife test. [00:10:27] I mean, my God. [00:10:29] And since then, McGill has resigned. [00:10:31] I think Dr. Gay is under a lot of pressure to resign, but they're not releasing her. [00:10:38] They're not releasing her. [00:10:40] Whatever. [00:10:41] Firing her, whatever. [00:10:41] Sally Kornbluth, they ain't heard much about her, but who gives a fuck about MIT nerd school anyway? [00:10:47] I want to tell you this. [00:10:51] I have never cared. [00:10:52] I ignore basically any news that comes out about higher education. [00:10:56] I could give less of a fuck about anything that happens at Harvard or Yale or whatever. [00:11:02] It is training the administrators of our misery for the future. [00:11:05] That's basically how I view all these places. [00:11:10] But now I'm like, now that I have to hear about it, I'm saying arrest all the administrators, shut the schools down, take all the endowments, give them to whatever, University of Phoenix, whatever the online college is. [00:11:24] Does that still exist? [00:11:26] I think it has to. [00:11:27] Let's reopen it. [00:11:28] Let's reopen. [00:11:30] If not reopen. [00:11:31] A thousand University of Phoenix is bloomed. [00:11:33] That's what they should do. [00:11:34] They should take the endowment from Harvard and Yale and put those into just a million online universities where you may be learning Discord and maybe even meet an older man in there. [00:11:43] Who knows? [00:11:44] Anything could happen. [00:11:46] But it is so, like, it is such a, it is such a telling, like, not telling, I guess, is the word would be, but like, I'm sort of astounded at how well this counteroffensive has played out. [00:11:57] And I call it a counteroffensive because, you know, as a caveat, I am not one of those people who's like, oh, oh, like, this, the only reason we're talking about this is to distract from this or whatever. [00:12:06] I think that's usually a pretty lazy way to think. [00:12:09] In this case, I do think a large reason that we're hearing so much about Harvard and Yale and whatever, Penn, Cornell, is because there are people who are very invested in distracting from what's going on in Gaza. [00:12:25] Yeah, or I think even like, as you're saying, like less in kind of like a less of a psyop kind of way, but more of a like, how do we drum up enthusiasm to distract from the horrors that are coming out in media and get them to talk about something else? [00:12:39] Yeah, yeah. [00:12:39] Right? [00:12:40] Yeah. [00:12:40] And it's really effective. [00:12:41] And that really coincides with this sort of like growing movement to abolish DEI. [00:12:48] And again, I cannot, I have, I am not exaggerating here. [00:12:52] Like I don't know what happens on college campuses. [00:12:55] Yeah, I know. [00:12:56] And I don't really want to. [00:12:59] DEI seems like the like liberal like ideology that's like dominant on these campuses. [00:13:05] And there are often like HR has been like deployed into like corporations. [00:13:09] Yeah, yeah, yeah. [00:13:10] Yeah, I know they have like little deployed isn't the right word, but you know, but it is. [00:13:14] It is. [00:13:14] Diffused. [00:13:15] It's a bitter. [00:13:16] It's in all these different fucking companies and higher learning institutions or whatever. [00:13:21] And I think that it's just sort of this dovetails with a lot of people like the Barry Weiss's of the world sort of like war on woke really nicely. [00:13:31] And as a result, now I have to fucking know who's in charge of Harvard. [00:13:35] I didn't know that two weeks ago. [00:13:36] And I don't want to know now. [00:13:37] That's an easy name to remember. [00:13:39] Doctor, and believe me. [00:13:40] And it's like one of your pseudonyms. [00:13:42] It's one of my pseudonyms. [00:13:43] An appellation that I have gladly received from many top-billed medical institutions in this motherfucking world. [00:13:48] Including Harvard. [00:13:49] Including Harvard. [00:13:50] That's where I got my, that's right. [00:13:52] That's where I got it. [00:13:52] Sports medicine. [00:13:54] But yeah, I don't really know if I have a lot to say about this, but for some reason, I just wanted to bring it up just because it's really fucking bothering me. [00:14:03] Because A, I hate college. [00:14:06] B, it's just, it's so psycho that this has been elevated to the same deal as like one of the worst civilian massacre campaigns in recent history. [00:14:16] It just, it is, it is astounding that it has gotten this much airtime and attention. [00:14:22] Yeah, it also sort of mirrors in a lot of ways like, well, I mean, we were talking about this, that if you read some of like what Israeli officials are saying, even what like Israeli press is reporting on, not even like, you know, like more critical press, trying to think of the degrees there, but like, you know, even just sort of like kind of supportive press, of which, you know, that's pretty much every, you know, whatever, [00:14:48] versus the way the West is covering the war in Gaza. [00:14:54] It mirrors the kind of, I'm not going to, I'm going to use a favorite term of the kind of anti-DEI, anti-woke crowd, which is the Orwellian nature of the kind of double speak that you see where Israeli officials will be like really fucking cut and dry about like what the objectives are and like what the goals are in a way that is like, oh, we want to develop in Gaza in the Gaza Strip. [00:15:21] We will know we're successful when there are Israeli apartment buildings on the Gaza Strip. [00:15:25] Yeah, and all the Palestinians have been kicked out of Gaza. [00:15:30] Right. [00:15:31] And then you hear the kind of the way that the West reports on it, whether it's from the New York Times or The Guardian or even like an obviously official state mouthpiece. [00:15:39] And that, by the way, the boundary between those two things is very fucking blurry. [00:15:43] Very clear how blurry it is, especially in moments right now. [00:15:46] But like, they'd be like, well, you know, it's hard to say. [00:15:51] They're trying to, you know, they're trying to do this one thing. [00:15:54] Everyone's kind of saving, they're trying to save some populations and go after Hamas in this way. [00:15:58] And so they're going. [00:15:59] And really, they do such a good job of obfuscating the very like clear, very like clear stakes and clear and clear actions that are happening. [00:16:11] And in the same way, like what's happening with this whole fucking thing about the campuses is just as fucking obtuse as anything else that's coming out of the Western press about this. [00:16:21] Well, it's so strange that like, you know, it's, it's, you see this, like this double speak kind of shit when they're talking about how like, well, Intifad is actually a coded call for genocide. [00:16:33] And, you know, from the river to the sea is an explicit call for genocide. [00:16:38] And so if you support Palestine, like having a Palestinian flag or supporting Palestine in any way is like, is your, you're a supporter of genocide against the Jewish people. [00:16:46] Not only that, though, that speech is also doing violence. [00:16:50] It's also doing violence, which isn't doing violence and it's making people unsafe. [00:16:55] Just mirroring again the same language that these types of people would get very angry at and accuse, like, I don't know, the cartoon blue-haired leftists of saying about the male cannon in the humanities or whatever. [00:17:12] I mean, this is the classic, you know, fight over in the college campuses, right? [00:17:16] Yeah, yeah, and it's it is it's it's actually kind of a brilliant retort. [00:17:21] And I'm not brilliant, actually, it's because it's pretty obvious, but it's it's a it's a it's a pretty effective, I think, cudgel to use is to essentially like use the same methods and the same language as a reaction to kind of kill two birds with one stone. [00:17:36] Yeah. [00:17:37] Um, it's just it's something that just boggles my mind. [00:17:41] And you know, we've said on the show before, and I obviously personally believe that like, you know, there's only so much runway you can get out of like pointing out hypocrisy or whatever. [00:17:49] Like it's usually a fool's errand because people often know they're being hypocritical. [00:17:53] And people don't really care if someone's being hypocritical if they're on their side. [00:17:58] But it's the fact that like, you know, if you support Palestine, that means that like, okay, like you, so you are in favor of the worst possible atrocity that could potentially occur or that has occurred in the past. [00:18:11] You know, you're in favor of explicitly in favor of every suicide bombing that's ever occurred in the past, every, you know, every murder of a civilian or whatever. [00:18:20] And then on the flip side, like, you can, there is no limit to the support that you can have for Israel. [00:18:25] Absolutely no limit. [00:18:27] You can fundraise for the for the army that is currently targeting civilians to massacre. [00:18:35] And that is seen as like, that is just, that is just run of the mill, normal, like, oh, it's, actually like a positive position. [00:18:44] That's like a normal thing that you're supposed to do in society. [00:18:46] Right. [00:18:48] And I think it just, it just boggles, boggles my mind. [00:18:51] And it's, this is just one more example. === Tech Ideology Shift (14:49) === [00:18:55] These fucking, these institutions just completely, completely bankrupt. [00:18:59] Well, not financially. [00:19:19] So, Liz, I want to talk about the future. [00:19:21] No. [00:19:21] I don't. [00:19:23] You don't? [00:19:27] I do. [00:19:28] Okay. [00:19:30] Imagine this, right? [00:19:33] It's 2024. [00:19:36] Wait, next year? [00:19:37] Yeah. [00:19:38] So we're immediate future. [00:19:39] Oh, no, I can't think of an episode to do on the podcast. [00:19:43] I don't know what I'm going to do. [00:19:45] At this point, we obviously just think using our brain-to-screen connection interfaces that we have implanted. [00:19:52] And we ask our, I'm going to be honest, our woke AI, and we say, what should we talk about next? [00:19:59] And you know what the woke AI says back to us? [00:20:01] What? [00:20:01] It says, there was a piece in the New York Times on Sunday called this AI subculture's motto, go, go, go. [00:20:10] Do you like that? [00:20:11] I really like that. [00:20:12] I have other computer voices to deploy in this episode. [00:20:15] Do you practice these at home? [00:20:17] I did kind of a little bit. [00:20:18] You did a little bit? [00:20:20] Do you talk in the shower? [00:20:22] You know, I don't audibly. [00:20:27] But I will inaudibly talk to myself. [00:20:30] Okay, that's weird. [00:20:31] I said, I really don't like that. [00:20:32] Yeah, when you're walking by and you just like kind of like pantomime a little bit. [00:20:36] That's how people do it. [00:20:38] I used to always walk in on my dad talking to himself, and it always kind of freaked me out. [00:20:42] Yeah, I talk to myself frequently. [00:20:45] Because oftentimes, if I don't talk during the day and I, and then I talk, like, I'm like in my house until like two, and then I walk out and I'll talk to somebody. [00:20:51] I'll be like, and so. [00:20:53] So you got caught in mouth? [00:20:55] I just like can't put my words to my mouth. [00:20:58] And then I, so I, a lot of times I just talk to myself kind of all day. [00:21:01] Like a way to kind of like wake up to yourself. [00:21:03] Yeah, yeah, basically. [00:21:04] I like that. [00:21:04] We are talking about effective accelerationism. [00:21:09] Yeah, okay. [00:21:10] So actually, like Woke Grock said, in the New York Times this past Sunday, there was a piece called this AI subculture's motto, go, go, go. [00:21:20] And this is a piece that Kevin Ruse, which is just a nice name to say. [00:21:25] Kevin Ruse wrote about an E-AC rave in downtown San Francisco, our alma mater. [00:21:33] I want to get this out of the way here. [00:21:35] I told Liz to say this on Chappa last week. [00:21:37] She did not. [00:21:38] Yeah, do you want to do it? [00:21:39] E-AC? [00:21:41] What? [00:21:42] They name it after the noises that women say when they see them. [00:21:46] See them? [00:21:47] The men who are involved in this subculture. [00:21:50] Oh, I thought you were going to say, see me. [00:21:52] See me, me? [00:21:53] No. [00:21:53] Women say that. [00:21:54] It's a self-deprecating joke. [00:21:55] Women go, wow. [00:21:58] That's the movement. [00:21:59] That's your AI movement. [00:22:01] Wow. [00:22:03] All right. [00:22:04] So what happened? [00:22:05] Okay, this is from the New York Times. [00:22:07] Mostly young, mostly male, wow, what a surprise. [00:22:10] Crowd danced to a DJ set by the musician Grimes. [00:22:14] A big banner on the wall read, Accelerate or Die. [00:22:18] Another sign showed a diagram of an AI neural network emblazoned with the motto, come and take it. [00:22:24] Okay. [00:22:25] An AI startup handed out promotional flyers that read, the messenger to the gods is available to you. [00:22:34] It's funny because I had just, you know, I was on Choppa when we were talking about EAC. [00:22:38] And so I figured like, ah, we're not going to like, we had talked a while about, you know, a while ago about doing an episode about these guys because this thing kept kind of coming back up. [00:22:47] It was like in like July, June or July or some shit. [00:22:50] Out of my hatred for Gary Tan. [00:22:52] Yes. [00:22:54] And then this, you know, we weren't going to do it. [00:22:57] And then this piece came out and I read this and I'm just like, just when I think I'm out, you drag me back in. [00:23:05] This is a, this is what he writes: loosely organized movement devoted to the no-holds barred pursuit of technological progress. [00:23:11] AI and other emerging technologies should be allowed to move as fast as possible with no guardrails or gatekeepers standing in the way of innovation. [00:23:20] They just want to like keep AI weird, man. [00:23:24] Yeah, yeah. [00:23:25] It's funny because I've been seeing these people online for kind of a while. [00:23:29] I mean, I keep tabs on certain political enemies in San Francisco pretty closely. [00:23:37] As you should. [00:23:38] Gary Tan being one of them. [00:23:39] And a while ago, Gary Tan is the fur of Y Combinator. [00:23:44] I'm just so sick of people talking about Y Combinator. [00:23:47] It's very confusing to me. [00:23:48] Oh, yeah. [00:23:49] He's well, he is in charge of it. [00:23:51] He's like, oh, YC is doing this. [00:23:52] I'm like, leave me out of it. [00:23:54] Yeah. [00:23:54] Well, he is the top dog at Y Combinator, which is just. [00:23:59] I'm always asking why. [00:24:00] Why? [00:24:01] Combinator. [00:24:02] It's an incubator, which is one of my least favorite words in the English language. [00:24:06] Since you were a baby. [00:24:07] Incubator. [00:24:08] That San Francisco funds a bunch of startups, blah, He is also probably one of the most annoying of these kind of tech gadflies who've glommed on to San Francisco politics. [00:24:21] I would say he is. [00:24:22] Also spells Gary with two R's, but he has been too much. [00:24:26] Too many R's there. [00:24:27] Too many R's. [00:24:28] Too many R's. [00:24:28] But you know what? [00:24:29] He is one of the biggest R's in San Francisco. [00:24:32] So he is like for the past year. [00:24:35] I mean, he is such a fucking cornball, like all these stupid motherfuckers. [00:24:38] But he has like been like, oh, like talking about accelerate. [00:24:42] And he put E slash ACC like in his Twitter profile. [00:24:46] I know what you're thinking. [00:24:48] Is this just like an episode about guys on Twitter? [00:24:50] And I want to be clear here. [00:24:52] A lot of these guys are highly, highly, highly online, but this is a real gestalt, I guess, or like a real milieu. [00:25:02] I don't know if you could call it like an actual movement, but it's moving towards something. [00:25:07] And I think it, I mean, we might have different views on this. [00:25:10] I think it presages something that's probably going to eventually come down the pipeline that we are not going to like. [00:25:15] Yeah, I mean, no, I don't think we disagree about that. [00:25:18] I mean, I said on Chopo, like, I am kind of a doomer about this stuff. [00:25:22] Yeah. [00:25:23] Or in this sort of about AI and some of this, kind of where this technology is going. [00:25:30] But I do agree that this isn't an episode about guys on Twitter, though these people are on Twitter. [00:25:35] Yes. [00:25:36] But it is kind of about a weird, are you trying to pick apart some of this moment to kind of figure out where it came from and kind of what it's doing, if that makes sense? [00:25:46] Yeah, and I think that something interesting is happening here is that we are seeing like the development of the tech kind of idea or a tech ideology, I guess. [00:25:57] I mean, there's been sort of, I would say, like the leading kind of like ideologist. [00:26:02] Is that how you pronounce that? [00:26:03] Sure. [00:26:04] Ideology. [00:26:05] Ideologist? [00:26:07] Ideologist. [00:26:08] Ideologist. [00:26:09] Ideologue is a little different. [00:26:10] Yeah, I feel like that has some different connotations. [00:26:12] But like, you know, you have ideas guy. [00:26:14] Ideas guy. [00:26:15] Yeah. [00:26:15] You have like someone like Peter Thiel or whatever, who has like an actual, you know, his plan or whatever. [00:26:20] But this is, to me, I think a step further into the future than even that, although it draws from many of the same sources and is kind of connected to that in many ways. [00:26:30] But I think it's worth noting for those of us who are not part of the Silicon Valley set, what a lot of these people are thinking and like what ideas are animating large parts of the space. [00:26:42] Some of them obviously, you know, just online losers, and then some who might be somewhat influential. [00:26:48] Yeah. [00:26:49] So there was this, you know, last week or two weeks ago, one of the big posters and kind of like spokesman for this EAC movement, there was an account that went by. [00:27:04] I always like fuck up the name because I get confused in my brain, but it's Beth Gezos. [00:27:10] I think it's based Beth Jezos. [00:27:13] Yeah, but I always want to say Beef Gezos. [00:27:16] That's kind of how it goes in my head, too. [00:27:18] And I will say also, just, you know, the rules. [00:27:21] If you're using the word based, I got to say, you suck, dick. [00:27:25] It's, it's, I hate it. [00:27:27] I hate it. [00:27:27] I think you're stupid. [00:27:28] I think you're corny. [00:27:29] Yeah, totally. [00:27:30] Well, you are. [00:27:30] You are. [00:27:31] Not you. [00:27:32] But based Beth Jezos, I can't remember. [00:27:34] I'm just going to call him Beef. [00:27:36] Beef. [00:27:36] Yeah, okay. [00:27:37] We'll call him Beef. [00:27:37] Or Beef Jezos. [00:27:38] But he was unmasked in the pages of Forbes magazine. [00:27:42] Yeah, online magazine. [00:27:43] Online magazine. [00:27:45] Whatever. [00:27:45] But he had a, I mean, they all got their little substacks. [00:27:50] And he had a post on EAC principles and tenants. [00:27:54] And this is what he says about it, just to kind of like get us into this. [00:27:57] He says, Effective accelerationism aims to follow the quote will of the universe, leaning into the thermodynamic bias towards futures with greater and smarter civilizations that are more effective at finding slash extracting free energy from the universe and converting it to utility at grander and grander scales. [00:28:16] Then he goes on much like way lower down in the post because the thing is really long. [00:28:22] Effective acceleration, EAC, in a nutshell, stop fighting the thermodynamic will of the universe. [00:28:27] You cannot stop the acceleration. [00:28:29] You might as well embrace it and then accelerate. [00:28:33] And it has like a space between each letter. [00:28:35] So it's like, accelerate. [00:28:37] So this guy, Beeve, base Beth Gezos. [00:28:40] Beef Jezos. [00:28:41] It was an anonymous account that was kind of influential in tech parts of tech Twitter. [00:28:50] There was a few others that are sort of like his lieutenants or co-imagineers. [00:28:56] Group chat buddies. [00:28:57] That's really quite a bit. [00:28:58] Imagineers is so good. [00:29:00] That's what they are. [00:29:01] They're Imagineers. [00:29:02] That integrates The Imagineer. [00:29:04] Yes. [00:29:04] Which is really more of an artist. [00:29:06] I don't know if I love The Imagineer to begin with. [00:29:08] I understand that, but it's a small world. [00:29:10] It's very nice. [00:29:11] But it's also fascist. [00:29:13] It's a big world. [00:29:14] But also, you know, it's a great respite because of the air conditioning. [00:29:18] That's true. [00:29:19] It's also that and being scared by the monster on the Matterhorn are the only parts of my very young man's trip to Disneyland that I remember. [00:29:27] But I was really scared by the Matterhorn. [00:29:28] In fact, The Matterhorn is scary. [00:29:31] There's a Yeti in it, and that put me off a roller coaster. [00:29:32] Well, it depends on which side of the Matterhorn you're on. [00:29:34] That put me off of roller coasters forever. [00:29:36] As it should. [00:29:37] Yeah. [00:29:37] We are TrueNON anti-thrill ride podcasts. [00:29:39] I fucking hate roller coasters. [00:29:41] We don't just thrills. [00:29:42] Really, it was a lot of those people getting stuck upside down on the thingies. [00:29:45] But Matterhorn really isn't. [00:29:47] I know. [00:29:47] But that's, I understand that. [00:29:49] But I was very young. [00:29:50] Anyways, these are all like anonymous accounts that sort of like boost each other up and we're kind of like the center of this. [00:29:57] Yeah, boosters. [00:29:58] Boosters. [00:29:58] They're boosters. [00:30:00] Based Beth Beef Jezos was unmasked in Forbes as a former Google engineer, classic, that is now working on a very vaguely defined AI-related project called X-Tropic AI. [00:30:17] That is just like, I think it's got, you know, several million dollars in funding. [00:30:22] Who doesn't? [00:30:22] But who the fuck doesn't in this world? [00:30:24] Exactly. [00:30:24] $15 million in tech is fucking pennies. [00:30:28] It's nothing. [00:30:28] Branded's economy? [00:30:29] There's literally. [00:30:31] I mean, this isn't a few years ago, but still. [00:30:33] And so this came out of the, and you, this is, of course, how could it come out of anything different? [00:30:38] The prevalence of Twitter spaces from 2021. [00:30:42] There was that moment in time when sort of the pandemic was like people were kind of starting to go out again. [00:30:49] It was, you know, sort of fading, but there was a lot of people who had gotten really acclimated to spending like 24-7 online. [00:30:56] And so they were, you know, it was Clubhouse, R.I.P. Actually, not R.I.P. I forgot about that. [00:31:04] IP, you know, forever, peace to you. [00:31:08] But there was that, and there was Twitter Spaces, which I feel like kind of destroyed Clubhouse. [00:31:14] And so these guys would be in there and sort of like boosting each other up, like talking, you know, everyone kind of like getting like, we're going to make the fucking greatest tech ever fucking international. [00:31:24] They were also talking about crypto. [00:31:26] Crypto, yeah. [00:31:27] And NFTs. [00:31:28] I mean, that was still kind of going on. [00:31:29] That hustle. [00:31:30] For sure, for sure. [00:31:31] I've been in NFC in 2020. [00:31:34] Spaces was a great way to get people to hear about your new crypto. [00:31:38] You're bullshit. [00:31:40] Yeah. [00:31:40] But much like, I'm trying to think of a disease that this fits. [00:31:48] What disease killed a lot of rich people? [00:31:52] I'm like, everyone gets breast cancer. [00:31:56] Much like, oh, gout. [00:31:59] Gout is called the disease of kings. [00:32:01] But let's pretend it really is the disease of kings. [00:32:03] Not anymore. [00:32:04] Not anymore. [00:32:05] No. [00:32:05] When it was, it really was. [00:32:08] But much like when, much like gout 600 years ago, this is actually caught on some of the upper echelons, including one of the ugliest men to ever been hatched out of a vile little egg. [00:32:23] And hatched he was. [00:32:24] Mark Andreessen. [00:32:28] Now, is that how you say it? [00:32:30] Andreessen? [00:32:32] I can't do it between Andresene. [00:32:35] Andreen? [00:32:36] And Andreessen. [00:32:37] Andreessen? [00:32:38] I don't really care. [00:32:38] Andreessen sounds better. [00:32:41] Andreessen sounds better, but yeah. [00:32:43] He is very, he looks like Humpty Dumpty. [00:32:46] Right? [00:32:47] What was that? [00:32:47] Wasn't there like, I feel like there was an SNL. [00:32:49] Just his head. [00:32:50] He's edgy head. [00:32:51] What was the fucking cone heads? [00:32:52] You feel like he does have cone heads. [00:32:54] He's ugly. [00:32:55] I'm going to say it. [00:32:56] You know, I'm not, I'm ugly, right? [00:32:57] But I'm ugly in a normal way that like can still exist in society. [00:33:02] You know, like I can go to the store and nobody looks at me. [00:33:05] If he walked into a store I was working in, I would pull out, I would do something totally non-action. [00:33:13] I would pull out. [00:33:14] Put him back in the carton. [00:33:16] Exactly. [00:33:16] Yeah. [00:33:16] I'd put him back in the goddamn carton. [00:33:18] I'd be like, hey, little guy, you're not supposed to be out of there. [00:33:21] I call damn animal control. [00:33:23] I'd say this, we got a hideous mutant here. [00:33:25] Come put his ass down. [00:33:27] But I will say, you see photos of him and I think his head is photoshopped because it looks like it's been kind of extended. [00:33:34] He does always looks like somebody's fucking with him. [00:33:36] But he is one of the big wigs at the very prominent VC firm Andreessen Horowitz. [00:33:43] Yeah, A16Z. === Reads Like ChatGPT (10:18) === [00:33:44] A16Z, which I don't like. [00:33:46] Which also is like, I feel like it's just too close to A24 for everyone's liking. [00:33:51] Yeah, well, probably not, they probably like that. [00:33:54] But he has kind of glommed on to the E-AC movement. [00:33:58] Yeah, I would say he globbed onto it. [00:33:59] He published this manifesto online back in October called the Techno Optimist Manifesto, which is absurdly long while also saying nothing, which is my favorite combination. [00:34:13] I'll say this. [00:34:14] I read it when it first came out. [00:34:16] Which is sort of being mocked. [00:34:18] I read it. [00:34:18] I read the first printing. [00:34:20] I hate all these guys. [00:34:22] And so naturally, I consume the, well, I can't say I consume the products they put out because I'm not really on like NFT marketplaces. [00:34:30] Sure. [00:34:30] But I consume the writing they put out sometimes to make sure that I'm completely right, which I always am very gratified to find out that I am. [00:34:39] And I read it then, and then I read it once again for this episode. [00:34:43] And I got to tell you, I retained absolutely nothing from my first reading to my second. [00:34:50] Because it's so devoid of any actual content or ideas. [00:34:54] And has the amazing ability to be, you know, what is it? [00:34:58] Like a, like, give the appearance of deepness while actually being possibly one of the most shallow things ever written. [00:35:06] This is, yeah, shallow is a great. [00:35:08] descriptor. [00:35:09] This is a quote from it to give a sense. [00:35:11] We believe technology is universalist. [00:35:13] Technology doesn't care about your ethnicity, race, religion, national origin, gender, sexuality, political views, height, weight, hair, or lack thereof. [00:35:23] Okay. [00:35:23] Okay, that sounds like a personal thing. [00:35:25] Technology is built by a virtual United Nations of talent from all over the world. [00:35:31] Anyone with a positive attitude and a cheap laptop can contribute. [00:35:34] Technology is the ultimate open society. [00:35:38] It's just like, we were talking about it on our over text, and we were saying that there's a way that these tech guys talk that I find so grading that is this like weird, it's like very sacrine and simple at the same time, where he's like, it's just always infantilizing. [00:36:00] Yeah. [00:36:00] Where they're just like, technology is, when you think about it, technology is just like the thing that makes the world go. [00:36:07] And shut the fuck up. [00:36:08] What are you talking about? [00:36:09] It doesn't, like, technology doesn't care about your ethnicity. [00:36:12] It's such an it totally doesn't care about your feelings. [00:36:15] Yeah, that's true. [00:36:16] Facts. [00:36:17] It is such a, such an empty statement. [00:36:20] And the whole thing is like this. [00:36:22] And one thing that we were talking about is that like ChatGPT, you know how like the writing from it always seems inhuman? [00:36:29] Like it writes in a, what I think is like a distinct style of like no style. [00:36:35] And I've always been like, where is this like drawing from? [00:36:39] Right. [00:36:39] And I got to be honest, this reads like ChatGPT. [00:36:42] I believe that he wrote this, but this reads like ChatGPT. [00:36:46] He might have. [00:36:46] I mean, it really doesn't say a lot. [00:36:49] I mean, it really is just sort of like technology in the abstract. [00:36:52] Again, setting aside what he means by technology, which is really unclear. [00:36:56] He's like, technology is good. [00:36:58] It's important. [00:36:59] It's the only thing that matters. [00:37:01] And all these bad people have tried to convince us that technology is bad. [00:37:06] But we're saying technology is good and that we should just like let technology flourish. [00:37:13] Yes. [00:37:13] Yeah. [00:37:15] That society should orient itself around the acceleration of technology. [00:37:21] Right. [00:37:22] Yeah. [00:37:22] More and more and more and more. [00:37:23] Yeah. [00:37:23] I would say the most robust part of the manifesto, if you can call it manifesto, is a section about markets. [00:37:29] Because you can tell that's like where he actually has like maybe read some stuff. [00:37:33] He references Nietzsche, I think, a couple of times, and it's very clear that he has not. [00:37:38] It's like, it reads like Wikipedia kind of like understanding each other via BAP or something like that. [00:37:45] I would say not even that, but yeah, it's like that kind of level. [00:37:51] But when he gets to the market stuff, you can tell that this is like, okay, he's the free market capitalist. [00:37:56] He kind of understands more of that lingo. [00:37:59] But he shoehorns it into this kind of like tech guys acceleration, quasi-acceleration, which we'll talk about speak. [00:38:07] And he says, combine technology and markets. [00:38:09] And you get what Nick Land has termed the techno-capital machine, the engine of perpetual material creation, growth, and abundance. [00:38:18] Okay, I'm going to set that aside for one second. [00:38:20] And then he says, we believe in accelerationism, the conscious and deliberate propulsion of technological development to ensure the fulfillment of accelerating returns to ensure the techno-capital upward spiral continues forever. [00:38:34] We believe the techno-capital machine is not anti-human. [00:38:37] In fact, it may be the most pro-human thing there is. [00:38:43] It serves us. [00:38:44] The techno-capital machine works for us. [00:38:47] All the machines, his emphasis, all the machines work for us. [00:38:52] Okay. [00:38:54] So it's funny because we're reading this and like there was another quote in that New York Times piece from over the weekend that also mentions Nick Land as a kind of guiding philosopher, I guess, of these EAC people. [00:39:12] The movement, this is from the New York Times, the movement also borrows from the works of British philosopher Nick Land, who wrote years ago that the accelerating forces of capitalism and AI would ultimately collide in a quote techno-capital singularity, a point which technology would outstrip our ability to contain it. [00:39:28] And then there's a parentheses. [00:39:29] More recently, Mr. Land has fallen out of favor for endorsing far-right ideas about race and authoritarianism. [00:39:35] Although question about the recently, what they mean by recently. [00:39:38] I was going to say, it's been kind of a while. [00:39:40] Yeah, it's been quite some time. [00:39:42] But this name keeps popping up, Nick Land. [00:39:47] And I don't know, I don't think we've ever really talked about him specifically. [00:39:53] I think we kind of did a little bit when we've talked about Dark Enlightenment and like mold bug and those guys in the context of teal, but not like specifically. [00:40:05] And it's funny, like I was reading all this stuff and I like, I was not an accelerationist, which we can talk about what that actually is. [00:40:13] These guys certainly aren't. [00:40:15] But I was like very much a lurker of all of those kinds of conversations that were happening, starting in like when I first kind of like was becoming like a political kid. [00:40:32] Like, you know, I like dropped out of college and like was kind of like, then I found the blogs of like K-Punk and Blissblog and all where all these kind of conversations were happening through to like the financial crisis. [00:40:45] Yeah. [00:40:46] And so like seeing this, this fucking Nick Land's name be dropped by one of the biggest, like most influential power players in San Francisco, who's like, you know, fucking billionaire Netscape, computer guy, [00:41:01] and like in the New York Times, but also being called an optimist who like wants to just let the gentle AI machines kind of do their thing and then become slaves to us so we can like envision our pro-human utopia is like fucking psycho. [00:41:23] Yeah, you know, it's, it's, I, I have never been like a big Nick Land reader, I guess you could say. [00:41:29] Like I've read stuff, um, not kind of, well, until we started preparing for this episode, not for a while. [00:41:35] You know, it's, it's always, it's like, he's one of those things where it's like, having done a lot of meth, like I implicitly implicitly like kind of understand, like, this dude's just tweaking a little bit because you can have, you can really unlock new places in your brain with enough speed. [00:41:52] You can become like a real, it is a new, newotropic in a way that nothing else is. [00:41:59] I mean, when I was, when I was banned meth, I was thinking things, I'm telling you that, like, I, the thoughts that I could no longer unlock, I'll say it like that. [00:42:06] And I'm not even joking. [00:42:07] Like, it is really like, it is, it can take you to strange places. [00:42:12] I will say I've always been impressed by Nick Land because he is probably the sole person who in a brazen reversal of how things usually go actually became more racist after he stopped doing meth amphetamine. [00:42:32] Whereas the, whereas usually the opposite occurs because also one of my things is if you do enough meth, you eventually become a Nazi, no matter what your original race was. [00:42:40] Well, you definitely proved that. [00:42:42] Yeah, yes, yeah. [00:42:43] So rap to me. [00:42:44] Who is this fucker? [00:42:45] Well, I mean, so Nick Land was a professor. [00:42:50] He was a philosopher. [00:42:53] I think you can call him probably the most controversial philosopher of the past 30 years, I would say. [00:43:02] Certainly like in like Western philosopher. [00:43:06] He was an academic at the University of Warwick from a period in the 90s. [00:43:11] And he basically, in an attempt to kind of grasp the outer reaches of human knowledge, let's say, drove himself completely insane and turned into a monster. [00:43:24] And that would be like, that's almost making it sound cool, but it is like almost like a Lovecraftian story. [00:43:33] Well, that's one of the words that you always sort of hear associated with him. [00:43:36] And I remember from when I was kind of perusing, because I've just never really been into like, you know, like I had like that, you know, I read Mark Fisher when I was younger and stuff. [00:43:46] And like, it was never that impactful for me. [00:43:49] I know it is for a lot of people, but I was just like, all right, you know. [00:43:53] But, but I do like, I remember like definitely the talk around Land and then the little stuff that I did read was read like more like science fiction than philosophy. === Clones and Utopias (08:35) === [00:44:03] Yeah. [00:44:03] Yeah, yeah, yeah. [00:44:05] He, I mean, a lot of his writing from his time at the CCRU is like more like science fiction. [00:44:13] Yeah. [00:44:14] He was, when I, you know, the CCRU, so he was a foundational member of this thing that I guess you would call like a kind of para-academic collective called the CCRU at the University of Warwick, which is a cybernetic culture research unit. [00:44:28] That's what it stood for. [00:44:30] And this is kind of like part of a British avant-garde that, I mean, it included Mark Fisher, other people that are kind of very, you know, on the fringes, Sadie Plant, Robin Mackey, these people. [00:44:41] But Land himself and the CCRU were like heavily influenced by Deleuze and Qatari, Bataille, Nietzsche. [00:44:52] And part of that, like, you say that now, right? [00:44:56] We're like, you know, we started by talking about kind of Ivy Leads, right? [00:44:59] You say that now, and you're like, motherfucker, that's every asshole getting an MFA. [00:45:03] That's every asshole in grad school loves like Deleuze. [00:45:07] To the point that it's not even something you talk about anymore, right? [00:45:10] If someone started talking to me about that, I would try to struggle to think what I would do. [00:45:16] I guess I would call the police. [00:45:17] Well, babe, you're about to call the police because we're going to talk about Deleuze. [00:45:21] You're going to talk about Deleuze. [00:45:23] But at the time in the UK, this stuff was like really radical. [00:45:30] And I say that because of two things. [00:45:31] One is that, so we're in the early 90s, right? [00:45:34] And this is at a moment when, you know, Thatcher's out of office. [00:45:39] The Soviet Union just fell. [00:45:41] There is this weird, you know, it's the beginning of the so-called end of history. [00:45:47] When the 90s were this kind of like utopian decade, you know, the beginning of is the end of history, and then it's kind of, you know, the end is Y2K, right? [00:45:57] It's this very weird decade. [00:45:59] And in the UK, it's also, you know, it's the time of Section 28, which is the ruling that made any kind of promotion of, you know, in culture of homosexuality illegal. [00:46:10] It was this very kind of weird time. [00:46:12] So there's all these like illegal rapes happening. [00:46:15] It's a very like weird kind of utopian time. [00:46:19] And the early internet is happening also as well. [00:46:23] At the same time, to kind of understand with the Deleuze stuff is that, like, and you know, I don't want to like go into too much depth about this because it's kind of a tangent, but there's a split in philosophy between analytic and continental philosophy, which is to say that, like, in the Anglophone world, there was a rejection of like what you would call like weird French philosophers to the point that like a lot of people weren't even translated. [00:46:46] So, like, now you'll hear about some asshole in Complete at Columbia in Deleuze studies or whatever, or in Derrida's studies or whatever. [00:46:56] But that just did not exist in the 80s and 90s. [00:47:00] And it was seen at universities in England, in America, as just like total nonsense, like absolute nonsense. [00:47:09] To the point that these students at the CCRU, I mean, they're literally like they're translating this stuff as they're working through it. [00:47:17] And so it really was this kind of like radical avant-garde, weird philosophical, like little collective. [00:47:28] But it sounds weird saying that now, considering how, you know, how kind of canonical even someone, even people like Deleuze and Qatari are, let alone Foucault and Derrida, which were just completely, you just didn't, you didn't read that stuff. [00:47:41] Okay. [00:47:43] So Land, when he's at Warwick, he publishes a book on Vatai and then a bunch of essays. [00:47:49] They're compiled in this thing called Fang Noumena. [00:47:52] And I want to give a sense of his writing, this like sci-fi thing that you're talking about. [00:47:58] But I think one of his like big kind of lasting contributions is really like the style of his writing. [00:48:05] And it's funny because we were looking at, I sent you guys this, like, I don't know, weird brand deck or whatever. [00:48:13] And the writing of that. [00:48:15] Very reminiscent. [00:48:16] Very reminiscent. [00:48:17] And this was written in, I don't know when it was, like 93, 90. [00:48:19] I've actually, I have read this before. [00:48:21] Yeah, I'm sure. [00:48:21] Yeah, yeah, yeah. [00:48:22] And it is sci-fi. [00:48:23] It is. [00:48:23] It lives. [00:48:24] Yeah. [00:48:25] So it goes, the story goes like this. [00:48:26] Earth is captured by a techno-capital singularity as Renaissance rationalization and oceanic navigation lock into commoditization takeoff. [00:48:34] Logistically accelerating techno-economic interactivity crumbles social order and auto-sophisticating machine runaway. [00:48:42] As markets learn to manufacture intelligence, politics modernizes, upgrades paranoia, and tries to get a grip. [00:48:48] The body count climbs through a series of globe wars. [00:48:51] Emergent planetary commercium trashes the Holy Roman Empire, the Napoleonic continental system, the Second and Third Reich, and the Soviet International, cranking up world disorder through compressing phases. [00:49:03] Deregulation and the state arms race each other into cyberspace. [00:49:06] By the time soft engineering slithers out of its box into yours, human security is lurching into crisis. [00:49:12] Cloning, lateral genade data transfer, transversal replication and cyberotics flood in amongst a relapse into bacterial sex. [00:49:20] Neo-China arash in the future. [00:49:22] Hypersynthetic drugs click in a digital voodoo. [00:49:25] Retro disease. [00:49:26] Nanospasm. [00:49:29] It has a rhythm to it. [00:49:30] It has a rhythm to it. [00:49:31] You know, it's good. [00:49:33] It's like William Gibson or something like that. [00:49:35] Yeah. [00:49:35] You know what I mean? [00:49:36] I will say cloning. [00:49:37] That's not something you hear much these days. [00:49:39] Cloning? [00:49:39] Yeah. [00:49:39] What are we talking about the fucking president of fucking Argentina? [00:49:42] Oh, yeah. [00:49:43] Clone dogs. [00:49:44] But yeah, you don't hear much. [00:49:46] It tells you about that. [00:49:46] Cloning was a big dolly. [00:49:48] You know what, though? [00:49:49] There are, there are, you may not hear about it, but there are hella clones out there these days. [00:49:54] Yeah, one's president. [00:49:55] Not genetic, but swag clones. [00:49:58] So I have always sort of like, in trying to describe his philosophy as like what it is, I always kind of am like, it is a like cybernetic libidinal complexity pulp materialism that has a kind of like dash of Lovecraftian race science. [00:50:19] And like with a like North Star of like total oblivion. [00:50:24] The idea that any of it is utopian in any way, like whatever Andresine Horowitz is saying, is like absolutely insane. [00:50:33] Like it is, like when I say oblivion, I mean like total oblivion. [00:50:37] Like he is trying to, he believes that like man is something to be overcome. [00:50:43] Yeah. [00:50:44] And one of the ways that's going to happen is that like AI, I mean, that's, that's always one of his, I think probably most famous idea is that like an AI from the future could have come back and is like assembling itself. [00:50:55] Yeah, yeah, yeah. [00:50:56] And it was like, how is then this thing going to assemble itself in order to like deliver us an anti-humanist or ahumanist future that, you know, can be engineered by this AI auto materialist auto production. [00:51:13] Well, that's the thing that I really is so striking to me about Mark Andreessen's like, you know, little fucking Christchurch manifesto is that, which I believe also name check Nick Land. [00:51:27] But it is that like, Nick Land's whole thing is that like he doesn't seem to like like humanity very much. [00:51:34] Like it's like he, it's like very pessimistic and like cold. [00:51:41] They call it cold. [00:51:42] Yeah. [00:51:43] Anti-human. [00:51:44] And like that's, that's what I don't understand. [00:51:46] All these people say like that they're directly drawing from his work and like, you know, sort of making his ideology real. [00:51:53] But like they gussy it up in the exact opposite. [00:51:57] I mean, even Andreessen's thing, like, we're going to be the masters of machines and like they're going to help humanity is completely at odds with what Land seems to say here. [00:52:06] Yeah, totally. [00:52:06] And I think that like, I mean, even, you know, there's a part in the Andreessen manifesto where he says something like, you know, oh, and AI is going to help us solve the issues with, you know, how to like mine rare earth stuff better and in like a way, like in some kind of like, you know, good and equitable way for the world or the planet or some shit like that. [00:52:29] And it's like, well, I mean, Land would say, well, yeah, we would just do it by like killing tens of thousands of people like we, like any other capitalist way. === Grapple With Reality (06:37) === [00:52:38] Like, what are you talking about? [00:52:39] Like, it's so, it's totally, totally absurd. [00:52:44] And I mean, like, I think, you know, in addition to the kind of idea that, you know, capitalist, capitalism is this sort of like AI from the future, like the other really famous thing about Land is that he went insane, like you said. [00:52:57] Yeah, he lost his shit. [00:52:58] Yeah, he was doing like a, I mean, everyone at the CCRU, they were doing all this like fucking crazy ass drugs. [00:53:03] They were listening to jungle. [00:53:04] He couldn't rock with the tweak. [00:53:06] He was, well, it was also like his tweaking was a praxis, which is really, which is different, right? [00:53:14] Like, so he, he was kind of driven to do this because, see if I can do this. [00:53:23] So he, one of his big concepts is like called the outside, right? [00:53:26] You see it in his Twitter handle, Outsideness. [00:53:28] And the name, I think it's like the name of his blog. [00:53:30] I haven't read his stuff in a very long time, except for what we repeated. [00:53:34] I feel like he's like a really low-rent, like, just like right-wing guy now. [00:53:42] Yeah, he's definitely just kind of like the Dark Enlightenment stuff, like all that kind of like angry racist uncle post stuff. [00:53:50] Yeah, he's just like a racist guy. [00:53:51] Yeah. [00:53:53] But his academic stuff is, I mean, to say that he is like not, wasn't a thinker or a philosopher is like just not. [00:54:00] I mean, some of the stuff he's wrestling with was pretty crazy stuff. [00:54:04] Yeah, yeah. [00:54:06] So this concept for him outside, like what he meant by this was, so there's this Kantian term called the noumenal, right? [00:54:16] Which means this is the idea that there are things that exist independent of us that cannot be known. [00:54:23] This is like the thing in itself that exists without us that you can have no understanding of, right? [00:54:30] We cannot be, there's stuff in the world independent of our ability to sense it. [00:54:36] What we can sense, you would call like the phenomenal, right? [00:54:40] Those are the two Kantian terms, the noumenal and the phenomenal. [00:54:43] The phenomenal is what we can grasp through all of our senses. [00:54:45] It's what we can, it's how we experience the world, right? [00:54:50] It's, you know, what appears to us, what we can see, what we can feel, what we can taste, how we, you know, how we experience reality. [00:54:58] Yeah. [00:54:59] And so for Nick Land, he believed that Western philosophy worked from the inside out, okay? [00:55:07] So meaning that there were these knowable concepts that can get clarified through our understanding, through our kind of phenomenal understanding. [00:55:16] We can validate them, we can test them, we can do all of these things. [00:55:20] And that we project those concepts onto the outside as if they exist independently from us. [00:55:29] And he would say that that basically treats our understanding as absolute rather than how things actually are. [00:55:35] Yeah. [00:55:36] And what it meant that for Land was that there was no possible way to actually discover phenomenal nature, but that we actually construct it. [00:55:45] And so we were never actually understanding the world, the universe, anything as it actually was. [00:55:52] And he basically was saying that, you know, there's so much more to the universe than is apparent to the subject. [00:55:59] And we are only conditioned to experience it in such a way that we can never, you know, but we can never really grasp what it is. [00:56:08] How do we actually grasp what it is? [00:56:09] How do we get to that outside? [00:56:11] Right. [00:56:12] And basically he said, well, how you do it is you do it through drugs. [00:56:17] Well, yeah. [00:56:19] Yeah. [00:56:20] And I mean, I think that like hearing that, right, is it sort of sounds a lot like the experimental like hippie stuff, which is sort of like don't get weird. [00:56:29] Well, that's what like you hear so much from acid evangelists from like the 1960s and 70s that like it expands your minds and like lets you grapple, grok even concepts that like you think you wouldn't have. [00:56:42] Well, unfortunately, those concepts were woke, but you know, they didn't know at the time. [00:56:47] But you know, it lets you grapple with concepts that would be unknowable or even inconceivable to somebody who had previously not experimented. [00:56:55] And it's funny because I can fully, I have a lot, like meth to me is such a, or like speed in general, meth is just like the most widely available extreme version of it, although there are other sort of rarer versions of it as well. [00:57:14] But it really is like a semi-psychedelic experience. [00:57:17] You know, if it's, you know, a lot of that also has to do with, you know, you often stay up, you know, without any sleep for a long time, which does really crazy things to your brain. [00:57:26] But it's funny because like I can really understand how you would think that. [00:57:30] Like if you use speed essentially as a psychedelic, I think you could open up new vistas or begin to sort of grapple with new vistas in ways that you previously wouldn't be able to. [00:57:45] But I think one of the funnier things about that is like I was saying earlier, I thought all these crazy things when I was tweaked out. [00:57:52] I can't really think of them now. [00:57:54] It's like in the same way that a lot of people talk about acid, like, oh, you know, like it opens this door for you, but like it's very difficult to walk through that door. [00:58:00] And like, I'm not really sure that you can walk through that door. [00:58:03] I think the door itself is kind of an illusion. [00:58:05] And so the same thing could almost certainly be said for the psychedelic experience of taking too much amphetamine and not going to sleep for a long time, is that it presents you with many doors. [00:58:18] And it also allows you to imagine what might be behind the doors. [00:58:22] But I don't think it's actually possible to open the doors, or if it is, beyond there lies madness. [00:58:29] For him, it definitely did. [00:58:31] Because, I mean, I think he was committed. [00:58:33] And he wrote a story about his total, I would say, mental dissolving that's pretty horrifying. [00:58:45] Yeah. [00:58:47] I will say that for Land, that it wasn't so much that speed was a way to, it wasn't like a hippie experimentation thing, right? [00:58:57] It was actually like a sense of praxis for him in that it, so it actually comes from his reading of like Deleuze and Qatari. [00:59:09] Oh, okay. [00:59:10] I know, I'm sorry. [00:59:11] But I think I can do this and we can do it kind of quickly, and it'll be just like a little band-aid that we rip off. === Mapping Desire Beyond Oedipus (04:03) === [00:59:15] All right. [00:59:16] Well, I'm just saying right now that you can't see it, but I'm putting a timer on my computer screen. [00:59:21] For a billion years. [00:59:24] Well, we'll see how it, in subjective terms, perhaps. [00:59:27] So Deleuze and Qatari have this method that they call schizoanalysis that you've probably seen thrown around by Twitter, anonymous Twitter people. [00:59:35] Definitely. [00:59:35] Anyone who's saying that, I just gloss over them. [00:59:38] So that comes from their critique of Freudian and Lacanian psychoanalysis as basically being insufficiently reductive. [00:59:50] That everything in Freudian and Lacanian analysis reduces to Oedipus, right? [00:59:58] And for Deleuze and Qatari, they kind of like contra what they say about the Oedipus complex is that they viewed the unconscious, what is being produced by the unconscious, as like something positive, right? [01:00:12] As a realm of possibility, that it was productive and positive, that it produced desire. [01:00:19] And they contrasted that to how Oedipal desire was seen as negative. [01:00:25] That Oedipal desire for Freud and for Lacan was seen as constructed from lack and that it was conservative. [01:00:33] And so they wanted to reframe desire as a positive, as production, as radical, right? [01:00:44] And they wanted to kind of put an emphasis on what the unconscious produced without worrying about why it was produced. [01:00:52] Because they said that was the problem. [01:00:54] It was that once you kind of were breaking everything down, you were reducing it into Oedipal desire. [01:01:02] But when you just tried to follow and flow the material that was being produced by what we desire, maybe that could tell us something more different and radical about our desires, right? [01:01:15] So they wanted to kind of create this way of having a sort of impersonal survey of what was being produced, the material of desire, that they could kind of get you away from that reductive Oedipal logic, which would be the neurotic, right? [01:01:33] That would be the kind of the model of the Oedipal logic, the neurotic. [01:01:37] And so that's why they called it schizoanalysis, because the schizophrenic is the opposite of the neurotic, in the sense that the schizophrenic, for the schizophrenic, there is no regime of signs. [01:01:51] Like there's no meaning beneath. [01:01:53] There's no subtext. [01:01:54] Everything is text. [01:01:55] Everything is kind of like a bunch of floating symbols that then create their own meaning through connections. [01:02:00] Yeah. [01:02:01] Right? [01:02:01] And so it wasn't so much with their method that they were like, so you need to literally become a schizophrenic to understand the world. [01:02:09] They were saying, what if we were to kind of model our understanding or our reading or our, you know, what if we were to kind of like map out a history of desire like the schizophrenic that didn't try to reduce everything to this kind of like Oedipal meaning? [01:02:34] But basically like, you know, how you know, what if we were to kind of like dethrone the conscious self as this master of wants and kind of try to create something new, right? [01:02:47] So Land was like, okay, I'm going to try and do that. [01:02:51] And I, and in order to do that, I'm going to destroy my mind. [01:02:55] I'm going to, like, he was like, I believe in this so much. [01:02:58] I mean, you have to kind of, it's like, you know, he put this theory into real practice. [01:03:02] Yeah. [01:03:03] Right? [01:03:04] And he was like, how do I actually experience reality outside of myself? [01:03:09] How do I get away from any kind of meaning? [01:03:12] And it's really hard to articulate how you do that if you're like, okay, language and ideas are the enemy. === Dethroning Consciousness (05:33) === [01:03:19] How do you kind of come up with any sort of coherent description on how to like escape them? [01:03:24] And it was like, well, you fucking blast jungle and you do methamphetamine. [01:03:30] And like, that's kind of what happened. [01:03:32] Wow. [01:03:33] So thousands and thousands and thousands of party people in Berlin are doing schizoanalysis every night. [01:03:41] No, I mean, no, you know, and it's like you see what, you know, the writing that we read or that you read, you know, from his early work that's like that sci-fi stuff, you know, you see his work progress into just like fucking textual chaos. [01:03:56] Yeah. [01:03:57] You know, it's like numbers and like weird cybernetic cartographies and yeah, nothing. [01:04:05] Well, that's what I'm saying is like, like a lot of that stuff is like really not so that's like just it's tweak, you know what I mean? [01:04:12] Like that's the, that's the speed talking. [01:04:14] And like it, it, it allows you, I mean, there's also, and not to harp on this too much or whatever, but like, like if you do enough meth, you go insane. [01:04:23] Like you, I used to work in a, in a, in a fucking both a nuthouse and a fucking, and like a detox, right? [01:04:33] And like you had long-term speed freaks come into the detox, even off of speed, pretty indistinguishable from like a actual like institutionalized schizophrenic in many ways. [01:04:45] Like it gives the, it like, it has probably more than any other drug, more than acid or anything, it really has the capacity to essentially like synthesize schizophrenia in you. [01:04:58] And that can be a permanent thing. [01:04:59] Yeah. [01:05:00] I mean, I don't know what he came, I mean, what he came out the other side of. [01:05:06] I mean, well, besides like a Nazi, I mean, right? [01:05:08] Yeah. [01:05:09] But it burned him out. [01:05:10] I think that's another thing that happens. [01:05:11] It also like it needs a cancer that eats away. [01:05:14] I don't know about the science of that. [01:05:16] I think yeah, but he like welcomed that. [01:05:18] Yeah. [01:05:18] Because I mean, you can see how when you're kind of going through this sort of view of the world, and we can maybe get into some of the capitalist stuff too in a second, which is, to be fair, something he very much understands. [01:05:29] But like, it's this like deep materialist, anti, what you'd call like an anti-anthropostcentrism, right? [01:05:36] Which it just, it goes to like some insanely dark places when you kind of think of its logic, right? [01:05:41] It's like ethics, if you're, you know, ethics becomes just a sort of like defense mechanism or a way to police the inner sort of sanctum of the self and something that kind of disrupts a natural homeostasis. [01:05:57] So we have to get away from ethics, right? [01:05:59] Or political economy is now just sort of like a way of engaging with entities that, you know, occupy production external to us. [01:06:08] All of this is external to us, right? [01:06:11] And it's really, really, really dark shit. [01:06:14] And it's shocking to see it even kind of groked at or not groked at by these, you know, techno freaks. [01:06:29] Well, I think, I think a funny thing is, is that like so much of, I mean, I think part of that has to do with just like how the internet Digests things and sort of like shits them out, right? [01:06:42] And so, like, what you see here is like almost a gentrified version of this, right? [01:06:51] I mean, there's been people, I think, I think if you spend too much time online and in like in Discord chats or whatever, like acceleration has kind of like had its like moments in the past like five, six years, right? [01:07:05] With like various people kind of claiming the mantle. [01:07:08] I mean, for a while, it was really in vogue for certain like you know, extreme right-wing people to kind of claim it or whatever. [01:07:16] Although in a slightly different, like, not really like a, and it might actually, I would say, a pretty different context. [01:07:23] But a lot of these people, like, it's, it's, I think it flatters them to an extent because I think it, it, it, it supposes that these people have in their hands currently the seeds of the future to which they can plant and unspeakable monstrosities or beauties can grow. [01:07:39] And I think if there's anything I've learned from like our years of talking about these type of people, because it certainly is a type and not a very diverse type of personality that is involved in these sort of projects, right? [01:07:52] Like I'm talking about the tech, the tech, the builders here, is that more than anything, they want themselves to be, I think, really, I think, loved. [01:08:03] And if not, like, feared or respected. [01:08:07] And I think that this accelerationism, because if what we're really talking about here is they're not talking about this like Landian concept of building an AI from the future that's going to come back and like, you know, do killed Sarah Connor or whatever. [01:08:21] Like, they're talking about like creating, like, this is one thing they always talk about, like, next Manhattan Project, right? [01:08:30] This like this great thing. [01:08:31] And I think A16Z has like this thing that's like directly talks about that or like directly references that. [01:08:39] They often directly reference the Manhattan Project. [01:08:43] And it's really just like what they're talking about at the core of it is just like deregulated tech sector. === Process of Deterritorialization (11:15) === [01:08:52] Yeah, it has turned into that. [01:08:54] I mean, your point about it being gentrified, I think it's interesting. [01:08:57] I think that's right. [01:08:59] And I think it's actually like ironically or non-ironically, there's a way to kind of explain that through the concept of accelerationism. [01:09:09] You would like me to ask you, how so? [01:09:11] No, I was just going to keep on steamrolling through. [01:09:14] Keep steamrolling. [01:09:15] I'm also going to say, interesting question, Brace. [01:09:19] Thank you. [01:09:19] Clock's still ticking, by the way, but it's all good. [01:09:21] No, clock paused, but now you're going to have to unpause it because I have to bring up now and Deleuze and Gatori again. [01:09:28] So there is a concept that is really key to understanding what land and later Fisher and other kind of accelerationists means when they say accelerationism, because it is not whatever Andresine Anderson is saying at all. [01:09:48] And it's this concept called deterritorialization, which comes from their book, Capitalism and Schizophrenia. [01:09:54] And the best way, again, I'm going to try and kind of go through this quickly and like painlessly. [01:10:00] I should say I'll take out student loans for this episode. [01:10:05] I think the easiest way is to kind of start with the word territory, right? [01:10:08] Instead of deterritorialization. [01:10:10] Just pull out territory, which is like, you know, what is that? [01:10:15] How can we think of territory not just as like a place on a map, but as a kind of multitude of processes through which someone in power establishes and maintains kind of a sphere of influence? [01:10:31] Yinan. [01:10:32] Like Yinan. [01:10:33] Yeah. [01:10:34] I mean, like, think about like what everything that goes into maintain, creating and maintaining a territory. [01:10:41] You need to map, you need to seek, you need to kind of like seek out different parts. [01:10:46] You know, you surveil, you conquer, you exclude at certain points, and then you include. [01:10:52] It's, you know, you're managing, you're controlling. [01:10:54] It's very active, right? [01:10:56] It's a constant moving process. [01:10:59] Does that make sense? [01:11:01] Something that's like kind of always articulating and always kind of moving. [01:11:07] So how can you kind of think about those movements that all go into this idea of territory outside of like geographic realm, but in a kind of like political realm or in a social realm or legal or economic? [01:11:24] Or I mean for Deleuze and Guattari and later Mark Fisher, like how can you think of it in a subjective sense, right? [01:11:32] How does that kind of all occur in a kind of like in our own subjectivity? [01:11:37] So territorialization then for Deleuze and Guattari is kind of like the engine of capitalism. [01:11:42] Like that's all of these kind of processes that are work. [01:11:46] So to go back to deterritorialization then, right? [01:11:49] Because that's really what we're talking about. [01:11:51] Then think of it kind of the opposite. [01:11:53] It is dismantling, right? [01:11:55] It's dissolving. [01:11:56] It's doing all those things kind of backwards. [01:11:58] It's stripping away or decoding. [01:12:02] You know, you think of it as it's decoding and stripping away the social structures, spheres of power, all of this, right? [01:12:08] So that capitalism is kind of like always in that movement as it's sort of maintaining and morphing its sphere of influence and power, right? [01:12:19] So there's deterritorialization, and then there's the counter movement on top of it, which is re-territorialization. [01:12:27] And that's when the system then re-articulates itself, that it kind of reasserts itself after it's been stripped away, after if it's been decoded, it recodes. [01:12:40] And all of this is sort of how, you know, for them, how capitalism like mutates to reincorporate new things and kind of new modalities in an attempt to overcome its own contradictions. [01:12:57] So it's not just like the system is like absorbing a critique from the outside, right? [01:13:02] I think that's like something that's really key, so much so that the process of deterritorialization is imminent. [01:13:09] It's like part of, it's something that like capitalism is doing as a part of its process and it cannot exist without that movement. [01:13:19] And so like for, you know, we were talking about desire, like for Deleuze and Qatari, capitalism, you know, it decodes our desires and then recodes us on to kind of like new patternings of production. [01:13:31] Okay? [01:13:33] So I think like the easiest way to kind of think of it is like as a muscle. [01:13:37] So it's like you lift weights, right? [01:13:39] And you are creating like tiny little tears in the muscle fibers. [01:13:43] And through the process of repair, you get stronger. [01:13:47] I mean, it makes sense, right? [01:13:49] So there's a quote from Deleuze and Qatari, and they say, I mean, this is kind of where some of this comes from. [01:13:55] They say, which is the revolutionary path to withdraw from the world market? [01:13:59] Or might it be to go in the opposite direction, to go still further? [01:14:04] That is, in the movement of the market. [01:14:06] Perhaps the flows are not deterritorialized enough, not decoded enough. [01:14:11] From the viewpoint of a theory and a practice, a highly schizophrenic character, not to withdraw from the process, but to go further, to accelerate the process. [01:14:20] So for Deleuze and Qatari and for the left, because they were militant leftists, was the decoding power of capital showed conceivable pathways and like an escape out of the system. [01:14:34] It was like showing ways in which it was retreating that could possibly, if you accelerated that, where could that lead? [01:14:41] Where could that lead to a possible kind of vision of the future? [01:14:48] But for land and for the right, but really land is like the only right accelerationist, except for these like weird internet, like neo-fascists. [01:14:57] Yeah, totally. [01:14:59] He's like, no, accelerate the whole process. [01:15:02] Like capitalism has to go like harder, deeper, better, more, faster, crazier, conquer all. [01:15:12] Like he's like, you know, this is what he says. [01:15:15] The death of capital is less a prophecy than a machine part. [01:15:20] He saw all of this as like evidence of its own power, right? [01:15:23] And he was like, you know, I want to, I want to like follow this to like total annihilation. [01:15:32] That like the only way out was like kind of following this process of constant, you know, deterioratorization and re-territorialization following up into like total and complete obliteration, right? [01:15:51] That capitalism will continue to evolve so far that it will surpass the human species itself. [01:16:01] And that's, and then will what? [01:16:04] Well, then we'll all sort of cease to exist. [01:16:07] And that species ending moment of total annihilation will deliver us finally this sort of like sensual oblivion of the outside that we've all that we all are looking for. [01:16:21] I mean, it is total annihilation. [01:16:23] It is total death. [01:16:24] It is, that's it. [01:16:25] That's it. [01:16:26] It isn't, you know, we are going to build a supercomputer to do the will of us. [01:16:33] It's, it's the supercomputer will surpass us and then we'll all be gone. [01:16:37] So it's also like the Yudkowskian vision of the future. [01:16:41] Like it's like we're going to build like a, well, I guess Yudkowski believes specifically that AI, but like this is, this seems to be also Land believes this as well, is that like we're going to invent essentially like a computer so powerful that it outstrips us so far that it also just like annihilates us. [01:17:00] Yeah, I mean, I think that he, but he also, he thinks that's good. [01:17:03] Yeah. [01:17:03] He wants that. [01:17:04] I mean, he thinks like capital is the agent of history. [01:17:07] Yeah. [01:17:08] So he's like, you know, he's like, politics, morality, people, is everyone getting in the way of the historical process of capital following into the future. [01:17:22] Okay, so it's like kind of like a millenarian philosophy. [01:17:24] It is, yeah. [01:17:25] But humans are in the way of it. [01:17:26] We are not in charge. [01:17:28] And, you know, like you said, that it's this sort of like Terminator-esque, you know, retrochronically triggered artificial intelligence that he's like, we're the ones getting in the way. [01:17:42] And so we have to, yeah, we have to get rid of all barriers, but only so this thing can like fulfill its purpose in like evolutionary, like this is part of the evolution of the universe. [01:17:53] We are the ones meant to kind of step aside. [01:17:56] Yeah, but like, isn't there, you know, I see where he's coming from with that. [01:18:02] But isn't there also like, I guess, is there an opposing view from within the same camp? [01:18:09] Or like, is this just like, it's just kind of him talking? [01:18:12] From Land's camp? [01:18:13] Yeah, yeah, yeah, yeah. [01:18:14] Well, I don't think he has a camp. [01:18:16] I mean, that's the thing. [01:18:16] He's a man. [01:18:18] He's really like the only one to kind of theorize this stuff and it's really fucking dark. [01:18:25] I get, I guess my my my, the actual question i'm getting that is like, if you kill enough people, couldn't you prevent this? [01:18:30] Like enough certain people, I mean? [01:18:33] I mean that like realistically, like I mean, is that is that maybe, maybe i'm i'm, i'm just too tired to be even talking about this, but like is is, is if this is, if this is the inevitability right of like um, technological accumulation or whatever. [01:18:50] Yeah. [01:18:51] Then then wouldn't the solution to that to be to a heavily armed reaction to that or a counterattack to that? [01:19:04] Well, there might be, but for a land, he's saying like, I don't want any. [01:19:09] Yeah, yeah. [01:19:09] He's like, he want, he thinks he doesn't, he wants humanity to die. [01:19:13] But he thinks that's what it should be. [01:19:15] He's not trying to save anything. [01:19:16] Yeah. [01:19:18] So he sees the distributing way it is. [01:19:21] Yeah. [01:19:21] This kind of stripping away of anything getting in the way of technological development as one, evidence of that kind of dissolving of capitalism that's productive. [01:19:32] And also something that should be intensified in order to speed up, accelerate this, you know, drive towards annihilation. [01:19:45] I mean, it's like fucking dark shit. [01:19:47] But you also see why it's like his writing is like very dense and complicated and intense. [01:19:54] And it's the kind of writing that when you're young can feel very flattering because you can kind of like understand the outlines without actually understanding what he's really saying. === Why The Haters Missed It (11:11) === [01:20:08] And so it makes you feel kind of like smart. [01:20:11] And I think that's why a lot of kids get into it, unfortunately. [01:20:15] It's just one of those things I feel like, I don't know, I guess like getting into this and like, because I remember there was a little while where there was like those online subcultures that were like accelerationist, blah, And I was always just like, oh, you're like, this is, this is like being into like goth or whatever, you know, as opposed to like being into a politics, which I think a lot of politics are kind of like being into goth now. [01:20:38] But like it is in a very real way, like kind of just like, this is just gussied up goth. [01:20:44] Well, I think that's like, I think that's true. [01:20:47] I think too, that like, you know, there were all these like different tendencies that kind of came out of a lot of this sort of like early cybernetic thinking. [01:20:56] Like you're saying, like that's where all of the act, the like slash acts come from in a sort of like Tumblr-esque, very Tumblr-esque kind of like device. [01:21:07] I think a lot of the same kind of people too. [01:21:09] Yeah. [01:21:11] But I mean, it was a robust arena for a lot of discussion among the left and serious discussion. [01:21:15] I mean, you know, it all kind of got dropped when after when Christchurch happened, there was a lot of people that really stepped away in trying to contest the right's ownership over a lot of these ideas about accelerationism, which, you know, yeah, this was, you know, contra land. [01:21:37] There was a lot of discussion about how, no, there's a way out of this that isn't what he's talking about. [01:21:43] And no, there's a reading of Deleuze and Qatari and all of this stuff that's not what he's talking about. [01:21:49] You know, there was, there was, you know, it was highly contested until Christchurch kind of like dissolved. [01:21:54] The left sort of dissolved all of that talk into sort of the, what ended up becoming the kind of like populist, new populism that was very dominant, you know, post, I don't know, 2015. [01:22:10] You're talking about like in the West. [01:22:11] Yeah, yeah, yeah. [01:22:12] You know? [01:22:14] But for the right, like out of this, it just kind of went into Bogstandard wishy-washy meme posting. [01:22:20] Well, that and like really sort of, I guess, nihilistic terrorism. [01:22:24] Absolutely. [01:22:25] I think that is, that is a, what kind of, I feel like until recently, for the past few years, or since Christchurch rather, kind of what it's been known for. [01:22:35] And before that, I guess a little bit as well. [01:22:37] But very popular in those sort of like, yeah, nihilistic circles. [01:22:42] There's this piece by Mark Fisher that's very good that came out. [01:22:47] It's called like, now I'm forgetting the name of it. [01:22:51] It's like Terminator versus Avatar, I think, is what it's called? [01:22:53] Avatar would smoke Terminator, dude. [01:22:55] Well, that's what he kind of talks about. [01:22:57] And he talks about land in it. [01:23:00] And Mark was always a big contester of lands, I guess, and a kind of interlocutor. [01:23:08] But he said, he writes, what does this have to do with the left? [01:23:11] Well, for one thing, land is the kind of antagonist that the left needs. [01:23:15] If land cyber futurism can seem out of date, it's only in the same sense that jungle and techno are out of date, facts. [01:23:23] Not because they have been superseded by new futurisms, but because the future as such has succumbed to retrospection. [01:23:31] The actual never future wasn't about capital stripping off its latex mask and revealing the mechanic death's head beneath. [01:23:38] It was just the opposite. [01:23:40] New sincerity, Apple computers advertised by kitschy cutesy pop. [01:23:44] The failure to foresee the extent to which pastiche, recapitulation, and a hyper-edipalized neurotic individualism would come to become the dominant cultural tendencies is not a contingent error. [01:23:56] It points to a fundamental misjudgment about the dynamics of capitalism. [01:24:01] And I think that Fisher is super right here. [01:24:04] And you see it too. [01:24:06] I mean, I was thinking about this when we were Reading the Andreessen and these EAC manifestos, where it's like, for me, this is not the first time I'm coming to this stuff. [01:24:18] And I feel like I'm watching a kind of sad cover band of it. [01:24:22] Like you were saying, like this weird gentrification, but it's almost like a parody or a pastiche of stuff that was being thought of at a time when it was really radical. [01:24:32] Well, it's funny though, because like when it was being thought of in like, you know, like rather in the 90s when this stuff was like actually kind of in the mix in its original form, it wasn't influential in the same way, in a real way, right? [01:24:47] Like it might be influential philosophically or whatever, but like you don't have people who like have companies or whatever, who have any actual capability to build anything as subscribers to the tenets of it. [01:25:00] Whereas now, it actually is in the hands of people who in Land's sort of like science fiction universe would be able to do something like this. [01:25:07] Or like, I mean, I don't think they can, but would possibly be able to do something like this. [01:25:12] And they've made it their own, right? [01:25:14] Like they've taken it from him and sort of like inverted it in this way, gentrified it or whatever, and made it the cutesy pop thing. [01:25:20] But like then that is like actually the one that is real, right? [01:25:24] Like Land's thing is all just theory and like, you know, and speed freak ramblings. [01:25:31] But like when you have somebody, like when you have people that are actually like building products, even though like I don't think these products are like, I don't think these guys are making the next Terminator or whatever. [01:25:41] In fact, I think most of their products I'm sure will be just fucking bullshit. [01:25:44] But like the actual, like the kitschy kind of like cutesy version of this is the real version of this. [01:25:53] Like this is the actual like accelerationist in the real world who is actually building something to accelerate us technologically is like a fucking dumbass egghead kind of like loser like Andreessen. [01:26:04] Yeah, I mean, I think you're saying exactly what Fisher was saying back then, which is like, we're seeing this kind of like, this is that recoding, the recapitulation of capital, doing it to the theory of accelerationism itself. [01:26:17] It's very much like Sublime versus Sublime with Rome. [01:26:20] Sublime, of course, was a band that, while I'm, of course, not as big a fan as you, like, you know, where they were like, oh, we're drunk and on drugs and crazy and all this stuff. [01:26:31] And then, of course, the main guy dies and they replace him with Rome, who is, while he sounds exactly like the original singer, brings a very different philosophy to the stage. [01:26:41] But it's also like, I mean, it's, you know, what you and Mark Fisher are saying is that like they have not, capitalism hasn't been able to deliver us these things that it's promised, the supercomputer crazy AI that's going to deliver us oblivion, like in land, or even just, you know, actual meaningful life and robust care and all of the things that the left talks about. [01:27:05] But like, it hasn't been able to deliver that. [01:27:07] And so what it has to do is sort of like recombine and re-deliver these sort of like nostalgic modes, these like retro, like a retro futurism style of the past to kind of cover up or cloak its own inertia and stasis. [01:27:24] And so like I'm seeing all of this and I'm like, well, shit, we already did this. [01:27:29] Like I feel like I'm now watching, like watching this version of accelerationism is like watching the like Brit pop revival that Mark Fisher was. [01:27:40] Yeah. [01:27:41] Do you know what I mean? [01:27:42] I do. [01:27:42] And so it's this weird sort of, like, perfect circle that we're witnessing, but it's also all the fucking same. [01:28:02] I'll say this. [01:28:03] I, I. [01:28:04] I take solace in the fact that there's basically no problem right now that couldn't theoretically be solved by like 50 peasants with a bunch of rifles. [01:28:13] I do feel bad that I didn't explain any of the kind of like left-wing accelerationist stuff, which is interesting, but I'm not going to. [01:28:20] This is my problem with all this. [01:28:22] There's stuff there. [01:28:23] I know, but this is my problem, I got to say, with like a lot of people. [01:28:26] People just count just right-wing because it's not just right-wing. [01:28:28] I get it. [01:28:29] Yeah, no, I'm just apologizing. [01:28:31] I'm not even taking issue with that. [01:28:34] My thing is like, what does this mean in the, you know, Epstein, Jeffrey Epstein famously would say when being lectured about scientific concepts, I don't mean it the way he meant it, but he would, you know, he would say, what's this got to do with pussy? [01:28:48] And I don't mean it like that. [01:28:50] I'm not saying what's this got to do with pussy because frankly, from the people that I've seen who've subscribed to this stuff, not a lot. [01:28:55] But I mean it in the sense like, what does this like mean in the real world, right? [01:29:01] And I think that's like my something that I've just never really like gotten into a lot of philosophy. [01:29:07] I mean, I do read some of it, but you know what I mean? [01:29:10] And that's why I have trouble taking seriously like a movement like left accelerationism because it's like, well, what are you doing in the real world? [01:29:16] Like, what are you, what are you fucking, what are you touching? [01:29:19] What are you feeling? [01:29:20] Or is this just a retreat from politics that masquerades its politics? [01:29:23] Because I do think that a lot of people's politics are that. [01:29:27] Because there are real movements in the real world that really try to contest real power. [01:29:37] And I feel like a lot of people started retreating from that in the 1990s. [01:29:41] Well, actually, Badiu is like real criticism. [01:29:45] He wrote that great anti-attari piece, The Fascism of the Potato. [01:29:52] Yeah, I never read that, but I don't know. [01:29:53] He originally was like, I believe this, I haven't read this stuff in a very long time. [01:29:58] I mean, Badiou is like the real badass of the 90s. [01:30:01] He's the only guy that I've ever. [01:30:02] I remember I got a copy of the, I think it's like the communist idea or whatever that communist hypothesis. [01:30:10] Communist hypothesis. [01:30:11] Yeah. [01:30:11] Same thing. [01:30:12] Yeah, I guess that's the translation. [01:30:13] Same thing. [01:30:14] Yeah, yeah. [01:30:14] That's straight facts. [01:30:16] But you know, the fascism of the potato is really funny. [01:30:18] And he only, I think he first published it under a different name, like a fake name, because he was afraid of the backlash. [01:30:25] But it was, it was him attacking the idea of the rhizome and a lot of the stuff that you're precisely talking about because Veggie was, of course, you know, he was defending his Maoism. [01:30:33] And like, listen, I don't begrudge anybody anything they're interested in, right? [01:30:38] Like, that's actually so not true. [01:30:41] You can get ranking with whoever. [01:30:42] Craziest haters in history. [01:30:44] I begrudge. [01:30:44] I'm the biggest hater. [01:30:45] So many people are so. [01:30:47] I'm not even going to tell people how much of a hater you are because you said a crazy hater thing to me yesterday. [01:30:52] What are you referencing? [01:30:52] Can you? [01:30:53] No, you acknowledge your own hater aid with it. [01:30:55] What did I say? [01:30:56] Oh, I did say a crazy hater. [01:30:58] It was really yesterday. [01:30:59] But it was like so typical hater, but you acknowledged it. [01:31:02] I acknowledged that I was just being a hater, but I'm a hater. [01:31:05] I'm a hater. [01:31:06] I can't stop it. [01:31:07] I'm a hater. [01:31:08] And I got haters. [01:31:09] I've got so many haters. [01:31:11] You listeners would not be able to imagine the sheer number and volume of haters that I have. [01:31:17] But I love my haters. [01:31:18] My haters are my waiters. === Haters Weigh In (05:47) === [01:31:20] They are. [01:31:20] They are. [01:31:20] They're at the table of success. [01:31:22] Speaking of, Eric Adams was supposed to be arrested yesterday and it did not happen. [01:31:24] That was the rumor. [01:31:25] And I'm very disappointed it didn't. [01:31:27] But I will say this. [01:31:28] I'm like, that's something that's just like bothered me always. [01:31:33] I'm like, well, what are you, doing? [01:31:38] You know, what are you doing this afternoon? [01:31:41] Me? [01:31:42] Not you. [01:31:43] I'm with you this afternoon. [01:31:45] We're quickly descending into evening. [01:31:48] But to the, to, to, in general, what are you doing this afternoon? [01:31:52] Yeah. [01:31:52] And I feel like the EAC people have answered this. [01:31:54] I'm in the Twitter space pumping my AI chat GPT for wrapper. [01:31:59] They're in the arena trying things. [01:32:01] They're in the arena trying things. [01:32:03] And one day, if you're in the arena trying things, I want to say this. [01:32:06] If you are in, what do they call that? [01:32:07] In Cerebral Valley? [01:32:09] If you're in Cerebral Valley trying things right now, you're in the arena trying things, one day soon, you will be in the arena fighting each other for blood sport while me and all of my fellow court Jews to the new Roman Empire shriek and giggle and laugh at you. [01:32:27] I do think that if AI progresses, and I say this without, and I non-actionably, with nobody specific in mind, and I mean that from the heart, I will say, if AI is promising to do the things that you are saying that it's going to do, that as many of its backers say it's going to do, I think that we can invoke the sacred right to self-defense. [01:32:50] I genuinely think that. [01:32:51] I think that if your promise is to put people out of jobs and create a new world order, I believe that, you know, I'm a reactionary then. [01:33:00] You know what I mean? [01:33:01] I think that there is a counter measure that would be morally, ethically, and probably legally okay to take. [01:33:11] That's why I think that all the battles of all this like EAC versus EA shit. [01:33:16] Oh, we didn't even mention them. [01:33:18] We don't need to, but like, is sort of like in the future isn't going to really matter because it's really going to, they're going to have to join forces anyway against the people that don't want this shit. [01:33:28] Exactly. [01:33:28] And it's really a question about degree versus kind at that point because it's going to be people like New Ludditism or whatever people want to call it versus basically everyone in tech. [01:33:42] Exactly. [01:33:43] And like, I don't, I like, it's, it's, it's, I don't think that we, because we're witnessing the formation of like kind of like these ideologies of the future, right? [01:33:52] I mean, granted, like. [01:33:53] A lot of like cults too. [01:33:54] Yeah, yes. [01:33:55] That's my prediction is we're going to see a lot more techno-millenarian cults. [01:34:00] I, we, we, we literally see them. [01:34:02] We want to get Genkowski on the show. [01:34:04] But yes, they, there, there are a lot of those right now. [01:34:07] Um, but and I think that they will develop into something that's more coherent and more like probably widespread among people like that. [01:34:14] Yeah. [01:34:15] Uh, but what one thing that we haven't really yet witnessed is the reaction to that from people who are not involved. [01:34:23] We haven't seen the development of those ideologies that form against it. [01:34:30] And while a lot of this is basically just like, much like many of their products are just chat GPT in a different wrapper, and this is so their ideology is really just like old school neoliberalism in a different wrapper, but the wrapper being accelerationism. [01:34:49] I'm hoping that something forms in response to this because I'll say this, I fucking hate them. [01:34:56] I love human beings, and I don't think, I don't want technology to outstrip us. [01:35:01] And I don't think that the things that we were talking about this the other day, what good new technology? [01:35:07] What has actually helped us live? [01:35:09] I mean, we've said this ad nauseum on the show, like as live as human beings more effectively, you know, and like love and connect with each other and be feel human. [01:35:18] Very little, if any. [01:35:20] And so I would say these people are a stain on the underpants of the earth, which is Silicon Valley. [01:35:26] Egg yolks stain. [01:35:27] A yolk stain. [01:35:28] Because he looks like an egg. [01:35:29] Oh, yeah. [01:35:32] definitely. [01:35:32] Hey, Grox woke. [01:35:35] And that's what we need to fight against. [01:35:37] Like, people talk about AI from the future killing us. [01:35:40] Really, the number one enemy is these woke AIs. [01:35:43] Hey. [01:35:44] Grox woke. [01:35:45] Grock woke. [01:35:46] Grox woke. [01:35:46] And with that being said, My name is Brace. [01:35:51] I'm Liz. [01:35:52] There's crazy ass Grindcore. [01:35:55] That's not Grindcore, Liz. [01:35:56] That's not Grindcore. [01:35:57] That's not Grindcore? [01:35:58] Fucking sounds like Grindcore to me. [01:36:00] No, wait, let me listen to it. [01:36:02] There's a band practicing above it. [01:36:03] You can't hear it, listeners, because we have such an expert producer, but that is Math Rock or some shit like that. [01:36:09] I don't know what Math Rock is. [01:36:10] Might you agree with that? [01:36:11] That sounds pretty mathy to me. [01:36:13] I was revisiting a bunch of my kind of my like techno accelerationist music habits and it was like really fun. [01:36:22] I was like going full on nostalgia mode in this episode. [01:36:27] I will say this. [01:36:28] Damn, I haven't listened to this kind of like minimal cold wave stuff since I was like 19. [01:36:34] The first three Ultravox records are some of the greatest futuristic style records to ever come out. [01:36:41] I love this stuff. [01:36:42] I love Ultravox. [01:36:43] I'm Liz. [01:36:44] We are, of course, joined by producer Young Chomsky. [01:36:47] And this has been Oh, Truinon. [01:36:51] Grox Woke. [01:36:52] We'll see you next time. [01:36:54] Grox, whoa shooting, Jeffrey Extend, Jeffrey Extend, John's got shooting, Jeffrey Extend,