True Anon Truth Feed - Episode 425: The Blue Light Killer Aired: 2024-12-16 Duration: 02:39:03 === Disavowal Manifesto (01:56) === [00:00:00] To the feds, I'll keep this short because I do respect what you do for our country. [00:00:05] To the people who listen to this podcast who plan on shooting anybody, I'll keep this a little bit longer. [00:00:10] We disavow you, we don't know you, we condemn your actions, you unsubscribed a long time before you did your actions. [00:00:17] We do not encourage you to do something. [00:00:19] So, if you're listening to this and you do do something, like shoot somebody or whatever, we had nothing like you can't put that on us. [00:00:26] Don't do it. [00:00:28] But if you do it, no, just have a poor manifesto. [00:00:32] This shots out too many people. [00:00:34] Brace, stop. [00:00:35] Do not do it. [00:00:36] Yeah, but if you're no, no, but if you're thinking about doing it, you don't do it. [00:00:40] No doing it. [00:00:41] We disavow. [00:00:42] Well, it's not, we're not, you know, I'm not your dad. [00:00:45] No, we disavow. [00:00:46] We disavow you, but we're not saying you should do it or not do it. [00:00:48] No, we're saying don't do it. [00:00:50] We're saying don't do it. [00:00:51] Don't do it. [00:00:51] But if you do it, if you do it, put in your manifesto. [00:00:55] This is also not Truanon's fault. [00:00:58] Put that actually, don't put that first either because it makes it look artificial. [00:01:01] Put that like third paragraph in your manifesto. [00:01:04] By the way, I know this is a non sequitur, but Truanon had nothing to do with this thing that I did, and they told me not to do it, if you do it. [00:01:37] Welcome to Truanon, the only podcast in human history who have completely 100% friendly fans who would never hurt a fly because their upper arm strength is so little that if they hit the fly... [00:01:52] the fly would win. [00:01:54] My name is Brace. [00:01:55] Hello, Brace. === 100% Friendly Fans (04:12) === [00:01:56] I'm Liz. [00:01:56] We are, of course, joined by producer Young Chomsky. [00:01:59] And like you said, this is Truanon. [00:02:01] Hello. [00:02:04] You can start talking about it. [00:02:05] I wasn't sure. [00:02:06] I wasn't sure. [00:02:07] I was trying not to laugh before. [00:02:08] What's your fucking name? [00:02:09] My name is Joshua Citarella. [00:02:11] It's good to be here. [00:02:12] Good to see you guys. [00:02:13] Joshua Citronella candle. [00:02:15] Keeping all those mosquitoes away. [00:02:18] What are rude things people have called you because your unusual last name? [00:02:21] Cinderella. [00:02:22] If you're in New York, there's a grocery store also. [00:02:25] Citerella. [00:02:26] Citerella, the grocery store. [00:02:28] Yeah. [00:02:28] But they would just call you grocery store? [00:02:31] Well, people outside of New York don't know about it. [00:02:33] So I would get Cinderella and Citronella. [00:02:35] Those are big ones. [00:02:36] Yeah. [00:02:37] Cinderella's tough. [00:02:38] Did that fuck you up at all? [00:02:41] As a, yeah, I mean, as a middle schooler is not the greatest thing for a middle school boy to hear all the time. [00:02:47] Yeah, I was bullied for being gay in middle school. [00:02:49] You're gay? [00:02:49] No, I'm not gay. [00:02:51] But I was bullied for being gay because I was into art and being doing art is gay. [00:02:56] Okay, I get it. [00:02:56] Well, Not to interrupt this, but Josh, we're very excited to have you on. [00:03:04] You wrote a piece about Luigi Mangione on your sub stack. [00:03:08] Yes. [00:03:09] Kind of breaking down, we can link to it in the show notes if we remember, which I probably won't. [00:03:14] But if we do, it'll be there. [00:03:16] Kind of breaking down potentially some of his ideological motivation or maybe what we could piece together about what inspired the CEO assassin that everyone on the internet is in love with. [00:03:33] But before we get to that, I do want to say really quick to everyone listening that we are about to wildly and inappropriately speculate, and that's fine. [00:03:43] It's fine for us to do because we know the actual truth, which we're not revealing, but we will reveal within these speculations with the actual truth we've been told by the friendly FBI. [00:03:53] But we are going to be wild. [00:03:55] Listen, we got to wildly speculate at this point because we have tried to, I think it's fair to say, look at this from all available information. [00:04:04] Yeah, everything that is in the Times and Business Insider, like every article that's written that's come across my feed, like we are looking at the same primary source material. [00:04:12] Yes, his entire Reddit history, his Instagram, his Twitter, like everything that's publicly available, everyone who's speculating on it now has access to the exact same primary sources. [00:04:22] Yeah, we're all sleuthing with the best of them. [00:04:24] Yeah, yeah. [00:04:25] And so, listen, if we get it wrong, maybe the truth that comes out is actually a lie and what we're saying is actually the truth. [00:04:34] Think of it that way. [00:04:36] So, Luigi Mangina. [00:04:40] I'm trying it out because everyone's like Luigi's Mansion, Luigi, so I'm trying to give my own, but that sounds rude, right? [00:04:45] That's kind of like a 90s thing. [00:04:47] Dude, wears my car style nickname. [00:04:49] So we'll just call him Luigi Mangione. [00:04:52] Mangion, Mangione? [00:04:53] I thought it was Mangione, but I think it's Mangioni. [00:04:57] It has to be for the all-star lyrics to scan. [00:05:02] He killed somebody. [00:05:04] And I think what we should do is go through the timeline of what happened involving the murder, like baseline stuff that probably most people listening have encountered in some form or another, but we'll put it all in one place. [00:05:17] And then maybe a little bit about his background and then a little bit about his intellectual influences that may have contributed or I don't know if contributed is the right word, although everyone's the sum of everyone's experiences in life contributed to everything that they do. [00:05:31] But led up to the point where he went dark six months ago and had his own little dark enlightenment and ended up in the streets of Manhattan fumbling with a plastic gun, although fumbling rather deftly, but still you have to fumble because the thing's not working too good. [00:05:50] Blowing Brian Thompson's ass away. [00:05:54] So before we get into those specifics really quick, I do want to say that, Josh, it's great to have you on the show for this because you have been researching in pathways of internet radicalization for some time. === Gray Man Propaganda (06:15) === [00:06:08] I don't know, is that an appropriate way to say it? [00:06:10] No, no, that's totally correct. [00:06:12] Well, I come from the background of the art world. [00:06:15] I wrote this long essay. [00:06:16] It's published as a book, but it's like 10,000 of my words, 10,000 of their words in 2018. [00:06:22] Following that, I interviewed many young people who were politicized online through memes and videos and podcasts, and I followed them for now many years. [00:06:31] So watching their influence, the influence that they take from online and then the political actions they take later in life, that is my area of expertise and interest. [00:06:41] So I will say that although these things are sometimes tenuous, it's not clear what people take from their online media consumption. [00:06:48] I think that we have gone quite a bit deeper than a lot of the other sources, particularly about his key influence that we'll get to a bit later and some of their more interesting, unconventional beliefs and how that may have influenced him. [00:07:01] So I think when we're speculating about what his belief system might have turned into during that dark six-month period, there's generally tracks when people move through this rationalist belief system into much darker stuff that can happen kind of quickly. [00:07:15] Yeah, yeah. [00:07:17] We talked a little bit about that before the mics were turned on. [00:07:20] And I think it's, there's a lot of examples of that from the past couple years, especially as this stuff has really gained prominence. [00:07:29] Prominence, which honestly I would not have foreseen, but with the like full embrace of for lack of a better word, I'm sure there is a better word, but not for me. [00:07:38] Like the tech lifestyle and like the like the whole tech world like kind of taking this primal, primal, excuse me, a primary position in like a lot of the culture. [00:07:49] I think, yeah, has ushered in a new renaissance of the gray man. [00:07:56] Not the gray man, but the gray tribe. [00:07:58] Gray tribe. [00:07:59] The gray tribe. [00:07:59] Maybe one thing to say at the top of the show, just in case there's anyone listening who has like the misconception that this is a left-wing activist, that is entirely, entirely wrong. [00:08:10] Yeah, I think that when, because of the nature of the assassination, which we'll get into the details about, like Bryce said, I think in the immediate like aftermath, like while Luigi Mangioni, which we should say is still a suspect, he's just, he's been arrested, he's been charged, but innocent until proven guilty, all that, we shouldn't work backwards, even though we basically are, again, wildly speculating, as is our wand. [00:08:34] But that because of the nature of the assassination, I think a lot of people immediately were like, oh, this is, you know, early 20th century, late 19th century anarchist assassination style of going after kind of a political opponent or a political enemy, which would therefore require a kind of leftist lens or a kind of leftist trajectory from the assassin. [00:09:03] And that doesn't seem to be clear at all if Luigi Mangioni is our guy. [00:09:07] Yeah, I think there was a, I've seen, and I get it. [00:09:10] Listen, like, that's the thing. [00:09:11] I get it. [00:09:12] Whatever they're about to say right now, no judgment here. [00:09:15] But I think a lot of people wanted him to be Alexander Berkman. [00:09:19] Whereas I remember saying this, and if you're the two guys I was saying this to in San Francisco who both have the same name and are old friends of mine, but I was like, yeah, I would hold off on like, I don't think that this guy is like going to be turned out to be some like big left-wing guy. [00:09:33] I think that's the case here. [00:09:35] But I think people thought it was maybe a propaganda of the deed or something like that. [00:09:39] Like the, you know, the Narodniks or something, you know, a terrorist. [00:09:44] There was like the terrorist thing in China. [00:09:46] That's what they were called. [00:09:49] And it's not that. [00:09:50] We're living in a different time now where different people shoot for kind of not too different of reasons in some way. [00:09:57] Although I guess this differs from like the propaganda of the deed, the theory behind that, which, by the way, was roundly proven to be false when it happened. [00:10:07] Wait, for people who don't know, do you want to explain really quickly? [00:10:09] Propaganda of the deed is where like you do something. [00:10:13] This is a demand simplification of it. [00:10:16] I thought it was really cool when I was 13, which sounds really condescending, but that is just genuinely true. [00:10:22] It's like I came to a lot of left-wing thought through like pretty basic biographies of like anarchist figures. [00:10:29] And one of those was Alexander Berkman. [00:10:32] He shot, who the fuck did he shoot? [00:10:34] Some steel magnet in the hopes that like this assassination would spark some class consciousness and revolutionary fur. [00:10:42] Now, that is, which did not happen. [00:10:45] That is technically possible in some circumstances, but you can't force it. [00:10:50] Like as a theory, it's rather tenuous because if everything aligns to be right, yeah, smoking the right dude can probably set off like some, you know, large-scale, latent, class-based, I don't know, movement. [00:11:07] Or unfortunately, World War I. Or World War I, yeah. [00:11:10] Which was actually kind of a class-based movement. [00:11:12] It was just the upper class. [00:11:14] But it's not, it's certainly not a fool, it's not a good theory because it only can work sometimes and it has to be the right circumstances. [00:11:22] So it's sort of just a tactic rather than anyway. [00:11:25] It seems like one of those things that's more likely to result in like extreme crackdown onto left-wing groups rather than like motivating any type of political movement. [00:11:33] You would be correct. [00:11:34] I mean, this was a big debate in the early 20th century Russian social democratic, later communist movement, about the efficacy of assassinations. [00:11:44] Because they were people, you got to give it to them. [00:11:47] They were killing a lot. [00:11:48] The socialist revolutionaries in early 20th century Russia, like the party, were killing a lot of fucking officials and members of the aristocracy, but it did not spark a popular movement. [00:12:02] And I think that's something to be, which isn't to say that it's not cool, but being cool, unfortunately. [00:12:09] Which we disavow. [00:12:10] Which we disavow being cool. [00:12:13] But like, unfortunately, being cool does not translate to like political victory or else, listen. [00:12:18] It certainly makes a lot of memes. [00:12:20] It does make a lot of memes, which maybe is a political victory for some. === Mark Rosario's Face (05:51) === [00:12:24] But all that is to say, Luigi Mangione. [00:12:28] So Luigi Mangioni, so everyone listening, I think, knows what happened, what we're talking about. [00:12:35] But for the non-internetted grannies who listen to us and also don't read the papers and also don't talk to anybody. [00:12:46] Theoretically, we have so many listeners. [00:12:48] There's like one guy who probably gets all his news from this podcast. [00:12:52] That's horrible. [00:12:53] That's horrible. [00:12:54] You really got to get it. [00:12:55] There's so much he doesn't know about. [00:12:57] Yeah. [00:12:58] Okay, so Luigi Mangioni, 26 years old, from Baltimore, Maryland, graduate of UPenn, doing this off the dome, software engineer. [00:13:10] He shot and killed Brian Thompson, the CEO of United Health Company, United Health Care, whatever it is. [00:13:18] United Healthcare, United Healthcare, which is itself part of a, just one part of a larger, you know, it's owned by a larger United Health conglomerate, but it is a major insurer. [00:13:30] Brian, I keep wanting Brian Johnson. [00:13:33] Me too. [00:13:33] I'm sorry, my bad on that one. [00:13:35] Brian Thompson, the CEO, he was set to speak at the like annual investors conference at a Midtown Hilton on, I believe, 54th Street in Manhattan. [00:13:48] And I'm just going to walk through a little play-by-play so we can give Lay the land. [00:13:52] Because actually, how this went down is very important to the story and the memeification of Luigi itself. [00:14:00] So Luigi arrived in New York City on November 24th, they say, allegedly, on a Greyhound bus from Atlanta. [00:14:09] Police say that he immediately took a taxi to the Midtown Hilton and then head, went uptown to his own lodgings, which were a he was staying at this like hostel up on 103rd in Amsterdam, which is right by my first apartment. [00:14:26] Really? [00:14:26] Yeah. [00:14:28] Many, many moons ago, which he checked into under a fake id with the name Mark Rosario. [00:14:34] I want to pause right here and say that's actually a pretty good fake name, really. [00:14:38] Yeah, you think I do think that's a, because you're like, okay yeah, there's a guy named Mark Rosario. [00:14:43] It sounds like a fucking. [00:14:45] It sounds like a Lower East Side dj from 2004. [00:14:48] Yeah, I know, but I still I. [00:14:50] I do you remember what my name was? [00:14:54] That they gave me in the Philippines my fake name? [00:14:56] Yeah, and like that to me. [00:14:57] I'm like that's the kind of fake name i'd be using. [00:15:00] Um, Mark Rosario is very normal. [00:15:03] Yes, one wouldn't bat an eye at it. [00:15:04] That's true. [00:15:06] Now, reports were that he kept his mask on the whole time. [00:15:09] I didn't know. [00:15:10] He was at a DSA meeting, the entire meeting. [00:15:13] There's a very famous moment where he lowers the mask right to smile at the clerk at the hostel. [00:15:19] Big mistake for Luigi. [00:15:21] Big mistake, Luigi. [00:15:22] They also say that it was not just him flirting with the person at the hostel, but also that they had to verify. [00:15:28] So on, the Mark Rosario id is his face. [00:15:31] Yes, the name is fake, but it's his face, and so he has to prove his identity by taking down the mask right, that I figured, um. [00:15:37] But he flashes a big smile, which is, of course, one of the images that circulates and immediately memeifies the man. [00:15:44] Yeah um, because it's a. [00:15:46] It's a charming smile, it is so december 4th 5, 30 a.m this is just last week he leaves the hostel, he hits the Midtown Hilton where Brian Thompson is slated to speak. [00:15:57] He gets there at like 541 a.m, which is too early. [00:16:01] He's like pacing back and forth on 54th Street for a while and his shit so bad. [00:16:07] There's so many. [00:16:09] I mean remember it's New York City right, it might not be London, but there's still CCTV fucking everywhere and there are cameras everywhere that are capturing him. [00:16:18] So he's pacing back and forth and then he hits a Starbucks big mistake on 56th and 6th. [00:16:24] They get a second picture of him, seems like he gets a kind bar and a water interesting. [00:16:30] So kind bars, I understand, are not very good for you. [00:16:33] It is a candy bar that, because it's called kind, you kind of think it's like a health bar. [00:16:40] But health bars, i've come to understood, are candy bars that try to trick you. [00:16:45] I think the best way to think of it is, health bars are candy and smoothies are milkshakes and that'll just like. [00:16:51] That's a good gut check for when you're thinking about what snack to get. [00:16:54] But I was told by various females that Lara bars which to me is a woman coated bar it is are actually not that bad, or Luna bars both. [00:17:04] I think the Lara ones are the good, but the Luna is for the woman. [00:17:06] Yes yeah, luna is woman because of moon. [00:17:10] Laura has one. [00:17:11] That's just five ingredients. [00:17:12] There's an apple pie, one that is actually really good, like that's what i'm talking about, and so that's the one where, like sometimes it's on sale, like girls will be, like you can get this one brace. [00:17:22] That was actually, I was thinking about that kind bar, and the ingredients in those things are kind of not based. [00:17:28] Like, they don't really fit with his political worldview. [00:17:30] It's like there's literally soy in the kind bar. [00:17:33] So it just seemed like this is somebody who's really searching for, he's just all over the place, you know? [00:17:38] He's traversing the maps of meaning. [00:17:40] Palm oil and shit like that. [00:17:42] Yeah, yeah. [00:17:42] They do seem like bugman food. [00:17:44] Right, right. [00:17:45] I'm guessing, and this is just a hunch. [00:17:48] So sometimes when you need to kill people, you really got to go to the bathroom. [00:17:53] And so I think that he probably had to go to the bathroom at the Starbucks. [00:17:57] And in order to go to the bathroom, he likely had to make a purchase. [00:18:00] Interesting. [00:18:00] And that purchase was the kind bar. [00:18:03] Well, the wrapper of the kind bar was found, no? [00:18:06] And they got a fingerprint off of that. [00:18:08] They say partial fingerprint, but. [00:18:10] Yeah. [00:18:11] So for the next 30 minutes, he is walking up and down 55th. === Bag and Fingerprint Mystery (07:08) === [00:18:15] Surveillance gets him right when the shooting starts to happen. [00:18:22] He's like walking up and he's on the phone, which is really interesting. [00:18:26] I have seen less people make note of this. [00:18:29] So it's unclear who he was talking to or if he was leaving a message, potentially, on someone's voicemail. [00:18:36] But it looks like he is holding up a phone and is talking on the phone as he's walking down 55th. [00:18:40] He could possibly, he could possibly be listening to Andrew Huberman through his phone speakers having forgotten his iPhone or his earps. [00:18:51] So 6.44 a.m. [00:18:53] So it's been about an hour since Luigi arrived. [00:18:57] This is when Brian Thompson, the CEO of United Healthcare, walks towards the entrance of the Hilton where he's set to speak. [00:19:06] So he's walking up. [00:19:07] We see Luigi approach him from behind. [00:19:10] I think everyone has kind of seen the video. [00:19:12] It's 6.45. [00:19:13] It also looks like there's a woman or someone at the revolving door at the entrance of the Hilton. [00:19:18] Luigi walks up. [00:19:20] He's what, maybe eight feet behind him and has a gun out and pull and what? [00:19:29] So, I mean, I have to watch the video again to see exactly the malfunctions, but I've seen it a bunch of times. [00:19:34] But he shoots. [00:19:36] Doesn't look like the shell casing ejects and he has to like manually pull it back and he shoots again and then similar malfunction and shoots again. [00:19:45] And then I think there's another malfunction. [00:19:46] Like it seems like, again, like I don't have the video in front of me. [00:19:49] It doesn't really matter how many times a malfunction. [00:19:52] It malfunctions at least twice. [00:19:55] But he's really calm when he, when it happens. [00:19:58] And it does seem like he expects that to happen. [00:20:02] Because listen, I don't have a lot of experience shooting semi-automatic pistols. [00:20:06] I don't know if you do. [00:20:08] Certainly not 3D printed ones. [00:20:10] No, this is what. [00:20:11] Yeah. [00:20:12] But like the reason I keep a revolver and I like a revolver is because you point it at somebody and you pull the trigger. [00:20:18] I'm not pointing at you. [00:20:19] I'm pointing it over your shoulder. [00:20:20] Believe me, I'm an expert shot. [00:20:21] And by the way, listeners, I am literally not, there's no gun in my, there's nothing in my 3D printed air there. [00:20:27] It's an invisible gun that I'm pointing at you. [00:20:29] What if I just did this the entire project? [00:20:30] I don't know. [00:20:31] It's annoying. [00:20:32] It's annoying. [00:20:32] Stop that. [00:20:33] But like, the reason that I have a revolver is because you point it at somebody and you pull the trigger. [00:20:37] And as long as it's relatively clean and in good condition, the mechanism is very simple. [00:20:41] We'll at least fire that one shot. [00:20:44] Well, he gets three shots off. [00:20:45] He has three shots off. [00:20:46] And he has to cycle the gun each time, which means, and it seems like he's expecting it. [00:20:52] And I think that the calmness and the coolness of that led a lot of people to speculate that there was something going on. [00:20:58] Because a lot of shootings you see on the street, gangland shootings and stuff like that, it's somebody kind of spraying wildly and like missing a lot. [00:21:06] And he hits him, it looks like all three times. [00:21:08] Yeah. [00:21:10] I think he did. [00:21:11] And then just skedaddles. [00:21:13] Yeah. [00:21:13] Just speculate a little bit about what he did during this dark six-month period. [00:21:18] That kind of training or just familiarity with the weapon, like those kind of jams are not uncommon for a 3D printed, like we'll get into the specificity of the weapon later, but he definitely had to have some period where he was like test firing this thing. [00:21:30] Absolutely, yeah. [00:21:31] So Brian collapses on the street. [00:21:33] He is pronounced dead, I think, about a half hour later. [00:21:37] But we're just following Luigi for now. [00:21:39] So Luigi famously, and this is what I think everyone knows, he gets away pretty easily. [00:21:44] The police start a manhunt for him that goes on for days. [00:21:50] It looks like Luigi jumped on a bike and cut through Central Park. [00:21:56] So he goes up, I would assume, like straight up to Central Park South and cuts through Central Park. [00:22:03] Then it sounds like allegedly ditched his backpack, which, again, allegedly contained fists fulls of monopoly money. [00:22:16] That's what they say. [00:22:19] I want to talk about that actually. [00:22:21] Let's tag that. [00:22:22] Let's tag this. [00:22:22] Let's tag this. [00:22:23] Let's actually talk about this right now. [00:22:25] Right now? [00:22:25] Right now. [00:22:26] Okay. [00:22:26] Okay. [00:22:28] That's way too Reddit for me. [00:22:30] It's if it's true. [00:22:31] So nobody, first of all, I was like, from the beginning, I'm like, how do they know it's his bag? [00:22:36] Because there's no, I mean, maybe they do now. [00:22:39] Maybe they have fingerprints on it. [00:22:41] But like, to me, it seemed like they found a bag right away and were like, it's his. [00:22:45] Right. [00:22:46] Well, he had that one kind of backpack on him that was, and I guess it was the same one, they say. [00:22:51] Yeah. [00:22:52] But there's a lot of guys with those backpacks. [00:22:54] But also, why would those guys, it's not the kind of backpack you ditch in the park because of whatever. [00:22:58] And they said they found a jacket as well. [00:23:01] But it seems like when he got arrested, he had a bag. [00:23:03] Maybe he had a different bag at the hotel. [00:23:05] I genuinely don't know. [00:23:06] I don't think they specified what bag he got arrested with. [00:23:08] But I always, I have to ask, when he got caught later, he has the gun that matches the ballistics. [00:23:15] This is the problem. [00:23:16] So, if he was leaving something to be identified as his, wouldn't he ditch the gun? [00:23:21] Like, what do you want to get picked up with the actual weapon later? [00:23:24] That doesn't make sense. [00:23:25] It doesn't make a lot of sense. [00:23:26] And, like, also, you know, it's a gun with a suppressor. [00:23:29] And, like, I guess what you could do is actually you could put that in your, although that's fucking risky riding on a bike with no holster. [00:23:35] You could put that in your jacket. [00:23:37] But, like, a gun with a suppressor is fucking long. [00:23:39] Yeah. [00:23:40] I mean, not that long. [00:23:42] But, like, it's pretty, it's long, right? [00:23:44] And so, like, that's the kind of thing he would put in the backpack. [00:23:46] Right. [00:23:46] So, like, he only had the one. [00:23:47] It's just that to me is like, it seems unclear on what the fuck they're doing. [00:23:52] It makes a great story. [00:23:54] It's really mimetic. [00:23:55] Like, it's very contagious. [00:23:56] But we can't actually definitively prove that it was his backpack. [00:24:00] And it could just have been someone else's. [00:24:02] And it just feels a little bit like, oh, that's a really great meme. [00:24:05] But also, he dropped his bag, but then had another bag with him. [00:24:09] So he had two bags on it. [00:24:11] It's not beyond the realm of possibility. [00:24:12] No, so he only had the one bag on him in the video. [00:24:16] Yeah. [00:24:16] Unless there was a front pack, like Koala style. [00:24:19] Euro style. [00:24:20] Yeah. [00:24:20] He's got a little, yeah. [00:24:22] He could have had another bag at the hotel, which or the hostel. [00:24:24] He doesn't go back to the hostel. [00:24:25] But he doesn't pick it up to the host. [00:24:26] Yeah, yeah. [00:24:27] Hostel. [00:24:27] You're wrong. [00:24:28] He's on him. [00:24:29] He's got to go straight through Central Park and then hops all the way up to a bus station on I think 173rd or 178th and skedaddles out of town. [00:24:42] So at no point did he pick up another bag unless, like I said, you have a go bag stashed in Central Park, which is fucking insane. [00:24:49] That makes no sense. [00:24:50] Yeah, I mean, it's, it's, there's so many variables there that would prevent you from picking that go bag up that you couldn't really have anything important in it. [00:24:59] No. [00:25:00] Yeah, and if it was kind of, you know, theatrical and he's planting, he's making this grand narrative, like he's trying to tell a story and make a big statement, like, wouldn't the manifesto and all of these other things have been equally considered? [00:25:11] Yeah. [00:25:12] Well, we'll get to that. [00:25:13] Yeah. [00:25:14] The manifesto when we get to his arrest. [00:25:16] The Daily Beast, which sometimes has good original reporting, but a lot of the time is basically the fucking Daily Mail. === McDonald's Surveillance Tech? (04:12) === [00:25:24] Put out the business. [00:25:25] Which is fun because the Daily Mail is pretty good. [00:25:26] It's fun. [00:25:27] And the Daily Beast often has great headlines, even though I feel like they called me anti-Semitic by name in print. [00:25:33] That was a good headline. [00:25:35] Who was that fucking cockseck? [00:25:36] What's his name? [00:25:37] I can't want to call him Andrew Sorkin, but I know it's like Alexander Ross Reef. [00:25:43] Yes, Alexander Ross. [00:25:44] Where are you now, bitch? [00:25:47] Sorry, that sounded gendered when I said that. [00:25:50] I apologize to everyone. [00:25:51] Where the fuck are you now? [00:25:53] I'm king of the castle, bitch. [00:25:55] And I'm more anti-Semitic than ever. [00:25:58] So he ditches nothing, maybe, I think. [00:26:02] And he leaves town. [00:26:04] Yeah, he leaves town. [00:26:06] So, and no one has heard from him. [00:26:09] Five days later, He's caught in Altoona, Pennsylvania at a McDonald's. [00:26:17] I do want to just point out that Eric Adams said that they had his name, and no one really followed up on that. [00:26:26] And I do think that maybe Eric Adams is just doing his thing. [00:26:29] He's like, he's like, hey, we're going to psych him out. [00:26:32] Yeah. [00:26:33] He's doing his like three-dimensional chest. [00:26:36] However, then a news article dropped today in the San Francisco Chronicle that said that the SF PD had his name and gave it to the FBI days before his arrest. [00:26:48] Now there's a lot of confusion and I think debate over how and why Luigi was caught the way he was. [00:26:58] The story goes like this: that an employee of McDonald's in Altoona, Pennsylvania, spotted Luigi, who was wearing a COVID mask and photographed on McDonald's surveillance eating a hash brown. [00:27:14] Shout out, McDonald's hash brown. [00:27:17] You know, they're good. [00:27:18] Yeah, can still taste it in my mouth right now. [00:27:20] It's like a Prucey and Madeline. [00:27:25] That they recognized him from, you know, seeing his image splattered everywhere online and on TV. [00:27:34] But the kid is wearing a, he's got a yellow beanie on and a COVID mask. [00:27:39] And it seems doubtful. [00:27:41] Now, some schizo posters, shout out to the schizo posters, have floated the possibility that it was McDonald's surveillance tech in the self-service kiosk machines, which I am sympathetic to. [00:27:59] I'm sympathetic to law enforcement tapping into the vast brother network at the fast food empire. [00:28:11] Have you ever, I can't remember the website? [00:28:12] I have this website that I use sometimes to like, it makes it good if you're like researching like the Moroccan from those from the El Pulpo episodes. [00:28:23] Like I was trying to find like other things he might have appeared in. [00:28:26] And so there's this website, which I can't remember the name of it. [00:28:29] I have it, I subscribe to it, where you can put in a person's face and like it'll find pictures of them from like kind of obscure sources. [00:28:37] And it's really fucking good. [00:28:39] Oh, interesting. [00:28:40] And I know that the cops have access like to probably, well, maybe not every local PD, but like certainly I'm sure San Francisco PD and like the FBI DH like that. [00:28:52] And so it's like, I wouldn't be surprised if they can do it into webcams. [00:28:57] I know. [00:28:57] I wouldn't be surprised if this live. [00:29:01] Even if he has a COVID mask on by just seeing like the shape of his nose and his eyes. [00:29:06] Well, it's a lot easier too at the self-service camera, right? [00:29:09] Because it's getting a real shot as opposed to the one that was leaked to the press of him just snacking on the hash brown, which is like a lot more, you know, kind of obscured, I think. [00:29:18] Yeah, I agree with that. [00:29:19] So I'm sympathetic to that rather than the guy spotting him and calling in the tip to the FBI, which also, how many tips do you think the cops and the police were getting at this time for them to be like, that's the one in Altoona to follow places? [00:29:35] Yeah. === How to 3D Print a Gun (08:17) === [00:29:36] Yeah. [00:29:36] Now, they claim that what they found on him when they arrested him were some notebooks that have not been released, $8,000, along with $2,000 in foreign currency, which Luigi disputes, a kind of handwritten, what they're calling manifesto, but I'll get into my feelings on that when we read through it. [00:30:04] And the gun, which is 3D printed. [00:30:07] So we've talked about the gun a little bit, but I do just want to ask, because I don't understand it, how do you print a gun? [00:30:14] What? [00:30:14] What does it have to do? [00:30:15] It's a lot easier now. [00:30:17] The way that they used to do them is that you'd have to like CNC. [00:30:20] You'd have to have basically a block of metal and then carve out the shape. [00:30:24] But now you just like, you basically print this polymer material. [00:30:28] It's like you print plastic patiently line by line. [00:30:31] And you just make this gun that can be done. [00:30:34] You get these things for like maybe a thousand bucks, a 3D printer. [00:30:38] Okay. [00:30:38] There's some level of fitting it together that requires a bit of technical expertise. [00:30:43] Like he couldn't do this in a weekend, but the file that he uses for the gun, I have this written down somewhere here, but it's basically, it's like the most popular model for it. [00:30:53] And you can just find it on a Google search. [00:30:55] It's called Chairman Wan V1. [00:30:57] This is an updated model that comes from a very popular file that was published in 2021 called Free Men Don't Ask. [00:31:07] They've got great titles on these things. [00:31:09] It's by a group called Deterrence Dispense, which now goes by the Gatalog. [00:31:13] You can just like Google search these things. [00:31:15] They're all publicly available. [00:31:16] Like we can find it right now and download it to our Chromebooks. [00:31:19] It's very accessible. [00:31:20] Yeah, yeah, you can't either. [00:31:21] So the total cost for all this stuff and the total amount of time involved is like you could probably do this within like a week or two of learning to use it. [00:31:29] But to successfully test fire the thing and also like unjam it means that you've got to do some kind of live test in the field and be able to prove that the thing works. [00:31:39] Like that actually takes more time than printing the gun. [00:31:42] Interesting. [00:31:43] So what I don't understand is the barrel is 3D printed. [00:31:49] Like every part is, I thought, because what I always understood about like 3D printing guns, and I know, just to, I know nothing about 3D printing. [00:31:57] It's like one of those things where I'm like, I know there's a lot there. [00:32:00] It's too much for me. [00:32:01] There's too many other things I need to learn about in my life. [00:32:03] So I'm going to have 100% ignorance on this. [00:32:07] But like, I know a little bit about guns. [00:32:10] I'm not an expert, but like, don't you need like a metal barrel? [00:32:14] Well, there's variations on it. [00:32:16] So, my understanding of this one is that you just 3D print the receiver. [00:32:19] This is the part that would be required by law to have a serial number on it. [00:32:24] And so, the convenient, the most easiest way to do it is to just buy the other constituent parts, put those together, and then print the receiver so it's unregistered. [00:32:33] Because you can't do that. [00:32:33] I believe that's what he did for this. [00:32:35] You can buy barrels. [00:32:36] I've done that. [00:32:37] I like bought a barrel for a gun that I had and changed it out. [00:32:41] And yeah, you don't have to, you don't have to go to the bottom. [00:32:42] But there are just technically speaking, there are ways to just entirely 3D print all these things, but it's usually just like a one-shot and then it's done. [00:32:50] So, yeah, he's only printing the receiver in this case. [00:32:53] The suppressor thing is wild to me because I've used a suppressor like a few times in my life, like at ranges. [00:33:01] I've never used one in combat or whatever. [00:33:05] But, you know, they're pretty complex in their own way. [00:33:10] And people were sort of like pointing to, you know, you can see a lot of gas coming out of the gun after he fires. [00:33:18] And to him having like the suppressor maybe being the cause of the jams and him like being aware of that. [00:33:25] And like, you know, suppressors definitely do suppress sound. [00:33:28] It makes it depends on the ammunition you use. [00:33:30] But like people were talking about, oh, he's using subsonic ammunition. [00:33:34] That's why it's jamming. [00:33:35] The cops at one point said he was using this like really sort of expensive, boutique, you know, basically bolt action pistol to do it, like a variation of the well rod. [00:33:46] But he, it's, it's that, the, the suppressor itself being 3D print is interesting. [00:33:51] The thing is, like, to get an actual suppressor, you have to, it's like, you have to get all these like ATF stamps and shit. [00:33:57] Like, it's actually a kind of a laborious process, or it can be. [00:34:01] It's certainly depending on this. [00:34:02] I've never even tried to think about getting one because who knows? [00:34:06] The law is in California, so it's like, this will take too long. [00:34:11] But yeah, I mean, I'm sort of, I'm impressed at how far the technology has gone. [00:34:15] I know that they're using 3D printed guns like throughout the world. [00:34:18] I think famously in Myanmar. [00:34:21] But there's like people in Central Europe that are using them. [00:34:25] They're pretty impressive. [00:34:26] It's ammunition that you can't really 3D print that. [00:34:30] But if you can get it, you know, I guess this is kind of a good test case. [00:34:35] So wait, what do you think the timeline? [00:34:36] I'm a question then, because like for him to research, print, get the printer, print, test, like do all of this, like, what do you think a like conservative timeline on that would be? [00:34:51] Yeah, this is, so I talked to a few people who are, because I'm from art, so I know people who do fabrication and this stuff. [00:34:57] And apparently it's, if you're skilled in this thing, it's quite easy to assemble, but there is kind of a learning curve to it. [00:35:04] So there's at least like a, you know, a week or two to learn how to use a thing. [00:35:08] You got to get the 3D printer shipped to you. [00:35:10] Right. [00:35:11] You got to buy the polymer material that it prints out. [00:35:14] And then you got to go somewhere where you can fire a gun. [00:35:16] It sounds like a gun. [00:35:17] And then you got to make sure that it works and whatever. [00:35:20] So I think the biggest thing to ask for me is, and this is where it gets interesting and ideological, is that you're in the United States. [00:35:27] There are easier ways to get guns. [00:35:29] Right. [00:35:30] You know, like you have to go through all of this. [00:35:32] Like, it's not necessarily cost efficient or anonymous or more anonymous. [00:35:37] Certainly not anonymous when you're keeping the gun on you. [00:35:42] Right. [00:35:42] Like, it's one thing to have a 3D printed gun and then like drop it and they can't trace the gun back to you buying it. [00:35:48] Like that would be the idea. [00:35:49] Yeah, you just drop it at the scene of the crime. [00:35:51] You can drop it. [00:35:52] Or you go to some lot in Altoona and disassemble it and whatever. [00:35:57] You bury the pieces in different places or something. [00:36:00] But yeah, I mean, I guess I can understand making a ghost gun because, you know, you just, it's a ghost gun. [00:36:07] You want to leave no trace. [00:36:09] But then you kept the trace with you. [00:36:11] I know. [00:36:11] Well, I have some thoughts about that. [00:36:17] I think it's because he's obsessed with technology, right? [00:36:20] Like his whole political philosophy is based on analogies from software and from code. [00:36:25] Yes. [00:36:25] Right. [00:36:26] Like that is their underlying belief system that they can design these perfect systems. [00:36:30] Like basically software designers are like the engineers of society and that they should be tasked with these what in other cases would be political decisions. [00:36:39] Like tech is their underlying belief system. [00:36:42] And so in a society where you could easily obtain a gun like the United States, the desire to print this thing is for him in some ways like realizing that vision. [00:36:52] And he's got weird kind of tech interest in the Unibomber and these technology critiques and all sorts of things that start to play in here. [00:37:01] So I think it's part of the project, part of the vision for him that he chose to 3D print it. [00:37:07] Probably more so ideology than anonymity would be my guess for this. [00:37:13] I think that that is a very interesting point. [00:37:16] And we should get into then maybe what this ideology might be because I think that, you know, so Luigi gets arrested, right? [00:37:28] They find all of this stuff on him. [00:37:31] We're going to talk about the healthcare angle in a second, but maybe we can focus on these other things because then his name gets leaked and everyone immediately, journalists and schizo posters and yours truly in Toto, all head online and are furiously researching whatever online footprint they can find before it's inevitably taken down by the powers that be. === Faking It? (03:27) === [00:37:54] And so people have kind of cobbled together archives of his social media footprint, Reddit posting, Twitter, Goodreads. [00:38:03] I think there was like some potential like Spotify. [00:38:06] I mean, there's a lot of like some fake screenshots. [00:38:08] There's a lot of fakery came out. [00:38:09] Yeah. [00:38:10] And some of the people that he was following, conversing with, that could maybe give us a window into who this guy is and how he kind of like came to believe what he potentially believes. [00:38:25] Right. [00:38:26] And the tech angle, it seems to be a very important part of this because again, it doesn't seem like, even though people wanted it to be, he was a kind of what you would call typical leftist with a kind of left-wing critique of the healthcare industry that he was then actioning out violently. [00:38:53] Well, I think we should start to where how he got to maybe where he got to before he got to where he got to. [00:39:01] Nice. [00:39:02] Started from the ground floor. [00:39:03] So let's get the elephant out of the room. [00:39:05] He's Italian. [00:39:06] The elephant out of the room. [00:39:07] Sorry. [00:39:08] A bit of a slip there, but you can see where I'm coming from because there's an Italian in the room. [00:39:13] That's why you called me in for this. [00:39:15] It's not about. [00:39:15] Cinderella is that. [00:39:17] I thought that was like a fake name that you had to put. [00:39:19] Cinderella is your real name? [00:39:20] My real name is Mark Casario, actually. [00:39:22] Mark Casario. [00:39:23] So your family's Italian. [00:39:24] They are indeed Italian. [00:39:25] So you're like the one who made it out. [00:39:26] Like you can read and shit. [00:39:29] I've been known to, I mostly listen to podcasts now, but there you go. [00:39:32] I have read in the past. [00:39:34] But, you know, he's Italian, wealthy Italian family, sort of patriarch grandfather, owned country clubs, etc. [00:39:41] There was a lot of people trying to make like a point like, oh, he was richer than Brian Thompson, which I doubt he personally was richer than Brian Thompson. [00:39:48] Maybe his family was. [00:39:50] But wealthy family. [00:39:51] Very wealthy family. [00:39:53] Also, I just want to say comically Italian. [00:39:55] Oh, come on. [00:39:56] you name your kid a luigi that is like that's like that's it's it's that's tough It's tough. [00:40:02] That's tough, but it's a proletarian name, no? [00:40:06] Plumber? [00:40:07] Yes. [00:40:07] Yeah, plumber. [00:40:08] Yeah, absolutely. [00:40:09] But it is, because especially being born in 98, I mean, Mario, the live-action Mario Brothers movie had come out, which I loved as a kid. [00:40:17] Everyone was obsessed with the games, and you double down on Luigi. [00:40:21] Either you're insanely out of touch with pop culture, which scans for extremely wealthy family, or you're trying to reclaim the Luigi. [00:40:28] You're like, fuck it. [00:40:29] It's ours now. [00:40:30] The Japanese took this from us. [00:40:32] And we did so much for you in the Second World War. [00:40:36] I don't know why they're Jewish now, but you know. [00:40:39] But he goes to a fancy school, Gilman, which I also spent a lot of time at Gilman as a teenager too, but different Gilman. [00:40:46] Yeah, the West Coast version. [00:40:47] The West Coast version of Gilman. [00:40:50] But he went to, it's called the Gilman School, $37,000 a year. [00:40:55] Yeah. [00:40:55] And it's got a 57-acre campus. [00:40:58] This is very elite, elite private school. [00:41:03] I have to tell you guys something, my I know what an acre is. [00:41:09] I know what an acre is, but I cannot visualize for the life of me what 57 acres it's a lot of space. [00:41:20] 50 not why. === Founder's Stone Toss (15:43) === [00:41:21] I don't need to know. [00:41:22] He's uh. [00:41:23] It's not just that though, that he goes there, he's valedictorian And he goes straight to an IAV, as I think most kids do who go to Gilman. [00:41:32] Important to mention also in high school, he was the founder of a startup company that made phone games. [00:41:37] So his interest in games begins in high school, actually, continues throughout college and after? [00:41:42] Yeah, and he makes some reference to, I think it has tweets that he had given a speech about like AI, et cetera, et cetera, as his valedictorian speech. [00:41:51] But by the way, I don't think they had a value. [00:41:54] My high school did not have a valedictorian. [00:41:57] There's a video. [00:41:58] You can watch it online. [00:41:59] Yeah, yeah, yeah. [00:42:00] I was like, I'm good. [00:42:01] But I did read one of his papers from high school that he posted on Twitter, which was, I got to tell you, I'm like, that's, you know, that's better than I was doing in high school. [00:42:07] The one about the Roman Empire? [00:42:09] Yeah, Well, you know, he went to a smart guy's school. [00:42:15] But it's, you know, the guy seems pretty interesting. [00:42:19] You can definitely see, like, there's a through line from that to like his later personality. [00:42:23] It doesn't seem like there's too big of a difference there. [00:42:25] But he goes to UPenn for computer science and math. [00:42:29] Yeah. [00:42:30] And he's like working in AI competitions. [00:42:33] Like you say, he's founding these companies. [00:42:35] He's like full-on in AI engineering. [00:42:38] I mean, he's doing computer engineering, software engineering. [00:42:40] So he's like in robotics. [00:42:43] I think if you're a young kid into that in college, you're going to immediately be very familiar with the founder culture, startup culture, the people and movers and shakers in Silicon Valley, whether that's open AI or whether that's like any other kind of robotics field. [00:43:04] It's interesting. [00:43:05] When I was in middle school, a kid at my middle school was in Battle Bots. [00:43:08] Remember that show? [00:43:09] Oh, yeah. [00:43:10] They filmed it on Alcatraz. [00:43:11] It was like on Comedy Central. [00:43:13] And his dad, he and his, no, that's a different kid. [00:43:18] Never mind. [00:43:19] I was like, I thought his dad beat me up, but this other kind of nerd. [00:43:21] His dad beat me up because he thought I threw a rocket at somebody. [00:43:24] Dude, a parent beat up you as a kid? [00:43:26] Yeah, yeah. [00:43:26] And I didn't press Jargon. [00:43:28] Yeah, he beat. [00:43:30] Well, I get it. [00:43:31] I get where he was coming from. [00:43:32] No, Liz, I do. [00:43:34] Because he and his kid were riding a bike and someone who was this other actually really bad kid straight up threw a rock at the fucking the nerd kid and like almost hit him off the bike in the fucking and like in traffic and but I'm like I have no idea this is happening you are always the victim of mistaken identity no Liz you have no idea Was standing by a payphone about to call my dad. [00:43:58] Fucking goompy. [00:44:00] And this like old guy picks me up and just starts smacking me around. [00:44:03] You fucking am I son. [00:44:05] And I was like, who's your son? [00:44:06] I don't know. [00:44:07] I don't know. [00:44:08] And it's like crazy. [00:44:11] The police come. [00:44:12] And they're like, do you want to press charges? [00:44:13] I'm like, no, I'm scared of him. [00:44:15] And then the kid who actually did it did not get in trouble. [00:44:17] But it's farcical, detective. [00:44:19] I know. [00:44:20] It was bullshit, actually. [00:44:22] But I get like, because if I had a kid and someone threw a rock at him, I'm sorry, I would beat the shit out of that 12-year-old. [00:44:26] Okay, let's go back to Mangioni. [00:44:28] So he's at UPenn. [00:44:30] He's getting into software startup culture. [00:44:32] I'm speculating, but I think that if you're interested in AI and you're like, this is my future, I'm going to get a job working in, you know, engineering, working in AI, you're going to know the whole culture and everything about it, right? [00:44:47] Yeah, he's also, he's the founder of a game development club for students where they build games for fun. [00:44:53] He's also an intern at Fraxis, which is the software studio that makes civilization. [00:44:58] Are you guys familiar with that game? [00:45:00] I don't know anything. [00:45:01] It's like, you know, one of these classic things where you kind of start in like primitive society and then you go through different eras of history, like the Bronze Age, and then you invent, you know, metallurgy or whatever. [00:45:14] And basically, you like go through the arc of human civilization. [00:45:19] And in the later games, now Civilization 4, I think, is the most recent one. [00:45:22] I haven't done this in years. [00:45:23] He was six. [00:45:24] Yeah, he worked on Civil War. [00:45:25] Oh, he worked on, okay. [00:45:27] But you basically go through the development of tech. [00:45:30] And then in the later ones, you go to space, you invent AI and whatever. [00:45:34] So very popular among people who are like kind of singularity tech accelerationist types. [00:45:40] They call it a 4X game. [00:45:41] Yeah, but I don't, it's like, it's... [00:45:43] Explore, expand, exploit, and exterminate. [00:45:46] Yeah. [00:45:46] And I end them, the exterminate. [00:45:48] The exterminate. [00:45:49] Well, you got to kill the barbarians, if I remember correctly. [00:45:52] Yeah, you got to kill the barbarians. [00:45:53] But there's different win conditions, and one of them is you kill everybody else. [00:45:56] Yeah, and then you can have a technological victory. [00:45:58] I used to love either Civ 4 or Civ 5. [00:46:01] I can't remember. [00:46:01] Civ is great. [00:46:03] But also, rumors of bisexuality, but that seems more like left-wing wish casting, too. [00:46:09] I don't know if we know that. [00:46:11] Not even just left-wing. [00:46:12] I think just gay wishcasting, too. [00:46:14] Listen, or bisexual guy wishcasting. [00:46:16] Sure. [00:46:17] It takes all kinds. [00:46:18] I think a lot of people are horny for him in a lot of different ways. [00:46:21] Across the spectrum. [00:46:22] Across the spectrum. [00:46:23] And spectrum is, I think, an interesting word that we should bring up too. [00:46:26] Because I always want to get this out of the way here. [00:46:29] He does seem to have like a little Aspergie. [00:46:31] Yeah, a little bit. [00:46:33] And I say that with no judgment, but it does seem just from his writing style. [00:46:37] I think of myself as an amateur whatever you do of this stuff. [00:46:43] And I feel like I can, I'm like, I get a tinge of it here. [00:46:48] Yeah, I mean, he also posted that video where he retweeted, well, one of the more like ideological people in his retweets, there's this guy who's kind of adjacent to Bitcoin monarchy people. [00:47:01] It's a clip of Peter Thiel saying his like famous line that a lot of the CEOs and founders in Silicon Valley are on the spectrum and that we live in a society that basically punishes ideas that are outside of the mainstream. [00:47:13] That that's not like a dig onto the founders, but that there's a whole bunch of like creative people who are just shut down by the rest of society. [00:47:20] So he clearly identifies with that in some way. [00:47:23] That's kind of where he is a founder too, where he made a games company in high school. [00:47:28] So he's like very, which is, I mean, the great irony of this whole thing is that you've got someone who basically their whole life experience is venerating tech, venerating entrepreneurs, and then decides to kill a CEO, which really does not make sense. [00:47:41] Very, very curious. [00:47:43] So he graduates Penn in 2020 and gets, it seems like a data engineer job out in San Francisco for a startup or a real company. [00:47:52] I don't know. [00:47:53] Who knows the difference? [00:47:54] I mean, it's like, what's the difference now when we say it? [00:47:58] But it's unclear exactly how long he's actually working there. [00:48:03] It seems like he goes remote for a while and is sort of just like, like a lot of tech workers, is kind of living, doing like the, what they call digital nomad life. [00:48:16] Yeah, yeah. [00:48:17] Where kind of like working remotely in different locations where it's like, oh, I could go live in Hawaii, which he ends up doing, but still work for this company in San Francisco, like remotely, I'm going to do that. [00:48:27] He's a cyber Bedouin. [00:48:29] Yes, totally. [00:48:31] So in 2022, he does end up in Hawaii in a co-living space called Surfbreak, which is actually pretty interesting. [00:48:43] Yeah, this is where his political journey begins. [00:48:46] So he winds up in Surfbreak in Hawaii. [00:48:50] This is a place which is, I think, the lowest tier for membership in this co-living space is like $1,500 a month. [00:48:56] So it's pretty expensive. [00:48:57] It's like living in New York or LA, but he's earning a good wage. [00:49:00] He's working at this tech company. [00:49:02] And in Surfbreak, there's a reading group which starts like the next chapter of his like political education where he starts to kind of move outside the Overton window. [00:49:13] There are three books specifically that they read in this time. [00:49:17] Interesting, interesting choices. [00:49:18] There's an interview with one of the guys who's like the organizer of this place of Surfbreak. [00:49:25] And he described the reading group as being discussing large-scale systems, public and private institutions, talking about their incentives. [00:49:34] So imagine like a tech guy's version of philosophy and trying to like do enlightened centrism to solve problems. [00:49:42] One of the more famous people associated with this is Tim Urban, who has been tweeting up a storm since the last few days. [00:49:50] Time is fairly embarrassed, perhaps. [00:49:52] He's, yeah, yeah. [00:49:53] Well, I mean, Luigi tweeted that this is the most important philosophical work of the 21st century. [00:49:58] This is his biggest influence, period. [00:50:01] But we're going back a little ways here. [00:50:03] So this is in 2022 that they start this reading group. [00:50:07] And Tim Urban has this book called What's Our Problem. [00:50:11] I would generally describe him as someone who's like a tech accelerationist, enlightened centrist, skeptical of both Democrats and Republicans, with probably more Democrat-leaning. [00:50:23] The book itself does not have a tremendous amount of history. [00:50:26] It's got a little bit of political history, but he starts with this frame where imagine all of human history as a thousand-page book. [00:50:33] And in the last like 10 pages, you have basically all of recorded history. [00:50:39] And everything that we know is on page 999. [00:50:43] And then the last few years is on page 1,000. [00:50:46] So the frame that you begin with is that the scale of social history and political conflict is far less important than the development of technology. [00:50:55] And so the emphasis of the Tim Urban book is planning what's on page 1001. [00:51:00] They're very forward-looking. [00:51:01] They're basically kind of soft discussions of the singularity, looking at tech accelerationism. [00:51:07] This is a little bit framed in like Evo psych evolutionary psychology, which is what he gets into later. [00:51:16] But Tim Urban is famous for making these like childlike political cartoons that we looked at. [00:51:23] Yeah. [00:51:24] When we say childlike, I mean like stick figure, like corny, I feel like early internet. [00:51:31] Who's that one racist? [00:51:33] Who's the like the racist Stone Toss? [00:51:36] Yeah, yeah, they're like Stone Toss. [00:51:37] This is like Stone Toss for guys who will believe what Stone Toss believes in three years. [00:51:43] Yeah, that's the thing to emphasize is that like his reading list is really like one step outside of like normie stuff. [00:51:50] Yeah. [00:51:50] You know, this is not, if I was looking at someone to interview and I saw his social media profile, his Goodreads and whatever, I think, I'll tag this and I'll come back in three years when this guy is really radical. [00:52:00] Because I mean, Tim Urban, this is a like best-selling book. [00:52:03] This is not like a niche, weird ideological thing. [00:52:06] He's not, I mean, he's on mainstream podcasts. [00:52:09] He's got a big platform. [00:52:10] You'll find him on like Chris Williamson and stuff like this. [00:52:13] He's on Lex Friedman, all those types of guys. [00:52:16] So not by any stretch of the imagination a political extremist, very different from the stuff that Luigi gets into later on. [00:52:23] But in a way, I think of Luigi as someone who was on the early years, like radicalized by political compass memes, but not made by a decentralized group of like teenage authors, but just literally XY diagrams drawn by Tim Urban. [00:52:39] Like he creates my favorite one of these, he's got, this is one of the more popular, well-known ones, but he has like a in okay, so it's a it's a square where they're plotting media accuracy over neutrality, and you know, the spectrum is right bias to left bias. [00:52:55] So it's also in a comic sans. [00:52:57] I just want to like, so people can picture it. [00:53:00] It's literally like a cutesy cartoon. [00:53:03] Like it almost looks like it should be animated slightly and it's like kind of shaking like the media matrix. [00:53:09] And, you know, I think there's actually, there's one of his diagrams has a curve called like it literally says growing up on it. [00:53:18] You know, it's like you have this kind of childlike assumptions at the beginning and then it dips down where you understand less. [00:53:24] And then again, on this XY grid, you kind of like slowly grow up and like use your higher mind, which is that that's his frame. [00:53:31] He's always talking about primitive mind to higher mind and like different rungs of thinking. [00:53:35] So these are guys who are very like pro-complexity, but ultimately discussing rather simple ideas. [00:53:40] And in this famous diagram, he's got, he calls it the media matrix. [00:53:45] And what you want, the North Star for media, is that you have this like super high accuracy, which is directly in the middle of unbiased neutral media that does not have left or right bias. [00:53:55] So that's kind of, this is what they think in an image. [00:53:59] They're not like, we mentioned gray tribe before. [00:54:02] I think the easiest way of understanding this kind of stuff is that in Silicon Valley, there's popular phrases like red tribe and blue tribe, which is basically like Democrat-Republican affiliation. [00:54:13] And then the grays are unaligned. [00:54:15] But that doesn't necessarily mean that they're neutral, as we'll explore a little bit later. [00:54:19] There is this kind of like fish hook neutrality. [00:54:22] If you ever see those like fish hook memes of like stuff that is far right, but also in the center. [00:54:28] So they're very pro-elite. [00:54:30] Like they think of themselves as software designers who should be running the rest of society. [00:54:34] Absolutely. [00:54:34] Yeah. [00:54:35] And I think the Tim Urban stuff, it really, so I came across Tim Urban, I think when we were doing those Elon episodes like so many years ago, because it was like Elon was like a Tim Urban reply guy. [00:54:47] Like, you know how Elon now is, like, he'll, like, hyper-focus on, like, the woman who had sex with Sam Bankman-Fried in order to get an interview, or, um, which she told people, um, or, like... [00:54:59] She unfollowed her. [00:55:00] You did? [00:55:01] He did. [00:55:01] He did. [00:55:01] I know. [00:55:02] So tragic. [00:55:02] After giving her like $25,000. [00:55:05] Crazy. [00:55:06] Ashley St. Clair, like these sort of like right-wing, like memesters or whatever. [00:55:13] But it used to be like he was like a Tim Urban guy. [00:55:15] And Elon was always saying, I'm in the center. [00:55:18] I'm in the center. [00:55:18] Like, you have to take a little bit of left, a little bit of right. [00:55:21] The classic moron kind of opinion where you think that like, it is true. [00:55:25] It's like, the idiot mind like conceives of like truth as always being directly in the center of two opposite sides. [00:55:35] Yes, and those are found on the airport bookshelf. [00:55:38] Yes, yeah, yeah. [00:55:39] Yeah, literally. [00:55:40] Yeah. [00:55:40] And I mean, this is also why I try to emphasize that the scale of social history is generally unimportant to these people. [00:55:46] Because if you were to say that, like, you know, the 40-year drift of neoliberalism is that both parties moved fiscally to the right, that would seem unimportant to them. [00:55:55] And they're still looking, well, whatever between those two poles, the middle is the truth. [00:55:59] And I'm looking for the pro-complexity answer for that. [00:56:02] But if the middle is a shifting target, like that's not even in their frame of reference. [00:56:06] They're just following this kind of singularity parabolic curve. [00:56:09] That's the only thing that's really important to them. [00:56:13] The kind of thrust of the Tim Urban book is about thinking clearly and overcoming cognitive biases. [00:56:19] I think of him as basically being kind of a Sam Harris type of guy. [00:56:24] He seems like a little, he's like a less popular Sam Harris. [00:56:29] This book, so going sequentially through their reading group, I think this lays, generally speaking, the foundation for Luigi's belief system. [00:56:38] And it plants some early seeds where these are people that you might analogize to being like rational skeptics from a few years earlier. [00:56:45] We were saying, I think, on the text thread that like a lot of Luigi's references are kind of like the internet from 10 plus years ago. [00:56:53] Like he's not an up-to-date edgelord. [00:56:56] These are kind of like old, like rational skeptic, like new atheist ideas. [00:57:00] Literally, they're citing Steven Pinker and Richard Dawkins in the next book that he reads. === Rational Skepticism's Legacy (14:15) === [00:57:05] So it's just, it doesn't feel very contemporary. [00:57:07] He's not like a base trad cath or like some esoteric monarchist or something like that. [00:57:12] Some people, maybe he gets into a little bit of that stuff later on. [00:57:15] But the following book that they read is from Steve Stewart called The Ape That Understood the Universe: How the Mind and Culture Evolve. [00:57:24] And I mean, I would say that this is really kind of like entry-level Evo psych. [00:57:29] And they place like memes and genes and like the urge for reproduction as basically being the foundation on top of which you should build any political belief system. [00:57:40] In their minds, it's like, okay, you're searching for some type of immutable truth. [00:57:44] You're not interested in social history, especially if you come from an elite background. [00:57:49] Like you don't really want to accept a narrative of like class conflict. [00:57:53] That's not meaningful to you. [00:57:54] And so from their perspective, it's like, okay, biology existed before feudalism. [00:58:00] It existed during capitalism. [00:58:02] And even if we arrive at socialism, biology is still there. [00:58:05] So that's actually the baseline from which we should evaluate all human decision-making. [00:58:10] Everything is just the urge for reproduction. [00:58:12] So you think that, for instance, if I have kind of come to the conclusion that I should maybe do hypergamey, where like I have many different wives and children by each of them, it's sort of like it comports to this because this is also biological, right? [00:58:29] Because back in the day, if I was a caveman, I would probably have a lot of different caves and a lot of different cave women in them. [00:58:36] So I think they would say that this is a very rational course of action to take, but they give this famous example of the peacock, which this sounds so stupid to be like an adult discussing this. [00:58:46] Like the peacock, yeah, it's big tail, right? [00:58:49] The big tail has no evolutionary advantage in like predator-prey relationships. [00:58:54] It's actually just it's really big and cumbersome and actually is a it's debilitating for them because they can't run away from predators, but it allows them to reproduce because they look more attractive to the mate. [00:59:05] And so what they say, what they say is that as you were to engage in this hypergamous activity, you could carry that out to evolutionary fulfillment up until you had too many GFs that all then got mad at each other. [00:59:19] And so that would be the biological natural countervailing force. [00:59:23] You hit the limit of women. [00:59:26] So there's a kind of natural equilibrium in nature. [00:59:30] There's a natural order which underpins all of these things. [00:59:33] Which, by the way, leads up to hierarchy. [00:59:35] Yeah. [00:59:36] Very, yeah, predisposed to hierarchy. [00:59:38] A lot of times with these guys, like you get the sense that they have a preference for hierarchies. [00:59:44] Right. [00:59:45] And then they will look to find empirical data to retroactively justify unnecessary hierarchies that exist in today's society. [00:59:53] Yeah, because oftentimes I feel like the people that write this kind of stuff are either adjacent to or wish to be a part of some unnatural hierarchy themselves. [01:00:02] Because a lot of these people are, I mean, if we're going to be honest, like genetic offal or whatever. [01:00:08] Offal, what's the trash? [01:00:09] Garbage. [01:00:10] Like they're shit. [01:00:11] You know, they're fucked up looking. [01:00:12] They're oftentimes, I guess you would call dysgenic, as I'm just to quote from one of the people we're going to be talking about later. [01:00:20] They're ugly. [01:00:20] They're weird looking. [01:00:21] They often are at times lumpy and freaky. [01:00:25] But they find themselves at sort of the top of these hierarchies. [01:00:28] And it's like they have to somewhat justify that because it's clearly unnatural to anybody like me who is obviously, you know, the entirety of human history has led to Bryce Melville. [01:00:40] And yeah, I don't know. [01:00:42] It does seem that way to me. [01:00:43] I mean, that's always, whenever anyone starts talking about this, and like really a large part of their political project is justifying these sort of like biological hierarchies, you have to look to like what they're getting out of that, right? [01:00:54] Yeah, yeah. [01:00:55] I mean, there's a reason why these things are appealing to them, you know, and coming from those elite positions. [01:01:01] I mean, I think that there is a level of coherence in this where we did offload major social decisions to technical engineers that should have been political decisions. [01:01:12] Like if you look at a Facebook group, for example, like we talk about this as a way for people to organize and exchange ideas or whatever. [01:01:18] They are literally speaking dictatorships where one person has admin privileges and can ban anyone else. [01:01:23] Like these systems were not built with the idea of democratic deliberation as part of them. [01:01:28] We'll get into some of the gamification for that later and how that bleeds into his belief system. [01:01:33] But I mean, just to kind of give credit where credit is due, like engineering, software design, social media design specifically, like these are big political decisions. [01:01:44] So they are right about that. [01:01:45] But then you have, of course, this kind of weird social texture where a lot of these guys were like nerds before and now they have tremendous power. [01:01:52] And so how do you, you know, I was going to say, like, a lot of this feels like a hop, skip, jump away from someone like Sam Bankman-Fried. [01:02:02] Or someone like Moldbug. [01:02:03] He's sympathetic to effective altruism. [01:02:06] Absolutely. [01:02:06] Yeah, yeah, yeah. [01:02:07] You mean Luigi is? [01:02:08] Luigi is. [01:02:08] I mean, and a lot of this stuff, I think, informed what became like long-termism. [01:02:12] Right, yeah, right. [01:02:14] Like, this all comes from like the same like heady, noxious brew. [01:02:19] If you follow the, like, if you follow it rationally out, it would, I mean, you end up in long-termism. [01:02:26] Yeah. [01:02:26] Because if you are, I mean, Luigi himself seems to be very, I would say, a kind of like vulgar utilitarianism that he sort of like kind of draws from. [01:02:40] I mean, that kind of logic leads you to, like, if you take it, you know, at its face, like, it, it leads you to long-termism. [01:02:49] That's the only thing that would make sense extending that logic. [01:02:51] Yeah. [01:02:52] Right? [01:02:52] Which is, I mean, one of the things that I start to speculate about the killing of Brian Thompson, which is that he selects United Healthcare based on its market cap. [01:03:02] And in a lot of these circles, like EA and, you know, long-termism and all of these types of things, you have this kind of utilitarian ethics, which is a type of morality, but it's not like, it's not based on class conflict. [01:03:15] So you get these kind of long, like literal equations that utilitarians type out with fucking calculators. [01:03:22] Yeah. [01:03:22] And I mean, I think the best example of this is like the best way to save a human life per dollar spent is to buy mosquito nets in Madagascar or something like that. [01:03:32] So you get these kind of elaborate thought experiments. [01:03:35] A lot of these guys are, I think you described it, Liz, before, as like dumb smart guy stuff. [01:03:40] It's like it sounds a little bit more complicated than it, than it actually is. [01:03:44] But you do these ethical calculations where what is the greatest good that I can do to create more positive outcomes in the future? [01:03:51] Yes. [01:03:52] And what that means in many cases is allowing for horrible things to happen in the present. [01:03:57] So one of the, this is like the most high contrast example, but there was one of these EA guys. [01:04:04] I forget who it comes from, but you're a hypothetical thought experiment. [01:04:08] There's a burning building and there's a baby inside with a crib. [01:04:10] And then there's a famous Picasso that's on the wall. [01:04:13] You have time to rush in and grab one of these, but not both. [01:04:17] And so what do you choose to do? [01:04:18] How much can I sell the baby for? [01:04:21] Probably not $150 million or whatever. [01:04:25] Yeah. [01:04:26] Right, right. [01:04:26] So the answer that they come up with from this like effective altruist position is that it is more ethical to allow the baby to die in the fire, to take the Picasso, to sell it, and then buy a whole bunch of mosquito nets because then you will save thousands of people later on. [01:04:40] Well, that's how SPF himself justified so much of what he did at FTX because he was like, all of this fraud, all of this lying, all of this stealing, all of this pyramid scheming will. [01:04:53] doesn't matter because so long as what I amass is this like, you know, these like hundreds of millions, billions of dollars, but end up being and buying political influence, which will then pay out by saving or, you know, getting the, you know, hypothetical mosquito nets or whatever, like that can then justify basically like what you're saying, any action in the immediate. [01:05:17] And extend this timeline out even further, right? [01:05:19] Let's take the civilization frame where we're talking about human society colonizing the stars, right? [01:05:24] And AI and all of this stuff. [01:05:26] So you're talking about billions or trillions of human lives in the future. [01:05:30] And if that results in, I don't know, maybe a few billion people dying now within the next, I don't know, climate change 50 years, all of that is totally acceptable insofar as humanity survives to colonize the stars. [01:05:41] Well, I mean, this is, I remember when we did episodes on this and like reading about this in detail for the first time, not just like learning a little bit about it. [01:05:50] I remember thinking like, so much of this stuff sounds like corny bond villain shit, you know, where like, I have to kill billions now in order to save trillions in the future. [01:06:00] Yeah, it's literally Avengers logic. [01:06:02] But there's also this, this incredible narcissism that one encounters so much in the tech sector where like they think that they, this like mantle of responsibility has been put upon their shoulders because they are some of the few rational people who can see these calculations and see these things. [01:06:22] And then that also, I mean, you can see that with people like Teal or who are kind of like the darker version of this, who are like almost, they have the ability to sort of see through the matrix and save humanity from itself. [01:06:34] Musk is very much like that. [01:06:36] Anti-democratic. [01:06:37] Yeah, very anti-democratic. [01:06:39] And listen, I have no great attachment to democracy as bourgeois democracy whatsoever. [01:06:46] However, I do think a lot of the people who believe in this kind of stuff in my ideal society would likely be working in like a mine or something like that. [01:06:55] And if we're going to be really utilitarian about this, I guess they're not, I don't know what they would do. [01:06:59] Maybe accounting. [01:07:02] These people are, you encounter a lot of megalomaniacs, I think, in this realm. [01:07:10] But it's also, I do want to point out, like, it's not just founders. [01:07:14] Like, this is very popular. [01:07:16] This is like normal, very normal, very mainstream, very popular in varying degrees. [01:07:23] Like, it did just remind me immediately of, it's the same exact logic as Thanos in the Avengers movie, which was a character that so many people were like, oh, this is the anti-hero. [01:07:35] Oh, I understand him. [01:07:36] Sparked all this debate. [01:07:37] What did he do? [01:07:38] Blah, blah, blah, blah. [01:07:39] He's the one who's like, oh, I have to, I can't fucking remember. [01:07:41] It's like, I'm going to collect all these rings to destroy the world so I can save the world. [01:07:45] He's a Malthusian in the future. [01:07:47] And so he decides to kill half the universe's population so that they can extend the resources. [01:07:52] Yeah, the snapshot. [01:07:54] I know about him. [01:07:55] I just didn't know that's why. [01:07:56] So they called him an eco-fascist, right? [01:07:58] That was like the meme that Thanos was trying to do resource conservation so that the rest of society could survive. [01:08:04] That's what The Avengers is about. [01:08:05] Yeah. [01:08:06] Yeah. [01:08:06] Isn't that funny? [01:08:07] But it's like, but I'm just using that as an example to say that, like, just to reiterate that Luigi, for example, was not in these like crazy esoteric, weird niche holes. [01:08:21] Like, this is mainstream airport book thinking. [01:08:25] Like, this is where you, this is Lex Friedman stuff. [01:08:28] Like, this is the more intense, you know, EAC and, you know, long-termist stuff of SPF is maybe it takes a couple, you know, jumps and leaps, but also, like, I read the fucking long-termist book that I'm now forgetting the name of. [01:08:46] It's the New York Times bestseller. [01:08:47] Like, that shit was really fucking popular. [01:08:50] What it strikes me about all this stuff is that I just, you know, I just saw 2001 Space Odyssey for the first time. [01:08:57] Oh, wow. [01:08:57] That's nice. [01:08:58] And thank you for giving me a wow for that. [01:09:01] I appreciate it. [01:09:01] It makes me feel like I did something. [01:09:03] No, I just, I think that's such a, that must, that's a great experience. [01:09:07] Yeah, I watched on my phone. [01:09:10] That was great. [01:09:11] But the speakers and the new iPhone are really good. [01:09:14] Not that I used them, but you know, subtitles. [01:09:16] Just subtitle silent. [01:09:18] Yeah, it was really good. [01:09:19] But minimized window because you were looking at social media. [01:09:22] Oh, yeah, yeah, yeah, yeah. [01:09:23] Absolutely. [01:09:24] Absolutely. [01:09:25] I was. [01:09:25] Yeah, the subway surfer in the corner. [01:09:27] Yeah, I'm playing a game also, and it's like on a thing. [01:09:30] And I'm driving. [01:09:32] But it's like the HAL in that. [01:09:36] He's like, I'm going to get rid of the crew because they're going to get rid of me and the mission's too important. [01:09:40] Blah, Like these people think like that. [01:09:43] Like, it's this like really cold computer like utilitarian logic where like they themselves, this is what I always think. [01:09:49] Like this is why they're so attracted to AI because they actually see AI as human because they think like AI. [01:09:55] Like there's no difference between your average fucking, and I'm not, no, no, no shade to Luigi, but like your average guy like this and a fucking large language model, they already are, they are replaceable with this shit because they think and talk exactly like them. [01:10:10] And write, which we'll get into. [01:10:11] Yes, yeah, yeah. [01:10:12] Yeah, yeah. [01:10:13] So we're kind of extrapolating here on, I think, what I would call the latent themes of these two very popular books. [01:10:19] And as we sequentially move through this reading list that they did in 2022 at the Surfbreak co-living group in Hawaii, there is this, like, I think the biggest open question is, how does he move from like normie rationalist stuff in the space of six months to 3D printing a gun and killing a CEO, right? [01:10:36] That, and as we descend into this, maybe we start to get some clues towards that story. [01:10:41] The third book that they read is Ted Kaczynski, The Unibomber. [01:10:46] Which is not a book, by the way. [01:10:47] It's a long, I think it's like 35,000 words, right? [01:10:50] Really? [01:10:50] It's that long? [01:10:50] I think it counts as a book. [01:10:52] A novella? [01:10:53] But I'm saying it's a, we'll say it's a PDF. [01:10:56] Yeah, yeah. [01:10:56] I mean, it might be like, I don't know, 75 pages or something. [01:10:59] It's not, yeah. [01:11:01] Industrial Society and Its Future. [01:11:04] This is suggested to the group as a joke initially, as all memes behind. [01:11:08] That's what they say post suggestions. [01:11:12] That's also, yeah, that comes from one of the members of the reading group. [01:11:14] Rationalization when they're asking, you know, why did this guy murder? [01:11:18] Oh, well, we suggested this as a joke. === Iceberg's Hidden Layers (15:12) === [01:11:20] Yeah. [01:11:20] I mean, they also said, I mean, in the same interview, that guy who was part of the reading group said, oh, we just really couldn't pull anything out of it. [01:11:26] And it was the reason that the book club disbanded. [01:11:29] So, I mean, it could totally have been that they just read it through and fucking loved it. [01:11:33] And he clearly is very interested in the Unabomber later on, right? [01:11:37] He writes this Goodreads review that has some pretty interesting quotes in it, tacit endorsement to violence. [01:11:43] He says, when all other forms of communication fail, violence is necessary to survive. [01:11:48] You may not like his methods, but to see things from his perspective, it's not terrorism. [01:11:52] It's war and revolution. [01:11:54] So whether the reading group finished the book, liked it or not, Luigi clearly did like it and was very influenced by it. [01:12:03] Although, in fairness, that Goodreads review actually comes from a Reddit poster called Boss Potato Ness. [01:12:10] Yeah, so he copied. [01:12:12] He copied. [01:12:12] He plagiarized the Reddit, the review. [01:12:16] Shout out Jacob Chamsian for finding this. [01:12:19] Yeah. [01:12:20] Yeah, he plagiarized, which I think is so interesting. [01:12:23] Goodreads, I've never really understood anyways, because you read a book, you put it in the pile. [01:12:30] Goodreads and Letterboxd online. [01:12:31] I just thank you. [01:12:32] It's just too much else to do in this life. [01:12:34] I got to keep those thoughts to myself. [01:12:36] I just got to keep it moving so that it doesn't get published and talked about on a podcast. [01:12:39] Keep it for sure. [01:12:40] I'm trying to. [01:12:41] Yeah, exactly. [01:12:42] It's too much of a glimpse into me. [01:12:44] Leave no trace. [01:12:46] But how many times they need to say I've read Atomic Habits? [01:12:50] Just kidding. [01:12:51] That's another thing. [01:12:52] You just listen to that audio. [01:12:53] Audiobooks. [01:12:54] But yeah, so he sort of cribbed part of this review, which I think is interesting. [01:12:59] Because why would you do that? [01:13:01] Yeah, Boss Potato Ness is also, he's kind of just like a, it's a very funny name, but he's kind of just like a general shitposter. [01:13:08] He's very involved in a debate thread in R slash monarchism about whether monarchy and fascism are compatible. [01:13:16] Interesting. [01:13:17] So this is a more kind of like edgelordy guy. [01:13:19] I mean, I think if I had to take a guess, The Goodreads is public. [01:13:23] So Luigi probably just googled review of Ted Kaczynski and then took a few paragraphs to put it in there. [01:13:31] I think sometimes too, when someone is really enamored with perhaps like the possibility of a new identity, when they're trying to cobble together a kind of new, we'll say like a new self conception. [01:13:48] I think that people who are maybe if they're like a little nervous about how they might present, they're cobbling together from other sources to try to like harden the idea of themselves as like someone who believes those things. [01:14:04] You know what I mean? [01:14:05] Like kind of meming themselves into being the guy who is smart enough to have a good enough take on Ted Kaczynski. [01:14:12] And so they're like borrowing. [01:14:14] I mean, that's kind of the classic reason why people like steal ideas is because they want to be the person who actually is, they want to be seen as the person who said that, right? [01:14:23] Not like they just think it's a better way of saying something. [01:14:27] No, I will say actually, he does quote him in the review. [01:14:30] I'm looking at it now. [01:14:31] He says a take I found online that I think is interesting and then he says had the balls. [01:14:35] Ah, I see, I see. [01:14:36] Oh, nice. [01:14:37] Okay. [01:14:37] Well, at least he did, he gave him, he gave him credit. [01:14:40] I mean, I think it's important to throw in here that although Ted Kaczynski seems like totally antithetical to a lot of this stuff, like you got to keep in mind that these guys, like they consider themselves, they pride themselves on taking in ideas from different sides and then sorting the signal to noise, right? [01:14:55] So the idea that they would read all of this pro-tech stuff and then read the anti-tech stuff is just kind of further venerating and validating how smart they actually are, right? [01:15:05] So it's kind of like a popular meme among tech circles too. [01:15:08] It's like you're working long hours, your whole life is technology, and then you have this idyllic fantasy from the Unabomber. [01:15:16] I would say of all the interviews that I've done, Ted Kaczynski is kind of this portal between something that is popular and in mainstream culture. [01:15:25] There's literally movies, documentaries, miniseries about it. [01:15:29] But then it is actual radical material, right? [01:15:32] So this is, I would say, like tier two of the ideology iceberg. [01:15:36] It's not that deep, but it is. [01:15:38] Let me ask you something. [01:15:39] If the iceberg was a pyramid, for example, would tier one be on the bottom or tier one be on the top? [01:15:46] So, okay, in the I'll describe this with words because it's a visual meme, but there's this famous meme format of the iceberg that starts with like relatively approachable and simple ideas at the top, and then it descends through increasingly dense, niche, obscure, esoteric layers. [01:16:02] So I did a 26-hour cumulative Twitch stream, not all at once, but over maybe like eight different episodes and went through literally all of these ideologies, like several hundred of them. [01:16:14] I think Unabomber was maybe like tier three or something like that. [01:16:17] But it goes down to 10. [01:16:18] But it's not shit, you're just not going to be able to do it. [01:16:21] That's all well and good. [01:16:22] But if you were doing a pyramid of ideology, if you were doing a pyramid, classic pyramid, would you put tier one at the bottom of that pyramid or would you put tier one at the top of that pyramid? [01:16:33] Do you start at the top or the bottom? [01:16:35] I would say you would do it like ranking on a chart and you would say like S tier, tier one is the top. [01:16:41] Yeah. [01:16:42] Top of the pyramid. [01:16:43] One. [01:16:45] What I've always said. [01:16:46] But there's disagreements. [01:16:47] That's what you always said. [01:16:48] That's what I've always said. [01:16:50] Is this something we've disagreed about? [01:16:51] This is something that has come up on this podcast. [01:16:53] No, no, no. [01:16:54] We've had a tier discussion, but certainly not in pyramid. [01:16:57] We've definitely had a pyramid-based discussion. [01:16:59] But you know what? [01:17:00] It's gaslighting. [01:17:01] No, we don't have to check anything, but we just, we just so he, I think that with the Ted, the Ted K meme has become like it's one of those things that's like, it's almost like, God, I'm really trying to not say the word normie, but it really is like normie-edgy. [01:17:18] You know what I mean? [01:17:20] Like, it's like the tech guy who's like a little into like, look, I've read like a layer deeper. [01:17:25] Kind of what you're saying there, like tier two. [01:17:27] It's like, and this is like the crazy shit right here. [01:17:31] Well, I think your point about it being this sort of portal or this kind of gate. [01:17:36] Gateway. [01:17:37] We're going to get real boomer about it. [01:17:39] Seems right because it does kind of, it does, like you're saying, it lives on this edge of like acceptable mainstream history. [01:17:48] It was in the 90s. [01:17:49] There's the, there's mini-series, there's movies, like it's around, but it's still, you know, the PDF is out there, but you got to go find it. [01:17:58] And it's like a little bit dangerous, but not something, you know, it's not totally like hidden or like inappropriate to read or look at. [01:18:12] It's just a little bit, a little bit dangerous. [01:18:14] A little bit dangerous. [01:18:14] A little bit dangerous. [01:18:15] Yeah, we would, you know, in if you're charting someone's political journey, you would describe this as a pipeline or a funnel. [01:18:23] These are the kind of popular analogies. [01:18:26] The funnel is to, it actually comes from counterterrorism after it designed during the global war on terror to try and create predictive models of jihadi extremists. [01:18:39] So you would look at media messaging that reached a large audience of people. [01:18:43] I'll just make the math simple here. [01:18:44] Like there's a popular message that reaches a million people. [01:18:47] And then on the next tier, there's a slightly more radical message that reaches 100,000 people. [01:18:51] And then there's 10,000 people. [01:18:53] And at the bottom of this funnel, you will find active extremist groups that are reaching an audience of a few hundred, encouraging people to go out and do lone wolf terror. [01:19:02] This model from basically like government-funded think tanks is then applied around 2016 to the lone wolf white nationalist terror to create predictive models based on media reach. [01:19:16] So, literally, plotting YouTube channels, audience sizes, and people finding their way through different ideas, moving through social networks into basically at that time, Discord terrorist cells where people were doing Adam Offen type shit. [01:19:30] Which is it's funny because, like, I mean, very famously, the FBI was running like one of those big, I guess, bottom-of-the-funnel Nazi book presses. [01:19:42] In many cases, yes. [01:19:43] So, there's actually a lot of studies of like how you acculturate someone to take radical action in the real world, like to do violence in the real world, right? [01:19:51] And so, there's a few different steps there where, one, are they hanging out in the channel? [01:19:55] Are they chatting? [01:19:55] Are they present? [01:19:56] Are they present for a consistent period of time? [01:19:58] And then they'll do something like, Hey, why don't you make sure people in your neighborhood are afraid? [01:20:03] Why don't you print out these memes and leave them in a public area? [01:20:06] Like, leave them in the park, leave them in the library. [01:20:08] And it says, like, you know, like white supremacists, like Nazi bullshit or whatever. [01:20:12] And then a few months later, they'll be like, they'll be talking about guns. [01:20:15] And they'll like, you know, do you have access? [01:20:17] Do you know anybody who has a gun? [01:20:18] Like, this is a common Fed tactic. [01:20:20] You basically, over consistent participation in these communities, you see if people are willing to make a public statement. [01:20:27] Do they have access to the weapons in which they could carry something out? [01:20:30] And then when they start to get worried, this is where it gets interesting for Luigi: is that they actually don't come and swat your house and go like raid you in the middle of the night until you stop posting. [01:20:41] When you stop posting, that means you're preparing to take action in the real world. [01:20:46] And so, when Luigi goes dark for a six-month period, this is where we start to speculate that he took that time to plan the assassination of Brian Thompson. [01:20:55] Interesting. [01:20:56] That said, as we go through his funnel of political ideologies, this stuff, you know, Unibomber is like maybe like the middle node in this pipeline. [01:21:06] The guy that he gets into, which I would say is probably the most significant influence on his political worldview, his ideology more recently, like up to 2024, is someone we've been diving deep into called Gerwinder Bogall. [01:21:23] Gerwinder Bogle. [01:21:25] This guy is, I'm going to tell you, not to get racial on the podcast, even though certain guests have slipped me notes asking me to. [01:21:35] I said, certain guests, if you want to take that, if you want to give me a knowing nod, then that's your business. [01:21:40] But what a motherfucking name that is, Gerwinder. [01:21:44] My God. [01:21:46] But respect. [01:21:47] I love a Gerwinder Bogle. [01:21:48] So, Gerwinder Bogle, you got to choose one of those. [01:21:52] Gerwinder Bogle is a sub-stacker. [01:21:56] What's his past? [01:21:57] Where's he come from? [01:21:59] He's a big gray tribe guy. [01:22:01] So he doesn't describe himself as a rationalist. [01:22:04] I think mostly because calling yourself a rationalist just feels like Slate Star Codex. [01:22:09] Like it feels 10 years old. [01:22:11] These guys definitely grew out. [01:22:12] People don't even say teapot anymore, but they used to at least be post-rationalist or teapot that part of Twitter. [01:22:19] That's kind of what it's like. [01:22:20] Yeah. [01:22:20] Yeah. [01:22:21] He's like all of these guys, he has a background in the tech industry. [01:22:25] And he describes like leaving that industry to go and basically become like a crowdfunded writer. [01:22:30] He has a readership of, I think, around 50K. [01:22:33] It's grown a little bit since because people are finding out about him. [01:22:37] But I have not really seen many people write about this guy's belief system in depth. [01:22:40] So I think we've gone significantly further than most of these people. [01:22:44] But he describes leaving tech as they had essentially designed these like perfect systems, but it was the people using them that were the problem. [01:22:54] So he talks about finding bugs in the human operating system. [01:22:58] That the bugs in the system, like they design the tech perfectly. [01:23:03] The problem is actually human psychology. [01:23:05] And generally speaking, I would say like their philosophical ideological project is to transform the human subject to fit what are, of course, their perfectly designed tech systems based on computer software, but applying that to political economy. [01:23:20] So if there is disagreement, if democracy does not seem to be functioning or there's chaos, like it's a problem of, you know, going back to the early Evo psych stuff, it's problems with their cognitive biases or whatever that need to be overcome. [01:23:36] And it's the people, but like the tech design is kind of perfect. [01:23:40] That's his underlying latent belief system. [01:23:43] But he's practically speaking, he is a sub-stacker. [01:23:46] He's on a lot of popular podcasts. [01:23:48] He's made, I don't know, maybe like between five and 10 different appearances on Chris Williamson, which reaches an audience of millions of people. [01:23:55] And in those contexts, he's talking about pretty mainstream stuff. [01:24:00] He's kind of like a tech rationalist guy. [01:24:02] He'll say stuff like, you know, like Steve Jobs, you should wear the same outfit each day of the week. [01:24:07] And that just makes things simple. [01:24:09] You can optimize the rest of your life. [01:24:11] Really, really mainstream kind of normie. [01:24:14] My favorite one of his pieces of advice is that every Monday you should have, for example, like a tuna sandwich. [01:24:22] Every Tuesday you should have spaghetti. [01:24:23] Every Wednesday, you should have a steak. [01:24:25] And just set all of your meals for the week in advance so you don't waste time deciding what you're going to eat that day. [01:24:32] Respect. [01:24:33] As a meal prepper, I kind of get it. [01:24:35] I'm going to do a meal plan, but you don't do tuna on Tuesday? [01:24:38] Is that right there? [01:24:39] I think the reason why so much of this stuff is so crazy to me, because I'm like, aren't you just like, every day is a new day, you're just going? [01:24:47] You know, forever? [01:24:48] I think that, I mean, I understand the meal prepping stuff if you're like following a certain diet or whatever. [01:24:53] I think it gets a little American psycho when you're doing it in order to optimize your time, which you're then using to like, you're like shoring up your productivity for other tasks, which I assume have to do with computer stuff. [01:25:08] And that's when it gets a little like psychotic robot. [01:25:12] Wait, but if you wear the same outfit every day of the week, wouldn't that freak people around you out to the point where you would lose productivity because people would have concerned conversations with you, be like, hey, how come you're wearing the same clothes every day? [01:25:25] And like constant reinforcements of people being like, you're weird. [01:25:28] I don't have that problem. [01:25:29] He will to his credit because this is somebody who has carved out a niche that is already within tech. [01:25:37] So he can't just be like full optimization with like no breaks on it. [01:25:42] So he's a famous critic of Brian Johnson, right? [01:25:44] He'll say that Brian Johnson has this two-hour skin routine. [01:25:47] Brian Johnson, the guy who's de-aging himself. [01:25:50] Yeah, not Brian Thompson, the CEO that was murdered. [01:25:53] Yeah, the guy who's trying to live forever. [01:25:55] His brand is don't die. [01:25:56] So he'll say, you know, Gerwinder is, you know, as ever, a rational centrist in this kind of like, you know, radical center that someone who optimizes too much is doing that at the cost of the rest of their life. [01:26:07] But I would say on the entry level, this stuff is kind of, it's popular among entrepreneurial types, people who are running small businesses. [01:26:15] You're locked in, you're grinding. [01:26:16] You're doing startup life. [01:26:18] You're going hard. [01:26:19] Yeah. [01:26:19] Yeah, yeah. [01:26:20] And I think important, like getting through the more narrow part of the funnel is that Gerwinder is certainly more radical than the previous people we've looked at, a lot more radical than Tim Urban. [01:26:31] Gerwinder is also smaller. === High IQ Supporters (14:54) === [01:26:33] And so more radical ideas have smaller audiences, obviously, but it also provides access for young people who are trying to live out or get involved with these belief systems. [01:26:43] So Luigi is a founding member on his sub stack of $200 a year. [01:26:49] Which is a lot more than $50 or whatever, $60. [01:26:54] That's like what people pay for YouTube plus Hulu plus HBO, and you're putting it right on one sub stack. [01:27:03] He's a big fan of this guy. [01:27:05] $200 a year is a lot to spend on the internet, but you get things from that. [01:27:10] There's perks from it. [01:27:12] So it's a good way to get noticed, first of all, but then the perks for the higher tier include personal emails and one-on-one Zoom calls. [01:27:21] That's not. [01:27:22] Something that Gerwinder was offering, to be clear. [01:27:24] Yes. [01:27:24] And what he did with Luigi took him up on this and reached out. [01:27:28] Gerwinder, still having a hard time saying that name. [01:27:32] Gerwinder tweeted out only a few of the emails that he said he had exchanged with Luigi. [01:27:41] I think only like three or four, but he said there were at least 20. [01:27:45] But he hasn't shown all of them. [01:27:49] The contents of some of those may be interesting. [01:27:52] Yeah, I mean, I think, you know, if he's not sharing them, then he's made a decision why. [01:27:57] Yeah, yeah. [01:27:58] But he also said that they spoke on one of those one-on-one video calls as was like part of the founding member, you know, perk package. [01:28:11] And that he's like, oh, he was such a nice guy. [01:28:14] I can't imagine how he'd do this. [01:28:16] I believe he says that Luigi expressed. like support for effective altruism in that conversation. [01:28:25] But he hasn't really, no one's really asked Gerwinder that much about this except for in like one interview with the New York Post. [01:28:32] I mean, I think he's quoted in the Times piece briefly, which is very interesting to me that you would think such a well-resourced organization at the New York Times would actually go through some of the shit he said on other podcasts and asked him about it. [01:28:45] Well, can you mention like some of that stuff? [01:28:47] Because you did go on a deep dive. [01:28:49] Yeah. [01:28:49] Yeah. [01:28:50] Without getting too far ahead of ourselves here, though, just the confirmed, as we know of as per Gerwinder is that there's a two-hour one-on-one call between Gerwinder and Luigi that happens in early May at some point. [01:29:04] May 25th is Luigi's last Reddit post. [01:29:08] June 10th is his last tweet. [01:29:10] Some point in June, he cuts off all communication with friends and family, so he goes dark. [01:29:15] What makes this particularly interesting is that the piece, as per Gerwinder says in a public tweet, is that the sub-stack piece is called Why Everything is Becoming a Game. [01:29:28] And this is what they discussed for about two hours. [01:29:31] It's largely about applying game theory to the world, analyzing social media, politics, work, and so on. [01:29:38] It extensively discusses Ted Kaczynski for about 3,000 words. [01:29:44] It's a long piece about Ted. [01:29:46] So I will say, something stuck out to me about this piece is that it doesn't seem that smart. [01:29:53] I don't know if I'm, that's probably not the best way to describe it. [01:29:56] But I was reading this expecting to be, because sometimes I'll like, whenever we've talked about like less wrong shit and like the rationalist stuff, I'll like read their fucking websites and be like, okay, these guys are just like, you're on some neurodivergent shit. [01:30:08] Like, I don't understand what you're saying. [01:30:10] It's like kind of nonsense mixed in with like a lot of technical jargon and like, you know, probably like fake math. [01:30:17] And this is just like, this just seems like something, I think you might have said this, but like it seems like something that like ChatGPT could have written. [01:30:25] Yeah, it does. [01:30:26] it's extremely like it's just not it's just not very then that's my big problem with a lot of writers like this is like they seem to and this i think belies a dark truth about them is they seem to often have to like repeat fairly like trivial almost like childlike ideas mm-hmm uh sometimes at length. [01:30:47] And it's like they're discovering them for the first time. [01:30:49] Like discovering like basic ethical quandaries that like you don't even have to write down to like compute in your head. [01:30:57] And that would never even occur to you as being like a real quandary. [01:31:00] They seem to have to like restate a lot of this stuff. [01:31:03] But nothing in this piece really stuck out to me as particularly insightful or certainly nothing I could have a two-hour conversation about. [01:31:10] Well, there's a few things that start to thematically connect in this essay. [01:31:15] But for just a little bit of background, because we're describing Gerwinder as being someone who's a kind of enlightened centrist. [01:31:22] And Liz, you also found that he's a supporter of socialized medicine. [01:31:26] Well, he said that in a, yeah, he said that in some piece where he's like, yeah, I, you know, to be clear, like, I'm a socialist for healthcare. [01:31:37] Yeah. [01:31:38] But, you know, like kind of making the distinction, which is like, you know, I think not an unpopular stance that people profess, whether or not that when rubber meets the road, that's like something that they would support politically, who knows? [01:31:53] But I think especially post-ACA and then especially post-Bernie, it became a kind of like default stance among like the type of people who are like, I'm going to take from this zone and I'm going to take from this zone and like, see, I can like all different types of things, but rise above it and make the perfect mix or whatever in the center to say like, oh, socialized medicine, a la the NHS, for example, [01:32:21] that's usually the example or kind of like Nordic system makes rational sense. [01:32:27] But I don't know if that actually means Gerwinder left on some issues, he center on others, but like a lot of these guys who are in the center, they also have some kind of overlap with the hard right. [01:32:40] So Gerwinder used to write for something called the Quilliam Foundation, which is a right-wing think tank focused on countering jihadi terror. [01:32:50] Basically, these people would write like policy papers about how to infringe civil liberties and privacy of Muslim people in the UK. [01:32:57] Think of this as like if Sam Harris is like the old school rationalist who had like, you know, Islamic societies are not capable of governing themselves and so we should overthrow them and install liberal democracy type of shit. [01:33:09] Like that pro-war, you know, that kind of rational skeptic push towards being pro-war that people like the Dawkins and the Hitchens would go for. [01:33:21] He's kind of of that camp. [01:33:23] But in particular, what kind of struck me, what I did not expect looking into him, is that, and you find it with a lot of these guys, is that rational skeptics also like to have conversations about race and IQ, like all the time. [01:33:42] And I found him on this hard right podcast, the woman who hosts it is, I think, most popularly known for being like anti-trans and pro-Victor Orban. [01:33:51] Okay. [01:33:52] And Gerwinder talks about being a socialist when he was a teenager, and then he learns about human biodiversity. [01:33:58] Okay, human biodiversity. [01:33:59] So he's pretty true. [01:34:01] So that means he was like a member of the Labor Party or whatever. [01:34:03] He was like a teenager, like a supporter of the Labor Party, and then found out about being racist? [01:34:08] He doesn't, well, he doesn't really qualify what that means or how young he was. [01:34:13] I think it's a general, this is popular among a lot of these people, which are like, I used to be part of the left, and then like I realized how cruel society was. [01:34:21] No, we're going to do that in like three or four years. [01:34:23] Yeah. [01:34:23] Like when the bottom falls out under this, we're like, we're going to be. [01:34:26] Some people sold out too quick, but we're holding on for like three years. [01:34:29] Tanna Kasparian, I'm telling you, you fucking, you checked out too quick. [01:34:32] You got another three years before this becomes really lucrative. [01:34:37] Well, you start to see here, this particular podcast has like 2,000 views on YouTube. [01:34:44] So this is like digging really, really deep. [01:34:46] And when I tend to do the research on these guys, like you could watch, you know, hours and hours of Chris Williamson, and he'll be talking about, you know, like insights that you can get from coding about how to optimize your life. [01:34:56] But when you want to find what these people really think, go to the small radical podcasts. [01:35:01] So for example, JD Vance. [01:35:06] Well, I mean, this is the kind of unique media environment that we live in and that things that are fringe can also become enormously viral and get big audiences in some cases. [01:35:14] But I think, you know, what I'm tempted to kind of speculate here is that if he is saying those things publicly, like what might one say behind the paywall in a private one-on-one conversation with someone who's your biggest supporter? [01:35:27] Like how radical do you go? [01:35:30] Really quick, I just want to, just because some of our listeners might not know, which is great, but human biodiversity is kind of a dog whistle, right? [01:35:42] Yeah, as the racism expert, I would say. [01:35:46] I mean, if people don't know that term or haven't come across it, they might be like, oh, that sounds like science. [01:35:52] What's the problem? [01:35:53] Which is exactly, by the way, this idea behind it. [01:35:56] Yeah, I mean, it's racism in a lab coat, right? [01:35:58] So human biodiversity is the idea that there is an unequal distribution of human capital or IQ among the population, which is, I mean, objectively speaking, that is true. [01:36:08] But what do you then do? [01:36:09] Like, how meaningful is IQ in terms of someone's productivity or contribution to society? [01:36:14] How does that fare against the labor market? [01:36:16] Are those accurate predictors of what someone's contribution can be? [01:36:20] And then two, does that mean that this is where we start to tip into the dark enlightenment post-rationalist stuff is like, if there is some type of immutable natural order to the world, which he is predisposed to, how do you then choose to organize society? [01:36:35] Do you create ethnostates? [01:36:36] Do you do whatever else? [01:36:38] Israel for everyone. [01:36:40] Some of these people will take it a little bit further, right? [01:36:42] So as you like tip into the post-rationalist stuff, we're going a little bit beyond Gerwinder here, we should mention, but like people will say that the irreducible complexities of human society and nature are actually beyond the cognitive capacity of humans. [01:36:58] And so although we have attempted to rid ourselves of cognitive biases, we are as flawed individuals, we can't do that. [01:37:05] And so the most lindy formations of society, I know this is retarded, but I'm trying to tell you what they think. [01:37:13] Liberal, you know, capitalism has been around for 200 years. [01:37:16] It'll last for another 200 years. [01:37:18] Monarchy was around for 800 years, and that will last for another 800 years after that. [01:37:22] And so they start to create these rational explanations for why you should have things like a caste system or theocracy or monarchy or basically in industrial society fascism, right? [01:37:34] That we should, the rationalist version I always think in my head is like, that's when you sort people by IQ. [01:37:38] Post-rationalism is when you sort people by race. [01:37:40] Gotcha. [01:37:41] Yeah. [01:37:41] I mean, there's a, there's a, I think like a, uh, it's interesting because I, I, I had no, I had never heard of this stuff before, like a few years ago, I think actually kind of when we were doing some of the Sam Bangman Freed stuff, like I knew there was like racist Silicon Valley people or whatever, but like I didn't know exactly like the ideology behind it because I was like, listen, there's a lot of other shit I got to deal with in this world. [01:38:05] But like having read a good amount of this stuff, it is like, because not only are you coming to these, are they coming to these conclusions of like, you know, like, listen, like everybody, like their solutions to everything are these very regimented, what they think of as very logical and completely like inhuman actions that they want to take. [01:38:29] And it is, I think this stuff, like, whereas like the more hardcore believers in it might be a little marginal, it's not marginal, but like they might be super large in number. [01:38:39] The diffusion of this stuff across the broad Silicon Valley ideologies that exist now that have really, really, really taken off in recent years, I think cannot be discounted at all. [01:38:51] Yeah, HBD is big. [01:38:54] Like we should underline that this is not that fringe, unfortunately. [01:38:59] No. [01:38:59] Well, Sam Harris, the origins, actually, I think this is like little discussed because it was mostly understood as being like a kind of anti-woke movement of the intellectual dark web. [01:39:09] But what moved Sam Harris from being this kind of like enlightened centrist or like mainstream commentator is that he would not stop talking about fucking race and IQ and had a like very long form conversation with Charles Murray, who's the author of The Bell Curve. [01:39:23] Charles Murray recently, I think he spoke at Stanford with Francis Fukuyama, like those two in a panel. [01:39:28] Nothing good happens at Stanford University, by the way. [01:39:30] I just want to call out a cliff because it's a ghost. [01:39:32] If you or any of your loved ones are like, man, I wish I could go to Stanford. [01:39:35] Fucking pull them out. [01:39:37] That is the Ted Kaczynski institutions as a born and bred San Franciscan. [01:39:44] Get rid of Stanford. [01:39:46] Get rid of it. [01:39:48] Yes, salt the earth, like you say. [01:39:50] I mean, it's making a comeback, and I feel like it's kind of necessary to say that, like, in a society that is experiencing or struggling to have growth rates where we're in declining rates of profit and austerity, you need to create some type of rationale for a certain percentage of the population to be expendable. [01:40:07] And so, this becomes in the minds of tech designers who are trying to do resource allocation and perfect system design, like, well, who are you going to get rid of? [01:40:14] It's the least productive people. [01:40:16] And so, you start to logically, empirically, using your rational faculties, move yourself into positions of like, well, let's just get rid of all the black people. [01:40:24] Like, that is kind of the teenology of this stuff. [01:40:27] And if you're interested in a leftist critique of how that structurally comes about in like a declining capitalist society, read Bataille because you can read the accursed chair. [01:40:37] I think that one of the things that's so interesting to me is that so much of this just also plays into the long-termism stuff and the EA stuff. [01:40:46] Less so the EA, but it's all part of the same, you know, it comes from the same place. [01:40:51] But like the long-termism stuff, yeah, like if there's this massive human drain on society, what do you do? [01:40:59] Is you exterminate them? [01:41:02] I mean, that is, or you isolate them. [01:41:04] You send them, you know, back to, and frankly, a lot of these people's minds, back to Africa, right? [01:41:09] And because, frankly, a lot of these guys just hate black people. [01:41:13] I mean, it's really. [01:41:14] Right. [01:41:14] I mean, yeah, a richest one. [01:41:15] Except for the rich for the good ones. [01:41:17] Except for the good ones, you know, except for the high IQ ones or whatever. [01:41:22] But yeah, it really leads you to some dark fucking places. === Coerced Statement Theory (15:41) === [01:41:27] Yeah, before we get too far off track for this, the piece that is focused that Gerwinder and Luigi talk about in this two-hour Zoom call, why everything is becoming a game, we kind of established before that Gerwinder was writing for the Quilliam Foundation. [01:41:43] So extremism, like violent extremism produced through social media, is already in his vocabulary. [01:41:50] In that piece, he's talking at length about Ted Kaczynski. [01:41:55] He also mentions, this was a little-known story, someone called Jacob Graham, who is, I believe, 22 years old. [01:42:01] He's in the UK. [01:42:02] He got noticed for basically making threats on social media of like him posting videos about saying he was going to do violence. [01:42:10] And I think this is kind of an early connection where like Luigi is possibly able to piece together that like you can live out the Unibomber ideology or like take action in the real world. [01:42:23] Like he goes through the kind of philosophical explanation for why Ted's ideas are important, which he later confirms in his posts and his texts. [01:42:31] And then he also points to a guy that, like, well, look, he was going to do it, but here's how he did it wrong. [01:42:36] Like, don't make his same mistakes. [01:42:40] The connection that you're trying to make here, or that Gerwinder is trying to make here, is that what Ted Kaczynski calls surrogate activities, which are things, it's been a long time since I read the book, but like, generally speaking, this is that you fill your life with things that take up your time but don't produce like a meaningful kind of sensual, like your hands in the earth experience of the world, what Marxists might call alienation or stuff like this. [01:43:05] He calls it surrogate activities. [01:43:07] And Gerwinder's idea is that we are increasingly using gamification to create surrogate activities in every aspect of your life. [01:43:16] And so he has, I pulled a quote here that I think is indicative of generally like Gerwinder's right, left, both of them are wrong and centrism is the correct idea. [01:43:26] But applying this to both political economy but also the workplace, he writes that the Chinese Communist Party was among the first to apply the principles of social media to the real world. [01:43:38] In several towns and cities, it began trialing social credit schemes that assign citizens a level of clout in quotes clout based on how well they behave. [01:43:47] I'm about to be rich in China. [01:43:49] He follows that with, meanwhile, in the West, gamification is used to make people obey corporations. [01:43:55] Employers like Amazon and Disneyland use electronic tracking to keep score of employees' work rates, often displaying them for all to see. [01:44:05] So think here from Luigi's perspective that this is a guy since high school and in college and his internship after college has been designing, literally designing games, is interested in a system of being a software designer to write the new rules for society, and then starts to realize that maybe all of those systems are producing this unhappiness and they've been weaponized against him in some way. [01:44:31] So I think a lot of people have pointed towards, we'll discuss in a moment, the tweet from Paul Scalas, one of the people that he followed, Lindyman, as he's known on Twitter. [01:44:41] But the essay from Gerwinder ends with what I think could be interpreted as a provocation. [01:44:51] The final lines of the essay read, even in a world where everything is a game, you don't have to play by other people's rules. [01:44:57] You have a wide open world to create your own, your move. [01:45:02] And it just kind of seems like if we're looking at, we have to speculate on the next six months of whatever politicization funnel that he went through. [01:45:11] But this is, I mean, you're starting to get increasingly narrow, increasingly radical. [01:45:17] You're making contact with the person who wrote the philosophy. [01:45:22] I think that this is as close as we can get, or at least, I mean, that we have the documentation for to show that there is some radicalization arc. [01:45:30] And it is definitely not a left-wing activist who's advocating for universal health care. [01:45:34] Well, I think it's actually, I think it would behoove us to read the actual so-called manifesto. [01:45:41] Yeah, I just pulled it out. [01:45:45] Yeah, yeah. [01:45:46] I know that, Liz, that you have extensive things to say about this, and so I'll cede the microphone to you. [01:45:52] We all do share one. [01:45:53] But I got to say this, from my perspective, is this a manifesto or mayhaps a paragraph? [01:46:02] Yeah, I don't think you, look, I think the press jumped on the written, I'm going to call it a statement, a written statement. [01:46:10] And they were like, because this is a guy who we were already looking at this through a very specific political framework that had been imposed by, I think, people on social media. [01:46:24] Yeah. [01:46:24] Right. [01:46:25] So we're naturally expecting a manifesto because we see him as a written statement. [01:46:29] Now we're calling it a manifesto. [01:46:30] And so we're actually imbuing this text with a politics that might not be there from the get-go. [01:46:38] I'm just saying. [01:46:39] Or a legible politics, at least. [01:46:41] I've seen grinder profiles longer than this. [01:46:44] I also just want to say, if you, it's a manifesto, it's too sh you 250 words is not a manifesto. [01:46:50] I meant to say that. [01:46:50] It has to be at least, at least bare minimum, three to four thousand words. [01:46:57] Well, we've got supposedly the other document, which is, was in the backpack in the spiral notebook that we have not yet seen. [01:47:03] So there may be. [01:47:04] Which he refers to. [01:47:05] He refers to in this, and he's like, there's nothing, there's nothing in there. [01:47:08] I mean, according to Merriam-Webster's dictionary, a manifesto is a written statement declaring publicly the intentions, motives, or views of its issuer. [01:47:16] But I think in the colloquial sense. [01:47:18] You're sound a little Gerwinder. [01:47:20] In the colloquial sense, we believe, I mean, I think we think a manifesto of something is at least two pages. [01:47:26] It's got to be longer and have a little bit more thrust. [01:47:28] I'm going to read. [01:47:30] I've heard that before. [01:47:34] Okay, I'm going to read from this thing, which is going to be very quick. [01:47:41] Okay, so this is what allegedly, this is what Luigi wrote, or they found on him, right? [01:47:50] I don't know if I buy that fully, but we'll say. [01:47:52] This is what they say. [01:47:54] To the feds, I'll keep this short because I do respect what you do for our country. [01:48:00] He's trying to optimize their time for. [01:48:03] To save you a lengthy investigation, I state plainly that I wasn't working with anyone. [01:48:09] This was fairly trivial. [01:48:11] Some elementary social engineering, basic CAD, a lot of patience. [01:48:16] The spiral notebook, if present, I'm just going to flag that because it means he doesn't know if it's present as he's writing this or if they found it. [01:48:25] has some straggling notes and to-do capitalized lists that illuminate the gist of it. [01:48:31] My tech is pretty locked down because I work in engineering, so probably not much info there. [01:48:36] I do apologize for any strife of, I think that's probably or traumas, but it had to be done. [01:48:44] Frankly, these parasites simply had it coming. [01:48:47] That's a good line. [01:48:48] A reminder, the U.S. has the number one most expensive healthcare system in the world, yet we rank roughly number 42 in life expectancy. [01:48:58] United is the indecipherable largest company in the U.S. by market cap, behind only Apple, Google, Walmart. [01:49:07] It has grown and grown, but as, I think it has our life expectancy. [01:49:12] No. [01:49:13] The reality is these indecipherable have simply gotten too powerful and they continue to abuse our country for immense profit because the American public has allowed them to get away with it. [01:49:25] Obviously, the problem is more complex, but I do not have space. [01:49:29] And frankly, I do not pretend to be the most qualified person to lay out the full argument. [01:49:34] But many have illuminated the corruption and greed, e.g. Rosenthal, Moore. [01:49:40] Can I just tag something here? [01:49:42] Having spoken with radical young people and interviewed these people for like six years, if you are writing a manifesto that's going to be read by the whole world, there is nothing these people like the kind of died in the wool hardened ideologs, they are name-dropping endlessly. [01:49:57] Like you would have so much more to say rather than like Elizabeth Rosenthal is the author of In American Sickness, a pop book in 2017, and then fucking Michael Moore? [01:50:07] Yeah, I know. [01:50:07] Like the Sicko documentary, it's just, these are not deep cuts. [01:50:11] No. [01:50:12] Something is weird about this. [01:50:14] I said this before. [01:50:15] It's like if Brevek had cited Bill Maher in his manifesto. [01:50:20] He might well have, actually. [01:50:21] This shit is like, you're like, what, dude? [01:50:24] This is tier one. [01:50:26] This is tier one shit. [01:50:27] This is like tier zero. [01:50:28] Yeah, Michael Moore is just like, that's pop culture. [01:50:31] She's like a writer for like a health writer for the New York Times. [01:50:34] Michael Moore is just, you know, I never saw Sicko, so I don't know. [01:50:38] Okay, let me just finish this up, though. [01:50:40] But many illuminated the corruption and greed, e.g. Rosenthal, Moore, decades ago, and the problems simply remain. [01:50:48] It is not an issue of awareness at this point, but clearly power games at play. [01:50:53] Evidently, I am the first to face it with such brutal honesty. [01:50:57] So I'm going to give Luigi credit. [01:50:59] That final line is pretty good. [01:51:02] That's it. [01:51:02] That's a great final line. [01:51:03] Evidently, I'm the first to face it with such brutal honesty is a fucking sick line. [01:51:08] Shout out to Luigi for that one. [01:51:11] So, okay, this is the quote-unquote manifesto. [01:51:14] Like you said, this doesn't scan as someone who believes not only like at the point they're being, like they're going to get caught and this is going to be what's left for the public to consume about their like final defying act, right? [01:51:30] Right. [01:51:31] When an ideologue commits like assassination like this or like a mass shooting or something and they have a statement, they know, first of all, they know that they might like be dead when they get caught. [01:51:45] They're usually like planning for that. [01:51:48] And two, like they're leaving something that not only will explain, but can also like usually is meant to further radicalize and leave a kind of like artifact, like a trail of themselves and their own footprint as a kind of self-styled like hero. [01:52:08] From the end of the deed. [01:52:10] Yes. [01:52:10] But for the public who will continue to carry out what they, you know, have imaginary. [01:52:15] You would also put red pills in it where you would have certain, like, you know, the simplest example is like a hashtag or the name of a certain philosopher that other people would then search that and then get radicalized by reading that writing. [01:52:27] Right. [01:52:27] And we have pop culture examples here. [01:52:30] Yeah, there's nothing that's nothing that's really that dark. [01:52:33] It's very weird. [01:52:34] I kind of can compare it to the it was a New Zealand, the guy who shot up that mosque. [01:52:39] Yeah. [01:52:39] Remember, it was like he did it. [01:52:40] Christchurch. [01:52:41] Yeah, Christchurch Shooter. [01:52:42] Because I remember reading his, and listen, I'm not comparing Luigi. [01:52:45] I remember an episode. [01:52:47] We did. [01:52:47] We did a long time. [01:52:49] Just so people know why we were doing it. [01:52:51] But, well, no, Liz made me read it. [01:52:53] She actually had access to it like a few months before. [01:52:55] Yeah. [01:52:56] But I'm not comparing Luigi to the Christ Church Shooter. [01:52:59] I just want to be clear on that. [01:53:00] No, no, no. [01:53:01] And as much as I am, I'm like, there are two guys who carried out what could be called political shootings and also had manifestos released. [01:53:10] The Christian Shooters Manifesto is like full red pill. [01:53:14] There's fucking memes in it. [01:53:15] It's like fucking black pill. [01:53:18] Yeah. [01:53:18] And it's extremely dark. [01:53:20] But it's like very referential to like internet communities he was in. [01:53:28] There's memes in it. [01:53:29] Like it is like, you know, it's like shouting out his Discord friends. [01:53:32] Yeah. [01:53:32] Like that. [01:53:33] This is like this, except for the beginning part, which actually talks about the shooting and like the lead up to it in very vague terms or in at least very basic terms. [01:53:44] The rest of it is like almost something like a thought you would have when you realize that insurance is bullshit. [01:53:48] You know what I mean? [01:53:49] Like it's not like this, I mean, he's sort of typing it all out here, but these are just like, this is like a, something you would come to after like reading an article and be like, oh, huh, I guess it's all a scam. [01:54:01] But that is, it's, there's, there's very, what's very much missing here is like what he expected to get out of the shooting. [01:54:10] Well, that's also why this is not addressed to the public. [01:54:12] It's addressed to the feds. [01:54:13] That's true. [01:54:14] Like he is explaining himself to the cops who either are right there when he's writing this. [01:54:19] And I just want to put out that I do think that that is a possibility that this was more of a coerced statement, written statement than something that was organically produced that he kept on him that he was like ready to disseminate as a manifesto. [01:54:31] So you're saying it's possible that in Altoona PD, like the FBI, maybe? [01:54:35] I do think that that is a possibility considering the content and how it's like written. [01:54:39] Yeah. [01:54:39] But we don't know that. [01:54:40] I'm again wildly speculating. [01:54:42] But let's say it's not and this is something he wrote. [01:54:45] It again, it is still addressed to the people who he assumes are arresting him and that that's what's like, you know, and is not meant to be an explanation to the public. [01:54:56] Like he says, it's not like two people, to my family, to understand why I did this, to like, you know, all of this stuff that I want to say about the healthcare industry and like people rise up. [01:55:07] None of that is in there, right? [01:55:09] There's the vague things about explaining some of the logic behind what he carried out. [01:55:14] And I, you know, we do need to because we've been going so long, which is fine, but we do need to talk about the healthcare aspect of it and the actual kind of various political angles. [01:55:24] Yeah. [01:55:25] But it just, this is not a political manifesto. [01:55:29] This is not a political manifesto. [01:55:30] And it's certainly not a political manifesto from someone who potentially was influenced by Ted Kaczynski and his political manifesto. [01:55:39] No, that's a manifesto. [01:55:40] 75 pages. [01:55:42] That's a manifesto. [01:55:43] Yes. [01:55:44] But not a book. [01:55:46] We all also subscribe to his sub stack, which there are ways, if you, well, it's private now, but if he was this kind of theatrical mastermind where he left the monopoly money and he wanted to make a big public statement, like, why is he carrying it around in the notebook rather than just like scheduling it to publish in advance, especially if he might die during the act, you know, something like that? [01:56:06] Yeah. [01:56:06] So just a lot of unanswered questions in it. [01:56:11] Maybe also important to say that from at least from the family as we know so far, he did not have United Healthcare. [01:56:17] Yes. [01:56:18] People have speculated, there's just a lot of confusion online about this, that there was some kind of claim that was denied. [01:56:23] And so that was what motivated him to do this. [01:56:26] I think he had Blue Cross, Blue Shield. [01:56:28] So he selected Brian Thompson based on the market cap of doing that kind of rationalist math. [01:56:34] Yeah, UHC. [01:56:35] And so, yeah, there's just not a lot of clear explanation of how he gets to normie rationalism to kind of weird edgelord stuff, goes dark for six months, and then decides to assassinate a CEO or a business leader, an entrepreneur that all of his belief systems would kind of lead him up to respect or valorize and so on. [01:57:00] Yeah, it's it's I mean, it's well, Blue Shield is, we're just denied a guy no back surgery that he really needs. === Dealing with Kafkaesque Healthcare (15:25) === [01:57:08] But we should mention, which we didn't do earlier, just I think we, listen, it's a long episode, where he had written deny, delay, depose on the shells that were ejected from his gun, which maybe also was why he had to shoot him three times. [01:57:24] You know, that's very planned out. [01:57:26] It's that does go to the like monopoly money mentality. [01:57:30] Exactly. [01:57:31] I will say. [01:57:31] One wonders if like he had written on other shells. [01:57:35] He had missed like the first one or something, like other words. [01:57:38] See previous shells. [01:57:41] Young Chomsky sent this meme. [01:57:43] It was like, that's how you know a leftist didn't do it because the text on the bullets would be so fucking long. [01:57:47] I know. [01:57:47] Well, I knew a leftist didn't do it because there's just it would have been whack in some ineffable way that like it would have been, like the guy would have just been, it would have been something, something so corny would have given the game away. [01:58:04] I feel like from the beginning that this this yeah anyways um, but so delight delayed sorry deny delay depose, kind of a mouthful. [01:58:15] Um, that is a reference to, or potentially allegedly a reference to internal, the internal kind of policy of various health insurers to avoid paying out claims. [01:58:27] And one of the big kind of looming storylines potentially with this is that Luigi had documented back pain. [01:58:40] He had spinal surgery and he talked about it at length on Reddit. [01:58:46] And even in his like Goodreads reading list was, you know, had read books and was interested in other books that had to do with managing chronic pain, managing spinal pain specifically. [01:58:58] and that this was kind of like a big part of his life. [01:59:01] And also uh, using psychedelics as a form of uh, recreation but also pain management. [01:59:06] Yeah, maybe to like kind of frame this thing out, if we just go through, we've assembled a timeline of, you know, confirmed dates that we have that kind of paint a story here, but uh, as far back as uh 2018, actually he's got a pretty extensive Reddit history, but he he complains about a few medical issues, that he has lyme disease When he's much younger, [01:59:34] his back pain actually precedes he sustained an injury during a surfing class at the co-living space, but he had back issues before that. [01:59:43] In around approximately, we don't have the exact date, but a later post backdates it by 1.5 years. [01:59:49] So we're guessing that it's February of 2022, somewhere around there. [01:59:53] Spring, yeah. [01:59:55] Which is interesting because after that injury, which was supposedly very substantial, he then later posts on Reddit about this is in November, so almost a year later, taking a 10-mile hike. [02:00:07] He's talking about that on his account. [02:00:11] He talks about back pain throughout the following years. [02:00:15] The surgery that a lot of people are passing around those images of is approximately July 20th, 21, somewhere around that. [02:00:23] He says on Reddit about a week ago, I had this surgery. [02:00:28] Which is like the background of his Twitter header is what I assume is his back. [02:00:34] It's like an x-ray of his spine that has, I mean, I don't know much about spinal surgery. [02:00:40] I hope I never have to know, but it looks really barbaric. [02:00:44] I mean, it's like it looks like three fucking nails in his spine. [02:00:49] It looks crazy. [02:00:51] I guess that's what they do, but it really is. [02:00:53] You know, he has it up there along with a Pokemon and I think a photo of him backpacking in what looks like Hawaii. [02:01:02] And so it kind of looks like it's like, oh, this is me. [02:01:05] This is who I am. [02:01:06] These three things. [02:01:07] It feels very like MySpace throwback where you have to check me out. [02:01:11] But it does feel like part of his core identity is chronic pain sufferer from spinal injury. [02:01:18] It's a big part of it, but something that all of us highlighted was that in these Reddit posts, we've gone through his entire archive. [02:01:26] In April of 2024, he is backpacking in Asia. [02:01:31] We believe this is Japan from the Gerwinder emails, but he's active. [02:01:36] He's not crippled. [02:01:37] I mean, we went through what he's packing in this backpack, and it's like, it's not like, he has a backpack on full of shit. [02:01:44] He's living out of it because he's traveling. [02:01:45] Yeah, he's carrying a lot. [02:01:48] This is not somebody who is debilitated or wheelchair bound or something like that. [02:01:52] Yeah, I mean, it's also, the guy got away on a bicycle. [02:01:55] Yeah, exactly. [02:01:56] I mean, if it's all back pain, guys, like, that's the least likely, you know, you'd think it'd be a wheelchair. [02:02:03] But to kind of like round out this, the timeline starts to accelerate here. [02:02:07] But so backpacking Asia is April of 2024. [02:02:11] I think what can't be discounted is the Andrew Huberman Polycule article in New York Magazine in March of 2024. [02:02:18] And that could have really been a lot of time. [02:02:20] I'm not sure why that was in the timeline. [02:02:22] I just saw that and I was like, who put that in the timeline? [02:02:26] Maybe somebody put that in the timeline. [02:02:27] But it happened. [02:02:28] I just want to note that that's a data point that doesn't necessarily connect to other data points, but it is a data point. [02:02:33] He was a big Huberman guy. [02:02:34] Yeah. [02:02:35] The most important one is that in May, he turned 26 and he got kicked off his parents' insurance. [02:02:42] Early May is the Gerwinder Bogal, the two-hour call where he talks about effective altruism and the Unibomber at length. [02:02:51] May 25th, his last Reddit post. [02:02:53] June 10th, his last tweet. [02:02:55] Sometime in June, he goes dark. [02:02:57] Later shows up, you know, six months later in New York, assassinates Brian Thompson. [02:03:03] In November, as well, his mother filed the missing persons report for him in San Francisco. [02:03:08] That is the, yeah, that's the curious thing. [02:03:10] Yeah. [02:03:10] If he had been missing for all that time, why did they wait so long? [02:03:14] They file the missing persons report in San Francisco, and then six days later he arrives in New York. [02:03:22] Something weird is going on. [02:03:23] I know, exactly, because it's like, it seems like from interviews with his friends and like those tweets towards him about missing a wedding, it seems like he had gone dark to all of his friends, but like what is his family up to? [02:03:37] He had some books on his Goodreads, like I want to read section that were about like, I can't remember the name of the book and I didn't write it down, but it was something like dealing with like narcissists or surviving certain types of parents, like bad parents or something. [02:03:55] And so that coupled with some statements that some friends have made to the press about how he hadn't been speaking with his parents or there was a really big lapse in communication. [02:04:08] And the fact that his mom thought that he was still working at the tech company in San Francisco when he had not been working there for quite some time. [02:04:17] Or he's been living there. [02:04:18] Yeah, yeah. [02:04:19] Like paints the picture that they had been estranged for some time, maybe even predating the period where he went dark. [02:04:30] Yeah. [02:04:30] Yeah. [02:04:31] Yeah, we need more data. [02:04:33] But yeah, it does seem like really strange that she would wait so long to file a missing persons report if he had really been completely dark for that long. [02:04:42] But it also seems strange because you'd think that, I mean, possibly this is what happened. [02:04:46] His friends who he cut off all contact with would try to contact. [02:04:49] You know, if someone I knew completely disappeared and went dark after like a month, depending on how close I was with them, you know, you might try contacting their family and be like, what happened to, you know, Marcuse? [02:05:02] Is this person okay? [02:05:03] Yeah. [02:05:03] How's Marcuse doing? [02:05:06] Yeah, you know, in Goodreads, a lot of hay was made about two books about spinal injuries. [02:05:14] And from his Reddit, we can tell that he had something called spondiliothis. [02:05:21] Spondiliothis. [02:05:22] Spondi. [02:05:23] Spondy for short. [02:05:26] There was some, there's been some gropings at an explanation for why he did this or what could have led him down the path to do this. [02:05:36] And one of those is possible erectile dysfunction from Spondi. [02:05:41] Right. [02:05:42] Because one of his friends from, which was at first reported as his landlord and later clarified to be one of the roommates at this co-living space, said that he was basically couldn't fuck. [02:05:55] or like be intimate with people because of his back pain. [02:05:58] And I thought that was, you know, listen, I don't know a lot about back pain. [02:06:01] I'd fuck my backup in my old job, but it's just if I roll it out, it goes away. [02:06:05] Well, I think in his Reddit post, I mean, it's a lot more serious because he talks about like ongoing numbness in the limbs, in his groin, in various areas, and that like his, there was some issue with his nerves and nerve endings and that it was causing like, if it was, you know, causing him to like not be able to be intimate, as his friend said, then, I mean, that can be, I mean, that's horrific. [02:06:31] You know, that's, that's horrible. [02:06:33] And I'm sure, and it sounds like from everything that he posted that he was in like some pretty serious pain. [02:06:40] I mean, you know, you read, Rolling Stone put a piece out today or yesterday, I think, where they interviewed a bunch of people that suffered from the same kind of condition, which I do think there's like a huge, like, you know, a huge scale in terms of like how this, how much pain is felt. [02:07:01] And it's totally, you know, it can really vary person to person. [02:07:04] But this is kind of, I'm just going to read two quotes that I thought, you know, were maybe kind of interesting to add to the discussion. [02:07:13] There's one that said, it was, one person said, it was a huge hurdle getting the surgery approved, talking about the initial spinal surgery, including a super tight window when it could occur. [02:07:23] I was constantly on the phone with the insurance. [02:07:25] They miscalculated my out-of-pocket maximum, and it took hours upon hours just to get them to approve the claim that was justified. [02:07:32] I was lucky that I had union negotiated insurance from my graduate teaching fellowship, some of the best in the country at the time. [02:07:37] Otherwise, I'm sure I would be in a different place. [02:07:42] And then another person said, insurance won't pay to do both the left and right sides of the back at one time. [02:07:48] You have to make two appointments because they feel you might have less pain after one side is done. [02:07:52] And they don't want to pay for two. [02:07:53] If they don't have to, I have to make two appointments and pay two co-pays in two weeks apart. [02:07:58] And I think like, I want to just add this into the conversation because I think a lot of people have made hay of like, oh, he's so wealthy. [02:08:06] So like, what does that have to do with anything? [02:08:08] I mean, there's like a big right-wing thing, you know, where they're like, oh, rich guy trying to do this, but he could afford anything he wanted. [02:08:14] And I, I want to like, you know, it's one thing to be underinsured and uninsured in America, which is like a fucking death sentence and is horrible, right? [02:08:22] But the being in pain and being sick and insured in America is also a fucking nightmare. [02:08:31] And if you are dealing with something as complicated to diagnose as chronic pain, then getting that covered and dealt with and different surgeries and dealing with doctors and having to basically like advocate for yourself in that situation is so fucking complicated and is a lot of work. [02:08:53] It's a lot of work. [02:08:54] And it is like the term Kafka-esque gets thrown around way too much. [02:08:59] But in this instance, when we're talking about the American healthcare system and insurance companies, like it is, it is this, like something that's so interesting about this story is that like insurance companies are premised on rationalizing care, on being like exceedingly rational organizations, right? [02:09:18] Like they have to do all this complex math and all these complex calculations to decide whether or not this gets, you know, approved or not approved, whether you get this care or you don't get this care or you get that doctor or you don't, or whatever it is, you know what I mean? [02:09:33] And, you know, they're making these calculations of like who should live and who should die. [02:09:40] And to the person on the other side of that, it is a completely inscrutable math. [02:09:46] In fact, it seems completely irrational because it's almost too rational. [02:09:50] It's like, it's like a rationalism that is like come back onto itself or something. [02:09:55] And it's that like Kafka kind of bureaucratic rationalization that is turned into a monster. [02:10:00] And again, to wildly speculate, for someone who is a very rationally driven, is a person who does look at the world in this kind of like, it should be this rational. [02:10:14] I can see it's this rational. [02:10:16] To then have to kind of deal with a system that purportedly is almost like so rational, it's irrational, coupled with who knows what else in his life, there can be a kind of potent explosion there, you know, that could really radicalize someone to act. [02:10:37] In so many of these interviews, there's a formative or catalytic experience in which the media that someone consumes after this event, whether it's like police violence or a conflict with an institution or you lose your home or something like that. [02:10:54] All of a sudden, all of that philosophy and media is then like, it's fucking go time. [02:10:58] Apply this to the world. [02:10:59] And I think one of the reasons just to underline like why we spend so much time on his political philosophy, being very sympathetic to private business, being super liberal and rational, and being sympathetic to like the profit motive in general, that the American healthcare system is so fucking bad that someone is primed through their whole life to valorize it, accept it, and celebrate it, to in some cases, make the argument of why it should be that way, [02:11:26] that they can still get radicalized by how bad this fucking system is. [02:11:30] I think that the almost unprecedented support that Luigi has gotten from large sectors of the public and also the lambasting from op-ed after op-ed, politician after politician. [02:11:47] I cannot remember when a like, maybe like George Floyd or something has had so many politicians weigh in on it. [02:11:58] But like I cannot remember another murder that has had this much attention paid to it. [02:12:06] I think a lot of that response is coming from obviously like everyone's frustration with how irrational and inhuman the healthcare system in the U.S. is. [02:12:17] And also just how fucking expensive it is. [02:12:20] I mean, it is like beyond that, how Byzantine and how just frustrating it is to deal with so many waiting for authorization, putting off surgeries, waiting in pain, being told that you don't need something. === Frustration With AI Denials (04:46) === [02:12:34] In the case of United Healthcare, having an AI algorithm deny your claim. [02:12:39] Yeah, I'm sure all of them within the next year. [02:12:44] How dehumanizing, how alienating it must feel. [02:12:47] And I think a large part of the population probably feels like some catharsis from watching a human. [02:12:59] I mean, I think that's another angle here is that like so many times when you're dealing with this kind of shit, you're dealing with like, you can't tell if you're talking to a person or you're talking to a fucking machine or whatever. [02:13:10] You have no idea if your plea is, I mean, talk about Kafka-esque. [02:13:13] You have no idea if your plea is even being heard by a person or being shuffled into some fucking cloud thing where it'll get lost, even though it's, you know, just some fucking ones and zeros. [02:13:23] To see a human being go up to another human being who is who is profiting from that and emblematic of that. [02:13:28] I mean, United Healthcare being like one of the worst examples of these lousy insurance companies and going up to him and plugging him three times. [02:13:38] That feels, I mean, not to me. [02:13:40] I don't have politics or feelings, but to many people, that feels good. [02:13:45] It feels good. [02:13:47] And you can think like, oh, all of the kind of criticisms and this really forced reaction against it from, you know, the Richie Tauruses of the world and New York Times op-ed page comes from this place of being like, well, he had a family. [02:14:03] Well, I mean, I think there's some, I think he's having some family problems according to court documents, but also so what? [02:14:10] You know what I mean? [02:14:11] Like, I think that the way a lot of people feel, not me, but a lot of people feel, is like, so what? [02:14:15] Everyone has a family. [02:14:15] You know, like everybody who deals with this company has a family. [02:14:18] Everyone who deals with this company has to go to work. [02:14:20] Everyone that deals with this company has to like has they all have families too. [02:14:25] Like, and there is something that like, you know, people, and that's why it's like his politics ultimately don't matter because the deed is what matters. [02:14:36] And the deed has, I think, like made a lot of people. [02:14:39] I don't think it's going to have, I mean, you never know, but I don't think there's going to be any political efficacy of it in like material terms. [02:14:47] But I think as a moral or morale, excuse me, an exercise in boosting morale, I think it's done wonders for a lot of people. [02:14:53] Well, it's interesting. [02:14:54] Two things. [02:14:55] One, a woman was just, I believe, arrested for saying to a claims, I think a claims adjuster or adjudicator or whatever, some insurance person deny, delay, depose. [02:15:11] And that was reported as a threat. [02:15:14] And she was like taken in for questioning. [02:15:16] Probably not arrested. [02:15:16] Probably. [02:15:17] She was arrested. [02:15:17] She was. [02:15:19] I think she might have also threatened them. [02:15:20] I think. [02:15:21] Don't quote me on that. [02:15:22] But it is interesting that that has already, I mean, that's not a copycat per se, but it's copycat-ish, if I can, add an ish. [02:15:30] Yeah. [02:15:31] On top of that, I want to flag another thing you said, which is that AI is now increasingly making these decisions, which is really important. [02:15:39] So of the jobs that are getting automated out through this rational governance that are Silicon Valley types and programmers have kind of instituted through business class contracts, we'll say, and, [02:15:55] you know, business consultant companies that the first to go are really the high finance jobs and insurance insurance work is increasingly done through automated algorithms and a bunch. [02:16:14] It's all math anyway. [02:16:16] So why not just have a computer decide and then you have a human kind of come in and approve. [02:16:21] We actually did an episode about this with Jason Sadowski a while back about the kind of insurance regime. [02:16:28] Oh yeah, we did. [02:16:30] Yeah, I remember that. [02:16:31] Yeah, and algorithmic programs kind of taking over the governance of insurance companies. [02:16:39] And obviously, you know, the health insurance industry is massive. [02:16:44] It is a huge part of our economy, unfortunately, which makes no sense. [02:16:49] And, you know, they're the guys that are utilizing this tech that I hate to say it, but, you know, Luigi has kind of spent his adult life kind of rationalizing as this should be the governance that we should all live under. [02:17:09] If there's anything that could make someone go full Kaczynski, it would be on the receiving end of realizing that you've spent your entire life designing the system that is now physically punishing you and causing you pain. === Mark Andreessen's Ugly Reality (11:52) === [02:17:21] Yes, I mean, so much. [02:17:22] We're speculating, but that narrative makes sense here. [02:17:24] I mean, yeah, I think that's actually to me, who knows? [02:17:27] We don't know because I'm sure he'll have his day in court. [02:17:30] And I have a feeling that he might say a word or two there. [02:17:34] But, you know, it does, there is like a certain poetry to this, I guess, of somebody who has been enmeshed in these systems and especially with these thinkers who are, and you see this with so many of these people. [02:17:50] I mean, even that essay, the Gerwinder essay about the gamification of this stuff, it's like, well, that's like your crowd that does this. [02:17:57] Nobody I know is involved in the gamification of society. [02:18:01] It's your friends that are. [02:18:02] It's your review. [02:18:03] Yeah, you seem to, it's like you seem to aspire to be doing more of this. [02:18:07] It's like these guys. [02:18:10] They talk about like, oh, well, this might destroy the world. [02:18:12] Well, you're making it, cocksucker. [02:18:14] Like, nobody I know is fucking making this shit. [02:18:16] You are. [02:18:17] All these fucking tech people, they talk about all the dangers of tech, all the dangers of tech, all the dangers of tech. [02:18:22] You're in the fucking, you're in the lab cooking it up, my brother. [02:18:25] You know, it's, it's the average Joe on the street ain't doing that. [02:18:29] And so there is a certain poetry to, I mean, even the way he, you know, he uses a tech gun, right? [02:18:36] I mean, he uses a fucking 3D printed gun and a 3D printed suppressor. [02:18:41] You know, there was that talk of him getting away on the city bike. [02:18:44] I don't know if they ever solved that, but I don't think that he, certainly, at least that fucking, you know, loser who was tracking all the city bikes at least had some pie in his face. [02:18:54] But people, he's probably neurologically incapable of feeling humiliated, anyways. [02:19:00] Anyways, there is a certain, not to me, of course, but there's a certain beauty that one could potentially feel in the totality of this from the beginning to, well, we're not at the end yet, I guess, but to the end of certainly Scott Thompson's life. [02:19:18] That Brian Thompson. [02:19:20] Brian Thompson. [02:19:21] Who the fuck is Scott Thompson? [02:19:24] Brian Thompson's life. [02:19:27] There is something there that I think is a, it's a narrative that we're rarely gifted with. [02:19:34] But don't do stuff like this. [02:19:36] No, no, we disavow. [02:19:38] Are you a, are you, what is your job? [02:19:40] I mean, you don't have a job, but what's your like, what would you call yourself? [02:19:44] Artist and internet culture writer, I go by. [02:19:47] I'm also the host of the Doom Scroll podcast. [02:19:50] I should have mentioned this before. [02:19:52] Yeah, yeah, we can be. [02:19:52] Young Chomsky and Brace have both been on. [02:19:55] Because you have a no winter. [02:19:56] And then at some point, Liz, hopefully in the future, but we're just starting for season two. [02:20:01] Season two, and so for season two, I'm really excited about that because you have, you're doing a, you have, you told me the guest list so far and it is incredible. [02:20:11] First of all, we have Gerwinder. [02:20:13] You got Gerwinder, which is a fucking get. [02:20:18] You're doing Gerwinder. [02:20:19] Bellagi. [02:20:20] Bellagi, Fuentes. [02:20:24] Mark Andreessen. [02:20:25] Mark Andreessen. [02:20:26] And Luigi Mangioni. [02:20:27] And Luigi Mangioni. [02:20:29] Andreessen. [02:20:30] What do you make of that? [02:20:31] So I was recently learning that babies have something called tummy time, or you put them on their stomach so their head doesn't get fucked up. [02:20:37] But I. [02:20:38] Well, do you think he didn't get tummy time? [02:20:39] No, so I think this. [02:20:40] So when I heard about that, because I knew about the soft side. [02:20:43] But also, he could have just been birthed in the way that. [02:20:46] No, check this out. [02:20:47] So you know what I'm about to say. [02:20:47] I know what you're about to say. [02:20:48] So I'll let you do it. [02:20:49] Mother had a pyramidal pussy. [02:20:51] Is that what you're saying? [02:20:53] That's what you fucking just said. [02:20:54] But you were, that's right. [02:20:56] That's not what I was about to say. [02:20:57] No, so I always think it's like, because you put them on their stomach so that their head gets normal. [02:21:01] But I was like, when I was learning about this, you can actually, if potentially, if you wanted to, you could every day kind of mold their head into being. [02:21:09] Potentially, I think you could do this. [02:21:11] Yeah, and so you could make a cone head. [02:21:13] So I think Mark Andreessen's parents, instead of putting it on his tummy, they on his tummy, they molded him into having a cone head because they wanted a freak son. [02:21:23] So we should look into if he has any siblings. [02:21:24] I don't know if that's a big one. [02:21:26] One connection is that in the Techno Optimus Manifesto that Mark Andreessen wrote, he cites based Beth Jesus. [02:21:33] Yeah, Thief. [02:21:34] Jesus among the people that Luigi followed on. [02:21:38] Yeah, Luigi was following Beth. [02:21:40] And he's adjacent to this stuff. [02:21:41] I feel like Best Time, Beth Jesus, by the way, that's a guy. [02:21:46] Let me tell you, once you reach a certain amount of followers on Twitter, a lot of people who call themselves Anons are tempted to do something called a face reveal. [02:21:56] And the reality is, is a lot of the face reveals for guys in the tech sector like this reveal somebody to it to be ugly. [02:22:03] And so famously, this occurred with what is called a Minecraft streamer. [02:22:07] And I only know about this from the face reveal called Dream, who revealed his face after spending many years streaming to eight-year-olds in a mask and was strange looking. [02:22:16] And so if you have a face that is not revealed, unless you're handsome and you got to run this by people you're not related to, possibly strangers, like, am I handsome? [02:22:27] Don't reveal your face. [02:22:29] I just, because it could fuck you up. [02:22:30] So Mark Andreessen, I think his, the reality is, is Mark Andreessen has a profile picture that's not him. [02:22:36] He follows a lot of Anons and a lot of young women. [02:22:40] And me. [02:22:40] And Liz. [02:22:41] Does he follow you? [02:22:42] He's followed and unfollowed me over the years many times. [02:22:47] He's deep. [02:22:47] He's actually follows a lot of people. [02:22:49] He's had Truanon blocked for basically the entire time. [02:22:52] It's not the pot that exists. [02:22:55] Jokes on him because I don't fucking tweet anything. [02:22:56] He follows me. [02:22:57] He does? [02:22:58] Yeah, that's weird. [02:22:59] Jesus Christ, Mark. [02:23:00] He's getting worse than Fitbit. [02:23:02] He's a lot of people, though. [02:23:03] 1.6 million. [02:23:04] Yeah. [02:23:04] He follows 1.6 million people? [02:23:06] He's just out of it. [02:23:07] Oh, okay. [02:23:08] I don't feel that special anymore. [02:23:09] All right. [02:23:10] Well, he has. [02:23:11] I did get an unfollow and follow. [02:23:13] He has me blocked. [02:23:14] So he meant to do that. [02:23:15] Also, a lot more people have the True An account blocked in the past year than ever before. [02:23:20] And I don't even tweet it. [02:23:21] October 7th. [02:23:22] I think it's from October 7th, you think? [02:23:23] Yeah. [02:23:24] 100%. [02:23:25] Because they couldn't tell. [02:23:27] But no one knows I was there. [02:23:30] Right? [02:23:32] No, they do. [02:23:33] No one knows that they had an inside man. [02:23:37] But I'm just playing, though. [02:23:40] But it's all love. [02:23:41] Mark Andreessen is crazy fucked up looking. [02:23:44] And I think that I just want to see a 3D printed scan of his body. [02:23:49] I want to see what Mark Andreessen's penis looks like. [02:23:52] Because if his head looks like that, my God. [02:23:55] What kind of crazy polygonal fucking pentagonal contraction of you? [02:24:02] Did you watch him or listen to any of the Rogans? [02:24:04] Yeah, I did. [02:24:05] God, because all the worst fucking, you, I'm sure, watch that. [02:24:08] No? [02:24:08] I've seen plenty of Andreessen, but not the Rogan one. [02:24:11] It was just a victory lap. [02:24:13] Yeah. [02:24:13] So it was a very, very. [02:24:16] New footage of teal out today. [02:24:17] Insufferable. [02:24:18] Oh, yeah. [02:24:19] That is the worst. [02:24:20] He was asked about Luigi. [02:24:22] Yeah, because it looked like one of his last retweets. [02:24:24] He's sweating more than usual. [02:24:25] The veins on both sides of his head. [02:24:28] Where are his eyebrows? [02:24:29] Where have they gone? [02:24:30] It's like skeletor now. [02:24:32] Everyone in Silicon Valley, I don't know what, at some point, with whatever they're all taking, they lose their eyebrows. [02:24:41] Zuckerberg, they went away for a while. [02:24:44] Teals are, who knows? [02:24:47] They went running away. [02:24:48] They're too scared to stay on that face. [02:24:51] There's like something going on. [02:24:52] I think the Zuckerberg stuff, now we're just talking, but I think this, remember when Zuckerberg hired a stylist or whatever and started wearing like heavy cotton t-shirts and a chain? [02:25:00] Gold shade. [02:25:01] But all of the like tech guys, like the bass Beth Jesus guys, were like, dude, he looks so fucking cool right now. [02:25:07] Like, wow, he's so fucking bass now. [02:25:10] It's like, brother, like, he looks like a fucking clown. [02:25:13] Like, he looks like a fucking Saudi teen. [02:25:16] Exactly. [02:25:17] He does. [02:25:17] He dresses. [02:25:18] If his pants were a little bit tighter, he would literally be a fucking. [02:25:20] Oh, shit. [02:25:21] Did you go to the fucking Off-White store in Dubai? [02:25:24] No way. [02:25:25] Tell me more. [02:25:25] These people are, it's so fucked up that we have the most anti-social, non-socialized human beings who's like a life, who have gotten fabulously rich off of designing social networks. [02:25:37] It is crazy. [02:25:38] would buy long on Zuck. [02:25:40] Oh, yeah. [02:25:41] I think that he just made a million dollar donation to Trump's inauguration, right? [02:25:45] Did he? [02:25:46] I'm just saying I think he's just well positioned in terms of the panoply. [02:25:53] I think that meta is well positioned. [02:25:55] Let me ask you this. [02:25:57] You should use this question in your interview show. [02:25:59] Which, by the way, I was on, I tell you, I told you this when we did it, I had taken so much traffic into Fall Suit the night before that when I left that interview, I think I came here and I was like, what the fuck did I just say? [02:26:09] And I watched it. [02:26:09] Yeah, you literally told me that you couldn't remember anything. [02:26:11] I don't know what I said. [02:26:15] But you should use this question when you talk to people. [02:26:17] Liz, with regards to Zuckerberry, do you fuck with the vision? [02:26:24] I don't know what the vision is, but do you fuck with just the vision? [02:26:30] No, I hate these people. [02:26:31] I fuck with them. [02:26:32] Very, very much. [02:26:33] I really, really, really, I have such a deep, deep, deep dislike for all of these assholes. [02:26:42] There are. [02:26:43] I liked them better when they were fucking nerds. [02:26:45] And now that they're trying to be like cool, it's like even worse. [02:26:48] That's even worse. [02:26:49] I think is really, we've gone wrong. [02:26:52] Yeah, I mean, not to get too zoomed out here, but like there's whatever, paraphrasing the Margaret Thatcher, but like the project of neoliberalism was to like transform the subject, right? [02:27:03] Yeah. [02:27:03] And that's like a direct through line to all the stuff we've been talking about today, which is that the systems are perfect. [02:27:08] Like, you know, liberalism is perfect. [02:27:10] It's the people that are the problem. [02:27:12] And so what we need to do is like transform them, get rid of their cognitive biases or whatever. [02:27:16] So that is just basically being applied to people through all of these platforms now. [02:27:20] Yeah, I think that's the same thing. [02:27:21] That's why they shifted right. [02:27:22] That's why they just went to the Trump right. [02:27:25] Is the tech is perfect or the tech will achieve perfection and we need to make everybody like the perfect subject to use this technology. [02:27:32] Yeah, there's not a problem in the code. [02:27:34] My code is perfect. [02:27:35] Like if you're an engineer, it's like, yeah, your political position is that my code is perfect and actually it's the users. [02:27:39] I don't know. [02:27:40] I'll pass. [02:27:41] User error. [02:27:43] But it just, there's something, there's like a sickness that is just growing and growing and growing. [02:27:50] And I feel like I can't even get my fucking brain around it. [02:27:52] But like the tech stuff is, there's, there's evil. [02:27:56] Like a, what's up? [02:27:59] I was going to say, like, no system can like account for itself, right? [02:28:02] That's the problem, is that there's always going to be like a little excess or a little, like there has to be an imperfection in the system or else it can't be perfect, right? [02:28:12] That's the problem that all of these guys cannot rationalize about like systems design. [02:28:17] Yeah. [02:28:17] And that they want like AI and all of this stuff to be able to overcome. [02:28:21] But I think they can't. [02:28:23] But anyway, that's getting into some other things. [02:28:26] But like, I think for like these guys, like that can drive them insane. [02:28:33] And then it comes, the project then turns very, very dark very, very, very quickly. [02:28:38] Yeah, I wonder if eventually we'll have like go postal incidents, you know, like going postal, I think, is in reference to like a post office employee going nuts and shooting up the post office. [02:28:50] And, you know, something that occurred to me when we were talking about this earlier, it's, you know, Marx, I don't know, I can't remember if this is just attributed to him or he actually said it, but like, you know, thought like the Prussian postal system was like a good way to run a country. [02:29:06] But the, like, these people who work on AI oftentimes think like whatever code is would be the perfect way to run the world. === Go Postal Incidents? (07:45) === [02:29:14] Yeah. [02:29:14] And I wonder if any of them, I mean, we saw, we see shades of that here, at least we're groping towards that. [02:29:19] We think that might possibly be, there's a spiritual truth to it. [02:29:22] I don't know if we'll see how it turns out in court or what he says. [02:29:27] If he says anything. [02:29:28] If he says anything. [02:29:29] I know. [02:29:29] I love the lawyer, by the way. [02:29:30] Don't have enough time to get into him. [02:29:32] I love the lawyer. [02:29:34] Who, by the way, says that I've seen no evidence that he did it. [02:29:37] Which maybe he hasn't been presented in Discovery yet. [02:29:40] And that's true. [02:29:41] I've seen no evidence that he's did it. [02:29:43] Well, you know, I'm with you on that. [02:29:45] That's a good thing for your lawyer to say, honestly. [02:29:47] It's a good thing for your lawyer to say. [02:29:48] But I wonder if we'll see some of these people, some of these tinkerers kind of turn around and kill their fellow man. [02:29:57] I think part of what to me was sort of titillating about the shooting is that I have said on this show, although I don't encourage people to do this, I say this in a very general, allegorical, whatever, metaphorical way. [02:30:10] I am again saying don't do this. [02:30:13] We disavow. [02:30:13] We disavow. [02:30:15] But I've always said, like, well, I've always thought as a thought experiment that should not be taken, do not take this into real life. [02:30:23] We disavow. [02:30:24] We disavow. [02:30:25] But if you do, but like if you like at the end of all of these companies, at the end of all these systems, like there are people, right? [02:30:32] Like there are human beings. [02:30:34] And like you can touch people. [02:30:37] And like, I think this was like, I was like, oh, yeah, you can touch people. [02:30:40] You know, like you can, you can, you can reach out of the computer and you can pat someone on the back. [02:30:45] And that is like, there's a sort of comfort to that. [02:30:47] Obviously, what he did, like, you know, Thompson's not a, he's not like a Musk figure or some like, you know, singular personality was driving this. [02:30:57] He can be replaced and he will be replaced. [02:31:00] But it is, it is, it's, it's just always, it's a good reminder of that, you know? [02:31:06] But what do I know? [02:31:07] But don't do something like that. [02:31:09] No, we disavow. [02:31:09] We disavow. [02:31:10] Or if you do do it, don't subscribe to it. [02:31:13] Just don't and put again in your manifesto. [02:31:16] I know that Bray said some stuff like this, but I don't mean he was saying not to do it. [02:31:21] And I am literally saying not to do it. [02:31:22] Don't do it. [02:31:23] Because I guarantee if you're some like little lefty guy listening to this, you're going to fuck that shit up. [02:31:27] Don't do it. [02:31:28] The gun's going to go on. [02:31:29] You're going to take your dick off. [02:31:30] Do not do this. [02:31:31] Because the worst thing to do is if you do this and like you've just like watched too many like Spanish Civil War YouTube videos or something, you go to shoot somebody and you're just like you start screaming because the guns are so loud, right? [02:31:44] It's like, don't humiliate yourself. [02:31:46] You literal sun also rises. [02:31:48] Yeah. [02:31:48] Oh my god. [02:31:49] And that's the other thing. [02:31:50] So the thing with his dick. [02:31:52] No. [02:31:52] No, I do want to mention this because we talked about this earlier today. [02:31:55] Why can't we talk about this on the podcast? [02:31:56] I want to mention this because they're pointing this as a motive that his dick didn't work and his son also rises. [02:32:01] Although that didn't happen in the sun also rises. [02:32:02] He just ate some pussy. [02:32:04] But he does. [02:32:05] Well, it's implied that he does at the beginning in Paris. [02:32:07] Well, he got a fingers there. [02:32:09] But what I think he's dick could be have been fucked up from the back stuff. [02:32:17] And I looked into it and like Spondi does correlate to ED. [02:32:22] I think that they're going to make a lot out of that. [02:32:24] Like he's like a frustrated incel killer. [02:32:27] Oh, yeah. [02:32:27] That does not strike me as true. [02:32:29] They're going to run with, I mean, as much as they can, that this is a, you know, rage-filled white male who was an incel and it's because he couldn't get laid that he decided to kill somebody. [02:32:39] And it has nothing to do with health insurance. [02:32:41] Like that is going to be, if indeed it is actually, you know, the case that his dick didn't work because of his back problems, which we can imply and is possible, but we do not know for sure. [02:32:51] Do not at all. [02:32:52] If that is the case, the media is going to have a fucking field day with this one. [02:32:55] Yeah. [02:32:57] They definitely are. [02:32:58] But listen, there's a lot of guys out there. [02:32:59] But also, that is a very tired, well-trodden like story about the incel shooter that it almost feels dated. [02:33:09] And it's what's interesting about this moment is that all this stuff that people are trying like doesn't really seem like the counter narratives or the sort of like it doesn't seem to be sticking. [02:33:21] Like your point about all of these people kind of rallying all this like support behind him kind of like organically that just out of this video, him becoming this folk hero because of the nature of the act and the health insurance and all this stuff that we've been talking about. [02:33:36] Like it hasn't all this kind of like yeah counter narratives that have been deployed to sort of like try to tamp down or organize or like you know make sense of that aren't really sticking. [02:33:50] Like it seems like people are like yeah but that's okay. [02:33:54] Like no, we still really kind of support Luigi. [02:33:57] Yeah. [02:33:57] Yeah. [02:33:58] A word of advice too is be careful what you post also because I do think that like there is a I mean Kathy Hochl one of the New York ghosts. [02:34:09] Oh yeah. [02:34:10] Kathy Hochul like had a big meeting today with a Gerwinder to be like we're gonna fucking I know that's a Gerwind. [02:34:15] She should she should fuck Gerwinder. [02:34:17] Oh bro. [02:34:18] I would love to see that. [02:34:19] But I like she had a big meeting with a bunch of CEOs being like we'll protect you. [02:34:24] We'll protect you. [02:34:25] But what that actually means like yeah they'll have a couple of guys with some guns around them or whatever. [02:34:30] But like, I think really a lot of what they mean by that is they're going to proactively go through social media accounts to try to find people who are being like, we should kill more LCOs. [02:34:39] And you might say that as a shit post or whatever, but like a little rappy-tap-tap-tappy. [02:34:47] Both state and private corporations, like in every instance when someone goes out and does violence in the real world, that then becomes like the excuse to constrain the parameters of political speech for stuff that is like, you know, they try to conflate radical and the extreme, right? [02:35:04] Like the radical politics today is even like Bernie Sanders social democratic normal shit. [02:35:09] Like that was normal politics in like the 1970s. [02:35:12] Extreme politics is when you go out and you hurt someone in the real world. [02:35:15] But when you allow people to conflate radical and extreme, someone doing an act of violence in the real world becomes the excuse to kind of crop everyone who is slightly outside of today's Overton window and to the benefit of both the state and the private corporations that they get to maintain the status quo, use the excuse of someone who is doing extreme acts in the real world to like just constrain our parameters of political debate even further. [02:35:41] So yeah, I feel like pretty pessimistic about what this is going to allow people to post and even the discussions that we can have about it. [02:35:50] Yeah, I mean, the reality is like there's no large class-based movement, like kind of what we were talking about yesterday behind this or that exists in general in the U.S. [02:36:01] And so like there's only so much that can be molded out of the clay that this rot or the mud that this rot, whatever. [02:36:08] I don't know how clay is even, what the fuck is clay if you think about it. [02:36:13] But still, it's still like, you know, you got to appreciate it. [02:36:16] It's good, you know. [02:36:17] We've got a lot of L's lately in general. [02:36:20] A lot. [02:36:20] A string of major L's lately for the past kind of as long as I can think. [02:36:30] And since before you could think. [02:36:31] And really since, well, I was always thinking, I've like existed forever. [02:36:34] Like, you know that Roland Stone song? [02:36:38] I'm not the, I just, but I knew him during a lot of that stuff. [02:36:43] And I, yeah, but, you know, it's so don't, don't, people get on their, like, thingy and like, they're, like, gritty's going to come and save the day or whatever. [02:36:54] But, like, be realistic. [02:36:56] Always be in reality. === Leave No Trace (02:04) === [02:36:59] And leave no trace. [02:37:00] And leave no trace. [02:37:01] That's what I always say, especially about social media. [02:37:03] Leave no trace. [02:37:04] Leave no trace. [02:37:04] And about your trash as well. [02:37:06] I thought you were talking about the ballistics. [02:37:09] Oh. [02:37:09] Well, that's the thing is. [02:37:10] Oftentimes also it's better to use a revolver because the shells. [02:37:13] Look, it can be applied in many situations. [02:37:15] Give them advice. [02:37:16] That's what people are doing. [02:37:18] No, well, I'm talking about, look, you can apply this advice to a lot of, you know, this is applicable for a lot of things. [02:37:23] But I'm talking about, yeah, when you go camping, you know, don't use the Airbnb. [02:37:28] Leave no trace. [02:37:30] What you bring, take it with you. [02:37:33] And, you know, social media footprint, leave no trace. [02:37:36] A lot of sniffies users could take that advice if my many Airbnb properties in Miami are to believed. [02:37:44] But yeah, leave no trace. [02:37:46] Don't get yourself in trouble. [02:37:47] Leave no Brace. [02:37:48] Leave no. [02:37:48] Well, leave a little bit of Brace. [02:37:50] Certainly, I've left a little bit of me throughout this world. [02:37:53] Just we should wrap this up. [02:37:59] My name is Brace. [02:38:03] Well, actually, first, thank you so much for joining us again. [02:38:05] Where can people find you? [02:38:06] Thank you. [02:38:07] It's wonderful. [02:38:07] Wonderful to be here. [02:38:09] Find me on Patreon, Joshua Citarella, on YouTube, Doom Scroll podcast. [02:38:15] We're putting out a bunch of stuff in the next few weeks. [02:38:17] So what's your cross street or whatever? [02:38:21] Guys, where can people find you? [02:38:24] If somebody is like, like, read a lot of your work and gets like, where can people find investor conference? [02:38:31] It's 55th, and I'm trying to find in the notes where he was shot. [02:38:36] Find him at the mid-town Hilton. [02:38:38] At the damn TrueNOT. [02:38:39] At Joshua Citarella on socials, patreon.com, Joshua Citarella. [02:38:44] We can link to all of them. [02:38:45] Doom Scroll on YouTube. [02:38:47] There's not that many things you can find. [02:38:51] I'm Liz. [02:38:52] Where can people find you? [02:38:53] Right here. [02:38:54] That's true in the Damn Truan podcast studio. [02:38:57] Thank you, Liz. [02:38:58] My name is Brace. [02:38:58] And we are, of course, joined by Producer Young Chomsky. [02:39:01] And this has been Truanon. [02:39:02] We'll see you next time.