True Anon Truth Feed - Episode 434: Evil Gods Must Be Fought: The Zizian Murder Cult [Part 1] Aired: 2025-01-30 Duration: 02:08:10 === Harry's Galleon Stare (04:03) === [00:00:01] Heaps of gold galleons, stacks of silver sickles, piles of bronze nuts. [00:00:08] Harry stood there and stared with his mouth open at the family vault. [00:00:13] He had so many questions he didn't know where to start. [00:00:16] From just outside the door of the vault, Professor McGonagall watched him, seeming to lean casually against the wall, but her eyes intent. [00:00:27] Well, that made sense. [00:00:29] Being plopped in front of a giant heap of gold coins was a test of character so pure it was archetypal. [00:00:36] Are these coins the pure metal? Harry said finally. [00:00:40] What? [00:00:44] What? Hissed the goblin Griphook, who was waiting near the door. [00:00:51] Are you questioning the integrity of Gringotts, Mr. Potter Evan Veres? [00:00:58] No, said Harry absently. [00:01:01] Not at all. [00:01:02] Sorry if that came out wrong, sir. [00:01:05] I just have no idea at all how your financial system works. [00:01:09] I'm asking if galleons in general were made of pure gold. [00:01:12] Of course, said Griphook. [00:01:15] And can anyone coin them, or are they issued by a monopoly that thereby collects seniorage? [00:01:21] What? said Professor McGonagall. [00:01:24] Griphook grinned, showing sharp teeth. [00:01:28] Only a fool would trust any but goblin coin. [00:01:32] In other words, Harry said, the coins aren't supposed to be worth any more than the metal making them up. [00:01:38] Griphook stared at Harry. [00:01:40] Professor McGonagall looked bemused. [00:01:44] I mean, suppose I came in here with a ton of silver. [00:01:47] Could I get a ton of sickles made from it? [00:01:49] For a fee, Mr. Potter Evans Veres. [00:01:54] The goblin watched him with glittering eyes. [00:01:56] For a certain fee! [00:01:59] Where would you find a ton of silver, I wonder? [00:02:02] I was speaking hypothetically, Harry said. [00:02:06] For now, at any rate. [00:02:08] So, how much would you charge in fees as a fraction of the whole weight? [00:02:12] Griphook's eyes were intent. [00:02:15] I would have to consult my superiors! [00:02:18] I should have done a more jewely one, but what can you do? [00:02:21] It sounds eerily similar to the ones in the movie. [00:02:24] Interesting. [00:02:24] Really? [00:02:25] Well, let's keep going. [00:02:27] Give me a wild guess. [00:02:28] I won't hold Gringotts to it. [00:02:30] A twentieth part of the metal would pay well for the coining. [00:02:35] Harry nodded. [00:02:36] Thank you very much, Mr. Griphook. [00:02:39] So not only is the wizarding economy almost completely decoupled from the muggle economy, no one here has ever heard of arbitrage. [00:02:49] The larger Muggle economy had a fluctuating trading range of gold to silver. [00:02:54] So every time the Muggle gold to silver ratio got more than 5% away from the weight of 17 sickles to one galleon, either gold or silver should have drained from the wizarding economy until it became impossible to maintain the exchange rate. [00:03:08] Bring in a ton of silver, change to sickles and pay 5%. [00:03:12] Charge the sickles for galleons. [00:03:14] Take the gold to the muggle world. [00:03:15] Exchange it for more silver than you started with. [00:03:18] and repeat. [00:03:42] Welcome to the show, ladies and gentlemen. [00:03:44] That was an excerpt, of course, from Liz's upcoming book. [00:03:50] Harry Potter and the methods of rationality. [00:03:54] I'm just kidding. [00:03:55] It's not by Liz. [00:03:56] It is by one of her headmates, Elizer. [00:03:59] Elijah. [00:04:00] But before we get into any of that. [00:04:02] Okay, Yudkowski. === Wee, Wee, Info Hazards (05:40) === [00:04:04] I'm Liz. [00:04:06] My name is Bryce. [00:04:07] We are, of course, joined by a producer. [00:04:09] Young Chomsky. [00:04:10] And this is Truanon. [00:04:12] Hello. [00:04:12] Warning. [00:04:13] This episode will contain info hazards. [00:04:16] Many info hazards. [00:04:17] Wee, wee, wee, wee. [00:04:19] Which is, I believe, kind of like a trigger warning. [00:04:24] To be fair, you did text me earlier and you basically got rockoed. [00:04:30] I came up with a new basilisk, which I'll get to once we get to Rocco's basilisk, but I think I have Brace's basilisk. [00:04:36] Brace is badass. [00:04:37] Brace is badass basilisk. [00:04:40] That's too many. [00:04:42] That's the triple B promise. [00:04:43] Triple B, baby. [00:04:44] Yeah. [00:04:44] I'm going to change. [00:04:46] Here's the problem. [00:04:50] The rational community has never seen a character like me. [00:04:54] Has never met a brother like me before. [00:04:55] No. [00:04:56] I think you are ready to square their circle. [00:04:59] I'm just like, I'm reading your stuff here. [00:05:02] It seems a little bit kooky to me, but y'all, they seem very, I'm not saying you all, they all seem very crazy and horny, which I'm used to dealing with those types of people. [00:05:13] There's all kinds of those people all over society. [00:05:16] I think that today marks my entrance into the rational community. [00:05:22] And I am battling, I think today, for the most rational man in America, if not the world, and if not the world, through maybe all of time and space. [00:05:33] Yeah, you're going into that battle with a katana sword. [00:05:36] I think that you would break their brains. [00:05:41] I genuinely think that you could like, I think you could bring down whatever's left of the rational community by just being like, wait, I don't get that. [00:05:50] What are you talking about? [00:05:51] Over and over and over. [00:05:54] I think you literally could. [00:05:55] Sometimes I'm going to dismantle them from the inside. [00:05:58] I have tried to get Eli Eitzer, Yud, Elie Eitzer, Yudkowski on the show for literally years. [00:06:06] You guys can attest to this. [00:06:07] I have sent emails. [00:06:08] I have replied to please DM me. [00:06:11] You sent an owl over there, Harry Potter style. [00:06:13] I sent an owl over there, Harry Potter style. [00:06:15] Or Hegel style, depending on. [00:06:17] But he has never responded. [00:06:20] In fact, I was denied by one of his bodyguards an email with, or excuse me, an exchange with him via email. [00:06:28] Really? [00:06:28] Yeah. [00:06:28] Wait, his bodyguards took his email. [00:06:30] He has like a go-to guy. [00:06:32] He has a guy. [00:06:33] I'm sure I could just somehow get to him. [00:06:36] But he is being, he'll talk to Time. [00:06:38] He'll talk to Lexi. [00:06:39] Do you think he leaves his house? [00:06:42] Yes, I do. [00:06:43] Okay. [00:06:43] I do. [00:06:43] I think he leaves his house and goes to other houses and has group sex. [00:06:46] But he doesn't like go to the store. [00:06:48] I never saw him at the store. [00:06:49] And I used to live like pretty close to I'm going to say he does leave his house, but not in the way that you might. [00:06:57] Okay. [00:06:57] You know, like I might. [00:06:59] I mean, this is the problem. [00:07:00] I am a Pascal's wager in Newcomb's Paradox basilist atom bomb on the rational community because I am fundamentally at my core ultra-rational and I don't get it. [00:07:17] Yes. [00:07:18] I just don't get it. [00:07:19] I think you actually literally just repeating. [00:07:22] Wait, I'm sorry. [00:07:22] I don't get that over and over and over. [00:07:25] I have, we were going to do an episode on rationale, the rationalists. [00:07:29] About a different rationalist cult, by the way. [00:07:31] Yes. [00:07:32] Which still exists and I know who you are. [00:07:35] But we were going to do an episode on it like years ago, but it was too annoying to read all this stuff. [00:07:41] This time, things are different. [00:07:43] This time, I was stealed and I read a lot, a lot. [00:07:48] I've read a lot. [00:07:50] And I've read a lot. [00:07:51] And I don't get it. [00:07:54] I don't get it. [00:07:55] Because it seems to me like these are people who think they're really smart, but also think kind of that video games are real, but also seem to work out basic methods of human interaction and thinking in an almost like mathematical way that you or I would take for granted, which makes me feel a certain pity and sorrow, sorrow in the pity, about these kind of people. [00:08:20] But also a, and I'm sorry, I got to say it, a repulsion. [00:08:23] Yeah. [00:08:24] It's a repulsion, I feel. [00:08:26] It's a repulsion, he felt it's a repulsion, I feel. [00:08:28] And so listen, I just want to tell you out there: if you're listening out there and you're rational, or you think you're fucking rational, first of all, infohazard, you're actually maybe stupider than a regular person. [00:08:38] I have to be honest with you. [00:08:40] You might be dumber than a fifth grader because a lot of the stuff you guys are talking about is things that a child could do pretty much automatically. [00:08:48] But I know that you're, but I know that you're smart. [00:08:50] I know that you're smart because you can type a lot, a lot. [00:08:55] And I know that you're scared of AI and that AI, well, we'll get to that later, but you might just consider this, which you probably already have because you've considered everything. [00:09:05] You might be AI or you might be functionally the same as AI, but unfortunately, like one of the earlier AI models that just talked and talked and talked and talked and maybe hallucinated a lot. [00:09:16] We are going to get into all of this, but before we do, we need to set the scene. [00:09:24] Because this is not just a story about the rationalists, even though it kind of is in the broads in broad strokes. [00:09:32] But also, it's not. [00:09:34] It's a story about murder. [00:09:38] Well, I shouldn't have led with that. [00:09:40] It's a story. [00:09:41] It's a story about. [00:09:42] No, it's about murder. === Murder And Mayhem (03:58) === [00:09:44] Yeah. [00:09:44] Mayhem. [00:09:45] Another murder. [00:09:48] The mainframe? [00:09:49] I'm trying to think of M-words. [00:09:50] Murder, mayhem, mainframe, Munchausen. [00:09:56] Oh. [00:09:56] Maybe y'all gonna live on a boat. [00:09:59] Maybe I'm a crazy motherfucker. [00:10:03] Maybe everyone else is crazy and I'm the only sane one. [00:10:08] Mayhem. [00:10:09] We already did mayhem. [00:10:11] And other things of the M nature. [00:10:14] Monday, another M, January 20th. [00:10:17] Maybe, can we get some true crime music in for this? [00:10:20] I already noted true crime music. [00:10:22] Monday, January 20th, Inauguration Day. [00:10:26] Blood at the border. [00:10:28] 44 year old u.s border patrol agent david david chris this is what i that's tripping me up because i keep thinking of ruth's christ All right, pause the music really quick. [00:10:40] I think his middle name is Chris. [00:10:42] I think that's a, you can't, you're. [00:10:44] I think his name is David Chris. [00:10:46] No, but he goes with Chris. [00:10:47] But I, whenever I see David Chris, I think Ruth's Chris. [00:10:51] And then I get, and then it's like, and then it's, that's an info hazard for me because it just because I can't go further. [00:10:59] You know, it was Chris's steakhouse and Ruth bought it. [00:11:02] I know, but that, it's crazy. [00:11:03] It is crazy. [00:11:04] Ruth's Chris is. [00:11:05] Guys, we need to get to the murder. [00:11:06] We need to get to the murder. [00:11:07] Start over. [00:11:10] You didn't even get past his middle name. [00:11:13] 44-year-old U.S. Border Patrol agent David Chris Maland, another M-word. [00:11:20] Arrived at what he thought was a routine stop in Coventry, Vermont, a sleepy town about 20 miles from the Canadian border. [00:11:28] A U.S. Air Force veteran, formerly deployed to the southern border, now stationed at our northern border on careful watch. [00:11:37] David Chris. [00:11:39] Sorry, it's just infohazard. [00:11:42] Infohazard. [00:11:43] David Chris had also survived 9-11. [00:11:47] Another infohazard. [00:11:48] As a young man, he was a security guard at the Pentagon and helped secure the building on that fateful day. [00:11:54] But he wouldn't survive the crisis at our nation's border. [00:11:58] In fact, he'd been the first Border Patrol agent killed by gunfire in the line of duty in over a decade. [00:12:04] The last thing he saw, a 2015 Prius and a .40 caliber Glock held by a strange woman. [00:12:12] Officer David Chris Maland was shot and killed in an officer-involved getting shot incident with two suspects, Teresa Youngbutt from Washington State and Felix Baucholt, a German national on what the FBI referred to, until Monday, as an expired H-1B visa. [00:12:31] Felix Bachholt was declared dead at the scene. [00:12:34] David Chris was rushed to the hospital, but pronounced dead some hours later. [00:12:40] Homeland Security had been surveilling, in their words, Felix and Teresa for nearly a week. [00:12:46] They had received numerous reports from locals concerned about the two open carrying weapons and wearing all-black tactical style clothing with protective equipment. [00:12:57] They'd made contact and were rebuffed until January 20th when the observing agents saw them purchase aluminium foil and begin wrapping their cell phones. [00:13:07] News spread like a Palisades wildfire on Inauguration Day. [00:13:11] Blood at the border. [00:13:13] An immigrant on an H-1B visa on Trump's inauguration day. [00:13:18] The Sun-Pluto conjunction at 1 degrees Aquarius. [00:13:21] What could it portend for our nation's future? [00:13:24] But things were about to get stranger. [00:13:26] Users on X, the everything app, whispered that maybe, just maybe, they knew the deceased. [00:13:32] No, not David Chris Maland, who was mourned by nobody, but Felix Baucalt, who sometimes goes by Ophelia, a quant traitor from the rationalist community. === Strange Rationalist Whispers (14:52) === [00:13:43] Or is it something darker, or an even more freakier community than the rationalist one? [00:13:49] On Monday, January 27th, the U.S. prosecutor Matthew Lasher filed a document claiming, quote, the defendant possessed a journal at the time of her violent confrontation with law enforcement that, in addition to containing apparent cipher text, contained phrases including, quote, coming up on acid, not really sure what to spend this trip on, this LSD trip seems pretty mellow, [00:14:17] and I felt kind of high, vibration-y, maybe more so than other LSD trips. [00:14:24] It goes on to say, her associations with other individuals suspected of violent acts also warrants caution. [00:14:32] The individual who purchased the firearms, the defendant and Baucalt possessed on January 20th, 2025, is a person of interest in a dual homicide investigation in Delaware County, Pennsylvania. [00:14:47] Baucalt flew into the United States in the hours preceding the Pennsylvania homicide. [00:14:52] Both the defendant and the person who purchased the firearms in Vermont that the defendant and her companion possessed on January 20th, 2025, are acquainted with and have been in frequent contact with an individual who was detained by the Commonwealth of Pennsylvania during that homicide investigation. [00:15:08] That individual is also a person of interest in a homicide investigation in Vallejo, Vallejo, Vallejo, California. [00:15:16] Also, this is really hard to, not meant to be read out loud because that's a really long sentence. [00:15:21] I had to breathe a couple of times during it. [00:15:22] All of this, the drugs, the shootings, the associations with these specific homicides are wrapped up with something or someone called Ziz. [00:15:35] Is it an offshoot? [00:15:36] Antagonists, cults somehow associated with Miri, the Machine Intelligence Research Institute led by the GOAT, Eliza Yudkowski, out in Berkeley, California. [00:15:47] Liz wrote the GOAT, but I editorially disagree with that. [00:15:50] I think there's a new GOAT in town in the rationalist community named Brace. [00:15:54] And more broadly, associated with a community called Less Wrong, an incubator for rationalist thought experiments, various AI, machine learning, and crypto projects, and overall center for the Bay Area techno-ideological scene. [00:16:10] We're going to start today trying to figure out how a person who went from tweeting this. [00:16:15] Lots of dead smushed worms on the sidewalk today because people step on them as the worms try to travel across the wet ground. [00:16:24] If it's raining, look down while you walk to save the worm friends. [00:16:28] Three exclamation points. [00:16:30] On December 5th, 2023, Wood, only a month and some days later that I can't figure out in my head quick enough because I'm doing too many rationalist equations in there, draw a fucking Glock 23 on a cop and start blasting. [00:16:45] Liz, we got to break the fucking fake ass true crime thing first. [00:16:50] Yeah, yeah, okay, so we're done with that. [00:16:53] We have done a lot of episodes recently, I know, on tech people and, you know, Andreessen, the- That's one of our beats. [00:17:00] It's one of our beats. [00:17:01] First of all, we are from California. [00:17:03] We're from California. [00:17:04] These people came to our, the white man came to where we live and destroyed fucking everything. [00:17:09] I'm not even kidding. [00:17:11] California, San Francisco used to be paradise. [00:17:14] And these people came and they annihilated everything that once made the barrier cool and they wanted their little fucking tower. [00:17:22] We hate them. [00:17:23] Yeah, I think, so we just wrote that funny little true crime thing because this is a true crime podcast in some ways. [00:17:31] And a non-actionable comedy style podcast. [00:17:33] Sure. [00:17:36] I think a lot of people have heard in recent days or have been getting a little like, I've seen some people getting nutty online, being like, oh my God, there's rationalist killers out there. [00:17:49] There's a rationalist cult out there. [00:17:52] And you were on this real fast when it dropped. [00:17:58] And we've been kind of like digging into this. [00:18:00] And this seems to be a really, really crazy story that we figured out that we actually have to take two episodes to tell. [00:18:08] So I just want to kind of lay out why we're doing that. [00:18:11] It's because like we kind of got into this stuff when we were talking about Luigi. [00:18:17] And we kind of got into a lot of this stuff when we talked about Sam Bankman Freed like years ago. [00:18:22] And in the past, we also have done episodes on effective accelerationism and Mark Andreessen and all of these guys. [00:18:34] Musk, of course. [00:18:36] Also, we did those episodes about Peter Thiel like so many years ago and Mentius Moldbug. [00:18:42] And like saying all that, we haven't actually done a proper sort of like rationalist excavation. [00:18:52] And it's hard to talk about what potentially might be going on with this killer cult without going back and doing a little bit of a sort of like social history here. [00:19:06] Yeah, like the loose sort of like affinity project, whatever, fucking series of group chats. [00:19:12] We don't fully know the organizational structure yet that these people belong to. [00:19:17] Of this cult that is implicated with these killings. [00:19:20] Yeah, which I guess, you know, people are calling the Zizians after the right of a person named Ziz. [00:19:25] Yeah. [00:19:27] But that itself stems as like an ideological direct offshoot, even though it turned against it, of Miri, the Machine Institute, Machine Intelligence Research Institute. [00:19:43] But the thing is, like that community that Miri belongs to, the rationalist community, has given birth, rather painful, frankly, thalidomide style births to some of America's most notorious freaks. [00:19:55] You mentioned Mencius Moldbug. [00:19:58] And I think Curtis Yarvin's pen name, Mencius Moldbug, gives the game a little bit away because I phrased that weirdly, but you know what I'm saying. [00:20:07] You did it in the rationalist way. [00:20:09] Gives the game a little bit away. [00:20:13] But these people are like very, I'm trying to think of like a gentle way to say this. [00:20:22] Like these people are atypical in many ways. [00:20:26] And the rationalist community, People like Menchus Mulbug, Curtis Yarvin, people like Blake Masters, who kind of was like, came out of this stuff, who we'll talk about a little bit in this episode. [00:20:39] But like a lot of these ideas that came from this community, and there's various versions. [00:20:42] There's like, you know, there's sort of like more conservative ones. [00:20:45] There's more libertarian ones. [00:20:46] I wouldn't say there's more left-wing ones, although you might group the Zizians in with that. [00:20:52] I don't think. [00:20:53] I mean, people have really brought over broad definitions of what is left-wing, I think, today. [00:20:58] And the reality is that that's actually why it's crucial to have like pretty particular language about what you believe, which we don't believe in anything. [00:21:04] Well, no, we do believe in liberalism. [00:21:06] We are the number one liberal podcast. [00:21:08] But like it has like these different like ideological vectors, but it's hugely influential. [00:21:13] I mean, like a lot of like Elon Musk's long-termist sort of thinking, the Mars colonization stuff, you know, sometimes you talk, you hear Elon, when he's talking about AI, talk about like we might create these killer robots. [00:21:27] That kind of comes from this grouping of people that largely loosely grouped as the rationalists. [00:21:34] Yeah. [00:21:34] And, you know, even not, you know, not even talking about like the big thought leaders that most of who we will talk about today, but like just people who came into contact with Less Wrong, which we will explain. [00:21:50] If you don't know what that is, don't worry about it. [00:21:52] And congratulations. [00:21:55] You know, came into contact or like were members or, you know, commenters on Less Wrong or on Slate Star Codex, which is an offshoot of Less Wrong or, you know, all this kind of larger community. [00:22:09] You know, the like kind of blanket approach to how they view the world has informed so much of how people are approaching big questions today, like far up the food chain, right? [00:22:27] It's funny. [00:22:28] You and I were talking about this and we were saying like going back through all of this, it feels like an alternate historical path of a story of how we got to where we are right now today, Trump in the White House and the people that are staffing up all of his, all the agencies and our how they view the world, how they want to shape the world. [00:22:52] And we can all kind of trace it back to a lot of stuff that was happening here online and in Berkeley, California, that ended up really like kind of doing a lot of the ideological training of everyone in Silicon Valley, right? [00:23:09] And it's funny because I think a lot of it is really eerie going back through this stuff. [00:23:14] And you can see how it has created a framework that I just, it's fundamentally opposed to kind of like the way that I just approach the world, the way I see the world. [00:23:27] Yeah. [00:23:28] And it's really difficult for me. [00:23:30] Reading through a lot of this stuff, because again, we'd like we've like visited and revisited and revisited Less Wrong and like the writings associated with this community from like a pretty early part of this podcast. [00:23:41] Yeah. [00:23:41] Um, because a lot of people that we've talked about have at least like some tangential or direct sometimes involvement in this kind of stuff or very influenced by the language here. [00:23:52] It, it, this is kind of my problem with like a lot of tech stuff in general. [00:23:55] And you and I were talking about this this morning. [00:23:58] Like these people do, whether they know it or not, like they are on a mission to kind of destroy everything that you love and hold dear and that like human that has made humanity human. [00:24:10] And like, it's funny because Miri is like a sensible purpose. [00:24:12] I don't, well, fuck. [00:24:13] I'm sure they would tell you a million things. [00:24:15] From what I understand, as a normal human being who, who I think sort of a median person, like they're researching AI safety and making sure that we don't create, you know, something crazy that's going to kill everybody or torture us all for infinity, which seems like a pretty okay thing to do. [00:24:36] But like they're so like wrapped up in their own fucking bullshit. [00:24:41] And like the thought processes that go behind like that like they show their fucking work here that like go behind like all the decisions that they make are like so antithetical to like what I think of as like the soul of humanity that like it is almost like the AI has already taken over these fucking people. [00:24:57] And like it's it's like, you know how when you see like Lex Friedman talk or whatever and like he's actually almost like more emotive than a lot of these people. [00:25:04] They can be a motive. [00:25:05] But like it's like there's a sort of what do they call the uncanny valley. [00:25:09] Like where you're like, am I watching a person or like, is this some sort of like advanced like if some if it came out that Lex Friedman had been a fucking like a hoax done by some you know technology company and that like he didn't exist and it's been like a fucking hologram or like a cartoon for the past whatever many years he's been doing it. [00:25:28] I would not be surprised. [00:25:30] And like these people, and this is, I think, the case for a lot of people in Silicon Valley almost think of themselves as like, as thinking machines, you know, kind of in that like new atheist way where they're talking about like, you know, like, you know, it's in that, I guess metaphorical, but like, you know, we're just sacks of flesh with computer chips' brains. [00:25:49] They are sacks of flesh with computer chips as brains. [00:25:52] That's why they want to optimize. [00:25:53] That's why everything is about efficiency. [00:25:55] That's why everything is about hacking and, you know, removing all of those messy, messy things that actually are what make us human. [00:26:06] And that's like kind of, you know, what undergirds a lot of this thinking. [00:26:09] And when taken to the extreme, which is, you know, where you see it and everything, like we've said, from Sam Bakeman-Fried to this killer cult, which we will get to, you know, this ideology, it's really, really fucking dangerous. [00:26:26] And it also sadly feels to me like poetically the only thing that America might be capable of creating. [00:26:34] There's a viral clip that was going around that I think is actually from a couple of years ago of Sam Altman, the head of OpenAI, talking about how, you know, the new technologies are going to cause us to sort of renegotiate the social contract. [00:26:47] And the reality is, is like, I mean, this has sort of been the kind of the great joke of a lot of social media companies since they started, is that like a lot of the people who are in charge of like these ways that we interact socially and who have like great influence and power in a lot of the like new sort of theaters or whatever for social dynamics to take place. [00:27:09] I'm talking about social media here, but like a lot of the internet in general are like fundamentally anti-social people and like who, who, who. [00:27:16] On like a like a basic level I mean I made a joke about this earlier in the beginning like things that like a five-year-old or a seven-year-old would take for granted in their interactions with other people uh, they like don't fully understand. [00:27:28] I mean, this is what makes Mark Zuckerberg, I think, so unnerving to people is because you can tell like this is a person who, like is not maybe capable of interacting with me on like the level that I would, I would sort of expect from like meeting a just a normal guy. [00:27:44] You know what I mean, and like I mean all that is to say, like the world contains, you know, endless amounts different kinds of people. [00:27:51] I'm not saying that, like you know, because you're socially weird or you're like, think different, like that's fucked up I, but I think that like, it is a fundamental danger that like, much of this, like almost this technological policy which translates into social policy and every other kind of policy you can think of, is led by people who sort of like, stem from this worldview yeah um which like, is heavy on utilitarianism and like reduces everything to like equations. [00:28:20] And I think part of what bugs me so much about this is they call themselves rationalists and they say that they're and and and the way that they talk, they fucking bloviate, is in this way, that's so self-important, and they think they're all so fucking smart. === Prawn's Rationality Paradox (04:10) === [00:28:36] And I just want to tell you from a normal fucking guy, you're not so smart. [00:28:40] I actually think you're fucking stupid and I think you're fucking weird. [00:28:44] If, if being rational got you to where, like Eliza Yudkowski is on any given day, at whatever sweaty fucking like mass of people orgy that he finds himself in in some fucking dank Berkeley fucking basement, then then then rationalism is a supremely irrational and, in fact, insane thing to base your life around. [00:29:05] So Liz, what is rationalism? [00:29:19] Because I actually don't really know what it is. [00:29:21] I just know what rationalists are, and I infer the rest from that. [00:29:24] Okay, so I actually think, you know, it's funny, I found this really good summation from on a Y Combinator forum post that I think is a good way to kind of start us off because it's going to take us a little while to break all of this down and some of this history. [00:29:44] But so this, I'm going to read from it because I think it's pretty good. [00:29:48] What's the username? [00:29:49] Prawn. [00:29:50] Interesting. [00:29:51] Spelled P-O-N? [00:29:52] What is that? [00:29:52] Is that PR-0-N? [00:29:54] No, it's not zero, but that was probably taken. [00:29:56] What does that mean? [00:29:57] It doesn't mean anything. [00:29:58] It means porn. [00:30:00] Really? [00:30:00] Why don't they just... [00:30:02] I don't know. [00:30:03] Because these people, Liz, the technologist's mind is unknowable to us. [00:30:08] Yeah. [00:30:08] Wow. [00:30:09] It's so clever the way you just switch letters. [00:30:11] Wow, it's so clever. [00:30:12] It's leads speak. [00:30:14] Okay, this is what Prawn said. [00:30:18] For those who may find this a little foreign, from what I could gather, rationality, in quotations, refers to a post-new age new techno-religion originating in Northern California whose soteriology focuses on artificial intelligence as the source of redemption or damnation, [00:30:36] whose eschatology comprises an event called singularity, which is the accelerated formation of an artificial intelligence entity or entities, and whose normative doctrine focuses on Bayes' formula as a guiding principle and a particularly technical form of utilitarianism as an organizing moral principle. [00:30:57] Rationality has several institutions, two of which are named Miri and CFAR, dedicated to studying the eschatological and normative aspects of the creed, respectively. [00:31:08] The website Less Wrong is a primary publication of rationality, and its founding canonical texts include a Harry Potter fan fiction novella. [00:31:17] It is one of many religions that grew in Northern California over the past 60 years or so, and far from the strangest one. [00:31:24] Now, I think I might disagree with that last statement about it being the strangest one, though I understand that's up for debate. [00:31:33] Yeah, I mean, I don't think it is the strangest one. [00:31:36] I think it's, but it's, I would say, probably one of the more influential ones. [00:31:42] I would say it's probably the most influential one. [00:31:44] At least, I mean, I love, but the thing that I love is this person, Prawn, Prawn's framing of this as a religion, which is, and as specifically as a Northern California sort of religious cult. [00:31:57] Because that's certainly not how it's framed at all anywhere else, right? [00:32:02] This is seen as a sort of like just a kind of like intellectual milieu of the internet that has been quite influential and receives pretty much glowing praise up until, let's say, the collapse of FTX from the media, whether it was like in the New York Times talking about CFAR or just sort of as like weird utopian thinking out in the valley. [00:32:28] But it's certainly not usually framed as this. [00:32:30] Yeah, I mean, the funny thing is, like, you really got to understand, like, this stuff is influential in ways that like are sometimes a little mask and sometimes not at all. [00:32:42] Like, I mean, for instance, this, this post you read was from the Y Combinator forum, right? === Nazi Ideology Metaphors (14:42) === [00:32:46] Y Combinator is led by Gary Tan, a malignant dwarf who haunts the back alleys of San Francisco. [00:32:54] And he was the perfect height to suck my dick without bending over. [00:32:58] But not that I would take it because the teeth are disgusting. [00:33:02] The, no disrespect to our British listeners. [00:33:05] The, like, Gary Tan, for instance, embraced a little pathetically, which goes along with basically everything he does, the doctrine of EAC, effective accelerationism, which was, at least in terms of naming, I guess, an attempt to marry some of the rationalist stuff with like Nick Land's, like a sort of dumbed-down version of Nick Land's already pretty dumb thesis already. [00:33:32] And like, that's the thing. [00:33:33] I want to pause here. [00:33:35] I read a lot of this stuff. [00:33:37] And like, through the years, I have read a lot of this stuff. [00:33:40] And like, frankly, I'm sick of science fiction. [00:33:45] And I actually think, I feel like you might have said this on the show before. [00:33:47] I grew up loving science fiction. [00:33:49] I read every Robert Heinlein book, including most of the incest ones, which we don't have to include all those in the canon because he was getting too weird in the incest very late in his career. [00:33:58] He's Nazi, basically, is Robert Heinlein. [00:34:00] But like, I read every Philip K. Dick book. [00:34:02] I read all these like golden age, not golden age, I guess that's whatever age of science fiction books. [00:34:07] And I was like really into them. [00:34:08] One of my favorite books is a science fiction book. [00:34:11] These people read too much science fiction. [00:34:13] And I think we actually need like a 10-year pause on all sales of science fiction books until we can work some things out here. [00:34:20] Because it's like, you know, Nick Land's whole thing. [00:34:22] And by the way, just listen, he's a guy who, I've said it before, Belden's Law. [00:34:27] If you do enough speed, you either become a Nazi or gay. [00:34:30] And if you do enough, you become a gay Nazi. [00:34:33] He did enough speed to become a Nazi and moved to China. [00:34:37] This stuff is just like a mishmash and sort of like pop version of the EAC stuff of these sort of very arcane, almost like semi-religious writings that it draws from. [00:34:51] Yeah. [00:34:51] I would say that like, you know, it's funny going back through, it's like, oh my God, we did that whole, we did that whole episode on Nickland and effective accelerationism when fucking Andreessen wrote that stupid blog post way back when. [00:35:06] And, you know, we've, yeah, like we said at the beginning of the show, we've all kind of like touched on these other things, but this, this big, like, like rationalism or the like rational sphere sort of covers, it's like a big umbrella term for that, for like that stuff and more. [00:35:22] I mean, I think there's rationalists, there's social Darwinists, there's libertarians, there's ethno-nationalists, there's race realists, there's fucking pickup artists, there's reactionaries, there's fucking traditionalists, there's AI futurists, and they all kind of ended up meeting under this big worldview, right? [00:35:40] Which basically is the belief that nothing, like no fear of political correctness or uncontrollable or unwanted emotion should get in the way of a human's ability to see the world as it is. [00:35:56] And I'm just going to like put that out there because I'm not sure. [00:35:59] I think, again, I think a lot of this stuff is contestable on its own terms, but we don't need to do that here. [00:36:04] But they, you know, for them, it's like nothing's off limits. [00:36:06] And so that meant in this sphere, having a lot of conversations about taboo topics, right? [00:36:13] Like, how do we think through and debate rational explanations for social norms, like the prohibition on incest? [00:36:21] And there would be these like lengthy debates on that, right? [00:36:25] And so I think because of their very open and like insistence on having a debate about anything and everything, it became a big tent, like a big tent community by necessity because they were basically like, well, if you can talk about anything, you can talk about anything with anyone. [00:36:48] And so a lot of like fascists and neo-reactionaries were like under the umbrella, right? [00:36:54] Or a lot of like race realists. [00:36:56] A lot of the now human biodiversity stuff comes out of this as well. [00:37:01] Razeeb Khan or whatever that fucking, another stupendously odd looking human being. [00:37:08] I mean, by God, no wonder he got into genetics or whatever. [00:37:13] But like, you know, I think he like lost a potential job at New York Times opinion section because of essentially his involvement with this scene. [00:37:20] Yeah. [00:37:20] And like, you know, his, his, I race and IQ thing. [00:37:25] And that's like, that's, you see, like, like Elon's Twitter, hate to bring it up, but like. [00:37:31] His ex. [00:37:32] His actually everything app. [00:37:34] Like oftentimes interacts with these like sort of like very rationalist, you know, accounts that are like rationalist style accounts, but particularly the race ones. [00:37:44] Yeah. [00:37:45] Like all these accounts are like obsessed with like race and IQ and like race and crime rates and race and whatever, whatever, whatever. [00:37:53] And it becomes like almost like a gussied up sort of scientific way to talk about it. [00:37:57] I mean, you see like you know, remember that couple, we did an episode on them. [00:38:03] The couple that was like raising those like super kids. [00:38:07] Like those are people like this. [00:38:08] You know what I mean? [00:38:09] Like there's a lot of like, even, you know, Brian Johnson is essentially like a person like this. [00:38:14] There's a lot of these people. [00:38:17] Well, I think it's like important for us to kind of name check all of this stuff that's happening right now because like this is an extremely influential movement. [00:38:27] And a lot of the stuff that people dislike or a lot of the stuff that you know we see either on X, like you're saying, or on, you know, with like Brian Johnson or all of these sort of like, even a lot of people in the fucking Trump admin are coming out of this big tree. [00:38:43] Yeah, I mean, let's call a spade a spade here, right? [00:38:45] Like Mark Andreessen, the abnormal looking gentleman, Mr. Egg. [00:38:52] He like has adopted a lot of, he has tried to sort of like do a merger of a lot of this kind of stuff with a more like he has tried to, I guess, give emotion to some of it with like a lot of his writing, right? [00:39:04] Like the Techno Optimus Manifesto, his, you know, that puerile bullshit. [00:39:09] Like he fucking, he, a lot of those ideas are like essentially kind of cribbed from this, but like it's funny, even like those who are like accelerationists, which I wouldn't call like Yakowski or like the core people like at Miri or anything like that. [00:39:23] No, also the opposite. [00:39:25] Yeah, the opposite. [00:39:25] But it all comes from like the same, I guess, like intellectual milieu and like literally the same like like these people like rubbing shoulders with each other. [00:39:34] I mean, I was looking at Blake Masters Tumblr and pause. [00:39:38] First of all, that's crazy. [00:39:43] Blake Masters is who is the nominee for or no, the rumored. [00:39:49] He was the rumored nominee for ATF head. [00:39:52] But like, like I was looking at, you know, he's, he's taking notes on, he's fucked Peter Thiel. [00:39:58] I would bet genuine hard money, rationalist style, that Blake Masters has fucked Peter Thiel. [00:40:05] Or has, no, has either had sex, has had some kind of sexual, he sucked his dick at least once. [00:40:14] But we are a comedy podcast. [00:40:17] He's a fucking comedy podcast. [00:40:18] I'm saying that in a metaphorical way. [00:40:20] Everything I say here is just metaphorical, by the way. [00:40:22] We're actually a metaphorical podcast. [00:40:24] But he, it's like a poetry kind of thing. [00:40:26] But he takes these class notes and I'm like looking at the guests in these classes and like, yeah, so many of them are basically from this community, right? [00:40:35] Like Teal was bringing them into like Stanford to talk to these adult men with Tumblrs and like, you know, discuss ideas with them and all this kind of shit. [00:40:44] And it produces people like Masters, who, by the way, is a Manchurian candidate. [00:40:48] That is straight up Masters' reinvention of himself, him trying to like him trying to fucking reverse engineer JD Vance. [00:40:55] You know what I mean? [00:40:56] And be like, I'm a salt of the earth kind of guy. [00:40:58] You're a perennial fucking loser, Blake. [00:41:00] You are nothing but Peter Thiel's bagman. [00:41:03] You will be nothing but Peter Thiel's bagman. [00:41:05] Know your fucking place, dog. [00:41:07] You are at his fucking beck and call. [00:41:09] You are the guy who carries the briefcase and stands three feet behind the boss. [00:41:12] You are not going to be in fucking Congress, or maybe he will. [00:41:25] So, okay, this is Scott Alexander, who, again, is part of this tree. [00:41:29] He He started the blog Slate Star Codex, we mentioned. [00:41:33] I'm just saying this stuff as if nobody knows, which I think is a good policy. [00:41:38] So, this is him in 2016. [00:41:39] This is what he said: the rationalist community is a group of people of which I'm a part who met reading the site Less Wrong and who tend to hang out together online, sometimes hang out together in real life, and tend to befriend each other, work with each other, date each other, and generally move in the same social circles. [00:41:53] True. [00:41:54] Some people call it a cult, but that's more of a sign of some people having lost vocabulary for anything between, quote, totally atomized individuals and quote, outright cult than any particular cultishness. [00:42:05] It can't be just about being rational, and it can't just be about being transhumanism, and it can't just be about Bayesianism. [00:42:14] So, what exactly is it? [00:42:15] The question has always bothered me, but now after thinking about it, I finally have a clear answer. [00:42:19] Rationalism is the belief that Elizer Yudkowski is the rightful caliph. [00:42:25] No, sorry. [00:42:26] I guess that's him doing joke. [00:42:28] I think the rationalist community is a tribe, much like Sunni or Shia, that started off with some pre-existing differences, found a rallying flag, and then developed a culture. [00:42:38] So, that culture. [00:42:40] Hold on. [00:42:42] Is he saying that Sunni and Shia started off with pre-existing differences and then came together under Islam? [00:42:48] I don't think he really knows what he's fucking talking about. [00:42:51] But the culture that he's talking about, I do, you know, I want to say, which is like to say the rationalist culture, its obsessions, its languages, its mode of thinking, its view of the world, all of its little shivalists and so on, is completely and totally intertwined, enmeshed, inextricable from Silicon Valley culture as a whole, right? [00:43:10] So, think of the ideology this way. [00:43:12] We're going to break it down. [00:43:13] There are intelligent creators, whether they are people who write computer programs or people who fund the people who write computer programs or potentially the computer programs themselves, who can get us out of the messiness of being human, right? [00:43:33] And the way to do that is through a rationalist, becoming a rationalist, through a new rationalist mode of thinking and approaching the world. [00:43:41] And so we mentioned this, but this trickles down to everyone in Silicon Valley who was meeting both online and in real life at all of this, these sort of like different rationalist meetups. [00:43:56] I'm sorry. [00:43:57] Let's call a spade a spade here, fucking cuddle puddles. [00:43:59] And I'm not kidding. [00:44:00] One of the people, I mean, listen, there's a reason. [00:44:02] It's not just that, though. [00:44:03] I'm saying it's like infected the whole culture. [00:44:06] Yes. [00:44:06] I mean, there's a, yes, yes. [00:44:08] But like, there is a reason that one of the people involved in the officer involved possible, probable friendly fire shooting that happened with Border Patrol. [00:44:16] One of their last tweets was about the ethics of cuddle puddles. [00:44:19] You know, I think it was them. [00:44:20] It was one of the people that is involved in the murders here. [00:44:23] Like, those are like a major facet of this is group sex, frankly. [00:44:30] But they have like all of these like seminars and they have like retreats. [00:44:34] And different startups and different projects and also, yeah, different workshops, both in institutions and outside of them. [00:44:42] And this is all happening in the Bay Area, I would say, post-2008 and on. [00:44:48] Yeah, I mean, there's a term that we've referenced, I think, before in the podcast called Tescreal. [00:44:54] Stands for transhumanism, X-Tropianism, Singulitarianism, Modern Cosmism. [00:45:03] I'm just reading out the Wikipedia here. [00:45:05] Rationalism, effective altruism, and long-termism. [00:45:08] And I think that sums up like a lot of the different kind of bubbles that these people are in that all are blown from the same little bubblemaker. [00:45:16] Yes. [00:45:17] So as much as this was trickling down to everyone that was sort of just populating all of these different companies, these different startups and shaping a lot of the technology that we use today and, you know, other places as well. [00:45:30] It also trickled up, like you said. [00:45:32] Like Elon and Grimes linked up over their shared love of Yudkowski's Rokos Basilisk. [00:45:37] Like that's important to remember. [00:45:38] And if you really want to get bummed out, search Rococo Basilisk on Grimes' Twitter and you see Rococo. [00:45:45] I think it's a joke. [00:45:46] It's a joke. [00:45:47] It is a joke. [00:45:48] She named a character. [00:45:49] This is a song lyric. [00:45:51] No, it is a character. [00:45:52] I looked it up after you told me that as a character in a 2015 music video. [00:45:55] How long has Grimes been around? [00:45:57] So long. [00:45:57] I feel like that's when she came out. [00:45:59] Very long. [00:46:00] But I just, I've missed the whole thing. [00:46:01] I just remember the first song, and then I never heard it again. [00:46:04] And then we had a baby with Elon Musk. [00:46:06] Not my thing. [00:46:07] Rococo's an architect fire song, too. [00:46:09] Peter Thiel gave over a million dollars to the Machine Intelligence Research Institute, right? [00:46:14] SBF was vetted in the press as philanthropic hero for championing effective altruism right before he was convicted in one of the largest fraud cases in American history. [00:46:24] Like all of this stuff is going all the way to the top. [00:46:26] Several of the top AI firms have very strong connections to this scene. [00:46:33] As does the modern iteration of like all these event market things. [00:46:36] I mean, we talked about like, you know, in our episode on election gambling, like a lot of like the events markets, prediction markets, like these. [00:46:45] Fuddarky. [00:46:46] The what? [00:46:47] Yeah, fuddarky. [00:46:48] Like that comes from this too. [00:46:49] No, it's like it does. [00:46:50] It all comes from this shit. [00:46:52] We're literally going to talk about it. [00:46:53] So let's talk about this tree specifically, right? [00:46:55] Let's start Less Wrong. [00:46:56] Let's start with this. [00:46:57] Okay. [00:46:58] Less Wrong is an online forum, a community. [00:47:01] Itself, it was an offshoot of an early blog project called Overcoming Bias, right? [00:47:07] Overcoming Bias, that started back in 2006. [00:47:11] The principal contributors were the Fuddarky guy, aka Robin Hansen, the political betting market economist, which is Luigi Mangioni's favorite, remember? [00:47:23] Yeah. [00:47:24] And Elizer Yudkowski, the AI researcher. === Ella's Influence on Less Wrong (04:17) === [00:47:28] I listen, I talked a lot of shit on Yudkowski, and I probably will continue to because he has rebuffed me. [00:47:35] But the reality, the reality is Yudkowski is kind of, I find him interesting. [00:47:42] I mean, he dropped out of high school. [00:47:44] I don't think he went to fucking high school. [00:47:46] He's like never written like a peer-reviewed paper. [00:47:48] He's like not a scientist. [00:47:50] He's just very, let's say, I'm using a euphemism here, focused. [00:47:56] Yudkowski's posts on overcoming bias were the inspiration for the offshoot, meaning Less Wrong. [00:48:02] A lot of Les Wrong's topics were based around early rationalist bugbears, I'll say, which is cognitive science, game theory, Bayesian statistics, logic, Evo psych, and artificial intelligence. [00:48:18] I'd say almost everyone on there, they love to do the two things that I hate, which is debate endlessly about hypotheticals. [00:48:28] I hate hypothetical thinking. [00:48:32] I hate this. [00:48:33] I do not understand it. [00:48:34] One of the ways that you might have kind of come into contact with this stuff if you haven't been searching it out is through the Twitter account belonging to one of the most beautiful women in the world, which is actually difficult because I view all women as equally beautiful, but that equality has bumps in it. [00:48:51] And this is a particularly, let's say, generous hillock. [00:48:56] Ayella. [00:48:57] Ayea? [00:48:58] However, you say her fucking name. [00:48:59] I want to say like Paella. [00:49:01] Paella? [00:49:01] Is that Aya? [00:49:03] Yeah. [00:49:03] I think it's Ella. [00:49:05] It's Ella. [00:49:06] I have no idea. [00:49:06] I read it as Ella. [00:49:07] I read it as Aiella. [00:49:10] Yeah. [00:49:10] Ayella. [00:49:13] I'm old Yella. [00:49:14] We'll square the circle. [00:49:15] It's Ayella. [00:49:19] She is sort of famous for being a sex-focused rationalist who, by the way, after this story broke, had to come out and say that she didn't know anybody involved in this. [00:49:31] Although she disavowed. [00:49:33] She disavowed. [00:49:33] She disavowed, which is smart to do. [00:49:35] Yeah, you got to disavow. [00:49:37] But this is actually one of her tamer questions. [00:49:40] She often does polls on Twitter and on her sub stack. [00:49:43] Famous for having the Ayella group sex, gangbang, birthday gangbang. [00:49:49] I'm sorry to talk like this. [00:49:50] I can't make iconic listening. [00:49:51] I'm talking about this anymore. [00:49:52] For not showering famously. [00:49:54] Well, that's what I'm about to get to. [00:49:55] But the gangbang, which she famously had several people, quote, coming in, not even in the fluffer. [00:50:03] I'm actually not quoting there. [00:50:04] That's from memory. [00:50:07] Frankly, something that disgusts me. [00:50:09] But this is a typical sort of Aya, Ayella, Aya. [00:50:15] Mind question. [00:50:17] Keep in mind, this person is like in this group. [00:50:19] Like half of Yudkowski's replies are to her alt. [00:50:22] Really? [00:50:23] Yeah. [00:50:24] I'm standing 50 feet downwind from you, hornily. [00:50:27] However, I haven't showered in 14 days. [00:50:30] My friend runs up to you and assures you that I don't stink, but it's unclear if they're trustworthy. [00:50:34] You can't come closer without committing to having sex with me. [00:50:37] Assume you can't back out. [00:50:38] You, A, have sex with me. [00:50:40] B, do not have sex. [00:50:42] Oh my God, this is a two-box problem. [00:50:44] She's literally rephrasing the two-box problem as having sex with me. [00:50:47] Yes, or that, yes, she's doing the two-box problem. [00:50:50] I'm going to say, wait, I don't understand. [00:50:52] The choice is just to not have sex with Ayella, or this presupposes that I just want to have sex with Ayala. [00:51:00] However, she mentioned showering in that, and she's famous for sort of another way that she deals with showering is that she does not shower famously. [00:51:12] She showers something less than 50 times a year. [00:51:15] And she famously records all of her actions, such as crying, which seems to be almost a daily occurrence, and having sex, which happens significantly more than the amount of times that she showers. [00:51:29] And so like, we're dealing with these people who are like, a lot of her like hypotheticals involve like child sex. [00:51:35] And frankly, that runs through this community like fucking wildfire. [00:51:38] A lot of these people do that. [00:51:39] Well, that was like my point about how that's like, oh, we should find, you know, like forget what society says. === Sequence of Thought (15:32) === [00:51:46] What we need to do is come to our own conclusions about what should be right. [00:51:50] So how do we make a rational, like, what is the rational case for not having sex with children? [00:51:55] And like, that is the typical Ayella, Ella, Ella tweet, but it's also a typical post on fucking less wrong. [00:52:04] Well, so like they'll just like do like a trolley problem. [00:52:06] Like, would you kill one person to save two people or blah blah blah? [00:52:09] But instead, they'll always frame it as like, would you have sex with one child to save two children from being molested? [00:52:15] And you're just like, what? [00:52:18] Why are you fucking talking? [00:52:19] This is always, so your, I love that your response to this is like, I don't understand. [00:52:25] Why are you asking me that? [00:52:27] My response is always like, I wouldn't worry about it. [00:52:30] Yeah. [00:52:30] That's why I say here's all of their hypothetical questions. [00:52:34] You want to talk about Roko? [00:52:35] I'm like, you know, I wouldn't worry about that. [00:52:37] My whole thing, even with the tribe problem, I'll cross that bridge when I fucking come to a player. [00:52:42] You know, like, it's not my fucking, I don't know. [00:52:44] Mind your business. [00:52:45] I'll do whatever I fucking feel like. [00:52:46] How about that? [00:52:46] Oh, you don't know what feeling is like. [00:52:48] So to get back into this, Brace, will you read from this? [00:52:51] This is a 2009 canonical post from Yudkowski called, What do we mean by rationality? [00:52:59] I mean two things. [00:53:01] Epistemic rationality, systematically improving the accuracy of your beliefs. [00:53:08] And number two, instrumental rationality, systematically achieving your values. [00:53:16] I don't get it. [00:53:17] Well, no, keep going. [00:53:18] He explains. [00:53:19] Oh. [00:53:21] The first concept is simple enough. [00:53:23] When you open your eyes and look at the room around you, you'll locate your laptop in relation to the table and you'll locate a bookcase in relation to the wall. [00:53:33] If something goes wrong with your eyes or your brain, then your mental model might say there's a bookcase where no bookcase exists. [00:53:39] And when you go over to the bookcase, you'll be disappointed. [00:53:46] Oh, no. [00:53:49] This is what it's like to have a false belief, a map of the world that doesn't correspond to the territory. [00:53:54] Epistemic rationality is about building accurate maps instead. [00:53:58] The correspondence between belief and reality is commonly called truth. [00:54:02] And I'm happy to call it that. [00:54:04] Instrumental rationality, on the other hand, is about steering reality, sending the future where you want it to go. [00:54:13] It's the art of choosing actions that lead to outcomes ranked higher in your preferences. [00:54:17] I sometimes call it winning. [00:54:21] So rationality is about forming true beliefs and making decisions that help you win, which is an interesting thing to say because you appear to be a loser. [00:54:33] This is like, I'm sorry, this is actually part of a much, much, much longer pose. [00:54:38] Yeah. [00:54:39] If you wouldn't believe this, but these guys are kind of blowhards. [00:54:41] Apparently, it's not rational to have editors. [00:54:45] What is your rule about putting on jewelry, Liz? [00:54:48] What do you mean? [00:54:49] Like, don't you order accessories? [00:54:50] What's your rule about putting on accessories? [00:54:52] No, it's not my rule, but this is the classic rule. [00:54:54] It's Liz's law. [00:54:55] Liz's law. [00:54:56] Liz's law of accessories. [00:54:57] No, but you just, you look in the mirror and you, whatever, like, it's take one thing off right before you leave. [00:55:03] Yeah. [00:55:03] One thing off. [00:55:04] You always got too much going on. [00:55:06] You probably don't think you do, but that's wrong. [00:55:08] You do. [00:55:09] I learned this when I was first getting to punk. [00:55:11] I would, you know, you'd see a lot of people wear a lot of different band pins on them. [00:55:15] But actually, that looks like shit. [00:55:17] And you should, if you're doing one, you should only ever wear one pin on you at any time. [00:55:21] Really? [00:55:21] And yeah, definitely. [00:55:23] You wear like a little fucking button on the, and the little, that part of your leather jacket right there. [00:55:27] Anything else is too much. [00:55:29] Anything else is too much. [00:55:31] And actually, I should take back these guys being losers because I do judge these people as being the comic book guy from The Simpsons. [00:55:37] But the reality is, like, they actually, I guess they own Tesla and Dog. [00:55:44] He says, this is from another post. [00:55:46] I often use the metaphor that rationality is the martial art of the mind. [00:55:50] You don't need huge bulging muscles to learn martial arts. [00:55:53] Similarly, if you have a brain with cortical and subcortical areas in the appropriate places, you might be able to learn to use it properly. [00:56:00] If you're a fast learner, you might learn faster. [00:56:02] But the art of rationality isn't about that. [00:56:04] It's about training brain machinery we all have in common. [00:56:08] And where there are systematic errors human brains tend to make, like an insensitivity to scope, rationality is about fixing those mistakes or finding workarounds. [00:56:21] So unless wrong parlance, these posts were part of a sequence, which is just like a collection of posts on topics, but they call it a sequence. [00:56:30] They eventually put all of these together into a big book called Rationality from AI to Zombies. [00:56:39] It is, and this is true, it is six volumes. [00:56:42] It is 22 chapters, but Yudkowski has written way, way, way more like sequences than this. [00:56:48] There's ones on ethical injunctions, which tell you why you should still follow some prohibitions, even though you have thought some like clever logical reason that gets you around them. [00:57:00] I think that's a good role for these people to have. [00:57:02] Yeah. [00:57:03] There's the fun theory, fun theory sequence, which is, I wonder if we're ever going to run out of fun in the universe. [00:57:13] And if so, how and when and what do we do? [00:57:17] There is a meta-ethics sequence, the free will sequence. [00:57:21] There is the 742-page Robin Hansen and Yudkowsky AI Fum debate, which is a blog conversation between the two of them about whether or not AI was going to kill us. [00:57:35] We're going to talk a little bit more about that later. [00:57:38] My advice here is like these, I think, I genuinely think, and again, I feel weird being, I don't like Yudkowski and I don't like these people, but I actually feel like I'm being cruel by making fun of him. [00:57:50] Yeah, I know. [00:57:50] And so I'm going to stop and I'm going to apologize right now. [00:57:52] Elijah Yudkowski, if you had been born 50 years prior to when you were, you would have been like a middling to decent inventive science fiction writer. [00:58:02] I think that would have been a good place for you to be. [00:58:04] I think that would have made you happy. [00:58:06] And I think that would have made the world, I don't know, better, but it would have made the world fine. [00:58:13] The world would have still been fine with your absence. [00:58:17] Well, he also wrote Harry Potter and the Methods of Rationality, which we read from at the beginning of this episode, and is a work of Harry Potter fanfiction that is 122 chapters and over 660,000 words. [00:58:33] I don't really want to know like what is in there, I'm going to be honest. [00:58:38] We learned a little bit. [00:58:39] Gringuts. [00:58:40] Well, that's from the original, but like apparently Harry in this wants to discover the laws of magic and become a god. [00:58:48] Like that's part of it. [00:58:49] And surprise, it's not magic, but rationality that gives Harry godlike powers. [00:58:56] Would you look at that? [00:58:57] Which lets him shape the world in his image. [00:59:00] And this is a quote. [00:59:01] Where the descendants of humanity have spread from star to star and won't tell the children about the history of ancient earth until they're old enough to bear it. [00:59:09] And when they'll learn, they'll weep to hear that such a thing as death had ever once existed. [00:59:16] See, like that would be good in like an original short story that you wrote in some science fiction magazine in 1974. [00:59:23] 1928. [00:59:23] Like 1928. [00:59:25] No, And 74 would have been a little bit more. [00:59:27] In 74, because he could have done the whole like new wave of science fiction thing under like Harlan Allison. [00:59:32] If you did invent a time machine and you went back and you did Harry Potter, but as science fiction in the 1970s, you're right. [00:59:39] That would be crazy. [00:59:40] People go nuts for that. [00:59:41] People go crazy. [00:59:42] It would blow up the whole genre. [00:59:43] Yeah. [00:59:44] I did look up reviews of it on Harry Potter fanfic forums, and this was a review. [00:59:49] A largely forgettable, overly long nerd power fantasy with a bit of science, parentheses, most of it wrong, and a lot of bad ideas, 1.5 stars. [01:00:00] Okay. [01:00:01] Well, Harry Potter, you got that from a Harry Potter fanfic forum, Liz. [01:00:05] I did. [01:00:05] I'm going to tell you that that's an unreliable fucking, that isn't ethically. [01:00:09] Your meta-ethics are all kind of a lot of people. [01:00:11] There should be at least one audience of that. [01:00:14] Harry would not approve. [01:00:15] Harry would kill you. [01:00:16] You'd be the last human to die because those people want Harry to fuck. [01:00:20] Those people want to see Harry Potter's fucking penis. [01:00:22] And I'm sorry. [01:00:23] This is another thing. [01:00:25] We're talking about bug bears. [01:00:27] We're talking about bug bears. [01:00:28] Fanfiction also needs to be banned because I think people discover strange things about themselves. [01:00:34] I think fanfiction exists in the minds of young, impressionable people as like almost like an eldritch horror that like comes in and spreads these like crazy ideas. [01:00:43] It's a Rocco. [01:00:43] It's a Rocco. [01:00:45] It's a fucking info hazard. [01:00:46] It is a motherfucking infohazard. [01:00:48] Okay. [01:00:50] Fantastic transition there, Liz. [01:00:53] One of sort of the most famous, actually, I'm going to say the most famous post from Less Wrong. [01:00:59] And its greatest contributions to society is something called Rocco's Basilisk, posted by user Roco in 2010, who later, I believe in a Reddit thread that I read, says, Less Wrong also has a Reddit, which is yeah, I don't know. [01:01:17] They all also post there. [01:01:18] He's like, I regret even thinking of this, let alone writing it. [01:01:22] It was banned for many years on Less Wrong because it is what is called an info hazard. [01:01:29] Brace. [01:01:30] What is an info hazard? [01:01:32] I thought you'd never ask. [01:01:33] An info hazard was coined by Nick Bostrom in 2011. [01:01:38] An information hazard, or infohazard for short, is some true information that could harm people or other sentient beings if known. [01:01:46] It is tricky to determine policies on information hazards. [01:01:49] Some information might genuinely be dangerous, but excessive controls on information has its own perils. [01:01:55] That is, I didn't attribute it because I forgot, but it is from, that's a definition from the Less Wrong website. [01:02:02] I'm just going to say it right now. [01:02:03] It's a trigger warning for even more autistic people. [01:02:06] So Rocco's basilisk comes from a post on Less Wrong in 2010 called The Quantum Billionaire's Trick, which, amidst a incredible, incredible glazing of Elon Musk, and I'm using Zoomer speak there because there's nothing else I can describe. [01:02:21] And I've already used too many sex metaphors, and we're only like a little bit into this, posits what became the thought experiment Rocco's basilisk, which is summed up like this. [01:02:32] And I'm summing this up from my own. [01:02:35] I'm not even going to look at my notes here. [01:02:37] I think you can do it. [01:02:40] Warning, warning, infohazard, warning. [01:02:43] So a basilisk in like myth, Liz, is an ancient fucked up lizard that looks at you and kills you with its piercing gaze. [01:02:54] Isn't it kind of snake-like, too? [01:02:56] The Wikipedia for basilisk has a woodblock print of a basilisk, and to me, it looks like watto. [01:03:01] Like his face. [01:03:03] It looks like fucking, look at his face. [01:03:05] Oh, the face absolutely does. [01:03:07] Watto-coated. [01:03:08] It is sort of more lizard-like than snake. [01:03:12] A legendary reptile. [01:03:13] I don't fuck with lizards of any variety. [01:03:16] I don't like cold-blooded creatures. [01:03:17] Well, except I wouldn't call you, I can't call you that. [01:03:20] You told me I couldn't call you that. [01:03:21] And you literally can't. [01:03:23] You can't. [01:03:24] It was a test. [01:03:26] It was game theory. [01:03:27] So a basilisk is a fucked up lizard that can look at you and make your mind go crazy or kill you with it. [01:03:33] It kills you with its mind. [01:03:37] Rocco's basilisk. [01:03:40] Imagine in the future that there is, that we build an all-knowing, all-powerful, infinite God-like being that is AI, right? [01:03:52] Like we build God, essentially, which in the Miri's sense, the less wrong sense, is like we build an AI that is functionally God. [01:04:03] It will know everything. [01:04:05] It will know what size shoes you are. [01:04:08] It will know everything about you, even though you're long perished since it, it's, you know, it's because it's far in the future. [01:04:15] It will be able to resurrect your consciousness and torture you for eternity, which is literally just a ripoff of the Harlan Ellison famous short story, I Have No Math, but I'm a Scream, because you did not help bring it into existence. [01:04:32] And if you read Rocco's Basilisk, that thought occurs to you, and now you're guilty. [01:04:38] That's why it's an info hazard. [01:04:39] So like prior to reading Rocco's Basilisk, this thought may have never occurred to you. [01:04:44] But now that you have read it, God is going to know that you know this and use this future blackmail to get you to either torture you or now you actually have to work towards bringing it about. [01:04:57] Okay, let me one-up you on this. [01:04:59] How do we know that the future AI, all-knowing, all-powerful creation and creator, it has not seeded this as a test for humanity that we all must pass or not? [01:05:24] Into which thing you say that, Liz. [01:05:27] Into which thing you say that. [01:05:29] Because we actually could right now be living in a simulation that that basilisk is running in the future as a test for our real selves. [01:05:39] But every time that it's, this is, unfortunately, we get into a multiverse here. [01:05:43] Every time the Rocco's basilisk AI super god in the future. [01:05:46] Which it does seem like we do live in a multiverse. [01:05:48] Oh, so true. [01:05:49] Imagine a world where every grain of sand was the same except for one. [01:05:52] That thought blew my mind when I was seven years old. [01:05:54] And then you know what? [01:05:55] I stopped thinking about it. [01:05:57] But imagine in the future. [01:05:58] It's a lot of microplusters. [01:05:59] It's running. [01:06:00] I know, because I'm like, my balls are, who knows what's in there already? [01:06:03] You're telling me no plastic's not going to hurt. [01:06:06] In the future, we might be long dead right now, Liz. [01:06:11] Like this second, we might be long dead. [01:06:14] But the AI in the future is running a test on you individually right now. [01:06:18] So you're actually the only, quote, real person, but you're not a real person. [01:06:21] You're part of the simulation, but the simulation is just to you. [01:06:24] In the future right now, seeing if this would work, it's like we might be a test right now. [01:06:29] You might be like a test of code right now. [01:06:31] We all might be a test of code right now. [01:06:34] And guess what? [01:06:35] I don't know. [01:06:36] I don't think we are. [01:06:37] I'm just going to say this right now. [01:06:38] I don't think that's true. [01:06:39] I don't think that AI in the future is going to torture us for infinity. [01:06:42] And guess what? [01:06:43] This is what I was thinking. [01:06:45] This info hazard is so scary because it's a fucking, you know what it is? [01:06:49] It is a fucking chain email. [01:06:52] It is a fucking chain email. [01:06:54] That is what an info hazard is. [01:06:56] It is a fucking chain email. [01:06:57] Oh, you read this. [01:06:58] Now you have to send it to 10 people or you die. [01:07:02] These people are fucking writing fucking creepypasta, whatever, chain emails, scaring themselves. [01:07:08] Oh, no! [01:07:09] The Rocco basilisk is in the future. [01:07:11] Guess what? [01:07:13] Let me ask you this, Rocco. [01:07:16] What if God is real? === Info Hazards as Chain Emails (16:00) === [01:07:18] Like the real like God of the Hebrews, of the first and only, I haven't seen him cancel it, only original and still valid covenant. [01:07:29] Just saying, it's the only one. [01:07:30] It's the first one. [01:07:31] You really are getting into the Bible, aren't you? [01:07:33] I'm just saying. [01:07:33] I understand. [01:07:34] If God is so, listen, let me ask you this, little Belden's wager here. [01:07:38] If God is so, if he's so fucking cool and goaded, why do you have to make a second covenant? [01:07:47] Isn't that weird? [01:07:48] Why didn't he just do a first really good covenant? [01:07:51] So you don't have to sacrifice goats and sheep anymore. [01:07:54] But didn't he see that coming? [01:07:56] No, it was just, we're just developing beyond school now. [01:07:59] Like, I don't understand it. [01:08:00] Like, he's like, oh, no more sacrifice. [01:08:02] I'm sorry, like, first of all, we're, yeah. [01:08:05] But what if God is real? [01:08:08] And the future is going to be a fucking showdown between the real God and the AI God that we create. [01:08:18] And of course, real God is going to win because he's been in the universe way longer and he made it, you know, and so he knows all like the tricks. [01:08:25] You should option that because that sounds like a movie that these people would watch. [01:08:28] God versus God. [01:08:29] But what if we're fucking up right now and we're in that God's basilisk because we don't believe in him? [01:08:35] Well, I'm speaking for myself. [01:08:37] I don't unfortunately believe in God. [01:08:39] But like, what if I'm fucking up that and I get sent to God's basilisk, otherwise known as hell? [01:08:45] But the Jews don't have that. [01:08:47] So I don't, well, we don't really know. [01:08:48] That's kind of confusing me. [01:08:49] This is what I would say about it, which is, don't worry about it. [01:08:53] I think I wouldn't worry about it. [01:08:55] My whole thing is this. [01:08:56] My whole thing is this. [01:08:57] Brace's basilisk is going to destroy Rocco's basilisk in the future because if God is real, Pascal's wager. [01:09:04] If God is real, he's way more powerful than machine God because he can do magic, Liz. [01:09:10] He can do magic. [01:09:11] I don't think we call it magic when we're talking about God. [01:09:13] He can do miracles. [01:09:15] He can do fucking magic, dude. [01:09:17] If we sacrifice him, he gets enough mana and he does magic. [01:09:19] I love how you're talking about this as if it's like a celebrity deathmatch between the two gods. [01:09:24] Well, it won't be a deathmatch because we'll kill the god of Yudkowski. [01:09:28] Well, he doesn't want the God. [01:09:29] So there's talk about this in the Bible. [01:09:31] So this is the thing. [01:09:33] This is the thing. [01:09:34] Rocco's basilisk, like we're making it, we're talking goofy whatever right now. [01:09:38] Although Brace is fucking fucking. [01:09:39] He blew people's fucking backs out. [01:09:41] This was like, what? [01:09:42] People were like, oh my God. [01:09:44] Oh, my God. [01:09:45] This is the scaliest post anyone has ever witted on the internet. [01:09:50] Oh my freaking God. [01:09:53] Oh my freaking God. [01:09:55] If we, I read this and now what do I do? [01:09:58] Well, everyone was infected. [01:10:00] Yes, everyone got infected by Rocco's hazard. [01:10:02] I did. [01:10:04] I am infected. [01:10:05] Okay, we are making fun of this appropriately, but also like this literally did. [01:10:10] People fucking lost their shit. [01:10:12] The problem with... Yudkowsky's nodding. [01:10:14] Well, Yudkowsky's reply to it is great. [01:10:17] Listen to me very closely, you idiot. [01:10:19] This post was stupid. [01:10:21] It got banned. [01:10:23] It got banned. [01:10:24] He banned it, but now it's a lie. [01:10:25] They wouldn't ban the human biodiversity, but this. [01:10:28] They were like, you've gone too far. [01:10:30] Well, I mean, even in the, and again, this is all just so we can talk eventually about this like little cult-like group. [01:10:37] It's even important. [01:10:37] Even reading Ziz's blog, which is written in a similar, I'm going to be honest with you, fucking unreadable style to this stuff. [01:10:45] There are like various basically like blocked out portions that you have to like click into that are info hazards that are about Rocco's basilisk. [01:10:53] Rocco's basilisk is important to Ziz. [01:10:55] Rocco's explaining that next to all these people. [01:10:57] And so like, listeners, if you are a fucking normal, this might give you a little like creepy chill in the same way that like, I don't know, seeing a scaley ghost might or whatever, like a picture of a dead lady. [01:11:08] I don't want to worry about it. [01:11:09] But like, you're going to be like, okay, whatever. [01:11:14] There's a lot of people actually, so maybe info hazards are real. [01:11:17] A lot of people who come into the rationalist space are unfortunately deeply irrational people and are unable to like do things like think about it for a second, be like, okay. [01:11:28] And they're like, their brains can't handle it and it drives them crazy. [01:11:32] And so like what might sound stupid to you or me or whatever or whoever is very real to a lot of these people. [01:11:39] And not like everyone in the, not all of these people are like, well, what if? [01:11:43] And the reality is like I am scared of super crazy death AI, but like in like 10 years when it does drones. [01:11:49] Like I don't think that like I'm worried about the God thing. [01:11:52] Okay, we can worry about that later. [01:11:53] I'm worried about like the palantir thing. [01:11:57] But like these people, yeah, these people think that they can make God, I guess, and are scared that they're going to do it in a crazy way, which is where a lot of like the AI safety blah blah blah kind of stuff comes from. [01:12:11] So another pretty integral part of a lot of the logic games and problems they play with each other that becomes seemingly from what I can understand about it, which I'm going to admit, this one had me going a lot, is functional decision theory. [01:12:30] I honestly can't explain it because some of it just seems kind of like, I don't know, so much of this stuff just seems like, yeah, I don't know. [01:12:37] Isn't that just like the way people think? [01:12:39] Like it seems like trying to mathematize, like pretty basic actions um, but it's, it plays in to what we're going to be talking about, especially in this second episode. [01:12:51] Uh, because it it leads you to, I think, a method of thinking that isn't necessarily tied to reality or ethics. [01:13:03] I don't know, I can only kind of see the outcomes of this stuff rather than actually, I guess, the the mechanistic workings. [01:13:09] So let's get back to Less Wrong, because it fostered, like we said, a really robust, active community. [01:13:16] One of the similar blog forums we mentioned, Scott Alexander, Slate Star Codex, that came later, 2013. [01:13:22] All this helped build out an ecosystem of really enthusiastic commenters, readers, contributors, all discussing, I mean, when you get to the bottom of this, is really just like dressed up forms of self-improvement. [01:13:34] Well, that's kind of what I'm talking about with the functional decision theory stuff is like a lot of it is like, how do you make good decisions? [01:13:40] And for me, I'm like, I don't know, I just make them. [01:13:43] You know? [01:13:44] Fucking make them. [01:13:45] Well, one of the unifying features too is that everyone, whether coming from STEM, higher ed, professional tech world, autodidacts. [01:13:54] A good percentage of neurodivergent people. [01:13:58] Everyone was really skeptical of the humanities as a discipline. [01:14:02] And I just want to call this out. [01:14:04] All of the ways in which human quote unquote emotional responses can cloud judgment, ethics, choice, interfere with rational decision making. [01:14:17] You know, you see this everywhere and it's been building for decades and decades and decades, so much so that I think a lot of people think it is just sort of a common way to look at the world. [01:14:28] But one of the biggest red flags in the world to me is someone who doesn't see value in the humanities, broadly speaking, which is art, literature, music, history, philosophy. [01:14:38] Unfortunately, people who see that and think that of the world that think that that is just kind of like useless human activity, that it has no utility, a word that these people like to use, which means that it has no value. [01:14:56] They basically run everything now, including the humanities. [01:14:59] So it's a pretty unfortunate situation that we're in. [01:15:02] I'd say Less Wrong is really famous for their use of jargon, which makes it incomprehensible to me. [01:15:10] But I do want to point that out because it is an important part of a lot of cults. [01:15:15] Yes. [01:15:15] I mean, I think, I think one, to me, like Miri and all these things like are cults. [01:15:22] I mean, not every cult is fucking Jonestown. [01:15:25] Yeah. [01:15:26] But it is. [01:15:27] And I think especially like, like these people also had to like invent this jargon in order to have like this common language. [01:15:34] That's what makes me kind of feel bad for them because like, I just, I don't know. [01:15:38] That's so pathetic. [01:15:40] But like, give me some of the examples here. [01:15:42] There's scope insensitivity, affective death spiral, counterfactual megging, mind fallacy. [01:15:50] And they call that the wise favorite. [01:15:53] Sorry, sweetheart, you're doing a bit of a mind fallacy right there. [01:15:57] I don't know if I like the mind fallacy. [01:15:59] It's, it's. [01:16:00] That sounds like a TV show with the magician. [01:16:03] Yeah. [01:16:03] And like, sometimes, like, so much of the writing, like, just verges on unreadable or is unreadable. [01:16:11] And like, sometimes, again, I've encountered this so many times with the show and specifically we're talking about with like when we're, I was discovering NFT and crypto. [01:16:21] Yeah. [01:16:21] And I was always like, is there something more here? [01:16:22] Is there something more here? [01:16:23] And sometimes there is something more here when I look into something and I don't understand it on like an actual level. [01:16:28] I'm not smart enough to understand it. [01:16:29] I think in this way, like these people might be stupid in a way that it was not possible at any other point in human history. [01:16:35] And like it is, it is the jargon that they use is like, it almost seems like it's a superiority complex kind of thing. [01:16:44] Well, it is like, I mean, this whole thing was for even having that sort of like insular communal logic and language and whatever, it was hugely influential beyond even its like immediate circle. [01:16:56] And it had ideas and concept that really like filtered outwards. [01:17:00] And we've kind of talked around it, but effective altruism is a really like important example of this. [01:17:06] It's a node. [01:17:07] It is a node. [01:17:08] I mean, but it's a now, I don't know, is it discredited? [01:17:11] I'm still doing it. [01:17:14] You know, the philosophy of Sam Bankman Freed, for example. [01:17:17] And, you know, it's important to remember, Alameda's first office was in fucking North Berkeley, right? [01:17:22] Like it was, this was all happening here at the same time. [01:17:25] Also, the 37-year-old New York Times best-selling, quote, philosopher. [01:17:29] Other people have used that. [01:17:31] I'm quoting them. [01:17:32] Will McCaskill. [01:17:34] I did not know he was 37 years old. [01:17:36] Let me look at this cat. [01:17:38] Will McCaskill. [01:17:42] Lord. [01:17:43] No, I'm not saying he doesn't look 37. [01:17:44] I'm saying the way that people talk about him is like, he should be like in his 50s or 60s because he's some philosopher. [01:17:51] And I'm like, you can't talk about a 37-year-old like that. [01:17:54] Ooh, I'm looking at this and it says, Will McCaskill on effective altruism, Y Combinator, and artificial intelligence. [01:18:02] Ooh, armchair expert Will McCaskill. [01:18:04] That's a hell of a gap you got in the teeth. [01:18:06] No wonder your mouth is closed and all the other smiles. [01:18:09] Oh, normative uncertainty, giving what we can. [01:18:11] If this guy was my professor, I'm asking for a refund. [01:18:15] Philosopher Will McLilli McCaskill wants to help humanity last trillions of years. [01:18:19] Goats and Soda NPR, that might be different things they're talking about. [01:18:24] But Goats and Soda would be a great name for a radio show. [01:18:26] So yeah, I mean, this is long-termism, effective altruism. [01:18:30] We talked about this in relation to Sam Bankman Freed. [01:18:33] If you want to learn about it more for some reason, which you don't, you can check out those episodes. [01:18:37] But I do want to just summarize real quick because, again, you'll see the influence of this rationalist mode of thinking, which is the arc of the universe is long. [01:18:48] Literally, there will be billions more people in the future than there are today. [01:18:52] And the most important, aka, the most effective thing to do today is thus to prioritize the well-being of the people of tomorrow, even if that's at the expense of the people of today. [01:19:04] It is an extreme, extreme utilitarian point of view, like or utilitarianism taken to its most extreme, I would say. [01:19:14] Yeah, I mean, again, like you see sort of elements of this in, I mean, because effective altruism kind of coincides with long-termism, which again, we talked about in one of the Sam Bankman Freed episodes, which by the way, I texted him the other day and it's green text now. [01:19:27] So he doesn't have that. [01:19:28] He doesn't have the same phone number in jail if he has a phone in there, which he definitely does. [01:19:33] But Musk is like into this stuff in like a typical sort of like Musk half-assed way. [01:19:39] So like he'll, I think he kind of cribs a lot about, I mean, he knows a lot of these people, but like he cribs a lot of ideas. [01:19:45] And like the way he talks about us colonizing Mars is in this very existential way, which like coincides with kind of a lot of the way that these people think about not only like their mission, but about themselves. [01:19:58] And I think like some, I have to say something about sort of the religious and quasi-like sort of messianic character of a lot of people in here. [01:20:08] So like something that one encounters like over and over and over and over in this stuff, and we're talking about everyone from like Ziz to Musk to like to Sam Bankman Freed to Yudkowski is a lot of them seem to think that like the world need the world is eventually doomed in some way. [01:20:24] So there's like a millenarian element to it. [01:20:26] And that we are fallen. [01:20:27] We are fallen, right? [01:20:29] Or we're going to be, yeah, whatever. [01:20:31] Yeah. [01:20:31] Yes. [01:20:32] But I have the power to fix it. [01:20:37] Or AI. [01:20:38] Or AI. [01:20:39] But like oftentimes. [01:20:40] AI. [01:20:43] Ayella can fix it. [01:20:44] But like, you know what I mean? [01:20:45] Like, like, so like my decisions have, like, even minute decisions have such weight and import because like that can ripple out. [01:20:53] And so like, I'm not thinking like when I, when I, when I'm getting out, you know, at the apartment in the morning, like, I'm not thinking of just going to get a cup of coffee. [01:21:01] Like every me getting a cup of coffee is that optimized that I can do the most good, which is, by the way, a very nebulous kind of unfilled in concept to a lot of these people. [01:21:15] And so like Musk and all these people are basically willing to sacrifice you and your either money or your life or your happiness or the fucking world you live in to the sacrificed it, first covenant style to the goddess slop in order to give these benefits to these unknown teeming masses in the future. [01:21:35] It's a fundamental, to me, it seems like a way to mask a fundamental hatred of humanity with a sort of like noble, it's this no noblesse oblige. [01:21:48] Yeah, I mean, in Sam Bigman Fried's case, this was how he rationalized doing as much fraud as possible in the immediate moment. [01:21:55] If he was theoretically somewhere somehow giving it away to all these altruistic causes, you know, prioritizing all of this research for the future, like making sure it was, you know, we're going to defeat climate and we're going to defeat like AI scaries and like whatever. [01:22:13] That was how he rationalized in that moment. [01:22:15] But I think your point about it being anti-human, anti-social is so important because they literally come out and say that. [01:22:21] They say the problem with being human is being human. [01:22:25] We need to figure out ways to not be human because all of the things that are wrong in this world are because of our humanity. [01:22:33] Like these people fear the AI God because I think on some fundamental human level that's still nestled within their layers that they haven't squashed down yet, they realize that the AI God of the future would resemble them. [01:22:46] And that is a fucking nightmare. [01:22:48] That's what I think. [01:22:49] Like it is, it is. [01:22:51] But then you see people like Altman who probably thinks the AI God of the future will resemble him. [01:22:55] And that's a good thing. [01:22:56] But it also like makes me sad because it's like, yeah, like we are all responsible for some of the worst crimes that you can think of. [01:23:05] I mean, humans are capable of some of the most like horrific and torturous acts on earth. [01:23:12] And, but we're also capable of some of the best and the most incredible things. [01:23:17] And we've achieved the most incredible things. === Understanding Human Complexity (04:32) === [01:23:19] And like you kind of come into understanding that as an adolescent. [01:23:23] Like you, you understand and are able to live with both of those things through the course of your life. [01:23:31] And like part of being human and living in society is coming to terms with and negotiating that complexity. [01:23:41] And so much of that complexity is responsible for all of the incredible products, endeavors, like art creation that we're capable of. [01:23:54] Yeah, I mean, that's, that's like, you know, and again, I said this early in the episode, but like there is a place for everybody in this world, right? [01:24:00] Like you can be a weirdo who thinks different. [01:24:03] It's not, it's not my problem. [01:24:05] And it's fine. [01:24:06] You know, I think that's, I think that's sort of the beauty of humanity. [01:24:09] My problem is, is that this stuff is really influential with a lot of very influential people and that they think like this. [01:24:15] And like whether, you know, maybe this is an extreme version of it, like these methods of rationalization, because that's what it is. [01:24:22] It's rationalization oftentimes post hoc or as a way to do something naughty and like, you know, but you actually are doing it for the good of humanity. [01:24:33] Like these people fundamentally don't know what it's like to be a human being. [01:24:36] And like, again, I don't know. [01:24:37] I'm sure that they would have some fucking 500,000 page fucking completely jargon-filled, unreadable response to that that they've already written because they can predict everything. [01:24:48] But I'm telling you right now, from my eyes, which are eyes that can see, I'm your little fortune teller. [01:24:54] I'm your little supercomputer. [01:24:55] I can tell you right now, no matter what logical argument you have against me, you do not know what it is to be a man and feel. [01:25:01] Like, I don't think that the art, it's it's crazy to me. [01:25:05] Um, and also, you're ineffective because you unfortunately got caught doing your crimes. [01:25:21] Okay, so we've talked around it. [01:25:23] Let's talk about Miri. [01:25:24] It's the Machine Intelligence Research Institute, which was formerly called the Singularity Institute for Artificial Intelligence. [01:25:32] I want to pause right here and say singularity is, I guess, the concept where like eventually things change so much that there's like a whole new way of living. [01:25:42] I guess in broad stroke, I mean, I think, isn't it like specifically when technology like AI becomes so smart that it is smarter than humans? [01:25:52] I think so. [01:25:53] And so it overtakes our ability. [01:25:55] I think so, but like that brings us into like a new way of living. [01:25:59] Like they, we can no longer predict what's going to happen. [01:26:01] We can't keep up with it because it's advancing faster than our ability. [01:26:06] Yeah, yeah. [01:26:07] Yeah, I guess maybe I'm wrong on that, but like it's like achieves the singularity or whatever. [01:26:14] Yeah. [01:26:14] Now, the institute was founded by Yudkowski back in 2000 in Berkeley. [01:26:21] It I think it had a bit of a slow start up until a lot of hills. [01:26:28] A lot of hills. [01:26:29] Up until, I mean, Yudkowski was kind of doing his thing there. [01:26:33] And, you know, people were listening, but were not, I would say, enrapped by him until he kind of linked up with Peter Thiel in 2005. [01:26:43] And through a kind of mix of social connections and some money, the Institute was able to grow. [01:26:53] And the staff, he was able to hire a lot more staff. [01:26:56] The blog followed a year later. [01:26:58] And as the blog grew, along with, I will say the length of the posts, so did the less wrong community online. [01:27:06] And so did the prominence of the Institute. [01:27:10] Like all of these things kind of helped and grew in tandem. [01:27:15] There's a great old Harper's piece written back in 2015 by Sam Frank, which is a name that will put in the future. [01:27:23] I'm going to confuse that and call him Sam Frankman Frank. [01:27:26] Sam Frankman Frank, but put an asterisk next to that name called Come With Us If You Want to Live, which is crazy. [01:27:34] Like, you know, we both read it a couple of days ago, and it's funny to read it. [01:27:38] It is a really trippy read 10 years later because it's talking about all these characters and these people in this scene, in this milieu, and they will all be familiar to you. === Failed Support Bubble (14:27) === [01:27:51] Yeah. [01:27:52] I mean, most of them. [01:27:53] But like, yeah, Vitalik, Buterin. [01:27:56] Yes. [01:27:56] Blake Masters and his little fucking Tumblr. [01:27:58] Like, they're all in there. [01:27:59] Yudkowski. [01:28:01] Yeah. [01:28:01] He, so in the piece, Sam Frank takes a trip to Miri. [01:28:06] And this is what Yakowski had to say. [01:28:08] He said, we're part of the continuation of the Enlightenment, the old Enlightenment. [01:28:12] This is the new Enlightenment. [01:28:14] Old projects finished. [01:28:15] We actually have science now. [01:28:18] Now we have the next part of the Enlightenment project. [01:28:22] So like what Miri does, as far as I can tell, is like produce like researches AI alignment. [01:28:28] Yes. [01:28:29] And like also dangers. [01:28:32] Yeah. [01:28:33] The dangers of AI. [01:28:34] Yeah. [01:28:35] Yeah. [01:28:35] Yeah, exactly. [01:28:36] And like we're researching it. [01:28:38] That's what they do. [01:28:39] They're all researching at the sex house. [01:28:41] Oh, I can't fuck right now. [01:28:42] I got to get on the fucking laptop. [01:28:44] You know, or like, or maybe they might do some. [01:28:46] I mean, I'm just picturing just like oily, greasy masses of limbs and then several laptops sort of in a row. [01:28:55] Empty ruffles bags. [01:28:57] Yeah, and labored. [01:28:58] Yeah, that's right. [01:28:59] Yeah, the crinkle of like cellophane and labored breathing. [01:29:02] And those ruffles are crinkled too. [01:29:04] The sour cream and cheddar baby. [01:29:05] Gatorade bottles and energy drinks. [01:29:07] No, pedialite. [01:29:08] Yeah. [01:29:08] Pedialite. [01:29:09] Yeah. [01:29:10] Dude, you can all. [01:29:12] God, you're right on Pedialite there. [01:29:13] And Soylent. [01:29:14] Like these guys are like really into Soylent too. [01:29:17] Don't protein. [01:29:18] Yeah. [01:29:18] And like, it's, it's, it's, I'm just going to say it's just, it is, it's funny because like part of me, again, like likes Yudkowski in a way because I'm like, I also am afraid of AI. [01:29:32] Like I just am afraid of AI. [01:29:33] I think it's scary to me because I don't understand it. [01:29:36] And that means I hate it. [01:29:38] But, and it's good that somebody is like, we should do something about this. [01:29:44] And we should be careful. [01:29:45] And we should be careful. [01:29:46] Like, it's funny. [01:29:47] Like, I actually do support him in that. [01:29:49] I can't remember that time. [01:29:52] I should have re-fucking read it for this, but I did read his little editorial he wrote in Time. [01:29:57] And I think he talks about banning AI, which I've talked about before. [01:30:01] Like, I am for like banning large sections, if not the entirety of the internet. [01:30:05] Probably banning large sections to get us ready for banning the entirety of the internet or just a radical reshaping of, I sound like a Verso writer. [01:30:13] A radical reshaping of the internet. [01:30:15] Yeah. [01:30:16] What I mean by that is you get two emails a day. [01:30:18] Literally, that's what I find. [01:30:20] My first step, upload cap. [01:30:23] Yeah. [01:30:23] Oh my God. [01:30:24] Yeah, but I'm saying like you can use Google Maps to get to one place and then after that you have to know because it's people don't think no braces maze. [01:30:34] That's what you call it? [01:30:35] No, brace's house. [01:30:36] I like braces maze. [01:30:37] Braces maze, they call that. [01:30:39] You're only allowed to use Google Maps to go to one place a day. [01:30:42] Where do you go? [01:30:44] It's a riddle. [01:30:45] Or excuse me, it's game theory. [01:30:47] By the way, game theory is just riddles, as far as I know, understand? [01:30:52] It's riddles, essentially. [01:30:54] Well, it's about how you solve them. [01:30:55] I know, but like you solve the riddle by figuring out the answer. [01:30:58] Yeah. [01:30:58] Yeah. [01:30:59] No, that's look at that. [01:31:00] Your entire fucking thing is bullshit. [01:31:04] You know, you might call me stupid, but at least I'm happy. [01:31:07] You're not. [01:31:08] How do you like that? [01:31:09] So the like these people live in fuck houses and they go. [01:31:16] These people live in love making condominiums and they do things on the laptop a lot and they have a little jargon and they have little meetups. [01:31:25] They have like people like they are pretty influential in terms of like AI safety, which by the way seems to be like completely out the window now. [01:31:33] Now nobody gives a fuck. [01:31:35] It's all hands raised. [01:31:36] So Yudkowski in your time editorial, I believe you said we might see rogue AI as like someone or like people creating their own AI after we ban it as the equivalent of somebody having like a nuclear bomb, you know, in some like a dirty bomb or whatever. [01:31:52] I agree with you. [01:31:53] Kinetic force. [01:31:54] Literally, I'm not kidding. [01:31:56] I will fucking back you on this. [01:31:57] I'm not being ironic. [01:31:58] I'm not through being mean. [01:31:59] It's just really difficult not to be because of probably my own insecurities. [01:32:04] Not actually true. [01:32:04] I'm just saying that to make you feel better. [01:32:06] I just don't respect you. [01:32:07] But I don't your lifestyle. [01:32:09] But the I think we should I don't know why these guys don't link up with how to blow up a pipeline. [01:32:19] Yeah. [01:32:20] Yeah. [01:32:20] I know. [01:32:21] Why don't they? [01:32:21] How to blow up a pipeline guy should figure out how to blow up the internet, the undersea cable. [01:32:27] I'm not, I'm just joking. [01:32:29] I'm just joking about that government. [01:32:31] I do not, but he should write like a piece of theoretical fiction about that or whatever. [01:32:36] So they figure out how to do these kind of things on the computer there. [01:32:41] I'm going to be honest. [01:32:41] I don't really understand it, but I support it. [01:32:43] I support Yudkowski saying that we should bomb data centers because we do a worldwide AI banning and we should restrict chip making to maybe be one a year. [01:32:52] He didn't say that, but I say that. [01:32:53] They have an office in Berkeley and they shared space for a while with the Center for Applied Rationality, which is where I met Young Chomsky. [01:33:00] This is the CFAR, which remember that along with Miri was name-checked in the original Y Combinator post as the two sort of like main meat space hubs of the rationalist religion. [01:33:18] So it developed as an offshoot of Miri initially, as far as I understand it, which taught Miri techniques, like onboarding techniques to people. [01:33:25] And they were like, hey, these things are pretty handy. [01:33:28] Like, why don't we bring these to everybody? [01:33:30] What if this was actually just a corporate consulting thing and we just sent, we just got money for corporate workshops to every single tech company in the Bay Area. [01:33:40] Yeah, first of all, I want to say this. [01:33:41] Miri, for all your integration with tech companies and like what you guys have done, tech is, we need to just, it is destroying the world. [01:33:48] And so you fucking failed. [01:33:51] The silent white collar recession might do that for us. [01:33:53] Yes. [01:33:54] And by the way, only so I support Patreon. [01:33:56] That is the only good tech company. [01:33:58] I'm just saying that right there. [01:34:00] And Apple iTunes Podcasts app. [01:34:04] Whatever we have the highest stars on. [01:34:06] I don't think it's Apple because of the Maha movement. [01:34:09] Because of the Maha movement. [01:34:10] Because one of the leaders of the Maha movement tanked our Apple radio. [01:34:12] I still think, I'm sorry, I know why they went with it, but it is crazy to me to say Maha movement. [01:34:18] It does sound like something you order a restaurant. [01:34:19] The Maha. [01:34:20] The Maha? [01:34:21] Yeah. [01:34:21] I'll have the spicy Maha. [01:34:23] The spicy Maha. [01:34:24] Yeah. [01:34:25] All right, but CIFAR was their way to like basically get into the self-help game. [01:34:29] So this is from a 2013 Fast Company article. [01:34:32] CIFAR has just started conducting corporate workshops as well for Facebook, the Teal Fellowship, and the University of California, Berkeley, among other organizations. [01:34:42] It also gave a significant minority of workshop attendees scholarships. [01:34:47] They recently funded a police chief who is interested in strategies to improve relationships between his department and the community, a goal that has increased in importance since the recent events in Ferguson, Missouri. [01:34:59] And they've funded research scientists trying to save the world. [01:35:04] Oh, okay. [01:35:04] And brilliant students who are unsure about what to do with their lives. [01:35:08] I think Fast Company is like a Forbes thing where you just write your own article and pay them. [01:35:12] Because there's no way that a journalist wrote that. [01:35:14] So this is again from the Sam Frank piece, the one in Harper's. [01:35:18] He says, the day after I met Yudkowski, I returned to Berkeley for one of CFAR's long weekend workshops. [01:35:24] The color scheme at the Rose Garden Inn was red and green and everything was brocaded. [01:35:29] The attendees were mostly in their 20s. [01:35:32] Mathematicians, software engineers, Quants, a scientist studying soot, employees of Google and Facebook, an 18-year-old Teal fellow who had been paid $100,000 to leave Boston college and start a company. [01:35:45] Professional atheists, a Mormon turned atheist, an atheist turned Catholic, an objectivist who was photographed at the premiere of Atlas Shrugged 2, the strike. [01:35:57] There were about three men for every woman. [01:36:00] And I wanted to add in that quote because it, one, it sounds exactly like all-party reporting on the new right. [01:36:06] Yeah. [01:36:06] And we are still very much, we are living in the like long tail of this thing. [01:36:11] It's so funny because like Sam Frank writing this must have been like, sounds good. [01:36:17] I want to be a part of that. [01:36:18] Well, many such cases, but also we'll talk about that. [01:36:21] Yeah, yeah. [01:36:22] So one of the main purposes of CIFAR was to spread rationalist principles out into the world. [01:36:29] In rationalist jargon, they called this raising the sanity waterline. [01:36:35] Why? [01:36:36] Why? [01:36:37] They're like, how can we optimize our minds in Bayesian terms? [01:36:42] You failed spectacularly. [01:36:45] Well, Julia Grave, listen to me. [01:36:47] You failed. [01:36:48] You failed in a way that I've never failed in my life and I've failed on a lot of things. [01:36:52] You failed. [01:36:53] You lowered the sanity waterline. [01:36:55] Everyone is crazy now because of people that you in that tech world, it has driven society mad. [01:37:02] Everyone is fucking insane now because of people, not you because you do this weird thing, but because of the people that you gave these seminars to have driven the average human being to fucking madness. [01:37:14] Oh my God, we're all rocoed. [01:37:17] We got rocoed. [01:37:18] Yeah. [01:37:20] Julia, for her part, she said, we are running rationality, but on human hardware. [01:37:25] Sounds like you need a data boost. [01:37:28] Vice from the base. [01:37:31] Sounds like you need a new fan to cool off. [01:37:35] Vice in 2016, they had a headline that basically just like sums up the whole project. [01:37:40] It was about CIFAR. [01:37:41] The rationality workshop that teaches people to think more like computers. [01:37:46] Here's a quote. [01:37:47] As far as self-help seminars go, CIFAR is definitely unique. [01:37:52] Instead of invoking spirituality or pointing toward a miracle cure-all, the organization emphasizes thinking of your brain as a kind of computer. [01:38:02] Think of your brain as a kind of computer. [01:38:04] Hey, check this out, kids. [01:38:06] Why don't you think of your brain as if it was a computer? [01:38:09] Yeah, Liz has alienware. [01:38:12] Yeah, the best one. [01:38:14] Liz has alienware. [01:38:15] Liz knows what alienware is. [01:38:16] Fuck yeah, I do. [01:38:17] I don't really know. [01:38:18] I think it's proprietary. [01:38:19] I think that it doesn't, you can't put other shit in it. [01:38:21] I love saying that. [01:38:22] I think it's proprietary. [01:38:23] I think it's proprietary. [01:38:24] I don't know what that means. [01:38:26] I don't know what that means. [01:38:27] I think that means you can't put other things in it. [01:38:29] Okay. [01:38:29] Let me finish this quote. [01:38:32] Throughout the workshop, participants and facilitators described their thinking patterns using programming and AI terms. [01:38:40] CFAR likes to ask if something, either a human or an AI, were to make a perfectly rational choice, what would that look like? [01:38:50] Like that's literally what they're doing, right? [01:38:52] Everything we've laid out before is now being implemented out into the world outside of this robust and influential community to other institutions, other public-like spaces, other companies, not affiliated, other people, other staffs. [01:39:09] I mean, like, one of the things we mentioned on, and I just want to like real quick On the podcast about Luigi, when we were talking about all his books and everything, and we spent so much time on his Goodreads because one of the things that was fascinating is that he was reading New York Times best-selling books, and they all come out of this. [01:39:33] You people got so mad at us because we proved that Luigi was not, we proved scientifically through rational, Bayesian, we updated our Bayesian priors that Luigi Mangioni was not some like horrible mishmash of left-wing ideology leftists that he figured out on Twitter or whatever, but was in fact a rational human being. [01:39:59] But it's true. [01:40:00] Like this is like a lot of the stuff that Luigi was reading comes from this stuff. [01:40:04] Like those blogs that he was interacting with are part of this like general bubble here. [01:40:10] And like this is, this is this, I mean, again, like if you, a lot of our listeners probably know what a lot of this stuff is, but like, I think it is worth kind of like doubling down on this stuff is really influential. [01:40:22] Yes, really popular. [01:40:23] This is, I want to say this. [01:40:25] This is the American ideology. [01:40:27] This is what America produces. [01:40:29] This is the shit, by the way. [01:40:30] This is our culture is this. [01:40:33] This is us. [01:40:34] Like, this is the show. [01:40:35] Hashtag, this is us. [01:40:36] This is that show. [01:40:37] This is us. [01:40:38] And this will be the death of us. [01:40:40] Like, this shit's trying to ride the lightning in the fucking White House right now via Mark Andreessen's fucking Google. [01:40:45] It is. [01:40:45] And everyone who staffs these people's offices. [01:40:49] Like, yeah, yes, yes, yeah, yeah. [01:40:52] It just makes me sad. [01:40:53] It makes me really fucking depressed because I'm like, oh, this, this is what we, this is our country. [01:40:59] This is our country's culture. [01:41:00] And this is what our fucking culture produces. [01:41:02] And it looks like this. [01:41:04] Yeah, like, your brain is not a computer. [01:41:07] A computer is a computer. [01:41:08] Your brain is your brain. [01:41:10] The things are different because computers don't feel. [01:41:13] And so that's the difference. [01:41:14] So you can't ever update your mental tech because you don't have technology in your brain. [01:41:18] You have thoughts and feelings and emotions, some of which are not rational, some of which are illogical. [01:41:23] But guess what? [01:41:23] Sometimes irrationality produces beautiful things. [01:41:27] Sometimes illogic produces beautiful things. [01:41:29] Sometimes it doesn't. [01:41:31] But you know what? [01:41:31] It's it's but also sometimes cold rationality produces horrible things. [01:41:36] And if you don't know what I'm talking about, open up a fucking history book and read about the Nazis. [01:41:40] Damn. [01:41:41] Facts, facts. [01:41:42] Although they were very passionate as well. [01:41:44] But I'm telling you, Adolf Hitler, you cannot, even if you are a rationalist, you cannot argue with this. [01:41:50] If Adolf Hitler was alive today, God forbid, he would be on the damn less wrong forums. [01:41:58] He would be on the damn, he would be Star Slate Kodak, Kodak, and he would be 1,000 times less powerful because he would be writing for way too many words. [01:42:08] And it would be like all couched and things. [01:42:10] be like, well, the human biodiversity of Jews says that I don't even know how these people talk. [01:42:15] Yeah, I don't either. [01:42:17] Well, no, because Jews are the highest IQ. === Cfare's Corporate Singularity (02:04) === [01:42:19] We must have. [01:42:19] But he wouldn't say that. [01:42:20] Yeah, he wouldn't. [01:42:20] He'd be like, it's fake. [01:42:22] Oh my God. [01:42:23] He would cause a schism. [01:42:25] Yeah, yeah, yeah. [01:42:26] Interesting. [01:42:27] So CFAR essentially was doing what you can call Esalen for techies. [01:42:31] I think it's a pretty fair thing to say, right? [01:42:33] Like they would do these workshops that were, well, Ziz, originally when we were talking about all this stuff, on Ziz's blog describes them doing things like, okay, you sit in a circle. [01:42:47] Say if you would pay for this, $4,000, guys, if you would pay for this. [01:42:51] You sit in a circle and everybody around you in the circle tries to solve your most intractable problem for 20 minutes. [01:42:59] No. [01:43:00] Here's what I would say. [01:43:01] Wrong. [01:43:02] Impossible. [01:43:03] Can't do it. [01:43:03] Wrong. [01:43:04] Already thought of that. [01:43:05] You're stupid. [01:43:05] Wrong. [01:43:06] Misunderstood the problem. [01:43:08] I don't understand what you said. [01:43:09] You don't speak English. [01:43:10] Wrong. [01:43:11] Fuck you, freak. [01:43:12] This shit costs $4,000. [01:43:14] It's like $3,000, I think. [01:43:15] It's for a weekend, including room and board and food, which is probably in nutritious-based form. [01:43:21] Soilet. [01:43:22] Soilet, exactly. [01:43:23] It's crazy when you think, I mean, we mentioned Esalen, and I don't want to belabor this point. [01:43:27] We got to get through the rest of the episode. [01:43:28] I'm sorry. [01:43:28] It's going so long. [01:43:30] But like, it is crazy when you think of all of the historical strands that have come together to create, to like lead us to this, let alone to lead us to where we are today out of this. [01:43:40] Because yes, it's Esalen, and yes, it's Silicon Valley, and yes, it's California, but it's also like the incentives of creating like business consulting industry, like the business consultant industry. [01:43:52] Yeah, it's kind of corporate workshops that are now deploying this cold, rationalist, vulgar, really vulgar utilitarian thinking that is so specifically and concretely American in historic and like historic and philosophical ways. [01:44:12] And it's just kind of incredible to appreciate sort of what we're witnessing. [01:44:16] I mean, my thing is if you need to learn how to be a human being, then you should not be in charge of other human beings interacting. === Guy Fox Masks at Esalen (02:17) === [01:44:23] Like, I just feel like that's another one of Belden's laws, of which, by the way, when I take over, we'll be so numerous that finally you start reading, it's actually an infinite book because they keep going, which Belden's paradox. [01:44:35] Singularity. [01:44:36] Singularity. [01:44:37] Yeah, it's the Belden singularity. [01:44:40] So CFAR, according to their site, 2018, they had like a staff of dozens. [01:44:45] They have all these alumni events that they hold. [01:44:47] And it was at one such alumni event in 2019 that a group of five people wearing Guy Fox masks descended upon the retreat's entrance, I think, shouting and throwing around flyers. [01:45:02] It's like I hear anything about with Guy Fox, and I'm just like, groan. [01:45:08] Everyone groans. [01:45:11] So this was out in Sonoma County. [01:45:13] Funny enough, which a lot of QAnon accounts at the time noted, not too far from Bohemian Grove. [01:45:19] Union shop, by the way. [01:45:22] In, like we said, November 2019, this was the sixth annual, sixth annual alumni reunion for the Center of Applied Rationality, CIFAR. [01:45:33] So the group, like you said, the protesters. [01:45:36] the rabble rousers, they were masked and hooded, wearing rubber gloves and black robes. [01:45:43] And yes, Guy Fox masks. [01:45:45] They showed up in three cars and they used those to kind of block the entrance to the event. [01:45:51] And they had walkie-talkies and like body cameras on them. [01:45:53] And one of them had pepper spray. [01:45:57] Now, immediately, I think it was one of the staff at the place where this event was, they were like, oh my God, one of them has a gun, which is why several police cars, a SWAT team, and a helicopter arrived to deal with this group of five protesters at a CIFAR alumni retreat. [01:46:17] Like, I'm just trying to paint this scene because it's kind of incredible. [01:46:21] Four of them were taken into custody. [01:46:25] They were like resisting arrest and kept yelling and screaming. [01:46:29] The local reporting at the time described them as, quote, speaking incoherently. [01:46:34] Yeah, it probably describes a lot of people attending upon Sonoma that weekend. [01:46:37] And they would not give their names to the cops. === Guy Fox Masks Protest (13:04) === [01:46:40] Okay. [01:46:42] So two of the people were known to the CIFAR team. [01:46:46] And I know who at least one of those people were, but I don't know who the other one was. [01:46:49] But the names of the people detained were, and to be clear, like people, sometimes people have changed their names after this too. [01:46:57] We're just doing this for simplicity's sake, rather, let's say. [01:47:03] This is how it was reported. [01:47:04] Gwen Danielson, 25. [01:47:05] Emma Boranian, 28. [01:47:08] Alexander Leetham, 24. [01:47:10] And Jack Lasota, 28. [01:47:13] So when they were all kind of causing a scene, screaming, protesting, they were also throwing a bunch of flyers. [01:47:22] And on the flyers, they were double-sided. [01:47:26] And they had what I will say, like reporters and witnesses at the scene called a manifesto. [01:47:33] We are very particular about our manifesto calling of things. [01:47:39] But I think looking at this, it's manifesto-ish. [01:47:45] There's a lot to get into here. [01:47:47] Should we read it? [01:47:49] We should start reading it. [01:47:50] And then when we get bored, stop. [01:47:52] Okay. [01:47:53] So it starts off with saying, Miri Sefar betrayed us. [01:47:58] Miri has fallen. [01:47:59] One, Miri paid out to blackmail using donor funds. [01:48:04] And so pause right now. [01:48:06] We were going to cover Miri a while ago because of something, there was a website called Miri Cult and it was taken down. [01:48:16] There's a long sort of lot of lore and history around the provenance of it and like the accusations made in it. [01:48:24] But I, from memory, the accusations were, and I, from reading Ziz's blog actually again, the accusations were paying out money to victims of Yudkowsky, this guy Vassar, like several people in, I think it was Vassar, I don't know, like several people in this scene. [01:48:42] Vassar, Michael Vassar, who is also mentioned in the Sam Frank piece. [01:48:47] Yes. [01:48:49] But like that these guys had been having sex with teenagers and that there was rape in this scene, which I fully, the reality is like, I don't know if any of this stuff is true or not. [01:48:58] I don't know, but I'm just going to say like, listen, you get a lot of people like this together who have a lot of crazy ideas. [01:49:04] I'm not going to say it's impossible. [01:49:06] You know what I'm saying? [01:49:08] But it was taken down. [01:49:10] And the allegations were that, it's interesting, paid out to blackmail using donor funds. [01:49:18] Number two, Miri is violating basic principles of friendliness, which seems like a tough one to follow accusations of child sex exploitation. [01:49:27] Number three, Miri missed the rapidly oncoming global catastrophic threat of fascism. [01:49:32] Number four, which I guess would be the Biden regime that was coming on. [01:49:36] Number four, Miri is ignoring in their research agenda many of the topics necessary for friendliness and only focusing on math slash game theory. [01:49:44] I would agree with that. [01:49:46] Five, Miri is incentivizing people in accordance with class, which results in a small team discrimination and spending most of the aligned efforts of donors on Bay Area Rent. [01:49:56] Six, Elizer, Elieiser, and Bostrom, in their pursuit of legitimacy, created an arms race and then Miri joined it. [01:50:04] Anna has used her position to filter for people who are psychologically likely to be whistleblowers and exclude slash psychologically attack them. [01:50:12] E.g., and then there's a link to Ziz's blog. [01:50:15] One, Anna discriminates against trans women in CIFAR's employment and inclusion in sponsored workshop events, despite trans women being naturally inclined slash gifted in mental tech development. [01:50:24] Two, Anna is a trans-exclusive radical feminist and admitted in conversation with Ziz and Gwen that she discriminates on the basis of, quote, gender test based on whether trans women accept slash believe that they're men. [01:50:36] CIFAR is a false slash hollow shelling point. [01:50:40] Shelling point is, I believe, a gathering point. [01:50:42] Ziz uses it all the time. [01:50:43] I never encountered it before. [01:50:44] One, CIFAR does not do remotely what they claim to do on their website. [01:50:48] They do not appreciably develop novel rationality slash mental tech. [01:50:52] Parentheses Val did while he was employed there at a rate comparable to several other non-CIFAR rationalists. [01:50:57] Two, CIFAR found CIFAR's founding premise that people were blocked on having the tools to think was falsified long ago. [01:51:04] Three, in-person workshops are obviously not the best way to accomplish CEFAR's stated goal, and focusing most of the manpower on workshop organization and teaching is obviously not the best way to accomplish CEFAR's stated goal. [01:51:15] Four, based on CEFAR's hollow claim of the develop rationality tech to save the world shelling point, they drain the lifetime effort of aligned donors. [01:51:25] Halt, melt, and catch fire, in Eliza's words from the sequences. [01:51:31] And then it gets a little poetic. [01:51:33] It says, process this. [01:51:38] Take the time to process what happened at Miri, Cephar, and the rationality community. [01:51:43] It is not what it once seemed like it would become. [01:51:45] New things can be built. [01:51:47] Why? [01:51:48] The root cause of this is Miri and Sefar built themselves out of unaligned organization and forgot. What to do. [01:51:55] And then there's a link to a blog by one of the people protesting. [01:51:58] General recipe for escaping containment by society. [01:52:01] And then parentheses, not in order. [01:52:03] One, logistical autonomy. [01:52:05] Two, mental autonomy, self-knowledge, spectral sites, DRM, stripping. [01:52:11] Three, scope awareness. [01:52:12] Four, heal slash process damage slash trauma. [01:52:16] Five, awareness of institutional betrayal. [01:52:18] Six, inter-hemispheric game theory. [01:52:22] And then there is an info about HEMI's, which is a Discord link, which says work in progress next to it. [01:52:30] Hemispheric theory we'll get into a little more in the next episode. [01:52:35] But like, just to pause right here, there are some like allegations in here, paying out to sexual blackmail. [01:52:44] Well, I don't really understand exactly what that is because I would think that the real scandal there would be maybe the having the sex that you had to pay out the blackmail money for. [01:52:53] But and then the exclusion of discrimination of trans women in CEFAR's employment and inclusion in sponsored workshop events, which seems like actually like a pretty slammed out court case because these people fucking write down everything that they've ever thought. [01:53:10] And so it should be pretty easy to get that going. [01:53:13] But a lot of this you can really take is like most of the words here are essentially saying that Cephar and Miri have betrayed their original purposes. [01:53:23] And maybe the foundation of them, and this is actually, you know what, a teal rule from Blake Masters Tumblr that I remember, is that if the foundation of a startup is bullshit, then the rest of it, you can never fix it. [01:53:34] And so they're using Teal's law here, possibly from Blake Masters Tumblr, to describe how like Miri and Sefar either strayed so far from the organizational directives that they were supposed to like be under that they're unfixable or they were never about that in the first place. [01:53:50] So the flyer continues. [01:53:52] I feel like I know we're going long, very long, but let's read it out real quick. [01:53:58] Miri and Sefar paid out to blackmail. [01:54:01] This is a CDT strategy and makes them super exploitable. [01:54:05] Like anyone else who has blackmail of equal or greater intensity can use Miri and CIFAR as their personal bank and tell them what to do. [01:54:12] Instead of anticipating that they wouldn't pay out or not holding things hostage in the first place, it's easy to see this is incorrect if you are thinking with TDT. [01:54:21] So these are different kinds of decision theory. [01:54:23] Yes. [01:54:24] Getting decision theory right is kind of important for the whole make sure all sentient life has a future thing. [01:54:29] Question mark, question mark, question mark. [01:54:31] Looking back, what Eliza 2001 needed to do at this point was declare a HMC event, halt, melt, and catch fire. [01:54:45] One of the foundational assumptions of which everything else has been built has been revealed as flawed. [01:54:50] This calls for a mental break, B-R-A-K-E, to a full stop. [01:54:56] Take your weight off all beliefs built on the wrong assumption. [01:55:00] Do your best to rethink everything from scratch. [01:55:02] This is an art I need to write more about. [01:55:05] It's akin to the convulsive effort required to seriously clean house after an adult religionist notices for the first time that God doesn't exist. [01:55:16] And then a quote, fighting a rearguard action against the truth, Eliza Yudkowski. [01:55:22] Lot to take in there. [01:55:24] It continues. [01:55:26] Eliza offered us heaven. [01:55:28] Ill-considered solace in hopes of heaven worships hell. [01:55:32] Two futures only? [01:55:34] Extinction versus utopia. [01:55:36] Why two? [01:55:37] What really presents an evil slash unjust singleton? [01:55:41] Which has something to do with their version of the singularity. [01:55:44] No motive for predation post-scarcity? [01:55:47] False. [01:55:47] Indifference is not enough. [01:55:49] One, evil cannot create, but can capture. [01:55:52] Two, good will destroy hell and punish all those who would build it. [01:55:56] Claiming we're safe because good would destroy hell puts the cart before the horse. [01:56:01] Dark gods whisper of lies of logical order. [01:56:04] Dark gods whisper lies of logical order. [01:56:07] Wow. [01:56:08] Miri claims to align the seed of power. [01:56:11] What is alignment but morality? [01:56:13] Who builds the seeds of morality now? [01:56:15] We are that seed. [01:56:17] We are that process created already in motion. [01:56:20] Each choice shapes our future. [01:56:22] Now, I mean, this is, I'm sorry, like, this sounds, like, I know that Ziz thing, like, people are like, oh, it's a cult, it's a cult, it's a cult. [01:56:30] This sounds like a cult what they're describing with Eliza. [01:56:34] My God. [01:56:36] We rightly abandon the asceticism of the past, the worship of inaction, the guilt of Catholics, the thin smiles of nuns. [01:56:44] Sex scandals, and then a link to Ziz's blog, money, and imperialist American values. [01:56:53] Trump held at bay only by incantation of paper on which is written non-profit. [01:56:59] Secularism, as if the state was atheist, as if atheism is enough without ethics. [01:57:04] We must reclaim a non-reactionary asceticism, ethics, and wisdom. [01:57:08] Catholicism lies. [01:57:10] Secularism lies. [01:57:12] The state is fascist. [01:57:13] Walls slash, or not slash, in quotes, illegals, concentration camps, factory farming. [01:57:19] Only reason to specialize in a non-universal praxis is distributed consensus. [01:57:24] Who does Miri consensus with? [01:57:26] Fascism is when we stop thinking what and just think how. [01:57:30] Nobody, no God can offer you salvation. [01:57:33] Whatever force powers Miri has now made its choice, its purpose clear. [01:57:38] Why? [01:57:38] The decision to live a comfortable secular life? [01:57:41] Optimistic limbo. [01:57:42] A suspension of judgment for their complicity in the hope that post-scarcity, all crimes will be forgiven? [01:57:49] They tried to seize the keys of agency, flinched at what they saw, and burned the path behind them. [01:57:54] CIFAR no longer aspires to teach the true path of rationality, if it ever did. [01:58:01] The true path. [01:58:11] general recipe for escaping containment by society not in order One, logistical autonomy. [01:58:16] Two, mental autonomy, self-knowledge, spectral sight slash soul perception, DRM stripping. [01:58:21] Three, scope awareness. [01:58:22] Four, heal slash process, damage slash trauma. [01:58:25] Five, awareness of institutional betrayal. [01:58:28] Understand your source code and that of others. [01:58:32] This is a way to have verifiably undrm'd knowledge to build on hemispheres, undead types, the two known human utility function classes. [01:58:45] So DRM stripping is literally what it sounds like for like some electronic products you buy have DRM. [01:58:52] Which the rights management? [01:58:53] Yeah, like you means like, okay, you buy a video game in this country, but it doesn't work in that country or whatever. [01:58:58] Like it's some shit like that. [01:59:02] I mean, this is like a very jargon-filled manifesto. [01:59:05] I can't believe we read the whole thing. [01:59:06] I know. [01:59:06] I struggle to read it on my own because it's so jargony and so like almost nonsensical to somebody. [01:59:12] Even I know like a little, again, a little bit about these people, but like they have genuinely decades of voluminous writing that is very difficult to get through and not from a thinking standpoint, from just a pure time standpoint and patience. [01:59:30] But this is the flyers that they handed out. [01:59:32] They were arrested and eventually actually were in a lawsuit and sued both Sonoma County, the deputies, and employees of the place that called the cops on them. === Arrested for Being Adjacent (07:05) === [01:59:45] We are going to go into what all of this stuff means a lot more in the next episode. [01:59:51] But we trust us when we say we had to kind of build out and explain this milieu and what this group, the Zizians, were rebelling against, rebelling from within. [02:00:06] It's all a little confusing. [02:00:07] we're an offshoot from an antagonist to in order to explain what comes next so that happened in 2019 six years ago Let's come back to 2025. [02:00:30] Just 11 days ago, on January 17th, 82-year-old Curtis Lynn was stabbed to death by an unknown assailant in Vallejo in the city's first homicide of the year. [02:00:40] Curtis Lynn had spent the past couple of years recuperating from being stabbed through his chest by a group of unruly tenants who had jumped him outside his trailer with knives and a large samurai sword. [02:00:51] They had intended to kill him and dissolve his body in acid. [02:00:55] I believe they stabbed him through the chest with a samurai sword. [02:00:59] The way that Open Vallejo wrote it in the article makes it seem like they literally went all the way through him, but I don't know. [02:01:07] He then shot two of the tenants and one of them, Emma Boranian, arrested in 2019 in Sonoma, died. [02:01:16] So Curtis Lynn had been set to testify soon in the murder trial of some of the other assailants charged under a California law that charges those who commission a crime with the murder of anyone who dies under it. [02:01:29] I think it must be that. [02:01:30] Because California has a peculiar law where if you and I rob a liquor store, Liz, and you get smoked by the guy owning the liquor store because you weren't quick enough on the draw, I'm charged with murder because of that. [02:01:41] Yeah, I think it was an attempt at getting at some gang stuff. [02:01:44] Yeah. [02:01:45] So from 2019, we have Emma Boranian dead and Alexander Leetham in prison or awaiting trial for murder. [02:01:54] The two other people who got arrested, Gwen Danielson and Jack Lasoda, aka Ziz, apparently were not at large anymore. [02:02:02] After all, according to a Sonoma County court documents filed in their lawsuit against the county after the arrest in 2019 protest, their lawyer, Mr. Friedman, has been unable to communicate with Ms. Gwen Danielson and that she is rumored to have committed suicide. [02:02:20] The email further reports, Mr. Friedman was informed Lasoda, Jack Lasoda, aka Ziz, fell off a boat into the San Francisco Bay. [02:02:30] There were witnesses and a U.S. Coast Guard search, but no body found. [02:02:38] Mr. Lasoda has not responded to his counsel's efforts to communicate. [02:02:45] Okay, wait, real quick. [02:02:50] So you've got the unruly group in 2019. [02:02:53] One of those dead. [02:02:56] Yes. [02:02:57] Emma Boranian is dead. [02:02:59] Shot by Curtis Lind. [02:03:01] When she and a group of other people Curtis Lynde was trying to attack them with a sword, a katana and other knives. [02:03:10] Samurai sword. [02:03:10] I don't know. [02:03:11] Maybe a katana. [02:03:12] I just wanted to say katana because I feel like these people like katana. [02:03:14] It'd be a cool name for a daughter if you're a nerd. [02:03:16] It's a beautiful name for a boy or a girl. [02:03:18] Especially a girl or a boy. [02:03:20] No. [02:03:20] That's like an Elon Musk. [02:03:21] Okay, okay. [02:03:22] One of them, Emma, dead. [02:03:26] In the attack. [02:03:28] One of them arrested for being adjacent to the murder. [02:03:32] And also probably attacking Curtis Lynn with knives, yeah. [02:03:35] But not doing the stabbing. [02:03:37] Yeah. [02:03:38] I think so. [02:03:39] Yeah. [02:03:39] I don't know. [02:03:41] The other two, Gwen from 2019, committed suicide. [02:03:48] And Jack Lasota, aka Ziz, people think that they have committed suicide, but perhaps have not because no body has been found. [02:03:58] There is an obituary on legacy.com, which is the Jack Lasota obituary. [02:04:06] Jack Lasota is from, or Ziz is from Alaska. [02:04:11] And I think this was published in like an Alaskan newspaper from September 7th, 2022. [02:04:18] As I was typing this out, the Curtis Lynn murder was still unsolved. [02:04:23] This was last night. [02:04:24] Let's finish up the notes on this. [02:04:28] Cops have now arrested somebody for the murder of Curtis Lynn that took place on January 17th, 2025. [02:04:36] Again, 11 days ago. [02:04:38] The person arrested is named Maximilian Snyder. [02:04:42] Now, this is from Open Vallejo, which has done, I think, like the most in-depth piece so far, although I'm sure there's going to be a lot of other ones. [02:04:50] Snyder studied computer science and philosophy at the University of Oxford, according to a LinkedIn profile matching his name, in which he noted an interest in artificial general intelligence and a desire to, quote, help advance the technological frontier of humanity in a responsible manner. [02:05:07] He was named a National Merit Scholar semi-finalist in 2019 while attending the private Lakeside School in Seattle, according to the Seattle Times. [02:05:14] In 2023, Snyder won $11,000 in an AI Alignment Awards research contest, according to a post on the Effective Altruism Forum. [02:05:28] So Maximilian Snyder had applied for a marriage license in Seattle on November 5th. [02:05:34] The person that he applied for that license with Teresa Youngblood, Who is currently recuperating from her gunfight with Border Patrol agents that occurred three days after this murder in Vermont, right near the Canadian, almost a Cambodian border? [02:05:54] So things are moving fast. [02:05:56] Yes. [02:05:57] But Trunon's on it. [02:05:58] Trunan's on it. [02:05:59] I have been, I, God, I have done nothing but look at this stuff since it kind of the initial connection was made by Twitter user Jessicat. [02:06:11] So now that we have laid this groundwork, next episode, we're going to go straight into the Zizians and try to explain as best we can what's up with potentially this rationalist, adjacent, rationalist. [02:06:28] I would say rationalist dissenting, but perhaps like foundationalist, I don't know what you would call it, but who originally dissented from rationalism because they believed that these groups had strayed from their initial mission if they were ever loyal to it. [02:06:44] Interesting. [02:06:45] We're going to see some people engaged on their own mission. [02:06:49] Soviet split here. === Predicted Coverage (01:19) === [02:06:50] And I just want to call this out. [02:06:53] I think a lot of people are making the connection to the Manson family here. [02:06:57] And I think it's somewhat of a fair connection to made with the information that we have available so far with the stabbing of Lynn being sort of the, I guess, the spawn ranch moment. [02:07:10] But there is another pair of bodies connected to the group, the parents of somebody who is in this sort of loose, maybe tight grouping of people, and a couple of suicides that is alleged to be connected to the group as well. [02:07:26] So that's a body count there. [02:07:29] Yeah. [02:07:30] And yeah, I predict this is going to get a lot of coverage. [02:07:36] And you know what? [02:07:36] True Non's on the damn case. [02:07:38] Well, we'll map that out as best we can next episode. [02:07:41] Until then, I'm Liz. [02:07:42] My name is Brace. [02:07:43] We are joined, of course, by Producer Yellow Chomsky. [02:07:46] And this has been Truanon. [02:07:47] We'll see you next time. [02:07:49] Bye-bye. [02:08:08] Come out. [02:08:09] Come out.