Behind the Bastards - Part Two: The Zizians: Birth of a Cult Leader Aired: 2025-03-13 Duration: 01:31:33 === Frustrating Internet Arcana (02:49) === [00:00:04] Oh my goodness. [00:00:05] Welcome back to Behind the Bastards, a podcast that is be interested to see how the audience reacts to this one. [00:00:14] Talking about some of the most obscure, frustrating internet arcana that has ever occurred and recently led to the deaths of like six people. [00:00:27] My guest today, as in last episode, David Borey. [00:00:31] David, how you doing, man? [00:00:33] I'm doing great. [00:00:34] I really can't wait to see where this goes. [00:00:39] Yeah. [00:00:41] I feel like anything could happen at this point. [00:00:44] It is going to. [00:00:45] It is going to. [00:00:48] A lot of frustrating things are going to happen. [00:00:53] This is an iHeart podcast. [00:00:55] Guaranteed human. [00:00:58] When a group of women discover they've all dated the same prolific con artist, they take matters into their own hands. [00:01:06] I vowed I will be his last target. [00:01:09] He is not going to get away with this. [00:01:11] He's going to get what he deserves. [00:01:13] We always say that, trust your girlfriends. [00:01:17] Listen to the girlfriends. [00:01:19] Trust me, babe. [00:01:20] On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:01:29] What's up, everyone? [00:01:30] I'm Ago Modern. [00:01:31] My next guest, it's Will Farrell. [00:01:35] My dad gave me the best advice ever. [00:01:38] He goes, just give it a shot. [00:01:40] But if you ever reach a point where you're banging your head against the wall and it doesn't feel fun anymore, it's okay to quit. [00:01:47] If you saw it written down, it would not be an inspiration. [00:01:49] It would not be on a calendar of, you know, the cat just hanging in there. [00:01:56] Yeah, it would not be. [00:01:58] Right, it wouldn't be that. [00:01:59] There's a lot of life. [00:02:01] Listen to Thanks Dad on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:02:08] In 2023, bachelor star Clayton Eckard was accused of fathering twins, but the pregnancy appeared to be a hoax. [00:02:15] You doctored this particular test twice, Miss Owens, correct? [00:02:19] I doctored the test once. [00:02:21] It took an army of internet detectives to uncover a disturbing pattern. [00:02:26] Two more men who'd been through the same thing. [00:02:28] Ray Gillespie and Michael Mancini. [00:02:30] My mind was blown. [00:02:31] I'm Stephanie Young. [00:02:33] This is Love Trapped. [00:02:34] Laura, Scottsdale Police. [00:02:36] As the season continues, Laura Owens finally faces consequences. [00:02:41] Listen to Love Trapped podcast on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:02:48] 10-10 shots fired, City Hall building. [00:02:51] How could this have happened in City Hall? === The Sentience Debate (16:10) === [00:02:53] Somebody tell me that. [00:02:54] A shocking public murder. [00:02:56] This is one of the most dramatic events that really ever happened in New York City politics. [00:03:02] They screamed, get down, get down. [00:03:04] Those are shots. [00:03:06] A tragedy that's now forgotten and a mystery that may or may not have been political. [00:03:11] That may have been about sex. [00:03:13] Listen to Rorschach, murder at City Hall on the iHeartRadio app. [00:03:16] Apple Podcasts are wherever you get your podcasts. [00:03:25] So we had kind of left off by setting up the rationalists where they came from, some of the different strains of thought and beliefs that come out of their weird thought experiments. [00:03:35] And now we are talking about a person who falls into this movement fairly early on and is going to be the leader of this quote-unquote group, the Zizians, who were responsible for these murders that just happened. [00:03:48] Ziz Lasota was born in 1990 or 1991. [00:03:52] I don't have an exact birth date. [00:03:54] She's known to be 34 years old as of 2025. [00:03:57] So it was somewhere in that field. [00:04:00] She was born in Fairbanks, Alaska, and grew up there as her father worked for the University of Alaska as an AI researcher. [00:04:07] We know very little of the specifics of her childhood or upbringing, but in more than 100,000 words of blog posts, she did make some references to her early years. [00:04:17] She claims to have been talented in engineering and computer science from a young age, and there's no real reason to doubt this. [00:04:23] The best single article on all of this is a piece in Wired by Evan Ratliff. [00:04:27] He found a 2014 blog post by Ziz where she wrote, My friends and family, even if they think I'm weird, don't really seem to be bothered by the fact that I'm weird. [00:04:37] But one thing I can tell you is that I used to de-emphasize my weirdness around them, and then I stopped and found that being unapologetically weird is a lot more fun. [00:04:46] Now, it's important you know, Ziz is not the name this person is born under. [00:04:50] She's a trans woman, and so I'm like using the name that she adopts later, but she is not transitioned at this point. [00:04:57] Like this is when she's a kid, right? [00:04:59] And she's not going to transition until fairly late in the story after coming to San Francisco. [00:05:04] So you should just keep that in mind as this is going on here. [00:05:08] Hey, everyone, Robert here. [00:05:09] Just a little additional context. [00:05:10] As best as I think anyone can tell, if you're curious about where the name Ziz came from, there's another piece of serial-released online fiction that's not like a rationalist story, but it's very popular with rationalists. [00:05:24] It's called Worm. [00:05:26] Ziz is a character in that that's effectively like an angel-like being who can like manipulate the future, usually in order to do very bad things. [00:05:39] Anyway, that's where the name comes from. [00:05:42] So, smart kid, really good with computers, kind of weird, and you know, embraces being unapologetically weird at a certain point in her childhood. [00:05:52] Hey, everybody, Robert here. [00:05:54] Did not have this piece of information when I first put the episode together, but I came across a quote in an article from the Boston Globe that provides additional context on Ziz's childhood. [00:06:07] Quote: In middle school, the teen was among a group of students who managed to infiltrate the school district's payroll system and award huge paychecks to teachers they admired while slashing the salaries of those they despised, according to one teacher. [00:06:20] Ziz, the teacher said, struggled to regulate strong emotions, often erupting in tantrums. [00:06:27] I wish I'd had this when David was on, but definitely sets up some of the things that are coming. [00:06:33] She goes to the U of Alaska for her undergraduate degree in computer engineering. [00:06:38] In February of 2009, which is when Eliza Yudkowski started Less Wrong, Ziz starts kind of getting drawn into some of the people who are around this growing subculture, right? [00:06:53] And she's drawn in initially by veganism. [00:06:56] So Ziz becomes a vegan at a fairly young age. [00:06:59] Her family are not vegans. [00:07:01] And she's obsessed with the concept of animal sentience, right? [00:07:06] Of the fact that like animals are thinking and feeling beings just like human beings. [00:07:11] And a lot of this is based in her interest in kind of foundational rationalist in a lot of this is based on her interest of a foundational rationalist and EA figure, a guy named Brian Tomasek. [00:07:25] Brian is a writer and a software engineer as well as an animal rights activist. [00:07:30] And as a thinker, he's what you'd call a long-termist, right? [00:07:34] Which is, you know, pretty tied to the EA guys. [00:07:37] These are all the same people using kind of different words to describe aspects of what they believe. [00:07:43] His organization is the Center on Long-Term Risk, which is a think tank he establishes that's at the ground floor of these effective altruism discussions. [00:07:53] And the goal for the center of long-term risk is to find ways to reduce suffering on a long timeline. [00:08:00] Thomas Ick is obsessed with the concept of suffering and specifically obsessed with concept, suffering as a mathematical concept. [00:08:08] So when I say to you, I want to end suffering, you probably think like, oh, you want to like, you know, go help people who don't have like access to clean water or like who have like worms and stuff that they're dealing with, have access to medicine. [00:08:21] That's what normal people think of, right? [00:08:24] You know, maybe try to improve access to medical care, that sort of stuff. [00:08:29] Thomas Ick thinks of suffering as like a mass, like an aggregate mass that he wants to reduce in the long term through actions, right? [00:08:39] It's a numbers game to him, in other words. [00:08:42] And his idea of ultimate good is to reduce and end the suffering of sentient life. [00:08:49] Critical to his belief system and the one that Ziz starts to develop is the growing understanding that sentience is much more common than many people had previously assumed. [00:08:58] Part of this comes from long-standing debates with their origins in Christian doctrine as to whether or not animals have souls or are basically machines with meat, right? [00:09:07] That don't feel anything, right? [00:09:09] There's still a lot of Christian evangelicals who feel that way today about like, at least the animals we eat, you know, like, well, they don't really think. [00:09:18] It's fine. [00:09:19] God gave them to us. [00:09:20] We can do whatever we want to them. [00:09:21] They're here to eat. [00:09:23] And to be fair, this is an extremely common way for that people in Japan feel about like fish, even whales and dolphins, like the much more intelligent, they're not fish, but like the much more intelligent ocean going creatures. [00:09:34] It's like they're fish. [00:09:35] They don't think you do whatever to them, you know? [00:09:38] This is a reason for a lot of like the really fucked up stuff with like whaling fleets in that part of the world. [00:09:44] So this is a thing all over the planet. [00:09:46] People are very good at deciding certain things we want to eat are machines that don't feel anything, you know? [00:09:53] It's just much more comfortable that way. [00:09:56] Now, this is obviously like you go into like pages, the pagans would have been like, what do you mean? [00:10:01] Animals don't think or have souls. [00:10:03] Animals think, you know, like they, they're, they're like, you're telling me like my horse that I love doesn't think, you know? [00:10:13] That's nonsense. [00:10:14] But it's this thing that in modern, in like early modernity, especially gets more common. [00:10:19] But there are also, this is when we start to have debates about like, what is sentience and what is thinking? [00:10:24] And a lot of them are centered around trying to answer like, are animals sentient? [00:10:29] And the initial definition of sentience that most of these people are using is, can it reason? [00:10:35] Can it speak? [00:10:37] If we can't prove that like a dog or a cow can reason, and if it can't speak to us, right? [00:10:44] Then it's not sentient. [00:10:46] That's how a lot of people feel. [00:10:47] It's an English philosopher named Jeremy Bentham who first argues, I think that what matters isn't can it reason or can it speak, but can it suffer? [00:10:56] Because a machine can't suffer. [00:10:59] If these are machines with meat, they can't suffer. [00:11:02] If these can suffer, they're not machines with meat, right? [00:11:06] And this is the kind of thing how we define sentience is a moving thing. [00:11:12] Like you can find different definitions of it. [00:11:15] But the last couple of decades in particular of actually very good data has made it clear, I think inarguably, that basically every living thing on this planet has a degree of what you would call sentience. [00:11:27] If you are describing sentience the way it generally is now, which is a creature has the capacity for subjective experience with a positive or negative, negative valence, i.e. can feel pain or pleasure and also can feel it as an individual, right? [00:11:47] It doesn't mean you know, sometimes people use the term effective sentience to refer to this to differentiate it from like being able to reason and make moral decisions. [00:11:55] Um, you know, uh, for example, ants, I don't think, can make moral decisions, you know, in any way that we would recognize. [00:12:04] They certainly don't think about stuff that way. [00:12:06] But 2025 research published by Dr. Volker Nehring found evidence that ants are capable of remembering for long periods of time violent encounters they have with other individual ants and holding grudges against those ants, right? [00:12:21] Just like us, they're just like us. [00:12:23] Um, and there's strong evidence that ants do feel pain, right? [00:12:26] We're now pretty sure of that. [00:12:28] And in fact, again, this is an argument that a number of researchers in this space will make. [00:12:32] Sentience is probably some kind of something like this kind of sentience, the ability to have subjective positive and negative experiences is universal to living things or very close to it, right? [00:12:43] Um, it's an interesting body of research, but it there's a it's it's fairly solid at this point. [00:12:50] And again, I say this as somebody who like hunts and raises livestock. [00:12:53] Um, I don't, I don't think there's any solid reason to disagree with this. [00:12:57] So you can see there's a basis to a lot of what Thomasic is saying, right? [00:13:01] Which is that you should, if you're what matters is reducing the overall amount of suffering in the world. [00:13:09] And if you're looking at suffering as a mass, if you're just adding up all of the bad things experienced by all of the living things, animal suffering is a lot of the suffering. [00:13:18] So if our goal is to reduce suffering, animal welfare is hugely important, right? [00:13:22] It's a great place to start. [00:13:23] Great. [00:13:24] Fine enough, you know? [00:13:25] A little bit of a weird way to phrase it, but fine. [00:13:30] So here's the way problem, though. [00:13:32] Thomasic, like all these guys, spends too much time. [00:13:37] None of them can be like, hey, had a good thought. [00:13:40] We're done. [00:13:41] Setting that thought down. [00:13:42] Moving on. [00:13:43] So he keeps thinking about shit like this, and it leads him to some very irrational takes. [00:13:48] For example, in 2014, Thomas Ick starts arguing that it might be immoral to kill characters in video games. [00:13:56] And I'm going to quote from an article in Vox. [00:13:59] He argues that while NPCs do not have anywhere near the mental complexity of animals, the difference is one of degree rather than kind. [00:14:06] And we should care at least a tiny amount about their suffering, especially as they grow more complex. [00:14:14] And his argument is that, like, yeah, most it doesn't matter like individually killing a Goomba or a bet or a guy in GTA 5, but like because they're getting more complicated and able to like try to avoid injury and stuff, there's evidence that there's some sort of suffering there. [00:14:29] And thus the sheer mass of NPCs being killed, that might be like enough that it's ethically relevant to consider. [00:14:36] And I think that's silly. [00:14:38] I think that's ridiculous. [00:14:40] Come on, man. [00:14:41] I'm sorry, man. [00:14:42] No, I'm sorry. [00:14:46] I hate to do this guy, but that's a lot of the fun of the game. [00:14:48] Yeah. [00:14:48] Killing the NPCs. [00:14:51] If you're telling me, like, we need to be deeply concerned about the welfare of like cows that we lock into factory farms, you got me. [00:14:59] Absolutely. [00:14:59] For sure. [00:15:00] If you're telling me, I should feel bad about running down a bunch of cops in Grand Theft Auto. [00:15:07] It's also one of those things where it's like, you got to think locally, man. [00:15:10] There's people on your street who need help. [00:15:12] There's, there's like, there's like, this is, this is the, I mean, and he does say, like, I don't consider this a main problem, but like the fact that you think this is a problem is means that you believe silly things about consciousness. [00:15:24] Um, yeah, I, anyway. [00:15:27] Um, so this is, I think, the fact that he gets he leads himself here is kind of evidence of the sort of logical fractures that are very common in this community. [00:15:36] But this is the guy that young Ziz is drawn to. [00:15:39] She loves this dude, right? [00:15:41] He is kind of her first intellectual heart throb. [00:15:44] And she writes, quote, my primary concern upon learning about the singularity was how do I make this benefit all sentient life, not just humans. [00:15:53] So she gets interested in this idea of the singularity. [00:15:56] It's inevitable that an AI god is going to arise. [00:15:59] And she gets into the, you know, the rationalist thing of we have to make sure that this is a nice AI rather than a mean one. [00:16:07] But she has this other thing to it, which is this AI has to care as much as I do about animal life, right? [00:16:14] Otherwise, we're not really making the world better, you know? [00:16:19] Now, Thomasic advises her to check out Less Wrong, which is how Ziz starts reading Eliza Yudkowski's work. [00:16:26] From there, in 2012, she starts reading up on effective altruism and existential risk, which is a term that means the risk that a super intelligent AI will kill us all. [00:16:37] She starts believing in, you know, all of this kind of stuff. [00:16:41] And her particular belief is that like the singularity, when it happens, is going to occur in a flash, kind of like the rapture, and almost immediately lead to the creation of either a hell or a heaven, right? [00:16:55] And this will be done by the term they use for this inevitable AI is the singleton, right? [00:17:00] That's that's what they call the AI god that's going to come about, right? [00:17:05] And so her obsession is that she has to find a way to make the singleton a nice AI that cares about animals as much as it cares about people, right? [00:17:14] That's her initial big motivation. [00:17:16] So she starts emailing Thomask with her concerns because she's worried that the other rationalists aren't vegans, right? [00:17:22] And they don't feel like animal welfare is like the top priority for making sure this AI is good. [00:17:28] And she really wants to convert this whole community to veganism in order to ensure that the singleton is as focused on insect and animal welfare as human welfare. [00:17:38] And Thomasic does care about animal rights, but he disagrees with her because he's like, no, what matters is maximizing the reduction of suffering. [00:17:46] And like a good singleton will solve climate change and shit, which will be better for the animals. [00:17:51] And if we focus on trying to convert everybody in this, the rationalist space to veganism, it's going to stop us from accomplishing these bigger goals, right? [00:18:00] This is shattering to Ziz, right? [00:18:03] She decides that he doesn't, Thomas doesn't care about good things and she decides that she's basically alone in her values. [00:18:10] And so her first move. [00:18:11] It's the time to start a smaller subculture. [00:18:14] This sounds like we're on our way. [00:18:18] She first considers embracing what she calls negative utilitarianism. [00:18:23] And this is an example of the fact that from the jump, this is a young woman who's not well, right? [00:18:29] Because once her hero is like, I don't know if veganism is necessarily the priority we have to embrace right now. [00:18:39] Her immediate goal is to jump to, well, maybe what I should do is optimize myself to cause as much harm to humanity and quote, destroy the world to prevent it from becoming hell for mostly everyone. [00:18:52] So that's a jump, you know? [00:18:54] That's not somebody who's doing well is healthy, right? [00:18:58] No, she's having a tough time out of it. === Cult Leader Logic (14:00) === [00:19:03] So Ziz does ultimately decide she should still work to bring about a nice AI, even though that necessitates working with people she describes as flesh-eating monsters who had created hell on earth for far more people than those they had helped. [00:19:16] That's everybody who eats meat. [00:19:19] Okay. [00:19:20] Yes, yes. [00:19:21] And it's ironic. [00:19:22] Large group. [00:19:23] It's ironic because like if you're, if you're, she really wants to be in the tech industry, she's trying to get in all these people in the tech industry. [00:19:30] That's a pretty good description of a lot of the tech industry. [00:19:33] They are in fact flesh-eating monsters who have created hell on earth for more people than they've helped. [00:19:37] But she means that for like, I don't know, your aunt who has a hamburger once a week. [00:19:42] And look, again, factory farming, evil. [00:19:44] I just don't think that's how morality works. [00:19:50] I think you're going a little far. [00:19:52] No, she's making big jumps. [00:19:54] Yeah, you're making. [00:19:55] Bold thinker. [00:19:58] Yeah. [00:19:59] Now, what you see here with this logic is that Ziz has taken this, she has a massive case of main character syndrome, right? [00:20:06] All of this is based in her attitude that I have to save. the universe by creating, by helping to or figuring out how to create an AI that can end the eternal holocaust of all animal life and also save humanity, right? [00:20:23] That's a lot of shoulders. [00:20:24] That's me. [00:20:24] It's a lot on our shoulders. [00:20:26] And this is a thing. [00:20:28] Again, all of this comes out of both subcultural aspects and aspects of American culture. [00:20:34] One major problem that we have in the society is Hollywood has trained us all on a diet of movies with main characters that are the special boy or the special girl with the special powers who save the day, right? [00:20:49] And real life doesn't work that way very often, right? [00:20:54] The Nazis, there was no special boy who stopped the Nazis. [00:20:57] There were a lot of farm boys who were just like, I guess I'll go run in a machine gun nest until this is done. [00:21:04] Exactly. [00:21:05] There were a lot of 16-year-old Russians who were like, I guess I'm going to walk in a bullet, you know? [00:21:10] Like, that's how evil gets fought usually, unfortunately. [00:21:15] All reluctant, like, yeah. [00:21:18] Or a shitload of guys in a lab figuring out how to make corn that has higher yields so people don't starve, right? [00:21:25] These are, these are really like how world class, like huge world problems get solved. [00:21:30] It's not traditionally people who have been touched, you know? [00:21:33] Yeah. [00:21:33] It's not people who have been touched. [00:21:34] And it's certainly not people who have entirely based their understanding on the world from quotes from Star Wars and Harry Potter. [00:21:47] So some of this comes from just like, this is a normal deranged way of thinking that happens to a lot of people in just Western. [00:21:55] I think a lot of this leads to why you get very comfortable middle class people joining these very aggressive fascist movements in the West, like in Germany. [00:22:05] It's like middle class, mostly like middle class and upper middle class people in the U.S., especially among like these street fighting, you know, proud boy types. [00:22:13] It's because it's not because they're like suffering and desperate. [00:22:16] They're not starving in the streets. [00:22:19] It's because they're bored and they want to feel like they're fighting an epic war against evil. [00:22:25] Yeah. [00:22:25] I mean, you want to fill your time with importance, right? [00:22:27] Right. [00:22:28] Regardless of what you do. [00:22:29] You want to, you, and you want to feel like you have a cause worthy of fighting for. [00:22:33] So in that, I guess I see how you got here. [00:22:36] Yeah. [00:22:36] So there's a piece. [00:22:37] I mean, I think there's a piece of this that originates just from this is something in our culture, but there's also a major, a major chunk of this gets supercharged by the kind of thinking that's common in EA and rationalist spaces. [00:22:49] Because so rationalists and effective altruists are not ever thinking like, hey, how do we as a species fix these major problems, right? [00:22:57] They're thinking, how do I make myself better, optimize myself to be incredible? [00:23:05] And how do I like fix the major problems of the world alongside my mentally superpowered friends, right? [00:23:15] These are very individual focused philosophies and attitudes, right? [00:23:20] And so they do lend themselves to people who think that like we are heroes who are uniquely empowered to save the world. [00:23:28] Ziz writes, I did not trust most humans' indifference to build a net positive cosmos, even in the absence of a technological convenience to prey on animals. [00:23:37] So like, I'm the only one who has the mental capability to actually create the net positive cosmos that needs to come into being. [00:23:46] All of her discussion is talking in like terms of I'm saving the universe, right? [00:23:51] And a lot of that does come out of the way many of these people talk on the internet about the stakes of AI and just like the importance of rationality. [00:24:00] Again, this is something Scientology does. [00:24:01] L. Ron Hubbard always couched. [00:24:03] getting people on Dianetics in terms of we are going to save the world and end war, right? [00:24:08] Like this is, you know, it's very normal for cult stuff. [00:24:12] She starts reading around this time when she's in college, Harry Potter and the methods of rationality. [00:24:18] This helps to solidify her feelings of her own centrality as a hero figure. [00:24:23] In a blog post where she lays out her intellectual journey, she quotes a line from that fanfic of Yudkowski's that is, it's essentially about what Yudkowski calls the hero contract, right? [00:24:36] Or sorry, it's essentially about this concept called the hero contract, right? [00:24:40] And there's this, there's this, this is a psychological concept among academics, right? [00:24:47] Where, and it's about like, it's about analyzing how we as a, how we should look at the people who societies declare heroes and the communities that declare them heroes and see them as in a dialogue, right? [00:25:04] As in when a, when you're in a country decides this guy's a hero, he is through his actions kind of conversing to them and they are kind of telling him what they expect from him, right? [00:25:16] But Yudkowski wrestles with this concept, right? [00:25:19] And he comes to some very weird conclusions about it in one of the worst articles that I've ever read. [00:25:26] He frames it as hero licensing to refer to the fact that people get angry at you if they don't think you have, if you're trying to do something and they don't think you have a hero license to do it. [00:25:37] In other words, if you're trying to do something like that they don't think you're qualified to do, he'll describe that as them not thinking you have like a hero license. [00:25:47] And he like writes this annoying article that's like a conversation between him and a person who's supposed to embody the community of people who don't think he should write Harry Potter fanfiction. [00:25:57] It's all very silly. [00:25:59] Again, all this is ridiculous, but Ziz is very interested in the idea of the hero contract, right? [00:26:06] But she comes up with her own spin on it, which she calls the true hero contract, right? [00:26:11] And instead of, again, the academic term is the hero contract means societies and communities pick heroes, and those heroes and the community that they're in are in a constant dialogue with each other about what is heroic and what is expected, right? [00:26:27] What the hero needs from the community and vice versa, you know? [00:26:31] That's all that that's saying. [00:26:32] Ziz says, no, no, no, that's bullshit. [00:26:35] The real hero contract is, quote, pour free energy at my direction and it will go into the optimization for good. [00:26:44] Classic. [00:26:45] In other words, classics is it's not a dialogue. [00:26:49] If you're the hero, the community has to give you their energy and time and power and you will use it to optimize them for good because they don't know how to do it themselves because they're not really able to think, you know? [00:27:03] Because they're not the hero. [00:27:04] Because they're not the hero, right? [00:27:05] You are. [00:27:06] You are. [00:27:07] You are the all-powerful hero. [00:27:10] Now, this is a fancy way of describing how cult leaders think, right? [00:27:15] Everyone exists to pour energy into me and I'll use it to do what's right, you know? [00:27:20] So this is where her mind is in 2012. [00:27:23] But again, she's just a student posting on the internet and chatting with other members of the subculture at this point. [00:27:30] That year, she starts donating money to Miri, the Machine Intelligence Research Institute, which is a non-profit devoted to studying how to create friendly AI. [00:27:39] Yedkowski founded Miri in 2000, right? [00:27:42] So, this is his like non-profit think tank. [00:27:45] In 2013, she finished an internship at NASA. [00:27:48] So, again, she is a very smart young woman, right? [00:27:51] She gets an internship at NASA and she builds a tool for space weather analysis. [00:27:55] So, she's a person with a lot of potential. [00:27:57] Very, very, as all of the stuff she's writing is like dumb as shit. [00:28:01] But again, intelligence isn't an absolute. [00:28:03] People can be brilliant at coding and have terrible ideas about everything else. [00:28:08] Yes, exactly. [00:28:09] Yeah. [00:28:11] I wonder if she's telling, you think she's telling people at work? [00:28:15] Um, I don't, I don't think at this point she is because she's super insular, right? [00:28:21] She's very uncomfortable talking to people, right? [00:28:23] Okay, she's going to kind of break out of her shell once she gets to San Francisco. [00:28:28] Now, I don't know. [00:28:29] She may have talked to some of them about this stuff, but I really don't think she is at this point. [00:28:33] I don't think she's comfortable enough doing that. [00:28:36] Um, yeah, so she also does an internship at the software giant Oracle. [00:28:41] So, at this point, you've got this young lady who's got a lot of potential, you know, a real career as well. [00:28:46] Yeah, the start of a very real career. [00:28:49] That's a great starting resume for like a 22-year-old. [00:28:53] Yeah. [00:28:53] Um, now at this point, she's torn. [00:28:56] Should she go get a graduate degree, right? [00:28:59] Or should she jump right into the tech industry, you know? [00:29:02] Um, and she worries that, like, if she waits to get a graduate degree, this will delay her making a positive impact on the existential risk caused by AI, and it'll be too late. [00:29:12] The singularity will happen already, you know. [00:29:15] At this point, she's still a big, a fawning fan of Eliza Yudkowski. [00:29:19] And the highest-ranking woman at Yudkowski's organization, Miri, is a lady named Susan Salomon. [00:29:26] Susan gives a public invitation to the online community to pitch ideas for the best way to improve the ultimate quality of the singleton that these people believe is inevitable. [00:29:36] In other words, hey, give us your ideas for how to make the inevitable AI God nice, right? [00:29:41] Um, here's what Ziz writes about her response to that: I asked her whether I should try an alter course and do research or continue a fork of my pre-existing life plan, earn to give as a computer engineer, but retrain and try to do research directly instead. [00:29:56] At the time, I was planning to go to grad school, and I had an irrational attachment to the idea. [00:30:00] She sort of compromised and said I should go to grad school, find a startup co-founder, drop out, and earn to give via startups instead. [00:30:09] First off, bad advice, Susan. [00:30:12] Bad advice. [00:30:13] Just be Steve Jobs. [00:30:16] Being Steve Jobs worked for Steve Jobs. [00:30:19] Well, and Bill Gates, I guess, to an extent. [00:30:21] It doesn't work for most people. [00:30:23] No, no, no. [00:30:24] It seems like the general tech disruptor idea, you know? [00:30:29] Yeah, and most people, these people aren't very original thinkers. [00:30:32] Like, yeah, she's just saying, like, yeah, go do a Steve Jobs. [00:30:35] So, um, Ziz does go to grad school. [00:30:39] Uh, and somewhere around that time in 2014, she attends a lecture by Eliza Yudkowski on the subject of inadequate equilibria, which is the title of a book that Yudkowsky had wrote about the time. [00:30:50] And the book is about where and how civilizations get stuck. [00:30:54] One reviewer, Brian Kaplan, who despite being a professor of economics, must have a brain as smooth as a pearl, wrote this about it: Every society is screwed up. [00:31:04] Eliza Yudkowski is one of the few thinkers on earth who are trying at the most general level to understand why. [00:31:10] And this is like wow, that's it. [00:31:14] Please study the humanities a little bit, a little bit, a little bit. [00:31:19] I mean, fuck me. [00:31:20] The first and most like one of the first influential works of modern historic scholarship is The Decline and Fall of the Roman Empire. [00:31:28] It's a whole book about why a society fell apart. [00:31:32] And like motherfucker, more recently, Mike Davis existed. [00:31:36] Like Jesus Christ. [00:31:41] I can't believe this guy continues to get traction. [00:31:44] Nobody else is thinking about why society is screwed up, but Eliza Yudkowski. [00:31:48] This man. [00:31:49] This man who wrote this. [00:31:50] This guy. [00:31:51] This man who wrote this Harry Potter novel. [00:31:54] Yeah. [00:31:54] No, I was trying to find another. [00:31:55] I read through that Martin Luther King Jr. speech. [00:31:58] Everything's good. [00:32:00] Yeah. [00:32:01] Oh, boy. [00:32:03] Oh, my God. [00:32:04] Oh, my God. [00:32:05] Like, motherfucker, so many people do nothing but try to write about why our society is sick. [00:32:12] You know? [00:32:15] On all levels, by the way. [00:32:16] On all levels. [00:32:17] Thinking about it. [00:32:18] Everybody's thinking about this. [00:32:21] This is such a common subject of scholarship and discussion. [00:32:26] In the barroom. [00:32:28] What everyone's talking about. [00:32:30] Always. [00:32:31] It would be like if I got really into like reading medical textbooks and was like, you know what? [00:32:37] Nobody's ever tried to figure out how to transplant a heart. [00:32:40] I'm going to write a book about how that might work. [00:32:44] And I think I got it. [00:32:50] Oh, these fucking people. [00:32:55] So, yeah, speaking of these fucking people, have sex with. === Everyone Is Talking About This (03:48) === [00:33:03] Nope. [00:33:04] Well, that's not something. [00:33:05] No, no, I don't know. [00:33:06] I don't know. [00:33:06] Don't fuck. [00:33:08] Listen to ads. [00:33:11] What's up, everyone? [00:33:12] I'm Ego Modem. [00:33:14] My next guest, you know, from Step Brothers, Anchorman, Saturday Night Live, and the Big Money Players Network. [00:33:21] It's Will Farrell. [00:33:25] My dad gave me the best advice ever. [00:33:28] I went and had lunch with him one day, and I was like, and dad, I think I want to really give this a shot. [00:33:33] I don't know what that means, but I just know the groundlings. [00:33:35] I'm working my way up through it. [00:33:37] I know it's a place they come look for up and coming talent. [00:33:39] He said, if it was based solely on talent, I wouldn't worry about you, which is really sweet. [00:33:44] Yeah. [00:33:45] He goes, but there's so much luck involved. [00:33:48] And he's like, just give it a shot. [00:33:49] He goes, but if you ever reach a point where you're banging your head against the wall and it doesn't feel fun anymore, it's okay to quit. [00:33:58] If you saw it written down, it would not be an inspiration. [00:34:00] It would not be on a calendar of, you know, the cat just hang in there. [00:34:07] Yeah, it would not be. [00:34:09] Right, it wouldn't be that. [00:34:10] There's a lot of luck. [00:34:12] Listen to Thanks Dad on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:34:20] In 2023, former bachelor star Clayton Eckard found himself at the center of a paternity scandal. [00:34:27] The family court hearings that followed revealed glaring inconsistencies in her story. [00:34:32] This began a years-long court battle to prove the truth. [00:34:35] You doctored this particular test twice, Miss Owens, correct? [00:34:39] I doctored the test once. [00:34:40] It took an army of internet detectives to crack the case. [00:34:44] I wanted people to be able to see what their tax dollars were being used for. [00:34:48] Sunlight's the greatest disinfectant. [00:34:50] They would uncover a disturbing pattern. [00:34:52] Two more men who'd been through the same thing. [00:34:55] Greg Gillespie and Michael Marancini. [00:34:57] My mind was blown. [00:34:59] I'm Stephanie Young. [00:35:00] This is Love Trap. [00:35:02] Laura, Scottsdale Police. [00:35:04] As the season continues, Laura Owens finally faces consequences. [00:35:09] Ladies and gentlemen, breaking news out of Maricopa County as Laura Owens has been indicted on fraud charges. [00:35:15] This isn't over until justice is served in Arizona. [00:35:20] Listen to Love Trapped podcast on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:35:30] 10-10 shots fired, City Hall building. [00:35:33] A silver .40 caliber handgun was recovered at the scene. [00:35:37] From iHeart Podcasts and Best Case Studios, this is Rorschach, murder at City Hall. [00:35:43] How could this have happened in City Hall? [00:35:45] Somebody tell me that, Jeffrey. [00:35:46] What did I? [00:35:47] July 2003. [00:35:49] Councilman James E. Davis arrives at New York City Hall with a guest. [00:35:54] Both men are carrying concealed weapons. [00:35:57] And in less than 30 minutes, both of them will be dead. [00:36:05] Everybody in the chamber's ducks. [00:36:08] A shocking public murder. [00:36:10] I scream, get down, get down. [00:36:11] Those are shots. [00:36:12] Those are shots. [00:36:13] Get down. [00:36:13] A charismatic politician. [00:36:15] You know, he just bent the rules all the time, man. [00:36:17] I still have a weapon. [00:36:20] And I could shoot you. [00:36:23] And an outsider with a secret. [00:36:25] He alleged he was a victim of flat down. [00:36:28] That may or may not have been political. [00:36:29] That may have been about sex. [00:36:31] Listen to Rorschach, murder at City Hall on the iHeartRadio app. [00:36:35] Apple Podcasts are wherever you get your podcasts. [00:36:44] There's two golden rules that any man should live by. [00:36:48] Rule one, never mess with a country girl. === Ethics vs Career Success (15:23) === [00:36:52] You play stupid games, you get stupid prizes. [00:36:54] And rule two, never mess with her friends either. [00:36:58] We always say, trust your girlfriends. [00:37:02] I'm Anna Sinfield, and in this new season of The Girlfriends, oh my God, this is the same man. [00:37:07] A group of women discover they've all dated the same prolific con artist. [00:37:12] I felt like I got hit by a truck. [00:37:14] I thought, how could this happen to me? [00:37:16] The cops didn't seem to care. [00:37:18] So they take matters into their own hands. [00:37:21] I said, oh, hell no. [00:37:23] I vowed I will be his last target. [00:37:25] He's going to get what he deserves. [00:37:30] Listen to the girlfriends. [00:37:31] Trust me, babe. [00:37:32] On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:37:45] We're back. [00:37:47] So Ziz is at this speech where Yudkowski is shilling his book. [00:37:51] And most of what he seems to be talking about in this speech about this book, about why societies fall apart, is how to make a tech startup. [00:38:00] She says, quote, he gave a recipe for finding startup ideas. [00:38:03] He said, Paul Graham's idea, only filter on people, ignore startup ideas, was partial epistemic learned helplessness. [00:38:10] That means Paul Graham is saying, focus on finding good people that you'd start a company with. [00:38:15] Having an idea for a company doesn't matter. [00:38:17] Yudkowski says, of course, startup ideas mattered. [00:38:20] You needed a good startup idea. [00:38:21] So look for a way in the world is broken. [00:38:24] Then compare against a checklist of things you couldn't fix. [00:38:27] You know, right? [00:38:28] Like that's, that's, that's what this speech is largely about, is him being like, here's how to find startup ideas. [00:38:34] So she starts thinking. [00:38:35] She starts thinking as hard as she can. [00:38:37] And, you know, being a person who is very much of the tech brain industry rot at this point, she comes up with a brilliant idea. [00:38:47] It's a genius idea. [00:38:48] Oh, you're going to, you're going to love this idea, David. [00:38:53] Uber for prostitutes. [00:38:57] You're fucking with me. [00:38:59] No, no. [00:39:03] That's where she landed. [00:39:05] She lands on the idea of look. [00:39:07] Oh, wow. [00:39:09] Sex work is illegal, but porn isn't. [00:39:12] So, if we start an Uber whereby a team with a camera and a porn star come to your house and you fuck them and record it, that's a legal loophole. [00:39:25] We just found out at a legal prostitution. [00:39:27] Is that not just the bang bus? [00:39:30] She makes the bang bus in the gig economy. [00:39:39] It is really like Don Dre for a moment. [00:39:41] What about Uber, but a pimp? [00:39:44] It's so funny. [00:39:45] These people. [00:39:50] You gotta love it. [00:39:51] You gotta love it. [00:39:52] Wow. [00:39:52] It's wow. [00:39:53] Wow. [00:39:54] What a place to end up. [00:39:55] I would love to see the other drafts. [00:39:58] Yeah. [00:39:58] Yeah. [00:39:58] What came first? [00:40:02] Oh, God. [00:40:04] Yeah. [00:40:06] Man, that's, that's, that is the good stuff, isn't it? [00:40:10] Yeah. [00:40:10] Wow. [00:40:11] Wow. [00:40:14] We special minds at work here. [00:40:16] Oh, man. [00:40:17] Ultimately, to save it all, I have to make smart. [00:40:21] I have to make pimp Uber. [00:40:24] That's so wild. [00:40:26] Yes. [00:40:26] Yes. [00:40:26] The Uber of pimping. [00:40:28] What an idea. [00:40:29] Now, so Ziz devotes her brief time in grad school. [00:40:34] She's working on pimping Uber to try and find a partner, right? [00:40:38] She wants to have a startup partner, someone who will embark on this journey with her. [00:40:42] I don't know if that's an investor you need to. [00:40:47] It doesn't work out. [00:40:48] She drops out of grad school because, quote, I did not find someone who felt like good startup co-founder material. [00:40:55] This may be because she's very bad at talking to people and also probably scares people off because the things that she talks about are deeply off-putting. [00:41:04] Yeah, I was going to say, it's also a terrible idea. [00:41:07] And at this point, she hasn't done anything bad. [00:41:09] So I feel bad for her. [00:41:10] This is a person who's very lonely, who's very confused. [00:41:13] She has by this point realized that she's trans, but not transitioned. [00:41:17] She's in like, this is, this is like a tough place to be. [00:41:20] Right. [00:41:21] That's a hard time. [00:41:22] That's, that's hard. [00:41:23] And nothing about her inherent personality makes it, is going to make this easier for her, right? [00:41:28] Who she is makes all of this much harder because she also makes some comments about dropping out because her thesis advisor was abusive. [00:41:41] I don't fully know what this means. [00:41:43] And here's why. [00:41:44] Ziz and encounters some behavior I will describe later that is abusive from other people, but also regularly defines abuse as people who disagree with her about the only thing that matters being creating an AI god to protect the animals. [00:41:58] So I don't know if her thesis advisor was abusive or was just like, maybe drop the alien god idea for a second. [00:42:05] Like the AI God. [00:42:06] Yeah, yeah. [00:42:07] Maybe focus on like finding a job, you know, making some friends. [00:42:14] Go on a couple dates. [00:42:15] Go on a couple of dates, something like that. [00:42:17] Maybe, maybe, maybe like, maybe make God on the back burner here for a second. [00:42:24] Whatever happened here, she decides it's time to move to the bay. [00:42:27] This is like 2016. [00:42:29] She's going to find a big tech job. [00:42:31] She's going to make that big tech money while she figures out a startup idea and finds a co-founder who will let her make enough money to change and save the world. [00:42:39] Well, the whole universe. [00:42:41] Her first plan is to give the money to Miri, Yudkowski's organization, so it can continue its important work, imagining a nice AI. [00:42:50] Her parents, she's got enough family money that her parents are able to pay for like, I think like six months or more of rent in the bay, which is not nothing. [00:42:58] Not a cheap place to live. [00:43:01] I don't know exactly how long her parents are paying, but like that, that implies a degree of financial comfort, right? [00:43:09] So she gets hired by a startup very quickly because again, very gifted. [00:43:14] Yeah, with a residue, right? [00:43:15] Yes. [00:43:17] And it's some sort of gaming company. [00:43:18] But at this point, she's made another change in her ethics system based on Lisa Yudkowski's writings. [00:43:26] One of Yudkowski's writings argues that it is talking about the difference between consequentialists and virtue ethics, right? [00:43:34] Consequentialists are people who focus entirely on what will the outcome of my actions be. [00:43:40] And it kind of doesn't matter what I'm doing, or even if it's sometimes a little fucked up, if the end result is good. [00:43:47] Virtue ethics people have a code and stick to it, right? [00:43:52] And actually, and I kind of am surprised that he came to this. [00:43:55] Yudkowski's conclusion is that like, while logically you're more likely to succeed, like on paper, you're more likely to succeed as a consequentialist. [00:44:03] His opinion is that virtue ethics has the best outcome. [00:44:06] People tend to do well when they stick to a code and they try to rather than like anything goes as long as I succeed. [00:44:12] Right. [00:44:13] And I think that's actually a pretty decent way to live your life. [00:44:17] No, I was going to say it's a pretty reasonable conclusion for him. [00:44:21] It's a reasonable conclusion for him. [00:44:23] So I don't blame him on this part. [00:44:24] But here's the problem. [00:44:25] Ziz is trying to break into and succeed in the tech industry. [00:44:30] And you can't, you are very unlikely to succeed at a high level in the tech industry if you are unwilling to do things and be have things done to you that are unethical and fucked up. [00:44:44] I'm not saying this is good. [00:44:45] And this is the reality of the entertainment industry too, right? [00:44:50] When I started in, I started with an unpaid internship. [00:44:53] Unpaid internships are bad, right? [00:44:55] It's bad that those exist. [00:44:57] They inherently favor people who have money and people who have family connections. [00:45:01] You know, I had like a small savings account for my job in special ed, but that was the standard. [00:45:07] It's like there were a lot of unpaid internships. [00:45:09] It got me my foot in the door. [00:45:11] It worked for me. [00:45:12] I also worked a lot of overtime that I didn't get paid for. [00:45:15] I did a lot of shit that wasn't a part of my job to impress my bosses, to make myself indispensable so that they would decide like we have to keep this guy on and pay him. [00:45:26] And it worked for me. [00:45:27] And I just wanted to add, because this was not in the original thing, a big part of why it worked for me is that I'm talking about a few different companies here, but particularly at Cracked, where I had the internship, like my bosses, you know, made a choice to mentor me and, you know, to get me, you know, to work overtime on their own behalf to like make sure I got a paying job, which is a big part of like the luck that I encountered that a lot of people don't. [00:45:54] So that's another major part of like why things worked out for me is that I just got incredibly lucky with the people I was working for and with. [00:46:02] That's bad. [00:46:04] It's not good that things work that way, right? [00:46:07] It's not like set up for you either. [00:46:10] Like you kind of defied the odds. [00:46:11] It's for, like you said, the rich people who get the job. [00:46:15] Exactly. [00:46:16] It's not even. [00:46:17] Yes. [00:46:17] That said, if I am giving someone, if someone wants what is the most likely path to succeeding, you know, I've, I've just got this job working, you know, on the at this production company or a music steer. [00:46:31] I would say, well, your best odds are to like make yourself completely indispensable and become obsessively devoted to that task. [00:46:40] Right. [00:46:41] That's it. [00:46:42] I don't tend to give that advice anymore. [00:46:45] I have, and I have had several other friends succeed as a result of it. [00:46:49] And all of us also burnt ourselves out and did huge amounts of damage to ourselves. [00:46:54] Like I am permanently broken as a result of, you know, the 10 years that I did 80-hour weeks and shit. [00:47:01] You know, now you're sounding like somebody who works in the entertainment industry. [00:47:05] Yes. [00:47:05] Yes. [00:47:06] And it worked for me, right? [00:47:08] I got a, I got a, I, I, I succeeded. [00:47:11] I got a great job. [00:47:12] I got money. [00:47:13] Um, most people, it doesn't, and it's bad that it works this way. [00:47:17] Ziz, unlike me, is not willing to do that, right? [00:47:21] She thinks it's wrong to be asked to work overtime and not get paid for it. [00:47:26] And so, on her first day at the job, she leaves after eight hours. [00:47:29] And her boss is like, What the fuck are you doing? [00:47:31] And she's like, I'm, I'm here to supposed to be here eight hours. [00:47:35] Eight hours is up. [00:47:36] I'm going home. [00:47:37] And he calls her half an hour later and fires her, right? [00:47:41] And this is because the tech industry is evil, you know? [00:47:44] Like, like, this is bad. [00:47:46] She's not bad here. [00:47:48] She is, it is like a thing where it's, she's not doing by her standards, what I would say is the rational thing, which would be if all that matters is optimizing your earning power, right? [00:47:59] Right. [00:48:00] Well, then you do this. [00:48:01] Then you do do whatever it takes, right? [00:48:03] Um, so it's kind of interesting to me, like that she is so devoted to this like virtue ethics thing at this point that she fucks over her career in the tech industry because she's not willing to do the things that you kind of need to do to succeed, you know, in the place that she is. [00:48:20] But it's interesting. [00:48:21] I don't like give her any shit for that. [00:48:23] So she asks your parents more for more runway to extend her time in the bay. [00:48:27] And then she finds work at another startup. [00:48:29] But the same problems persist. [00:48:30] Quote, they kept demanding that I work unpaid overtime, talking about how other employees just always put 40 hours on their time sheet no matter what. [00:48:38] And this exemplary employee over there worked 12 hours a day and he really went the extra mile and got the job done. [00:48:43] And they needed me to really go the extra mile and get the job done. [00:48:48] She's not willing to do that. [00:48:49] And again, I hate that this is part of what drives her to the madness that leads to the cult to the killings because it's like, oh, honey, you're in the right. [00:48:57] It's an evil industry. [00:48:59] You see a flash of where it could have gone well. [00:49:01] It really, there were chances for this to work out. [00:49:04] No, you are 100% right. [00:49:07] Like this is fucked up. [00:49:08] Yeah. [00:49:09] You know what I mean? [00:49:10] And that's super hard. [00:49:12] I really respect that part of you. [00:49:13] Oh, sad that they make that this is part of what shatters your brain. [00:49:21] Like that really bums me out. [00:49:25] So first off, she's kind of starts spiraling and she concludes that she hates virtue ethics. [00:49:30] This is where she starts hating Yadkowski, right? [00:49:32] This is, she doesn't come break entirely on him yet, but she gets really angry at this point because she's like, well, obviously virtue ethics don't work. [00:49:41] And she's been following this man at this point for years. [00:49:44] Exactly. [00:49:45] Exactly. [00:49:45] So this is a very like damaging thing to her that this happens. [00:49:51] And, you know, and again, as much as I blame Yudkowski, the culture of the Bay Area tech industry, that's a big part of what drives this person to where she ends up. [00:50:01] Right. [00:50:02] So that said, some of her issues are also rooted in a kind of rigid and unforgiving internal rule set. [00:50:09] At one point, she negotiates work with a professor and their undergraduate helper. [00:50:13] She doesn't want to take an hourly job and she tries to negotiate a flat rate of 7K. [00:50:18] And they're like, yeah, okay, that sounds fair, but the school doesn't do stuff like that. [00:50:23] So you will have to fake some paperwork with me for me to be able to get them to pay you $7,000. [00:50:29] And she isn't willing to do that. [00:50:31] And that's the thing where it's like, ah, no, I've had some shit where this was like, there was a stupid rule. [00:50:36] And like, in order for the people, me or other people to get paid, we had to like tell something else to the company. [00:50:43] Like, that's just, that's just knowing how to get by. [00:50:47] Yeah, that's, that's living in the world. [00:50:49] You got, yeah, you did the hard part. [00:50:51] Yeah. [00:50:51] They said they were going to do it. [00:50:53] They said they'd do it. [00:50:54] Yeah, that's like, they already said, we don't do this. [00:50:56] That's where you're like, all right. [00:50:57] You just, you can't get by in America if you're not willing to lie on certain kinds of paperwork, right? [00:51:04] That's the game. [00:51:05] Our president does all the fucking time. [00:51:07] He's the king of that shit. [00:51:12] So at this point, Ziz is stuck in what they consider a calamitous situation. [00:51:16] The prophecy of doom, as they call it, is ticking ever closer, which means the bad AI that's going to create hell for everybody. [00:51:25] Her panic over this is elevated by the fact that she starts to get obsessed with Rocco's basilisk at this time. [00:51:31] I know. [00:51:32] I know, I know. [00:51:33] Worst thing for her to read. [00:51:35] Come on. [00:51:36] What they call it, an info hazard? [00:51:38] An info hazard. [00:51:39] She should have heeded the warnings. [00:51:40] Yep. [00:51:41] And a lot of the smarter rationalists are just annoyed by it. [00:51:45] Again, Yadkowski immediately is like, this is very quickly decides it's bullshit and bans discussion of it. [00:51:52] He argues there's no incentive for a future agent to follow through with that threat because by doing so, it just expends resources at no gain to itself, which is like, yeah, man, a hyper-logical AI would not immediately jump to, I must make hell for everybody who didn't code me. [00:52:08] Like, that's just crazy. [00:52:10] There's some steps skipped. [00:52:11] Yeah. [00:52:12] Only humans are like ill in that way. === Info Hazards and Bans (04:10) === [00:52:15] That's the funny thing about it is it's such a human response to it. [00:52:18] Yeah. [00:52:18] Right, right. [00:52:20] Now, when she encounters the concept of Rocco's basilisk, at first, Ziz thinks that it's silly, right? [00:52:26] She kind of rejects it and moves on. [00:52:28] But once she gets to the bay, she starts going to in-person rationalist meetups and having long conversations with other believers who are still talking about Rocco's basilisk. [00:52:38] She writes, I started encountering people who were freaked out by it, freaked out that they had discovered an improvement to the infohazard that made it function, got around to Lisa's objection. [00:52:48] Her ultimate conclusion is this. [00:52:50] If I persisted in trying to save the world, I would be tortured until the end of the universe by a coalition of all unfriendly AIs in order to increase the amount of measure they got by demoralizing me. [00:53:02] Even if my system two had good decision theory, my system one did not. [00:53:06] And that would damage my effectiveness. [00:53:09] And like, I can't explain all of the terms in that without taking more time than we need to, but like, you can hear, like, that is not the writing of a person who is thinking in logical terms. [00:53:17] No, it's, it's a, it's so scary. [00:53:22] Yes, yes. [00:53:23] It's very scary stuff. [00:53:25] It's so scary to be like, oh, that's where she was operating. [00:53:28] Those are the stakes. [00:53:29] This is where she's dealing with. [00:53:32] That's, that's. [00:53:33] It is. [00:53:34] You know, I talk to my friends who are raised in like very toxic evangelical subculture, chunks of the evangelical sub culture and grow up and spend their whole childhood terrified of hell. [00:53:44] That like everything, you know, I got angry at my mom and I didn't say anything, but God knows I'm angry at her and he's going to send me to hell because I didn't respect my mother. [00:53:53] Like that's what she's doing, right? [00:53:55] Exactly. [00:53:56] Exactly. [00:53:56] She can't win. [00:53:57] There's no winning. [00:53:58] Yes. [00:53:58] Yes. [00:53:59] And again, I say this a lot. [00:54:02] We need to put lithium back in the drinking water. [00:54:05] We got to put lithium back in the water. [00:54:07] Maybe Xanax too. [00:54:09] She needed, she could have taken a combo. [00:54:12] Yeah. [00:54:13] Getting rid of that. [00:54:15] Before it gets to where it gets, at this point, you really, you really feel for her and like just living in this, living like that. [00:54:23] Every day, she's so scared that this is what she's doing. [00:54:27] It's, it's this, this is, she is the therapy-needingest woman I have ever heard of at this point. [00:54:32] Oh my God. [00:54:34] She just needs to talk to, she needs to talk to people. [00:54:37] Again, you know, the cult, the thing that happens to cult members has happened to her where her, she, her, the whole language she uses is incomprehensible to people. [00:54:47] I had to talk to you for an hour and 15 minutes so you would understand parts of what this lady says, right? [00:54:54] Exactly. [00:54:54] Because you have to, because it's all nonsense if you don't do that work. [00:54:57] Exactly. [00:54:58] She's so spun out at this point. [00:55:00] It's like, how do you even get back? [00:55:02] Yeah. [00:55:03] How do you even get back? [00:55:04] Yeah. [00:55:04] So she ultimately decides, even though she thinks she's doomed to be tortured by unfriendly AIs, evil gods must be fought. [00:55:11] If this dams me, then so be it. [00:55:13] She's very heroic. [00:55:16] She sees herself that way, right? [00:55:18] Yeah. [00:55:19] And even like just with her convictions and things, she does. [00:55:23] She does. [00:55:24] She does it. [00:55:25] She's a woman of conviction. [00:55:27] You really can't take that away from her. [00:55:30] Those convictions are nonsense. [00:55:32] No, that's the. [00:55:34] But they're there. [00:55:35] They're based on an elaborate Harry Potter fan fiction. [00:55:39] It's like David Icke, the guy who believes in like literal lizard people. [00:55:42] And everyone thinks he's like talking about the Jews, but like, no, no, no, no, he does. [00:55:46] Just lizards. [00:55:47] It's exactly that. [00:55:48] Where it's just like you want to draw. [00:55:51] You want to draw something so it's not nonsense. [00:55:54] And then you realize, no, that's. [00:55:55] No, no, no, no. [00:55:56] And like David, he went out. [00:55:57] He's made like a big rant against how Elon Musk is like evil for what all these people he's hurt by firing the whole federal government. [00:56:04] People were shocked. [00:56:05] It's like, no, no, no. [00:56:06] David Icke believes in a thing. [00:56:08] It's just crazy. [00:56:10] Yeah, yeah, yeah. [00:56:11] Like those people do exist. [00:56:14] Yeah. [00:56:15] Here we are talking about them. [00:56:16] And here we are talking about them. [00:56:18] Some of them run the country. [00:56:20] Well, actually, I don't know how much all of those people believe in anything, but. [00:56:23] No, I don't think they're flying any flag. === Sponsors Believe in Lizards (03:59) === [00:56:25] Yeah. [00:56:27] Yeah. [00:56:27] Speaking of people who believe in something, our sponsors believe in getting your money. [00:56:38] What's up, everyone? [00:56:39] I'm Ago Modem. [00:56:40] My next guest, you know, from Step Brothers, Anchorman, Saturday Night Live, and the Big Money Players Network, it's Will Farrell. [00:56:51] My dad gave me the best advice ever. [00:56:55] I went and had lunch with him one day, and I was like, and dad, I think I want to really give this a shot. [00:57:00] I don't know what that means, but I just know the groundlings. [00:57:02] I'm working my way up through and I know it's a place to come look for up and coming talent. [00:57:06] He said, if it was based solely on talent, I wouldn't worry about you, which is really sweet. [00:57:11] Yeah. [00:57:12] He goes, but there's so much luck involved. [00:57:14] And he's like, just give it a shot. [00:57:16] He goes, but if you ever reach a point where you're banging your head against the wall and it doesn't feel fun anymore, it's okay to quit. [00:57:24] If you saw it written down, it would not be an inspiration. [00:57:27] It would not be on a calendar of, you know, the cat just hang in there. [00:57:34] Yeah, it would not be. [00:57:36] Right, it wouldn't be that. [00:57:37] There's a lot of luck. [00:57:39] Listen to Thanksgiving on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:57:47] In 2023, former bachelor star Clayton Eckard found himself at the center of a paternity scandal. [00:57:54] The family court hearings that followed revealed glaring inconsistencies in her story. [00:57:59] This began a years-long court battle to prove the truth. [00:58:02] You doctored this particular test twice, Ms. Owens, correct? [00:58:06] I doctored the test once. [00:58:07] It took an army of internet detectives to crack the case. [00:58:11] I wanted people to be able to see what their tax dollars were being used for. [00:58:14] Sunlight's the greatest disinfectant. [00:58:17] They would uncover a disturbing pattern. [00:58:19] Two more men who'd been through the same thing. [00:58:21] Greg Oespi and Michael Marancine. [00:58:24] My mind was blown. [00:58:25] I'm Stephanie Young. [00:58:27] This is Love Trap. [00:58:29] Laura, Scottsdale Police. [00:58:31] As the season continues, Laura Owens finally faces consequences. [00:58:35] Ladies and gentlemen, breaking news at Americopa County as Laura Owens has been indicted on fraud charges. [00:58:42] This isn't over until justice is served in Arizona. [00:58:47] Listen to Love Trapped podcast on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:58:56] 10-10 shots five, city hall building. [00:59:00] A silver .40 caliber handgun was recovered at the scene. [00:59:04] From iHeart Podcasts and Best Case Studios, this is Rorschach, murder at City Hall. [00:59:10] How could this have happened in City Hall? [00:59:12] Somebody tell me that. [00:59:12] Jeffrey Hood did. [00:59:14] July 2003. [00:59:16] Councilman James E. Davis arrives at New York City Hall with a guest. [00:59:21] Both men are carrying concealed weapons. [00:59:24] And in less than 30 minutes, both of them will be dead. [00:59:32] Everybody in the chamber's ducks. [00:59:35] A shocking public murder. [00:59:36] I screamed, get down, get down. [00:59:38] Those are shots. [00:59:39] Those are shots. [00:59:40] Get down. [00:59:40] A charismatic politician. [00:59:42] You know, he just bent the rules all the time. [00:59:44] I still have a weapon. [00:59:46] And I could shoot you. [00:59:49] And an outsider with a secret. [00:59:51] He alleged you to be a victim of flat down. [00:59:54] That may or may not have been political. [00:59:56] That may have been about sex. [00:59:58] Listen to Rorschach, murder at City Hall on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:00:11] There's two golden rules that any man should live by. [01:00:15] Rule one, never mess with a country girl. [01:00:18] You play stupid games, you get stupid prizes. [01:00:21] And rule two, never mess with her friends either. === Morally Valuable Self Help (15:45) === [01:00:25] We always say, trust your girlfriends. [01:00:28] I'm Anna Sinfield, and in this new season of The Girlfriends. [01:00:32] Oh my God, this is the same man. [01:00:34] A group of women discover they've all dated the same prolific con artist. [01:00:39] I felt like I got hit by a truck. [01:00:41] I thought, how could this happen to me? [01:00:43] The cops didn't seem to care. [01:00:45] So they take matters into their own hands. [01:00:48] I said, oh, hell no. [01:00:49] I vowed I will be his last target. [01:00:52] He's going to get what he deserves. [01:00:56] Listen to the girlfriends. [01:00:58] Trust me, babe. [01:00:59] On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:01:12] We're back. [01:01:13] So, she is at this point suffering from delusions of grandeur, and those are going to rapidly lead her to danger. [01:01:21] But she concludes that since the fate of the universe is at stake in her actions, she would make a timeless choice to not believe in the basilisk, right? [01:01:30] And that that will protect her in the future, because that's how these people talk about stuff like that. [01:01:37] So she gets over her fear of the basilisk for a little while. [01:01:42] But even when she claims to have rejected the theory, whenever she references it in her blog, she like locks it away under a spoiler with like a infohazard warning. [01:01:51] Rocco's basilisk family, skippable. [01:01:54] So you don't like have to see it and have it destroy your psyche. [01:01:58] That's the power of it. [01:02:00] Yeah. [01:02:01] The concept does, however, keep coming back to her, like and continuing to drive her mad. [01:02:06] Thoughts of the basilisk return and eventually she comes to an extreme conclusion. [01:02:10] If what I cared about was sentient life and I was willing to go to hell to save everyone else, why not just send everyone else to hell if I didn't submit? [01:02:19] Can I tell you, I really, it felt like this is what was this is where it had to go, right? [01:02:25] Yeah, yeah, yes. [01:02:29] So what she means here is that she is now making the timeless decision that when she is in a position of ultimate influence and helps bring this all-powerful vegan AI into existence, she's promising now ahead of time to create a perfect hell, a digital hell, to like punish all of the people who don't stop, who like eat meat ever. [01:02:52] She wants to make a hell for people who eat meat. [01:02:54] And that's the, yeah, that's the conclusion that she makes, right? [01:02:59] This becomes an intrusive thought in her head, primarily the idea that like everyone isn't going along with her, right? [01:03:06] Like, she doesn't want to create this hell. [01:03:08] She just thinks that she has to. [01:03:09] So she's like very focused on trying to convince these other people in the rationalist culture to become vegan. [01:03:17] Anyway, she writes this: I thought it had to be subconsciously influencing me, damaging my effectiveness, that I had done more harm than I could imagine by thinking these things because I had the hubris to think infohazards didn't exist, and worse, to feel resigned, a grim sort of pride in my previous choice to fight for sentient life, although it damned me. [01:03:35] And the gaps between do not think about that you moron, do not think about that you moron, pride which may have led to intrusive thoughts to resurface and progress and progress to resume. [01:03:45] In other words, my ego had perhaps damned the universe. [01:03:49] So, man, I don't fully get all of what she's saying here, but it's also because she's like just spun out into madness at this. [01:03:58] Yeah, she's she that's she lives in it now. [01:04:01] It's so yeah, it's so far for we've been talking about it however long she's already, she's so far away from us, even. [01:04:08] Yeah, and it is it is deeply, I've read a lot of her writing. [01:04:12] It is deeply hard to understand pieces of it here. [01:04:16] Man, but she is at war with herself, clearly. [01:04:19] She is for sure at war with herself. [01:04:21] Um, now Ziz is at this point attending rationalist events by the bay, and a lot of the people at those events are older, more influential men, some of whom are influential in the tech industry, all of whom have a lot more money than her. [01:04:36] And some of these people are members of an organization called CIFAR, the Center for Applied Rationality, which is a nonprofit founded to help people get better at pursuing their goals. [01:04:47] It's a self-help company, right? [01:04:48] It runs self-help seminars. [01:04:50] This is the same as like a Tony Robbins thing, right? [01:04:54] We're all just trying to get you to sign up and then get you to sign up for the next workshop and the next workshop and the next workshop, like all self-help people do. [01:05:03] Yeah, there's no difference between this and Tony Robbins. [01:05:06] So Ziz goes to this event and she has a long conversation with several members of CIFAR who I think are clearly kind of my interpretation of this is that they're trying to groom her to get a new because they think, yeah, chick's clearly brilliant. [01:05:22] She'll find her way in the industry and we want her money, right? [01:05:25] You know, maybe we want her to do some free work for us too, but like, let's, let's, you know, we got to reel this fish in, right? [01:05:33] So this is described as an academic conference by people who are in the AI risk field and rationalism, you know, thinking of ways to save the universe because only the true, the super geniuses can do that. [01:05:46] The actual why I'm really glad that I read Ziz's account here is I've been reading about these people for a long time. [01:05:53] I've been reading about their beliefs. [01:05:55] I felt there's some cult stuff here. [01:05:59] When Ziz laid out what happened at this seminar, this self-help seminar put on by these people very close to Yudkowsky, it's almost exactly the same as a Synanon meeting. [01:06:12] Like it's the same stuff. [01:06:14] It's exact, and it's the same shit. [01:06:16] It's the same as accounts of like big self-help movement things from like the 70s and stuff that I've read. [01:06:24] That's when it really clicked to me, right? [01:06:26] Quote, here's a description of one of the, because they have, you know, speeches and they break out into groups to do different exercises, right? [01:06:34] There were hamming circles per person take turns having everyone else spend 20 minutes trying to solve the most important problem about your life to you. [01:06:42] I didn't pick the most important problem in my life because secrets. [01:06:45] I think I used my turn on a problem I thought they might actually be able to help with. [01:06:49] The fact that it did, although it didn't seem to affect my productivity or willpower at all, i.e., I was inhumanly determined basically all the time, I still felt terrible all the time, that I was hurting from some to some degree relinquishing my humanity. [01:07:03] I was sort of vaguing about the pain of being trans and having decided not to transition. [01:07:07] And so, like, this is a part of the thing. [01:07:10] You build a connection between other people and this group by getting people to like spill their secrets to each other. [01:07:15] It's a thing Scientology does. [01:07:16] It's a thing they did at Synanon. [01:07:18] Tell me your darkest secret, right? [01:07:21] And she's not fully willing to because she doesn't want to come out to this group of people yet. [01:07:28] And, you know, part of what happens. [01:07:29] Hey, forget that she's also dealing with that entire. [01:07:32] Yes. [01:07:33] Wow. [01:07:34] Yeah. [01:07:35] And the Hamming circle doesn't sound so bad. [01:07:38] If you'll recall, as you mentioned this, I was really good at it in part one. [01:07:42] Synanon would have people break into circles where they would insult and attack each other in order to create a traumatic experience that would bond them together and with the cult. [01:07:49] These hamming circles are weird, but they're not that. [01:07:53] But there's another exercise they did next called doom circles. [01:07:57] Quote, there were doom circles where each person, including themselves, took turns having everyone else bluntly but compassionately say why they were doomed using blindsight. [01:08:08] Someone decided and set a precedent of starting these off with a sort of ritual incantation. [01:08:13] We now invoke and bow to the doom gods and waving their hands, saying, doom. [01:08:18] I said I'd never bow to the doom gods. [01:08:19] And while everyone else said that, I flipped the double bird to the heavens and said, fuck you instead. [01:08:24] Person A, that's this member of CIFAR that she is admires, found this agreeable and joined in. [01:08:31] Some people brought up that they felt like they were only as morally valuable as half a person. [01:08:36] This irked me. [01:08:37] I said they were whole persons and don't be stupid like that. [01:08:41] Like if they wanted to sacrifice themselves, they could weigh one versus seven billion. [01:08:45] They didn't have to falsely denigrate themselves as less than one person. [01:08:49] They didn't listen. [01:08:50] When it was my turn concerning myself, I said my doom was that I could succeed at the things I tried, succeed exceptionally well. [01:08:57] Like I bet I could in 10 years have earned a give like $10 million through startups and it would still be too little too late. [01:09:03] Like I came into this game too late. [01:09:04] The world would still burn. [01:09:08] And first off, like this is, you know, it's a variant of the synonym thing. [01:09:12] You're going around, you're telling people why they're doomed, right? [01:09:14] Like why they won't succeed in life, you know? [01:09:17] But it's also one of the things here, these people are saying they feel like less than a person. [01:09:22] A major topic of discussion in the community at the time is, if you don't think you can succeed in business and make money, is the best thing with the highest net value you can do, taking out an insurance policy on yourself and committing suicide. [01:09:38] Oh my God. [01:09:39] And then having the money donated to a rationalist organization. [01:09:42] That's a major topic of discussion that like Ziz grapples with. [01:09:45] A lot of these people grapple with, right? [01:09:47] Because they were obsessed with the idea of like, oh my God, I might be net negative value, right? [01:09:52] If I can't do this or can't do this, I could be a net negative value individual. [01:09:56] And that means like I'm not contributing to the solution. [01:09:59] And there's nothing worse than not contributing to the solution. [01:10:03] Were there people who did that? [01:10:06] I am not aware. [01:10:10] There are people who commit suicide in this community. [01:10:13] I will say that. [01:10:14] Like there are a number of suicides tied to this community. [01:10:17] I don't know if the actual insurance con thing happened, but it's like a seriously discussed thing. [01:10:24] And it's seriously discussed because all of these people talk about the value of their own lives in purely like mechanistic, how much money or expected value can I produce. [01:10:37] Like that is a person and that's why a person matters, right? [01:10:41] And the term they use is morally valuable, right? [01:10:44] Like that's, that's what means you're a worthwhile human being if you're morally valued, if you're creating a net positive benefit to the world in the way they define it. [01:10:53] And so a lot of these people are, yes, there are people who are depressed and there are people who kill themselves because they come to the conclusion that they're a net negative person, right? [01:11:03] Like that is a thing at the edge of all of this shit that's really fucked up. [01:11:08] And that's what this doom circle is about is everybody like flipping out over, I'm like, and telling each other, I think you might know, be as only be as morally valuable as half a person, right? [01:11:19] Like that's people are saying that, right? [01:11:20] Like that's what's going on here, you know? [01:11:23] Like it's not the synonym thing of like screaming like you're a, you know, using the F slur a million times or whatever, but it's very bad. [01:11:32] No, this is, this is, this is awful. [01:11:35] For like one thing, I don't know. [01:11:37] My feeling is you have an inherent value because you're a person. [01:11:42] Yeah, that's a great place to start. [01:11:44] You know? [01:11:45] This is also leading people to destroy themselves. [01:11:48] Like it's not even. [01:11:50] It's, it's so, it's such a bleak way of looking at things. [01:11:55] It's so crazy, too. [01:11:56] Where were these meetings? [01:11:56] I just, in my head, I'm like, this is just happening in like a ballroom at a Radisson? [01:12:01] I think it is. [01:12:02] Or a convention center. [01:12:04] You know, there's different kind of public spaces. [01:12:06] I don't know. [01:12:06] Like, honestly, if you've been to like an anime convention or a Magic the Gathering convention somewhere in the Bay, you may have been in one of the rooms they did these in. [01:12:13] I don't know exactly where they hold this. [01:12:16] So the person A mentioned above, this like person who's like affiliated with the organization that I think is a recruiter looking for young people who can be cultivated to pay for classes, right? [01:12:29] This person, it's very clear to them that Ziz is at the height of her vulnerability. [01:12:34] And so he tries to take advantage of that. [01:12:36] So he and another person from the organization engage Ziz during a break. [01:12:41] Ziz, who's extremely insecure, asks them point blank, what do you think my net value ultimately will be in life? [01:12:49] Right. [01:12:50] And again, there's like an element of this. [01:12:52] It's almost like rationalist Calvinism, where it's like, it's actually decided ahead of time by your inherent immutable characteristics, you know, if you are a person who can do good. [01:13:01] Quote, I asked person A if they expected me to be net negative. [01:13:05] They said yes. [01:13:06] After a moment, they asked me what I was feeling or something like that. [01:13:09] I said something like dazed and sad. [01:13:11] They asked why sad. [01:13:13] I said, I might leave the field as a consequence and maybe something else. [01:13:16] I said I needed time to process her think. [01:13:18] And so she like, she goes home after this guy saying like, yeah, I think your life's probably net negative value and sleeps the rest of the day. [01:13:26] And she wakes up the next morning and comes back to the second day of this thing. [01:13:33] And yeah, Ziz goes back and she tells this person, okay, here's what I'm going to do. [01:13:39] I'm going to pick a group of three people at the event I respect, including you. [01:13:44] And if two of them vote that they think I have a net negative value, quote, I'll leave EA and existential risk and the rationalist community and so on forever. [01:13:54] I'd transition and move probably to Seattle. [01:13:57] I heard it was relatively nice for trans people and there do what I could to be a normie, retool my mind as much as possible to be stable, unchanging and a normie. [01:14:05] Gradually abandon my Facebook account and email, use a name change as a story for that. [01:14:10] And God, that would have been the best thing for her. [01:14:13] That's what I'm, oh, you see like that sliver of hope. [01:14:16] Like, oh man. [01:14:18] She sees this as a nightmare, right? [01:14:21] This is the worst case scenario for her, right? [01:14:25] Because you're not spun out, right? [01:14:27] You're not part of the solution. [01:14:28] You're not part of the cause, you know? [01:14:31] You have no involvement in the great quest to save humanity. [01:14:35] That's worse than death almost, right? [01:14:36] That's its own kind of hell, though, right? [01:14:38] To think that you had this enlightenment and that you that you weren't good enough to participate your best efforts a lot about how I'd probably just kill myself. [01:14:48] You know, that's the logical thing to do. [01:14:50] Um, it's so fucked up, it's so fucked up. [01:14:54] And also, if she's trying to live a normal life as a normie, and she refers to like being a normie as like just trying to be nice to people, because again, that's useless. [01:15:04] Um, so her fear here is that she would be a causal negative if she does this, right? [01:15:09] Um, and also the robot god that comes about might put her in hell, right? [01:15:15] Because that's also looming after every for every decision, right? [01:15:19] Yeah, and a thing here, she she expressed, she tells these guys a story, and it really shows both in this community and among her how little value they actually have for like human life. [01:15:29] I told a story about a time I had killed four ants in a bathtub where I wanted to take a shower before going to work. [01:15:35] I'd considered, can I just not take a shower? [01:15:38] And presumed me smelling bad at work would, because of big numbers and the fate of the world and stuff, make the world worse than the deaths of four basically causally isolated people. [01:15:48] I considered getting paper in a cup and taking them elsewhere, and I figured there were decent odds if I did, I'd be late to work and it would probably make the world worse in the long run. [01:15:55] So, again, she considers ants identical to human beings. [01:15:59] And she is also saying it was worth killing four of them because they're causally isolated so that I could get to work in time because I'm working for the cause. === Becoming a Rational Psychopath (10:49) === [01:16:10] It's also such a bad place here. [01:16:13] Yeah. [01:16:13] The crazy thing about her is it like the amount of thinking just to like get in the shower to go to work. [01:16:22] You know, you know what I mean? [01:16:23] Like that, that, ah, man, it just seems like it makes everything. [01:16:28] Yes. [01:16:29] Every action is so loaded. [01:16:31] Yes. [01:16:32] Yes. [01:16:32] The weight of that must be. [01:16:34] It's so, it's, it's wild to me both the this like mix of like fucking Jane Buddhist compassion of like an ant is no less than I or an ant is no less than a human being, right? [01:16:46] We are all these are all lives. [01:16:48] And then, but also it's fine for me to kill a bunch of them to go to work on time because like they're causally isolated. [01:16:53] So they're basically not people. [01:16:55] Like it's so weird. [01:16:59] Like, um, and I, and again, it's getting a lot clearer here why this lady and her ideas end in a bunch of people getting shot. [01:17:09] Yeah. [01:17:09] And stabbed. [01:17:11] Okay. [01:17:12] There's a samurai sword later in the story, my friend. [01:17:16] That's the one thing this has been missing. [01:17:18] Yes, yes. [01:17:19] Um, so they continue, these guys, uh, to have a very abusive conversation with this young person. [01:17:25] And she clearly, she trusts them enough to- This is a conversation where she asks for the two. [01:17:29] Yeah. [01:17:29] Okay. [01:17:30] Yeah. [01:17:30] And she tells them she's trans, right? [01:17:33] And this gives you an idea of like how kind of predatory some of the stuff going on in this community is. [01:17:38] They asked what I'd do with a female body. [01:17:41] They were trying to get me to admit what I actually wanted to do as the first thing in heaven. [01:17:45] Heaven being there's this idea, especially amongst like some trans members of the rationalist community, that like every, all of them basically believe a robot's going to make heaven, right? [01:17:54] And obviously, like there's a number of the folks who are in this who are trans are like, and in heaven, like you just kind of get the body you want immediately, right? [01:18:02] So these guys, they were trying to get me to admit that what I actually wanted to do as the first thing in heaven was masturbate in a female body. [01:18:11] And they follow this up by sitting really close to her, close enough that she gets uncomfortable. [01:18:17] And then a really, really rationalist conversation follows. [01:18:21] They asked if I felt trapped. [01:18:23] I may have clarified physically. [01:18:25] They may have said, sure. [01:18:26] Afterward, I answered no to that question under the likely justified belief that it was framed that way. [01:18:31] They asked why not. [01:18:32] I said I was pretty sure I could take them in a fight. [01:18:35] They prodded for details, why I thought so, and then how I thought a fight between us would go. [01:18:39] I asked what kind of fight, like a physical unarmed fight to the death right now, and why? [01:18:44] What were my payouts? [01:18:45] This was over the fate of the multiverse, triggering actions by other people, i.e., or imprisonment or murder was not relevant. [01:18:51] So they decide to make this into, again, these people are all addicted to dumb game theory stuff, right? [01:18:56] Okay, so what is this fight? [01:18:58] Is this fight over the fate of the multiverse? [01:18:59] Are we in an alternate reality where like no one will come and intervene and there's no cops? [01:19:04] We're the only people in the world or whatever. [01:19:06] So they tell her, like, yeah, imagine there's no consequences legally, whatever to you do, and we're fighting over the fate of the multiverse. [01:19:12] And so she proceeds to give an extremely elaborate discussion of how she'll gouge out their eyes and try to destroy their prefrontal lobes and then stomp on their skulls until they die. [01:19:21] And it's both, it's like, it's nonsense. [01:19:24] It's like how 10-year-olds think fights work. [01:19:27] It's also, it's based on this game theory attitude of fighting that they have, which is like, you have to make this kind of timeless decision that any fight is, you're just going to murder. [01:19:37] So you have to go to the hardest confrontation, right? [01:19:39] Yes. [01:19:39] So you would have to be the most violent. [01:19:42] Yes. [01:19:42] Because that will make other people not want to attack you, as opposed to like what normal people understand about like real fights, which is if you have to do one, if you have to, you like try to just like hit them in the hit them somewhere that's going to shock them and then run like a motherfucker. [01:19:58] Yeah. [01:19:58] Right. [01:19:58] You get the fastest possible. [01:20:01] Like if you have to, like, ideally just run like a motherfucker, but if you have to strike somebody, you know, yeah, go for the eye and then run like a son of a bitch, you know? [01:20:09] Like, but there's no run like a son of a bitch here because the point in part is this like timeless decision to, anyway, this gives you, tells you a lot about the rationalist community. [01:20:21] So she tells these people, she explains in detail how she would murder them if they had a fight right now. [01:20:26] As they're like sitting next to her, super close, having just asked her about masturbation. [01:20:31] Here's their first question: Quote, they asked if I'd rape their corpse. [01:20:35] Part of me insisted this was not going as it was supposed to, but I decided, I decided inflicting discomfort in order to get reliable information was a valid tactic. [01:20:43] In other words, them trying to make her dis uncomfortable to get info from her, she decides is fine. [01:20:49] Also, the whole discussion about raping their corpses is like, well, if you rape, obviously, like if you want to have the most extreme response possible that would like make other people unlikely to fuck with you, knowing that you'll violate their corpse if you kill them is clearly the like, and like that really is that. [01:21:04] Okay, sure. [01:21:05] I love rational thought. [01:21:08] Oh, man. [01:21:11] Damn, this is crazy. [01:21:12] Sorry. [01:21:14] This is so crazy. [01:21:15] It's so nuts. [01:21:17] So then they talk about psychopathy. [01:21:20] One of these guys had earlier told Ziz that they thought she was a psychopath. [01:21:25] But he told her that. [01:21:26] He told her that doesn't mean what it means both to actual like clinicians, because psychopathy is a diagnostician, or like what normal people mean. [01:21:34] To rationalists, a lot of them think psychopathy is a state you can put yourself into in order to maximize your performance in certain situations. [01:21:44] That's because they, they've, again, there's some like popular books that are about like the psychopath way, the dark triad, and like, well, you know, these are the people who led societies in the toughest times. [01:21:55] And so like you could, you need to optimize and engage in some of those behaviors if you want to win in these situations. [01:22:02] Based on all of this, Ziz brings up what rationalists call the Gervais principle. [01:22:07] Now, this started as a tongue-in-cheek joke describing a rule of office dynamics based on the TV show The Office. [01:22:14] When you said, I was like, there's no way. [01:22:16] Yes, it's Ricky Gervais. [01:22:17] Yes. [01:22:18] And the idea is that in office environments, psychos always rise to the top. [01:22:22] This is supposed to be like a negative observation. [01:22:25] Like the person who wrote this initially is like, yeah, this is how offices work. [01:22:28] And it's like, why they're bad, you know? [01:22:30] It's an extension of the Peter principle. [01:22:32] And these psychopaths put bad, like dumb and incompetent people in like in positions below them for a variety. [01:22:41] It's trying to kind of work out why in which like offices are often dysfunctional, right? [01:22:46] It's not like the original Gervais principle thing is like not a bad piece of writing or whatever. [01:22:50] But Ziz takes something insane out of it. [01:22:53] I described how the Gervais principal said sociopaths give up empathy as in a certain chunk of social software, not literally all hardware, accelerated modeling of people, not necessarily compassion, and with it happiness, destroying meaning to create power. [01:23:08] Meaning, too, I did not care about. [01:23:10] I wanted this world to live on. [01:23:12] So she tells them she's come to the conclusion, I need to make myself into a psychopath in order to have the kind of mental power necessary to do the things that I want to do. [01:23:23] And she largely justifies this by describing the beliefs of the Sith from Star Wars, because she thinks she needs to remake herself as a psychopathic evil warrior monk in order to save all of creation. [01:23:38] Yeah, no, of course. [01:23:39] Yep. [01:23:39] So this is her hitting her final form. [01:23:42] And true to fact, these guys are like, they don't say it's a good idea, but they're like, okay, yeah, that's not the worst thing you could do. [01:23:49] Sure. [01:23:52] I think the Sith stuff kind of weird, but making yourself a psychopath makes sense. [01:23:55] Sure. [01:23:55] Yeah. [01:23:56] Of course. [01:23:56] I know a lot of guys who did that. [01:23:58] That's literally what they say, right? [01:24:01] And then they say that. [01:24:03] Also, I don't even think that's what they really, they say that because the next thing they say, this guy, person A, is like, look, the best way to turn yourself from a net negative to a net positive value, I really believe you could do it, but to do it, you need to come to 10 more of these seminars and keep taking classes here, right? [01:24:19] Right. [01:24:19] Of course. [01:24:21] Here's a quote from them or from Ziz. [01:24:24] She's conditional on me going to a long course of circling, like these two organizations offered, particularly a 10-weekend one, then I probably would not be net negative. [01:24:36] So things are going good. [01:24:39] This is, this is, you know. [01:24:43] Yeah. [01:24:45] Great. [01:24:47] How much does 10 weekends cost? [01:24:49] I don't actually know. [01:24:50] I don't, I don't fully know with this. [01:24:52] It's possible some of these are like, some of the events are free, like, but the classes cost money. [01:24:58] But it's also a lot of it's like there's donations expected or by doing this and being a member, it's expected you're going to tithe basically. [01:25:07] That's what I was thinking it's going to be. [01:25:08] It's something like 50% of your income, right? [01:25:10] More than they're like worried about this money at the time. [01:25:12] I mean, with right, I don't know, the format, is she not going to be like super suspicious that people are like, you know, faking it or like going over the top? [01:25:22] She is. [01:25:23] She is. [01:25:24] She gets actually really uncomfortable. [01:25:25] They have an exercise where they're basically doing, you know, they're playing with love bombing, right? [01:25:30] Where everyone's like hugging and telling each other they love each other. [01:25:33] And she's like, I don't really believe it. [01:25:34] I just met these people. [01:25:36] So she, she is starting to, and she is going to break away from these organizations pretty quickly. [01:25:40] But this conversation she have with these guys is a critical part of like why she finally has this fracture. [01:25:49] Because number one, this dude keeps telling her you have a net negative value to the universe, right? [01:25:56] And so she's obsessed with like, how do I, and it comes to the conclusion, my best way of being net positive is to make myself into a sociopath and a Sith Lord to save the animals, of course. [01:26:11] It feels like the same thinking though as like the robot's going to make hell. [01:26:15] It seems to always come back to this idea of like, I think we just got to be evil. [01:26:23] Well, I guess the only logical conclusion is doom. [01:26:28] Yep. [01:26:31] Yeah. [01:26:31] Yeah. [01:26:32] It's like it feels like it's a theme here. [01:26:36] Yep. [01:26:37] Anyway, you want to plug anything at the end here? [01:26:42] I have a comedy special you can purchase on Patreon. [01:26:45] It's called Birth of a Nation with a G. You can get that at patreon.com backslash David Boy. [01:26:53] Excellent. [01:26:53] Excellent. [01:26:56] All right. [01:26:57] Folks, well, that is the end of the episode. === Birth of a Nation (02:06) === [01:26:59] David, thank you so much for coming on to our inaugural episode by listening to some of the weirdest shit we've ever talked about on this show. [01:27:09] Yeah, this was, I don't really, I'm going to be thinking about this for weeks. [01:27:14] I mean, to me, it's, there's, I feel like it's kind of fair because your co-host likes it, Kirby Kaybod for the Elders of Zion episodes. [01:27:24] Yeah. [01:27:24] Yeah. [01:27:25] Okay. [01:27:26] I wanted to, I was initially going to kind of just focus on all of this would have been like half a page or so, you know, just kind of summing up, here's the, the gist of what this believes. [01:27:35] And then let's get to the actual cult stuff when like, you know, Ziz starts bringing in followers and the crimes start happening. [01:27:42] But that Rollings or the Wired article really covers all that very well. [01:27:47] And that's the best piece. [01:27:48] Most of the journalism I've read on these guys is not very well written. [01:27:52] It's not very good. [01:27:53] It does not really explain why they are, what they are or why they do it. [01:27:57] So I decided, and I'm not, the wired piece is great. [01:28:00] I know the wired guy knows all of the stuff that I brought up here. [01:28:03] He just, it's an article. [01:28:04] You have editors. [01:28:05] He, he, he left, he left out what he thought he needed to leave out. [01:28:09] I don't have that problem. [01:28:11] And I wanted to really, really deeply trace exactly where this, this lady's, how this lady's mind develops and how that intersects with rationalism. [01:28:21] Um, because it's interesting and kind of important and bad. [01:28:26] Yeah. [01:28:28] Okay. [01:28:29] So interesting. [01:28:30] Anyway, thanks for having a head fuck with me. [01:28:34] Um, all right. [01:28:36] That's it, everybody. [01:28:37] Goodbye. [01:28:40] Behind the Bastards is a production of CoolZone Media. [01:28:44] For more from CoolZone Media, visit our website, coolzonemedia.com or check us out on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:28:53] Behind the Bastards is now available on YouTube. [01:28:56] New episodes every Wednesday and Friday. [01:28:59] Subscribe to our channel, youtube.com/slash at behind the bastards. === Available on YouTube (02:27) === [01:29:06] When a group of women discover they've all dated the same prolific con artist, they take matters into their own hands. [01:29:14] I vowed I will be his last target. [01:29:16] He is not going to get away with this. [01:29:18] He's going to get what he deserves. [01:29:21] We always say that, trust your girlfriends. [01:29:25] Listen to the girlfriends. [01:29:27] Trust me, babe. [01:29:28] On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:29:37] What's up, everyone? [01:29:38] I'm Ego Mode. [01:29:39] My next guest, it's Will Farrell. [01:29:43] My dad gave me the best advice ever. [01:29:46] He goes, just give it a shot. [01:29:48] But if you ever reach a point where you're banging your head against the wall and it doesn't feel fun anymore, it's okay to quit. [01:29:55] If you saw it written down, it would not be an inspiration. [01:29:57] It would not be on a calendar of, you know, the cat just hang in there. [01:30:04] Yeah, it would not be. [01:30:06] Right, it wouldn't be that. [01:30:07] There's a lot of life. [01:30:09] Listen to Thanks, Dad, on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:30:16] In 2023, bachelor star Clayton Eckard was accused of fathering twins, but the pregnancy appeared to be a hoax. [01:30:23] You doctored this particular test twice, Miss Owens, correct? [01:30:27] I doctored the test once. [01:30:29] It took an army of internet detectives to uncover a disturbing pattern. [01:30:34] Two more men who'd been through the same thing. [01:30:36] Greg Gillespie and Michael Mancini. [01:30:38] My mind was blown. [01:30:39] I'm Stephanie Young. [01:30:41] This is Love Trapped. [01:30:42] Laura, Scottsdale Police. [01:30:44] As the season continues, Laura Owens finally faces consequences. [01:30:49] Listen to Love Trapped podcast on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:30:56] 10-10 shots fired, City Hall building. [01:30:59] Did this ever happen in City Hall? [01:31:01] Somebody tell me that. [01:31:02] A shocking public murder. [01:31:04] This is one of the most dramatic events that really ever happened in New York City politics. [01:31:10] They screamed, get down, get down. [01:31:12] Those are shots. [01:31:14] A tragedy that's now forgotten. [01:31:16] And a mystery that may or may not have been political, that may have been about sex. [01:31:21] Listen to Rorschach, Murder at City Hall on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:31:30] This is an iHeart podcast. [01:31:32] Guaranteed human.