True Anon Truth Feed - Episode 260: God-Sperm of Mad Men Aired: 2022-12-23 Duration: 01:31:13 === Father Christmas Vibe (05:50) === [00:00:00] Christmas, Christmas, man, move, move. [00:00:04] Ho, ho, ho. [00:00:06] Oh, my God. [00:00:07] Here we go. [00:00:08] What do we have here? [00:00:10] Underneath my little Christmas tree. [00:00:13] Do you have a Christmas tree at home? [00:00:14] Fuck no. [00:00:15] Why not? [00:00:16] Me? [00:00:17] Man, man, one doesn't simply invite nature into their home. [00:00:22] I don't have plants. [00:00:23] I don't have trees. [00:00:23] I don't have leaves. [00:00:24] You don't have plants. [00:00:25] No, but you can't kill a Christmas tree. [00:00:27] Oh, I could kill a Christmas tree. [00:00:28] You set those fuckers on fire like early February or something. [00:00:32] I don't know if that works on the East Coast because it's wet out here. [00:00:35] San Francisco, you said that. [00:00:36] You drive as my bone. [00:00:37] You dry as my bone. [00:00:38] Yeah. [00:00:39] Do you see the earthquake that happened in Humboldt? [00:00:41] Really? [00:00:41] Yeah, 6.3 or something. [00:00:43] Really? [00:00:44] Yeah. [00:00:44] That's pretty high. [00:00:45] It is pretty high. [00:00:46] Also, you know what else is high? [00:00:47] Humboldt County. [00:00:48] Mm-hmm. [00:01:13] Ladies and gentlemen, Merry Christmas. [00:01:17] Hey! [00:01:18] Hanukkah. [00:01:18] It's not Christmas yet. [00:01:20] What day is it? [00:01:21] It's the 21st. [00:01:23] But these people might be listening to it in the future. [00:01:25] How hard do you think Christ was kicking at this point in his mama's belly? [00:01:30] Yeah, he's like, let me out, let me out. [00:01:32] The wise men are just like, oh, fuck, we gotta find, it's getting late. [00:01:34] We gotta find this kid. [00:01:36] We gotta get to the manger. [00:01:38] Let me tell you. [00:01:40] I'm son of God. [00:01:42] I want to be born in Cedar Sinai Hospital. [00:01:44] I don't want to be born in a manger. [00:01:45] You know what? [00:01:46] Great hospital. [00:01:47] Great hospital? [00:01:47] Oh, yeah. [00:01:48] Bear fantasy. [00:01:49] Okay. [00:01:49] I hope to be there somewhere. [00:01:50] Febby Hillies. [00:01:52] I hope to be getting a, I don't know, some kind of transplant. [00:01:54] I believe that is where Courtney Kardashian pulled her baby out of her body. [00:01:59] Did we talk about this in the show? [00:02:01] You told me about this. [00:02:02] I love this story so much, so I'm sure I have many times. [00:02:05] This is Germane for us. [00:02:06] And if I have, don't comment in the comments that I already have told the story. [00:02:12] Tell me. [00:02:12] Can I tell it? [00:02:13] Yeah, tell us. [00:02:14] There's an episode of Keeping Up with the Kardashians, R.I.P., where Courtney, who is now Courtney Barker. [00:02:24] Yeah. [00:02:26] She's giving birth to one of her babies. [00:02:29] And the doctor, she's like, the doctor's like, do you want to pull out the baby? [00:02:34] I see the feet. [00:02:35] And she literally just, you know, kind of like yoga poses, leans over, stretches on over, and just yanks it all out. [00:02:45] Yanks the baby out. [00:02:46] By its feet? [00:02:46] By its feet. [00:02:47] Its feet? [00:02:48] She's just like. [00:02:49] Wait, actually, no, not by its feet. [00:02:50] No, by its feet. [00:02:51] By its head. [00:02:51] Yeah. [00:02:52] She clutches the head. [00:02:54] For the woman to pull the baby out. [00:02:56] The feet part would be the best. [00:02:58] The feet part would be busy. [00:02:59] That would make it around. [00:03:01] No, but by the shoulders and head. [00:03:03] But it's not, I don't believe it's normal for the woman to, for the doctor to implore Courtney Kardashian to do it herself. [00:03:12] So are women, is this a common thing? [00:03:13] She's pretty drugged up. [00:03:14] Are women doing the bends now and just popping that thing out like a claw? [00:03:18] I have a friend who went to that same doctor, and the doctor asked her if she wanted to do it too. [00:03:22] No, it's just his thing then. [00:03:24] Here's my thing. [00:03:25] I came out, like, I was like one leg down, one arm up, and then like my head poking out sideways. [00:03:31] I was like, hey, what's going on out here? [00:03:34] Yeah, then you like, pulled your iron cap. [00:03:37] We're worn out. [00:03:38] I was like, hey, baby, how about an American spirit blue? [00:03:41] Yeah, you're lighting up with the nurse. [00:03:42] Yeah, I'm like, I know, I know, I know, I was born with the knowledge that you ladies love this type of cigarette. [00:03:48] And that's why you were the inspiration for the boss baby. [00:03:52] Hello, everyone. [00:03:54] Hi. [00:03:55] Merry Christmas. [00:03:56] Happy Hanukkah. [00:03:59] My name, Chris Grangle. [00:04:02] That's nice. [00:04:02] I'm Liz. [00:04:03] We are, of course, joined by Father Christmas himself, young Chomsky, and the podcast is called Truanon. [00:04:12] I feel like Father Christmas is kind of like an inspo vibe. [00:04:15] I'm Uncle Hanukkah. [00:04:17] I feel like you could kind of like make a, I could see a Pinterest board of Father Christmas. [00:04:21] Father Christmas. [00:04:22] How about Uncle? [00:04:23] You want Uncle Hanukkah? [00:04:25] Uncle Hanukkah. [00:04:26] Uncle Hanukkah's coming over. [00:04:27] I don't like that. [00:04:28] Hey, Uncle, you don't live in a school. [00:04:31] Uncle Hanukkah's coming over. [00:04:33] It's good, right? [00:04:33] He's cool. [00:04:34] I know, like, he's just, you know, he technically he did his time. [00:04:38] Yeah. [00:04:38] You know, he's not. [00:04:38] I don't want to hide the whiskey. [00:04:39] Uncle Hanukkah's coming over. [00:04:41] I was walking through, I was walking through the damn Times Square the other day, the other night. [00:04:46] Why? [00:04:47] To do the show. [00:04:48] Okay. [00:04:48] And there was a group of cavorting young Jews on the street. [00:04:53] You seem to. [00:04:54] Yes, you sent us. [00:04:55] Some may call them, in fact, and I believe this was Young Chomsky's joke. [00:04:58] Dancing is raised. [00:04:59] Wow, okay. [00:05:00] We don't know. [00:05:01] I mean, they did seem, but it was like a group of like eight youngsters of the Hebrew religion. [00:05:08] Yeah. [00:05:09] Basuted. [00:05:10] And they were just blasting some sort of techno music. [00:05:14] Yeah, basuted in the New Yorkian manner. [00:05:17] And just like blasting techno and dancing. [00:05:20] And I said, you know, what's all this then? [00:05:23] He said, are you Jewish? [00:05:24] I said, look at me, brother. [00:05:25] He gives me a menorah. [00:05:27] And do you still have it? [00:05:30] I knew it. [00:05:30] No, I gave it to a guy that night. [00:05:32] I gave it to a rather loathsome individual that evening. [00:05:38] There was a talent show at the show, and there was like a guy there who wasn't loathsome. [00:05:42] He was just like, I was like, I get your stick. [00:05:43] But it was like, he was a Jewish rapper named Kosher, Kosher Dills. === Dancing Triplets and Menorahs (05:58) === [00:05:50] Are you kidding me? [00:05:51] I'm not kidding you anymore. [00:05:52] Why did you not go with Kosher Dilla? [00:05:54] I don't know. [00:05:55] Maybe it was that. [00:05:56] I wasn't really, I was just like, I get your whole thing. [00:05:58] Who was the big Jewish rapper? [00:06:00] Mattis Yahoo. [00:06:01] Mattis Yahoo. [00:06:02] Yeah. [00:06:03] And I got to be honest with you. [00:06:04] Mattis Yahoo. [00:06:06] Look at him now. [00:06:07] I once worked with a very small lesbian who was a big fan of his. [00:06:10] Really? [00:06:11] I don't think I ever encountered. [00:06:12] He was kind of a hosier figure in my mind. [00:06:15] You know? [00:06:16] He's no longer looking like that. [00:06:19] If you want to hit him with a Google image search, he's not, he ain't. [00:06:22] He's changed. [00:06:24] In my head, he's a Sasha Baron Cohen character. [00:06:27] Yeah. [00:06:28] Another guy I don't like. [00:06:30] All right. [00:06:31] Well, you know what? [00:06:32] Liz, funny you mention babies coming out being yanked into this horrible world unceremoniously by the clawed hand of a parent. [00:06:42] Because we're actually talking about those little wah-wah babies today. [00:06:47] Young Chomsky, can you hit me with the baby noise? [00:06:52] Do we have a baby noise? [00:06:54] Do we have a phrase baby noise? [00:06:55] I know this baby noise. [00:06:58] Oh, you don't want me to make one? [00:06:59] No, no, I think we need two baby noises. [00:07:00] We get two baby noises. [00:07:03] We're talking about babies. [00:07:05] And we're not just, okay, not just talking about babies. [00:07:09] We're talking about a certain kind of baby. [00:07:12] And one of the biggest babies of them all, Elon Musk. [00:07:17] Well, we're starting with Elon, and then we're going to, like his children who will no doubt, of course, launch their own incredible careers in science and arts, we're going to use Elon Musk as a little launching off point. [00:07:27] So, Liz, we did our Elon Musk story, our series, and, you know, listen, it's no secret that this neurodivergent billionaire loves to fuck, right? [00:07:40] And he loves more than that to receive the fruits of his fucking in the form of children. [00:07:47] He has 10 children, which is actually, I think it kind of makes it. [00:07:51] I think it's rumored. [00:07:52] It hasn't been confirmed that he has 10 children. [00:07:55] We haven't. [00:07:56] He could have more. [00:07:56] He could have more. [00:07:57] I am saying, it's about as far as you can get a rumor until we see the baby. [00:08:03] But, you know, there's registered, there's 10 registered babies out there. [00:08:08] I'm going to go and say that there's probably more. [00:08:10] I'm going to go through really quickly some of these kids. [00:08:13] I mean, the first two we had were Griffin and Vivian Musk, who are the, as far as we know, the only named children that we've encountered that have normal names. [00:08:22] Yeah, I was just going to say that. [00:08:24] And Griffin is already like, that's on the borderline of not normal name. [00:08:29] Griffin, I'm going to say it's a normal name. [00:08:31] Don't love it, though. [00:08:32] No disrespect to anyone. [00:08:33] No disrespect to all the Griffins out there. [00:08:35] I just think that it's a little, it's got a, you know, a tinge on the palette of Harry Potter. [00:08:41] You know what I mean? [00:08:42] So he had Griffin and Vivian. [00:08:44] Well, funny you say that because he had both of these little babies with his ex-wife, Justine Musk, who Canadian science fiction author. [00:08:53] That's true. [00:08:54] One of those children, Vivian, Says, change their name from due to, as he wrote in the form, gender identity and the fact that I no longer live with or wish to be related to my biological father in any way, shape, or form. [00:09:13] I mean, if my biological father was Elon Musk, I'd say, yeah, I get it. [00:09:17] Fully relatable. [00:09:19] I think we actually see here that sort of the beginning of Elon Musk, or not the beginning, but a glimpse into Elon Musk's mindset here because he does blame that his daughter doesn't talk to him because of full-on communism in schools and a general sentiment that if you're rich, you're evil. [00:09:37] The relationship may change, he says, but I have very good relations with all the other children. [00:09:42] Can't win them all. [00:09:44] What? [00:09:44] That is a crazy thing for a parent to say about their kids. [00:09:48] Can't win them all. [00:09:48] Can't win them all. [00:09:50] My whole thing is, and check this out. [00:09:52] You actually can win them all. [00:09:54] Yes. [00:09:54] 100%. [00:09:55] Absolutely. [00:09:56] In fact, I'd say that's one of the parents' jobs. [00:09:59] If you're a parent listening to this, and listen, I know that I'm notoriously childless due to my, well, I mean, I got a lot of kids. [00:10:05] I just don't take care of them. [00:10:06] But like, you actually, you should definitely make that your number one priority to win them all. [00:10:11] Yeah. [00:10:12] Next, he and Justine, Elon and Justine, had three kids named, and this is where the names start to get a little uncomfortable for all British people. [00:10:19] Okay. [00:10:20] Kai. [00:10:21] Interesting. [00:10:22] Saxon. [00:10:23] Okay. [00:10:24] And the devil name, Damien. [00:10:26] Now, I'm going to go ahead and say that those are the names of a wealthy person's German shepherds. [00:10:35] Absolutely. [00:10:36] Yes. [00:10:37] Kai is like, this is my dog, Kai. [00:10:40] Saxon? [00:10:41] Saxon should be tearing me apart on like the outer, like on the outer perimeter of like a rich man's like hunting grounds where he let me loose after I agreed to come to his mansion in order to receive. [00:10:53] Kai is the oldest dog who is a German shepherd and they got after their engagement in Hawaii, actually in Kauai. [00:11:02] And then Saxon is a more masculine hunting dog that the father procured after the wedding when he was feeling like a little emasculated. [00:11:12] Yeah. [00:11:13] Damien is the child's shitty Chihuahua. [00:11:17] Yeah, yeah, but it's a black chihuahua. [00:11:19] Evil cat dog. [00:11:20] Well, black chihuahua is like chihuahuas like cats, kind of. [00:11:23] But anyways, these are all triplets. [00:11:24] They were all born using in vitro fertilization, which we'll get to in a moment, with Justine Musk. [00:11:32] Unfortunately, those were the last of the brood that they sired together. [00:11:36] Elon later gets married to Tallulah Riley. [00:11:39] It was a, as far as we know, childless marriage. [00:11:42] I want to make something clear here. [00:11:44] Tallula Riley is not Canadian. === Spy Plane Secrets (04:29) === [00:11:48] I just want you to, I want to put a pin in that right now. [00:11:51] I want listeners to understand. [00:11:52] Tallulah Riley, unlike Justine Musk, is not Canadian. [00:11:56] That's one thing I like about Twilight O'Reilly. [00:11:58] Also, that she left Elon Musk. [00:12:00] No children. [00:12:01] Okay. [00:12:01] Mr. Musk then meets the famous Canadian waif singer Grimes via Twitter in 2018. [00:12:12] These two, I'm sure many of you gossip hounds out there, lipstick alley readers, were often spotted together necking at various makeout points in America. [00:12:22] But they were also fucking in various vaginas and penises as well. [00:12:27] So they had a baby that is named, and listen. [00:12:31] Famously named. [00:12:32] Famously named. [00:12:34] X, and then that little AE, like the Aeropostal thing, like where they're combined. [00:12:39] AE? [00:12:40] Aeropostal? [00:12:41] Aeropostal? [00:12:42] Did I make that up? [00:12:43] That's cute. [00:12:44] No, I like it. [00:12:45] I was thinking of Aeon Flux, but. [00:12:46] Maybe I was thinking of Aeon Flux. [00:12:48] No, I think you were thinking of the other Abercrombie. [00:12:50] Abercrombie. [00:12:51] No, American Eagle? [00:12:52] No, Aeropostal is the other Abercrombie. [00:12:56] Okay, gotcha. [00:12:56] Yeah, American Eagle is too. [00:12:58] X, AE, one letter. [00:13:01] And then A, hyphen 12. [00:13:03] According to a tweet from Grimes, the baby is named this because. [00:13:06] Okay, X is the unknown variable. [00:13:09] So true. [00:13:11] Always saying that. [00:13:12] AE, she says, is my elven spelling of AI, meaning artificial intelligence, or possibly love. [00:13:25] And then A-12, this is very weird, is the precursor to the SR-17, quote, our favorite aircraft. [00:13:35] No weapons, no defenses, just speed, great in battle, but non-violent. [00:13:43] Plus, A equals Archangel, which is my favorite song. [00:13:46] Is that one of her? [00:13:47] That's got to be one of her songs. [00:13:48] Yeah, isn't that the big song? [00:13:51] No. [00:13:51] That's Oblivion. [00:13:52] Oh, okay. [00:13:53] Named after the video game. [00:13:54] Stop! [00:13:55] You violated the law! [00:13:56] I don't know anything about her. [00:14:00] So the A-12, for those of you who don't know, was the SR-17 is pretty famous, but the A-12 was a spy plane. [00:14:09] So I know there are a lot of, how should I describe you? [00:14:16] Cruel men out there who are like cackling in delight. [00:14:20] Liz got the name with a fucking spy plane wrong because she's a girl. [00:14:24] She doesn't know what the SR-71 is. [00:14:26] She said SR-17. [00:14:28] I'm just going to go ahead and put a baby noise right here for you. [00:14:33] Liz read that tweet verbatim. [00:14:36] It's Grimes who doesn't know what the SR-71 is, the precursor to the A-12, which she included in her baby's name. [00:14:43] The A-12 was a spy plane built by the CIA in order to spy on the Soviet Union and Cuba. [00:14:50] Cuba. [00:14:50] Although it didn't actually spy on the Soviets thanks to Gary Power's stupid fucking ass. [00:14:55] He's crashing. [00:14:56] It photographed military installations in communist countries with less sophisticated air defenses than the USSR, like North Vietnam, North Korea, and, of course, Cuba. [00:15:05] So the A-12, I guess, is a precursor to the SR-70. [00:15:08] SR-71, very famous spy plane, A-12, I guess, less so. [00:15:11] I think it's real crazy to be like great in battle, but non-violent. [00:15:15] Because I'm going to be honest, we actually, you know, I'm not a big plane guy, as well documented on this show. [00:15:23] I don't think that the A-12 was ever engaged in battle by anyone, nor was the SR-71. [00:15:29] It just doesn't make any sense. [00:15:30] Like, a spy plane is like not. [00:15:32] That's not really what even if you're shot at, you're not in a battle. [00:15:38] You're like running from clearly very not great at spy. [00:15:46] I mean, they fucking, you know, listen, these spy planes, Gary Powers, again, his funky ass shot down over the Soviet Union. [00:15:52] Bridge of spies, Tom Hanks. [00:15:53] Yeah. [00:15:54] You know, this is not. [00:15:55] And then, of course, we dealt with him in our rules for Truanon episode. [00:16:00] So Musk appears to just call this baby X and views him as a sort of protege. [00:16:05] X, I believe, of course, an homage to the famous late rapper DMX, who during his behind the music episode begins to tear up while talking about the death of his mother and then pulls himself together and barks like a dog. === Silicon Valley's New Protege (14:54) === [00:16:18] So now, Musk and Grimes break up not too long after this baby is born. [00:16:22] They now say their relationship is fluid. [00:16:27] They're very Bushwickian of them. [00:16:29] Sure. [00:16:30] That did not stop them from having another baby, this one by a secret surrogate. [00:16:36] This baby's name is Exadark Sidrail Musk. [00:16:43] Yeah, I don't know what any of that means, but that baby, I don't even know what gender that baby is. [00:16:49] So I'm going to say that baby. [00:16:51] It's a girl, I believe. [00:16:52] Was born December 2021. [00:16:55] There has been, as far as I know, no public explanation over the name of this, but let me give you a little hint here. [00:17:03] Grimes is being what we in the industry call weird and crazy. [00:17:08] Are you familiar with Invader Zim, Liz? [00:17:10] No. [00:17:11] Young Chomsie, you're familiar with Invader Zim? [00:17:14] Well, there was a certain kind of person during my time in middle school who'd wear an Invader Zim shirt over long sleeve striped, a long sleeve striped shirt. [00:17:24] This person might be a little crazy, they might have a chain, and they would be above everything else, hella random. [00:17:31] So what Grimes is doing here is she is being hella random. [00:17:35] This might be a little bit digressatory, which is a way to describe something that's a digression, but I do want to give, this is going to situate us a little bit for the rest of the episode. [00:17:45] This is from the Vanity Fair piece, which revealed the birth of their second child. [00:17:49] So this is actually Grimes questioning the reporter for Vanity Fair. [00:17:54] Do you know what a protopia is? [00:17:56] The reporter says, no. [00:17:58] Parentheses, a state of gradual progress towards utopia. [00:18:02] Then she asks, effective altruism? [00:18:06] I mean, I know what those words mean. [00:18:09] Using data analysis to maximize resource deployment to help others. [00:18:12] Fact check on that. [00:18:14] Vanity Fair. [00:18:16] Then Grimes asks, the Overton window? [00:18:20] I thought so, but I looked it up while she was in the bathroom and I was wrong. [00:18:25] Parentheses, the spectrum of accepted discourse and achievable ideas. [00:18:30] That's kind of crazy that a Vanity Fair writer didn't know what the Overton is. [00:18:33] I really want to know what she thought it was before she looked it up and corrected herself. [00:18:37] What did she think? [00:18:38] I'm like, fuck, I thought it was a kind of scientific window. [00:18:41] Or just like literally a window. [00:18:43] Oh, the Overtone. [00:18:44] It's like a window that weighs. [00:18:45] It's over. [00:18:46] Yeah, it's overton there. [00:18:49] Yeah, that's kind of extraordinary, but I think the middle question there, the middle query posed by Grimes there, do you know what effective altruism is? [00:18:58] Taken in conjunction with all the rest of the things that we'll be going over in this episode, does make one go, hmm, and do the little resting your chin in the crook of your thumb and forefinger. [00:19:11] Yeah, we've been talking about effective altruism a little bit in the past couple episodes because of Sam Binkman Freed, of course, who was caught up, thrown in the old who's gow. [00:19:26] The Gray Bar Hotel. [00:19:27] Yeah, the old Gray Bar Hotel, the Bahamian Gray Bar Hotel where he still resides. [00:19:31] I believe today he is actually being extradited. [00:19:34] Congratulations, Sam. [00:19:35] Here to New York City. [00:19:36] Thank God, because we're here. [00:19:38] Yes. [00:19:38] Phrase noise. [00:19:41] Not baby noise. [00:19:43] Other praise noise. [00:19:44] So Elon Musk did not stop his busting and nutting there, though. [00:19:49] He actually also, and this was revealed in a sort of like, this was revealed in, I guess, as low-key a way as something like this can be revealed. [00:19:57] I believe it was Business Insider that broke this. [00:19:59] He also sired two babies with a woman that same year. [00:20:05] In fact, one month prior to Baby Y, which is what they call that second fucked up baby with Grimes, about a month prior in November. [00:20:14] He gave birth, excuse me, he didn't do shit. [00:20:17] A woman named Siobhan Zillis gave birth to those two children. [00:20:22] Siobhan Zillis is, like Grimes, like Justine, Canadian, which is, I am, I don't know what's going on here. [00:20:30] Someone look into that. [00:20:31] But he is clearly seeking out Canadian women to have sex with and give a baby to. [00:20:36] We don't know if he actually had sex with Siobhan Willis or Zillis. [00:20:40] He says he's cheap and he likes the health care. [00:20:42] Well, I think they're happy. [00:20:43] Yeah, he is actually very cheap. [00:20:46] I know some unconfirmed things about his personal life that it's just he does live in a weird way. [00:20:52] Yeah. [00:20:53] She is a high-performing VC who is on the board of directors for Open AI, worked on autopilot at Tesla. [00:21:02] That didn't work out right now. [00:21:03] And is now director of special operations for Neuralink. [00:21:09] Also, I believe lives in Austin, Texas. [00:21:11] That brings the total of Elon Musk's children that we know of up to 10. [00:21:17] Now, as far as I know, those two, I think, I don't know, but I'm pretty sure that those two babies with Siobhan were born with IVF. [00:21:28] Obviously, his other three children with, or his three second set of children with Justine, also IVF. [00:21:35] Yes. [00:21:36] And, you know, Elon Musk, I think a lot of people have actually noticed this in the past couple years, keeps talking about like, oh, the problem isn't underpopulation, or excuse me, overpopulation with the world. [00:21:47] The problem is we're actually not going to have enough people. [00:21:50] Right. [00:21:50] And he's been talking a lot about the future of civilization and like how rich people, well, that's the implication, but people need to be having more and more and more and more kids. [00:22:00] Yeah, he's posted on Twitter. [00:22:01] I mean, even before he became CEO in that takeover, he's posted on Twitter like bizarre graphics from like right-wing blogs of like falling birth rates and demographic decline and all that kind of stuff, which is all very popular in certain right-wing circles. [00:22:21] Absolutely. [00:22:21] And so you might be like, wow, what a strange thing for him to believe, but he's actually not the first Silicon Valley type to be obsessed with the fruits of his testes. [00:22:31] So one of the first guys on the scene in the Valley was a guy named William Shockley, who invented the transistor, which is, I got to say, good on him. [00:22:40] Pretty good on him. [00:22:41] That's crazy. [00:22:41] Anyways, he undergoes- That's a big invention. [00:22:44] That was a pretty big one. [00:22:45] I got to say. [00:22:46] I think of all the things to invent, that's like a transistor? [00:22:49] Yeah, I would have called it the Shockley. [00:22:52] Yeah, that would have been very good. [00:22:53] It's baller to name shit after yourself. [00:22:55] Yeah. [00:22:57] Anyways, he undergoes a midlife crisis as a result of his wife's uterine cancer, which is, we're getting some psychology here. [00:23:04] She recovers. [00:23:06] He actually sort of nurses her through that. [00:23:08] And then he's like, you know what? [00:23:09] Sick of you. [00:23:10] Complain, complain, complain. [00:23:12] Cancer, cancer, cancer. [00:23:13] I'm going to marry a nurse, or I don't even think he married her. [00:23:15] I'm going to shack up with a nurse and get the fuck out of here to what's going to become Silicon Valley. [00:23:21] So there he starts a semiconductor business in what would become known as Silicon Valley. [00:23:26] His business was a bust, but everyone who works there actually, it's like that, it's like that fucking sex pistol show that everyone was at in Birmingham, where there was very few people at the show, but everyone who was there went on to start Joy Division. [00:23:39] Oh, yeah, yeah, yeah. [00:23:39] You know what I'm talking about? [00:23:40] I hate that quote. [00:23:41] You know what bands those guys started? [00:23:43] Screwdriver. [00:23:44] Yeah. [00:23:44] Those guys were at the fucking sex pistols show. [00:23:46] And so he moves out there, starts this company. [00:23:48] He's trying to make semiconductors. [00:23:50] He's such a terrible boss and a horrible person to hang out with that basically everybody abandons him in this big betrayal, goes and starts Fairchild, which in turn gives birth to Silicon Valley as we know it. [00:24:01] But Shockley did not think he could fail. [00:24:03] Like many people, like many people of his caliber, he actually thought he was a genius. [00:24:08] So he goes on to work at Stanford. [00:24:12] Ah, my favorite. [00:24:14] There needs to be a Nuremberg trial for everybody who's gone to or worked at Stanford. [00:24:18] Absolutely. [00:24:19] Like throughout the year. [00:24:20] We can go Mormon style, dig up the dead. [00:24:22] Here's the thing. [00:24:23] It's not the guilty by association. [00:24:26] We've just got questions. [00:24:27] I just got some questions. [00:24:28] We've got some questions. [00:24:29] And the best place to ask questions when you're sitting on trial. [00:24:33] I want to hit you in the leg with a sock with an eight ball in it. [00:24:36] And then someone is fucking pointing a flashlight directly in your eyes. [00:24:41] Just kidding. [00:24:42] So, you know, he starts feeling himself. [00:24:45] You know, he's like back in his groove at Stanford. [00:24:48] And he starts talking publicly at what he viewed as the biggest problem of the age. [00:24:53] Smart people just aren't having enabies. [00:24:56] And stupid people are having too many. [00:24:59] Damn, I've heard that one before. [00:25:00] Well, unfortunately, Shockley also believed in a heavy correlation between race and IQ. [00:25:06] So a lot of these stupid people, he starts enumerating as what he would, black people. [00:25:11] So he's like, black people are having too many babies. [00:25:14] They're fucking stupid. [00:25:15] Indians are having too many fucking babies. [00:25:17] We hate them. [00:25:18] So he starts taking debates. [00:25:20] He actually turns out to be a complete fucking moron, loses a lot of these debates. [00:25:25] And throughout the 1960s and 70s, becomes a pariah figure on both campuses and debate stages, sort of this worm who seems delighted in being painted as a racist eugenicist, because that's really what he is, right? [00:25:37] He's pretty bald about it. [00:25:40] But then in 1979, sort of after Shockley's star has fallen pretty far from the sky, we kind of get a glimmer of the future. [00:25:49] There's a guy named Robert Graham who made bank off of making plastic lenses for eyeglasses, and he opens up a sperm bank. [00:25:57] This he called the Repository for Germinal Choice. [00:26:02] Now, a little background on Graham. [00:26:04] And he is classic. [00:26:05] Like we actually see a lot of parallels in some stuff we've covered on this show. [00:26:09] So he'd been involved in a lot of stupid fucking schemes before. [00:26:12] At one point in the early 1970s, he instructed the vice president of his company, which was called Armorlight, not. [00:26:20] I know you Irish people out there, you're like, oh, Armorlight. [00:26:23] Oh, I can blast one of those in a town square. [00:26:27] No, Armorlight. [00:26:30] He instructs the vice president of this company that he owns to buy a private island that he could found a nation on. [00:26:37] We like the sound of that, don't we? [00:26:39] So he spends years designing this like ultra-scientific, data-driven sewer system, food processing, storage, all this kind of stuff. [00:26:45] And he made this one thing about his project super clear. [00:26:49] Yes, he wanted to start a nation, but he wanted to be a center for the world scientists to get together to do high-tech research. [00:26:57] Remind you of anybody? [00:26:58] Yeah, I mean, I think a lot of people are probably like, wait, doesn't that sound like Jeffrey Epstein? [00:27:02] Yes, exactly. [00:27:03] It sounds like Jeffrey Epstein. [00:27:04] We've actually talked about the repository of Germanal Choice back when we covered some of Epstein's predilections with this stuff, which we'll get into, again, to remind you. [00:27:18] But it is not, like you're saying, it's not that uncommon among this milieu. [00:27:24] Exactly. [00:27:25] It's strange. [00:27:26] These guys do believe they are blessed with sort of holy balls. [00:27:31] But, you know, like many mercurial sort of genius types like him, his ideas fell apart. [00:27:36] And he kind of flits through thing to thing. [00:27:40] He was also an author. [00:27:43] He wrote a book called The Future of Man in 1970, which I have flipped through. [00:27:48] It can basically be summed up by this excerpt. [00:27:51] Although the contributions of the intelligent, parentheses, control of disease, efficient food production, and a multitude of inventions, permit great masses of the less intelligent to survive, when these masses gain sufficient preponderance, they no longer remain part of a united people. [00:28:10] Instead, they choose class war. [00:28:13] They are led to turn on the halves of their own people and liquidate them by the millions. [00:28:20] The killing includes not only those who have property, but those who have perceptive minds. [00:28:27] Intelligence has come to have extinction potential instead of survival value. [00:28:34] The most dangerous game. [00:28:36] Exactly. [00:28:37] Nowism. [00:28:37] Yeah, I know. [00:28:38] That's the funny thing about this. [00:28:39] And that's sort of a subtext you see in a lot of this stuff, is that the danger in like sort of these like elite smart, you know, and the subtext also, white people having, not having enough babies, is that the great unwashed masses will have babies and that those babies will grow up to become communists, essentially, whatever word they would use for it now, although Elon Musk is still calling them communists. [00:29:03] And of course, as we know, the countryside does surround the city, encroach on these fucking smart people and kill them because their contribution. [00:29:11] It's like idiocracy. [00:29:12] Yes. [00:29:13] Which is really, I mean, you know that movie changed a lot of guys' lives in Silicon Valley. [00:29:19] So he's like, I'm going to fix this by hooking up smart women with smart men's sperm. [00:29:24] Let me actually correct that real quick. [00:29:26] Smart white women with smart white men's sperm. [00:29:30] So he actually founds his little sperm bank in his basement in 1980. [00:29:35] Which, by the way. [00:29:36] Okay. [00:29:36] I'm not jizzing on baseball. [00:29:38] Yeah. [00:29:39] Ladies, you know, look, you don't need a man. [00:29:43] Okay. [00:29:44] It's modern times. [00:29:45] You're good. [00:29:46] You're good. [00:29:47] Don't go to a sperm bank that's in someone's basement. [00:29:50] Hey, hey, do you want a smart kid? [00:29:54] Like, just little, here's a little true non-tip: don't go to the sperm bank if it's in a guy's basement. [00:29:59] Sweetheart, you have such intelligent eyes. [00:30:01] Do you want, uh, do you want to come to my basement? [00:30:04] Sweetheart, you're so white. [00:30:05] You're so fucking white. [00:30:06] Oh my god, you're so fucking white and hot and smart. [00:30:10] Like, do you want to just come to my basement and fucking get some Nobel Prize sperm? [00:30:14] Because that was the big selling point for the repository of Germanal choice. [00:30:19] It had the testicular ofal of several Nobel Prize winners. [00:30:29] Hey, that's what it is. [00:30:32] Of several Nobel Prize winners. [00:30:34] And I mean, allegedly. [00:30:36] Allegedly. [00:30:37] It's in a little thing. [00:30:38] It's like, who knows? [00:30:40] Exactly. [00:30:40] I mean, you know, this is too far of digression. [00:30:43] I know how people get mad at us, but I got to tell you, I've told this story in the podcast before. [00:30:47] Graceland 2, no longer existing, but I believe the guy killed somebody in Memphis. [00:30:52] The guy claimed to have a pillow with Elvis's on it. [00:30:56] And that kind of could be anybody's. [00:30:59] Anyways, the selling point for the repository of Germany choice is that they're going to hook up smart women with smart men. [00:31:05] So they're advertising in Mensa towards women. [00:31:08] And then Graham was going around to smart guys he knew or had connections to. === Mutants, Reddit, and Eugenics (15:42) === [00:31:12] Unfortunately, by the 1980s, eugenics had long been out of vogue. [00:31:18] So the ideas, of course, of eugenics, kind of too big of a topic fully for the history of to get into this episode. [00:31:24] I think a lot of people generally know the contours of it. [00:31:26] But I got to tell you, the Nazis, just like the goose step and the Roman salute, and the swastika and the name Adolph and the name Hitler and the name Goebbels, they really kind of put an end to that. [00:31:41] Yeah. [00:31:42] Right? [00:31:42] Like that eugenics were immensely popular. [00:31:45] Nobody's named Goebbels anymore. [00:31:47] Exactly. [00:31:47] Well, I got a buddy, Jose Goebbels. [00:31:52] And it's like, okay, cool. [00:31:54] I just have to call him Jose. [00:31:55] I can't even just call him by his. [00:31:56] Okay. [00:31:56] But anyways, so the Nazis, of course, big believers in eugenics, as were really every Western country. [00:32:06] Eugenics were huge among like really all swaths of society, but especially among the elite, obviously. [00:32:12] The Nazis had several eugenics programs. [00:32:14] Of course, the most famous of those was the Liebensborn program. [00:32:19] That gave birth to, I believe, someone from ABBA. [00:32:23] I believe that's true, yeah. [00:32:25] Yes, one of the people from ABBA was born in one of those. [00:32:28] But after World War II and the Holocaust and all that, people were like, you know, maybe this racial hygiene thing, it's got some got some bad connotations. [00:32:37] I want to just pause for a second because you said hygiene, and I think that's a good, you know, for just a good way to think about, not a good way to think about eugenics, but 20th century, late 19th century, early 20th century eugenics really was about like purifying, right? [00:32:49] In the classic Nazi way, purifying genetic population of what they would say would be what, I guess, unwanted traits. [00:33:00] Yes. [00:33:00] Yeah. [00:33:00] Exactly. [00:33:01] So it was like a, yeah. [00:33:02] A great cleansing, as they would call it. [00:33:05] Yes. [00:33:05] Not my words. [00:33:06] I believe the term eugenics was actually coined by Charles Darwin's half-cousin. [00:33:11] It was. [00:33:12] So there's a lot of like, you know, once it was, you can sort of see the evolution, by the way, it's a lie. [00:33:18] Those things all just were born that way. [00:33:19] Yeah. [00:33:20] Chicken or the egg. [00:33:21] I'm sorry. [00:33:21] I'm not answering. [00:33:22] Yeah, it was seven days. [00:33:23] Everyone knows. [00:33:24] Seven days. [00:33:26] But, you know, these ideas had sort of by the 1980s gone out of vogue. [00:33:32] Yeah. [00:33:32] And so while Graham kind of tried to hide some of the nature of the sperm bank, he could unfortunately not help but run his mouth. [00:33:41] And then it was found out that every single of the many donors, he's claimed that were in there was white and that every single person that they accepted to have a baby with one of those donors was also white. [00:33:56] The bank received some negative publicity. [00:33:58] Yeah, I would think. [00:33:59] Did not help that the only Nobel Prize winner to come forward to say that his, yes, his semen was in there was famously racist and eugenicist Nobel Prize winner, William Shockley. [00:34:12] Oh my God. [00:34:13] Was it only his? [00:34:15] Nobody else ever came forward. [00:34:17] So he's, I think, the only confirmed comer that we have in there. [00:34:21] About 230 babies, including myself, were born from this sperm bank. [00:34:26] You know, the place, I think, closed, I think in the maybe 90s. [00:34:32] I'm not exactly sure. [00:34:34] Who knows? [00:34:35] By the way, that's too late for a place like that to close. [00:34:37] I mean, I'm like, did he got to re-up it or is it like frozen? [00:34:40] No. [00:34:40] You got to go back every few days or something? [00:34:42] Refractory. [00:34:44] Anyways, like we were saying, these things might call to mind a certain person that we've talked a lot about this show on this show named Jeffrey Epstein. [00:34:58] Yeah, it came out back in 2019 when we started this podcast that Jeffrey Epstein had planned to, and this is from the New York Times, quote, seed the human race with his DNA, which a classic Jeffrey Epstein quote. [00:35:13] Classic. [00:35:14] He planned to do that at, of course, his New Mexico ranch, Zorro Ranch, where he, you know, had, he told people at least, many people, that he had a plan worked out to inseminate dozens of women with his, you know, sperm. [00:35:32] Yeah. [00:35:33] You really. [00:35:35] You should have seen the face Liz made. [00:35:37] I was realized where I had to go with that sentence. [00:35:39] So, you know, the article in 2019 talks about Epstein's sort of long-standing fascination with transhumanism and eugenics, which led to me getting into an argument with Milton Friedman's grandson. [00:35:52] This is a... [00:35:53] Memories. [00:35:53] Yeah, I remember that. [00:35:54] I remember that. [00:35:56] He's a little fucking liar, too. [00:35:57] I immediately caught him in a very easily provable lie. [00:36:01] No one puts baby in the corner, you know what I'm saying? [00:36:03] From the article, it says, once, at a dinner at Mr. Epstein's mansion in Manhattan's Upper East Side, Mr. Lanier said he talked to a scientist who told him that Mr. Epstein's goal was to have 20 women at a time impregnated at his 33,000 square foot Zorro ranch in a teeny town outside of Santa Fe. [00:36:23] Mr. Lanier said that a scientist identified herself as working at NASA, but he did not remember her name. [00:36:29] By the way, I want to say 20 women at a time. [00:36:32] No one ever defined at a time. [00:36:34] Yeah, I'm like, a couple hundred. [00:36:36] Like, what is that? [00:36:38] Like, what are we talking about there? [00:36:40] The article does go on to state that this same NASA scientist claims that Epstein had modeled his sperm bank on the repository for germinal choice. [00:36:51] I want to highlight that name because we're having it. [00:36:53] Germinal choice. [00:36:54] It's such an interesting. [00:36:55] It really is. [00:36:56] And I don't know. [00:36:59] I think that the name itself, Germinal Choice, the idea that you get to now choose what the genes will be of your child, I think, really points to where we're going with this episode. [00:37:10] Absolutely. [00:37:11] And according to an acquaintance of Jeffrey Epstein's, Mr. Epstein said he was fascinated with how certain traits were passed on and how that could result in superior humans. [00:37:22] I don't like the term superior human, like X-Men? [00:37:26] Like X-Men, yes. [00:37:28] Would you call X-Men superior or like just different? [00:37:32] You know, I, I gotta be honest, was not a giant superhero guy growing up. [00:37:37] But the impression I always got from the X-Men, how they had to like live at that house and stuff, was that they were kind of disabled. [00:37:45] What? [00:37:46] Like, the main guy's like in a wheelchair and like blind, right? [00:37:50] Professor X? [00:37:51] Is Cyclops blind? [00:37:52] He just has to wear a special visor. [00:37:54] But the whole thing with the X-Men is all their powers are like double-edged swords. [00:37:57] Yeah. [00:37:58] Yeah, because they're mutants. [00:37:59] Yeah. [00:38:00] So are they mutants or superheroes? [00:38:01] Well, they're mutants, I guess, but they're superheroes. [00:38:04] I mean, yeah, they're superheroes, but they're mutants. [00:38:07] I will say, the Fantastic Four, even though their powers all suck, ass, I like that they just like live in their own apartment. [00:38:15] One guy's just made out of rock. [00:38:18] How bummed are you? [00:38:19] You're like Mr. Elastic and then the hot chick and then there was another guy and then like, I'm just fucking made out of stone, I guess. [00:38:25] What was his name? [00:38:26] It? [00:38:27] It must be really heavy for the thing? [00:38:29] Was he the thing? [00:38:30] That might be like, I'm not a fantastic. [00:38:32] Yeah, me either. [00:38:33] They suck. [00:38:34] Yeah, there's a reason there's never been a Fantastic Four movie. [00:38:36] Just kidding, I'm sure there has been. [00:38:38] I think Jessica Alva's in it. [00:38:40] But like, I don't know what that is. [00:38:42] What? [00:38:43] Yeah. [00:38:43] Now she just makes baby lotion, but she was a thing for a second. [00:38:48] What is she in the repository for Germanal Choice's basement? [00:38:52] So this leads us to just narry a few weeks ago, or maybe two weeks ago, don't have the date in front of me, but to an article that came out in Business Insider. [00:39:02] Our favorite paper. [00:39:03] Our favorite rag. [00:39:05] Yes. [00:39:06] It was called, or still is, I believe, Can Super Babies Save the World by Julia Black. [00:39:12] And what we're talking about are not mutants or superhero babies. [00:39:17] I'd say they're mutants. [00:39:18] Okay. [00:39:19] But it's like social mutants, not like they're not actually endowed with any powers. [00:39:22] Although I bet hanging out with these people for five minutes would make me kill myself. [00:39:26] So that in its way is a sort of power. [00:39:28] It's an exploration of the world of Malcolm and Simone Collins, a sort of pair of bespectacled techies who live in the world of what is called in the article, hipster eugenics. [00:39:41] Which is a term that I just don't think we should write. [00:39:44] I don't think that that should be a term that we use at all. [00:39:48] No, we need to combine the eggs of a longtime American in apparel employee with the sperm of somebody who played in a mid-level indie rock band but escaped the Burger Records me tooing. [00:40:01] And once we get those two kids together, that child that they're born from will open up a vintage store on sunset. [00:40:09] That's literally just like everyone in our emilieu in San Francisco. [00:40:13] Correct. [00:40:13] Yes. [00:40:15] So these people are fucking freaks, right? [00:40:18] We read this article. [00:40:19] Not our friends. [00:40:19] The Collins. [00:40:20] The Collins. [00:40:22] These people are freaks. [00:40:23] They're off-leash. [00:40:24] And they look kind of funny, too. [00:40:25] They look awful. [00:40:28] Yeah. [00:40:29] My God, Terry Richardson would never take a picture of them. [00:40:33] But I think sort of reading the article to me, I think was important because I guess the term I would use was they sort of blew out the spot for everybody. [00:40:42] It blew up the spot rather for everybody else, right? [00:40:44] Because they're so ostentatious and they're so like excited about this what they call pro-natalist project that they have that they kind of like make everybody who's more serious and maybe more moneyed and maybe have these like foundations. [00:40:57] And maybe doesn't talk about it in the press. [00:40:59] Yeah, certainly not like this. [00:41:00] Yeah. [00:41:01] They make them look even more freakish than really they are. [00:41:04] They sort of tar them by association. [00:41:06] So they have some kind of wacky idea. [00:41:08] They want to have eight children and have those children have eight children and so on and so on and so on and so on, which is whatever. [00:41:14] But the important thing I think to pay attention to is that they call themselves the Underground Railroad Tagataca for their children. [00:41:24] So far we have Octavian and Titan Evictus. [00:41:31] Igvictus? [00:41:33] I don't know. [00:41:34] All right. [00:41:35] Now, one. [00:41:37] Yeah. [00:41:38] Don't reference Gattaca. [00:41:40] I like Gattaca. [00:41:41] No, no. [00:41:41] Great movie. [00:41:42] Green County Courthouse. [00:41:43] But it's not a positive thing. [00:41:45] No, we don't. [00:41:46] No, that was. [00:41:47] It's like if someone was like, you know what I'm thinking about? [00:41:50] In the future, I'm so excited to finally watch The Hunger Games, the real thing. [00:41:54] It's like, what? [00:41:55] Well, my thing is, you know, this quote, the Underground Railroad of Gattaca, I mean, I kind of, you know, not to compare myself to a black person under chattel slavery in the 1800s, but if Harriet... [00:42:10] Not your funky ass. [00:42:10] Not my funky ass being led by Harriet Tubman to fucking Gattaca. [00:42:16] Like, I'm sorry, that's a little different than like going up to Boston to like fucking escape slavery. [00:42:22] Gattaca? [00:42:24] My God. [00:42:26] Okay, then I want to say it too, Octavian and Titan Victus. [00:42:30] Absolutely not. [00:42:31] No. [00:42:32] I'm going to tell you, if you want your eight children to survive to adulthood without getting the shit kicked out of them to the point where their balls don't work because they've been kicked in them so many times, do not name your kid Octavian. [00:42:46] Good God. [00:42:47] It should not be a surprise to anyone listening that they are both tech workers and one of them used to work at Google, the other one for Peter Thiel. [00:42:56] So he actually proposed to her on Reddit. [00:43:00] Gonna let that hang in the air there. [00:43:05] Was the ring made out of Reddit gold? [00:43:10] Excuse me. [00:43:11] May I have your hand in marriage? [00:43:14] I've already sent a PM to your father, offering him a large dowry of Reddit gold and several awards. [00:43:23] I mean, I don't know what to tell you about that, man. [00:43:25] He fucking proposed to her ass on Reddit. [00:43:27] There was articles about it. [00:43:30] There's a lot of fan art he commissioned. [00:43:33] They're also very active in the long-termist community, which is very much related to our old bugaboo effect of altruism. [00:43:42] So they're open about being scientific Calvinists too, which is to say there's – No, I'm just going to say if any of you know anything about Calvinists, that's not something we need to bring back. [00:43:51] No, we don't need to bring, we don't need to bring back Calvinism. [00:43:54] No. [00:43:54] And scientific Calvinism? [00:43:56] That's even worse. [00:43:57] That sounds even crazier than OG Calvinism. [00:44:00] My thing is, if you're naming your kid Titus Invitus, Invictus. [00:44:04] Whatever your fucking name, your kid name. [00:44:06] That sounds like a fucking Hulu drama. [00:44:08] I'm telling you, your kid is predestined to be a dumb little baby dork, okay? [00:44:14] So you've already, I guess you are a secular Calvinist because you've doomed your kids to getting made fun of for the rest of their lives after anybody in high school Googles their name and sees this Super Baby Save the World article. [00:44:27] But they have an obsession about scientifically optimizing the best future for their children. [00:44:33] And they come right out and say, they said they view themselves as the elite, and the elite should have more children, as those children will also likely be elite. [00:44:42] Vampire people. [00:44:43] Well, they're freaks, right? [00:44:45] But actually, this worldview is shared by quite a few rich people. [00:44:50] Yeah. [00:44:51] I mean, Elon Musk, definitely one of them, right? [00:44:54] Mm-hmm. [00:44:55] According to an anonymous source in that same business insider article, he believes, quote, IQ and wealth are closely linked. [00:45:02] And he says that wealthy people should have more children because those children will likely be smarter. [00:45:07] His father also quoted in the article being concerned about population collapse. [00:45:11] However, the article does not mention that Elon Musk's father had a daughter with his own stepdaughter. [00:45:18] I want to repeat that again. [00:45:19] Elon Musk's father, who is concerned about population collapse in, quote, productive countries, has had a baby with his own stepdaughter, Ela Woody Alain. [00:45:31] Yeah. [00:45:33] So Elon is also a follower of long-termism. [00:45:36] And Liz, let's talk about long-termism for a second. [00:45:39] It's such a big, it's a pretty big topic. [00:45:42] Yeah, I think we could probably devote a whole episode just to that. [00:45:45] Yeah, to that and effective altruism. [00:45:47] I told you I read that book, the Million McCaskill book. [00:45:51] That's crazy. [00:45:52] What circumstances did that happen in? [00:45:54] Because everyone was talking about it. [00:45:56] It's like New York Times bestseller. [00:45:57] And I mean, I kind of like flip through a lot of it. [00:45:59] Yeah. [00:46:00] You read the intro and you have the, I think I told you that I was like, it reminded me so much of the how to quit smoking book in the sense that it just, you can tell that the tool of the how to quit smoking book, which by the way didn't work for me, although I don't know. [00:46:16] You don't smoke. [00:46:16] No, but when I tried to quit smoking through the book, it didn't work for me. [00:46:19] Gotcha. [00:46:22] But it just like repeats the same things over and over and over again to kind of like hypnotize you. [00:46:26] And the book is written very similarly and as like kind of simple-mindedly as a book about how to quit smoking. [00:46:34] I got to be honest, I think that works because, you know, we do a podcast. [00:46:38] Like a podcast is probably the worst way to convey information to people. [00:46:41] Sure. [00:46:42] But if we just repeated, if we had just like three things that we just repeated over and over again, we could probably get our point across easier. [00:46:49] Paul is dead, you mean? [00:46:50] Paul is dead style. [00:46:51] Yes. [00:46:52] But yeah, the reason I read it was just because it was kind of everywhere. === Rationalism And Risk (03:05) === [00:46:55] And I was like, God, everyone is like talking about this effective altruism thing and this long-termism thing. [00:46:59] Like, what is this all about? [00:47:01] I hate nouveau rationalism, right? [00:47:03] I like real, like, really don't like these guys. [00:47:07] So I wanted to check it out. [00:47:09] And it's completely and totally ridiculous. [00:47:12] Well, long-term, I mean, effective altruism, I think we discussed a little bit on the SBF, the first SBF episode that we did. [00:47:19] But long-termism is sort of a subset that's related of it, that's kind of related to it. [00:47:24] I would say, from what I understand, it's also a faction within the EA community of people who are more concerned with that rather than like maybe like orthodox effective altruism. [00:47:35] I mean, it's all hokey baloney, right? [00:47:37] But there's still factions within it. [00:47:39] So the thinking here is that if humanity breaks out of our planet, like goes and colonizes space and our solar system and beyond, the potential for the amount of people, which God and technology willing, that the universe can sustain is so massive and huge that even the tiniest leap towards that future, no matter how negatively it affects people now in the present, is actually helping a massive, [00:48:08] almost uncountable number of people in the future. [00:48:11] So while I could be greatly harming the world and people around me right now, as long as I say that I'm doing it because I'm doing it in the pursuit of technological advance, which will help humanity, I know how ridiculous this sounds, break out of the planet and into the solar system. [00:48:31] I'm actually doing way more good because I'm actually helping all of these people who just haven't been born yet. [00:48:38] Which is like a kind of mode of like vulgar rationalism that I think would make Hobbes blush. [00:48:44] I think some of this stuff would make Hitler blush, to be completely honest with you. [00:48:47] I mean, it is fucking, you can like, it is, it is. [00:48:50] It's completely absurd. [00:48:51] It's absurd and it's like, it's, it's so self-serving and really almost like, if you cut down all the bullshit about it, fucking nihilistic. [00:49:01] Yeah, absolutely. [00:49:03] So Elon gets this idea from this philosopher named Nick Bostrom, who he supports and, you know, sites and stuff. [00:49:09] He is the guy that kind of came up with that whole we live in a simulation thing that I know you love. [00:49:13] He also coined the term existential risk, which once you realize that that is like a long-termist thing, once you start seeing that, it's sort of like a dog whistle for long-termists. [00:49:24] Bostrom runs a Future of Humanity Institute at Oxford, which is part of a huge network of long-termist institutions at elite universities that are funded in part by a cadre of tech royalty like Peter Thiel, John Jan Tallin, who have founded Skype, and Sergey Brin. [00:49:42] So there's also another guy named Sam Altman, who ran Y Combin. [00:49:47] He was like, I think the second guy who ran Y Combinator and also the founder of Open AI, where Elon's newly revealed baby mama is also from, who's heavily invested and working with a lot of these guys. === Genomic Prediction Controversies (09:22) === [00:50:00] And he's invested in a lot of fertility tech, including something that the Collinses used. [00:50:06] It's called genomic prediction. [00:50:09] Yeah, that's one of the companies, genomic prediction. [00:50:11] There's kind of a, there's a whole clump of these companies, Orchid Health, Genomic Prediction, ReproCare, Genetics. [00:50:19] There are all these kind of like very, very well-funded startup labs that are basically, some of them just labs, some of them that contract out to other clinics that offer IVF in vitro fertilization. [00:50:36] That basically, these companies, they offer this new service where you can select embryos based on what they call polygenic scores, ESPS, embryonic selection from polygenic scores. [00:50:51] Now, riddle me this, young trickster. [00:50:54] What's a polygenic score? [00:50:55] Well, that's a good question. [00:50:57] It's pretty confusing. [00:50:59] A polygenic score, it represents predictions. [00:51:04] It's like a measure, right? [00:51:05] Of predictions of health outcomes derived from genome-wide studies in adults to basically predict health outcomes of these embryos, right? [00:51:17] So it reads the kind of the genome, the genetic code of the embryo, compares it with a data set from a population of adults, and says, okay, looking at all these markers, we're giving it a kind of score, a statistical score, that says, okay, this embryo might be, you know, [00:51:43] more likely to have all of these certain like genome predicted characteristics. [00:51:49] Okay, like we're talking about like height. [00:51:52] Height, BMI, certain behavioral, you know, and cognitive traits, but specifically, and most importantly, like risk factor for diseases. [00:52:01] Okay. [00:52:01] Which a lot of that is, you know, not new science, right? [00:52:05] Some of these like totally normal testing done now that's like really routine for cystic ribosis or TASA. [00:52:11] Yeah, yeah, yeah. [00:52:14] But the polygenic score basically puts together and figures out the combined effect of all of these genetic variants on a certain trait and then predicts what the individual, meaning the embryo, what their traits will be. [00:52:26] You know, a while ago, it was really controversial with in vitro fertilization to select for sex. [00:52:33] Yeah. [00:52:34] Right. [00:52:34] I mean, I remember, I think it was like in the 90s where there was like a huge like uproar about it, like Time Magazine cover like debate style. [00:52:41] You know what I'm saying? [00:52:42] Of like, should you be able to pick that? [00:52:44] And at first, you know, doctors and some of the medical boards that govern this, well, not govern, but like, you know, oversee sort of softly some of their stuff really pushed back on it. [00:52:56] But parents or prospective parents were really for it. [00:53:01] And so that now is, you know, pretty routine with IVF. [00:53:07] So now there's a ton of companies that offer ESPS and this polygenic score. [00:53:12] And a lot of them, especially the ones that are getting all of this crazy tech money and are enjoyed by people like the Collinses, use machine learning algorithms to run through these massive, massive data sets to stratify these populations. [00:53:31] And so they'll say, okay, you might have a risk of having IBS as one of them. [00:53:39] Okay, why are you pointing at me? [00:53:41] So I want to be clear here. [00:53:42] I don't have IBS. [00:53:43] I know, but I'm pointing at you because we're talking. [00:53:45] Okay, I know, but you're pointing at me. [00:53:47] Well, you haven't pointed at me for the entire rest of the conversation when you say like a disease that a lot of Jewish people have. [00:53:51] You did point at me. [00:53:52] Breast cancer is another one that they testify. [00:53:54] You're also still pointing at me. [00:53:57] Alzheimer's, right? [00:53:59] They kind of assign a score for. [00:54:01] But now, the New England Journal of Medicine put out a pretty, you know, an interesting piece on this and really was ripping apart a lot of these companies for, you know, on ethics grounds. [00:54:12] Yeah. [00:54:13] And they were saying that one of the companies, Miom, which I got to say, I don't like that. [00:54:18] Mioam. [00:54:20] Myome. [00:54:22] That they were providing patients with embryonic polygenic scores for education, household income, cognitive ability, and subjective well-being. [00:54:32] Wait, hold on a tick. [00:54:34] Lass. [00:54:35] You're telling me that like this company, you know, I'm like, oh, I take my, I'm like, I want to have a baby, go to this company. [00:54:41] And this company is telling me like, hey, your kid's going to make $65,000 a year and they're going to work at New Gawker. [00:54:49] Not exactly, right? [00:54:50] None of this stuff can actually predict, you know, obviously. [00:54:54] Yeah, that sounds like hokeum. [00:54:55] It is hocum. [00:54:56] But what they do is they assign a kind of statistical score. [00:55:00] So they say your kid might be more likely or your kid is at risk, and we'll get into that, is at risk of being poor. [00:55:08] Your kid might be a psycho thing to offer. [00:55:13] Sir, ma'am, I don't know how to tell you what this, but the data here suggests that your daughter will be a bisexual woman with mental health issues. [00:55:22] You know, for whatever it's worth, that journal article, it was published in 2021, and it linked to a report from MyOM that has now since been taken down and removed from the Wayback Machine. [00:55:33] You know when it's been removed from the Wayback Machine that it's serious. [00:55:36] Yeah. [00:55:36] You don't, yeah. [00:55:38] I was reading this interview with the co-founder of Genomic Prediction, which again is the company that the Collins has used. [00:55:45] And this guy is Stephen Hisu. [00:55:48] Su. [00:55:48] Chu? [00:55:49] Shu? [00:55:50] I don't know how to say it. [00:55:50] It's a name. [00:55:51] You can say it however you want. [00:55:52] You can say it however you want. [00:55:55] He had a couple really interesting quotes that I think might give us some insight into some of this stuff. [00:56:01] He said there's been a huge spike in these papers in the last year or two, and the results are really impressive. [00:56:08] Talking about these kind of polygenic risk scores and the data that these companies are able to kind of extract and work through. [00:56:17] He says the results are really impressive to the point where a lot of health systems are becoming aware of it and realizing that sooner than later, they may be deploying widespread, inexpensive genetic testing to predict risk for adults. [00:56:32] And he says, my feeling is companies should not get too far out ahead of what society is comfortable with. [00:56:37] We want there to be a broad discussion in society about what people think is appropriate. [00:56:42] The ASRM, which is the American Society of Reproductive Medicine, I'll talk about them in a second. [00:56:47] They're very clear about what they think is appropriate. [00:56:49] The UK, the UK has the HFEA. [00:56:52] Every country has a different regulatory scheme. [00:56:54] The difficult question may arise if some country decides they are comfortable with something further out on the edge, right? [00:57:01] Now, what he's talking about is if a country were, say, for example, totally fine with people doing genetic testing and using these risk scores for, say, skin color or eye color or interesting. [00:57:16] So, like, wow. [00:57:18] So, like, being like, oh, I want children with like lighter colored skin. [00:57:23] Yeah. [00:57:23] Essentially. [00:57:24] I mean, he goes on, he says, like, you know, it'll be a tough decision for us. [00:57:28] Suppose we have this huge customer, right? [00:57:30] Because they're a lab, right? [00:57:32] Essentially. [00:57:33] Suppose we have a huge customer that is ordering 100,000 tests a year from us, and they really want this feature, right? [00:57:40] Like, what he says is, you know, imagine a big clinic in Korea or something, and they're demanding that they want to be able to do, you know, they want to know who has lighter skin and who has darker skin. [00:57:52] What if we've got a client that's going to order 100,000 tests from us and we can actually do that and it's legal, what are we supposed to say? [00:58:00] Yeah. [00:58:01] And it's interesting, right? [00:58:02] Because, I mean, there aren't actually any, at least in the U.S., right? [00:58:05] I can't speak for Korea or the UK or, you know, he talks about Singapore. [00:58:10] I can't speak to the regulatory systems and bodies in those countries. [00:58:14] But in the U.S., like IVF and all of this stuff is completely unregulated. [00:58:21] Really? [00:58:21] There are, yeah, I mean, the ASRM is not a regulatory body in the sense that it says you can do this, you can do that. [00:58:29] It gives guidelines, but that's it. [00:58:32] It's not actually enforceable, right? [00:58:35] The U.S. is, I think this should not come to a surprise, like come as a surprise to anyone, but it's basically completely and totally unregulated as opposed to many, many other countries when it comes to IVF and reproductive care. [00:58:50] So basically, I mean, all of these questions, right, of whether or not you should be able to test for, you know, or you screen or choose an embryo or whatever for something like hair color or eye color or skin color or, you know, as far, you know, even these kind of weird polygenic risk scores that seem very kind of some strange data sets, right, on things like, you know, [00:59:19] income or education or IQ. === Genetics Research and High Income Predictions (03:48) === [00:59:22] I'm just like, I mean, that just sounds so ridiculous to me. [00:59:26] You can't be like, that embryo is going to have a high income or like, you know, like, really, I'd be like, well, do you guys have a high income? [00:59:34] Because then your kid is probably going to have a high income. [00:59:36] Well, basically, all of these questions about whether or not this should exist are really up to a handful of small companies that have a lot of money and also have very wealthy clients that have their own interests. [00:59:48] And wealthy investors, right? [00:59:51] And quasi-bogus science and basically zero regulatory oversight. [00:59:56] So, okay, that puts my fears to rest. [00:59:58] Yeah, absolutely. [01:00:00] But I mean, these polygenic risk scores, I mean, the thing you're saying is like, yeah, you know, if you, if you got, if your parents are wealthy, then most likely your kid is going to be. [01:00:08] Yeah, no shit. [01:00:09] Yeah. [01:00:09] I mean, it's just such an absurd thing to kind of associate. [01:00:14] But I mean, when they look into this kind of like genetics research, right? [01:00:18] All of the human genetics research up to now basically has been conducted with research participants from European ancestry. [01:00:25] Yeah, yeah, of course. [01:00:27] And, you know, all that predictive power is going to get a lot lower when you start comparing it to participants who are not from European ancestries, right? [01:00:39] There's a lot of bunk in a lot of this science is basically what I'm going to say. [01:00:43] That's funny. [01:00:43] That really reminds me of the, you know, the sort of like genius all-white Spurbank. [01:00:49] You know what I mean? [01:00:50] Like they're just like, it's like they're kind of like, I don't know, there just always seems to be like a weird, like, even if it's not on purpose, which often used to be, like a racial element to a lot of this. [01:01:01] Yeah, absolutely. [01:01:03] And even just like, I mean, statistical modeling being completely imperfect, right? [01:01:08] And it's bizarre when it gets kind of, you know, put to use in this way. [01:01:12] I mean, he puts it well. [01:01:14] He says, you know, I mean, he compares it to auto insurance, which is a good way to look at it, right? [01:01:19] He says, say you get three speeding tickets last year. [01:01:22] The fact that you did doesn't mean that you're at a higher risk for an accident this year. [01:01:27] That conclusion statistically can only be drawn on a population level, not an individual one. [01:01:32] That's true. [01:01:34] But when they issue a policy and charge you 10% more, that's an individual decision. [01:01:40] That's the jump that is made all the time. [01:01:42] This is what statistics is used for. [01:01:45] It's very weird when you say, okay, but now let's apply that to choices about genetics, right? [01:01:52] And children. [01:01:56] Yeah, yeah. [01:01:57] I mean, that's like, it's funny because, I mean, like, so many of these things, like, I guess the way I'm thinking of it is like, a lot of this seems to be coming from, I guess what I would call more the tech sector than the health sector, which are both very, I don't know if the word I want to use is corrupt because it's not like something pure that was corrupted in either case, but you know, like both, both shady in their own rights, right? [01:02:23] But like, you know, I wonder how much of this is like either so unworkably expensive right now that it's like really just going to be available to rich people for a while, or how much of it is just like fucking hokeum? [01:02:36] Well, the thing is, is the tech is going to get cheaper and better and more widely adopted. [01:02:41] And the question is how? [01:02:43] Yeah. [01:02:43] Right. [01:02:44] And it's like one of those things I think, I mean, it's sort of like, you know, what is it? [01:02:49] It's like if you're looking for your keys under the lamp, it doesn't mean that your keys are there. [01:02:54] It just means the light's on there. [01:02:56] You know what I'm saying? [01:02:57] It's like, so are we making choices about what we're testing for because that's what we've decided, these are the things we want to test for? [01:03:04] Or because this, what we can test for right now, this is what we can look at. [01:03:08] Yeah, okay, I do see what you're saying. [01:03:09] Yeah. === Greatest Human Achievement? (02:06) === [01:03:10] And I don't think anyone's really having these kind of public discussions of like what we actually want. [01:03:15] And they're being driven by a bunch of private companies. [01:03:18] You know, and this stuff gets really sticky and scary. [01:03:21] I mean, genetics, it's, you know, I'm of such two minds about all of it. [01:03:26] We were talking about this. [01:03:27] You know, it's like, you know, on the one hand, you know, you think about it, it's like, of course, like I get, of course, if you could eliminate Alzheimer's, right? [01:03:38] Yeah. [01:03:39] Of course, of course, that's an admirable goal. [01:03:41] Of course you would want to do that, you know? [01:03:43] And the sciences, I mean, molecular, like molecular biology and the, I mean, the mapping of the human genome, right? [01:03:52] It's like one of the greatest human achievements in history. [01:03:56] I mean, even calling it the greatest human achievement in history seems to like kind of belittle what an achievement it was, right? [01:04:03] And it's like one of those things that's like kind of like thinking about it and is really like awe-inspiring to me. [01:04:10] I know it's kind of corny, sorry. [01:04:12] But just like our capacity and our ability to kind of, I don't know, like to advance that far and see, I don't know. [01:04:22] It's really, really an incredible achievement, right? [01:04:25] And it's enabled like so much. [01:04:28] And I think that like what's exciting about molecular biology when you think about it, right? [01:04:35] genetics is that the more and more we understand the genome, the more we understand that it's, you know, it's something that's coded, but something that's rewritten as we move through life and because of environmental factors and social factors and all these things, right? [01:04:50] And so the kind of the traditional sort of pillars of is it nature, is it nurture, that stuff, right, like fall apart. [01:04:58] And we see that so much of what we call natural is socially constructed. [01:05:06] And so much of what we've socially constructed is natural. [01:05:11] Like that, those, those lines, the kind of ontological difference between those two is like completely like falls apart. === Socially Constructed Risk (03:15) === [01:05:17] Yeah. [01:05:17] The more and more we learn about the genome, right? [01:05:20] But at the same time, you see throughout history how this science gets used in the most horrific and horrible ways, like how it's used to justify these kind of eugenesis programs. [01:05:37] Yeah, eugenesis programs, most like monstrous acts in human history. [01:05:41] I mean, I mean, even like, I can't remember, I think it's North Carolina, maybe South Carolina, sterilized like hundreds of people up until I think maybe 80s, 1990s. [01:05:51] You know, like it's, it's that, that, I guess, like, you know, like, like we were talking about yesterday, like, I'm also of two minds about this, right? [01:05:59] Like, of course, you know, if we can eliminate disease or whatever, I mean, how could I be against that? [01:06:07] You know, what's like, I have no, there's not like a religious reason or whatever for me to be against it. [01:06:12] Of course I would be for it. [01:06:13] Like, I believe in, I believe in science in that, in that sort of the, I guess what you might call the Marxian way rather than like the believe science kind of people way. [01:06:22] But like, the thing is, like, this technology is not, like, that's not really where I think the money would lie, at least in the immediate future. [01:06:36] Because while the people, many of the big proponents of this stuff might be long-termists or whatever, who, you know, by their ideology, you might think would want to do that and get this out to the masses as soon as possible. [01:06:48] You know, they are still investors and these are still businesses, right? [01:06:52] And like, where would the business be in this? [01:06:54] It would be for high profile, or not high-profile necessarily, but like for wealthy clients who are trying to have, you know, for lack of a better word, super babies. [01:07:02] Right. [01:07:02] I mean, not like these babies will be super, but like, you know, to select the most like choice embryo for them. [01:07:10] And it's just like, I see the major proponents of this stuff. [01:07:15] And like, you know, it just, and I got to tell you, you know, listeners of the show will understand. [01:07:21] Last time I took a science class was like 2006, right? [01:07:24] Not a passion of mine. [01:07:25] I'm a man of poetry. [01:07:27] So, you know, I'm not going to claim to understand like all of the different science behind this. [01:07:31] But from what I can understand, from the layman's perspective, from the normal person looking at this on the outside perspective, you know, some of the stuff seems really great. [01:07:38] And then some of the stuff seems fucking terrifying, right? [01:07:41] And especially, you know, I look at those names, you know, Teal and Musk and these people. [01:07:48] And again, like, I know they're not like, you know, even Sam Altman, the fucking AI person. [01:07:52] It's like, you know, this all seems to be like part and parcel with this like general like technocratic trend, I guess, that we see. [01:08:02] And so like, in a way, like it is kind of the railroad to Gattaca, like, but like the Gattaca society rather than like us being super like the Gattaca people. [01:08:11] Yeah, I want to go back to that quote that I said from the genomic prediction guy, Stephen Hsu or Su or Zu, who knows? [01:08:24] I read it already, but I'm going to read it again because I kept thinking about this when we were kind of looking at stuff for this episode. === Risk in Technocratic Trends (10:42) === [01:08:32] He said, sooner or later, they may be deploying widespread, inexpensive genetic testing to predict risk for adults. [01:08:42] And I've just like in general been thinking about risk a lot and this term we use, risk. [01:08:49] You know, it gets studied in like all fields, I guess, now, like psychology and obviously economics, that's just where it comes from. [01:08:56] Well, it comes from insurance, but you know what I'm saying. [01:08:59] Philosophy, political science. [01:09:01] It's now in obviously ecology, sociology, technology. [01:09:05] Every like investment, every project, every business. [01:09:08] I mean, every business has like entire buildings devoted to risk assessment. [01:09:14] Every political decision based on risk. [01:09:16] But not just that. [01:09:16] It's like even the kind of non-income generating facets of our lives. [01:09:24] People talk about risk with like dating. [01:09:26] People talk about risk with other things. [01:09:29] Absolutely. [01:09:30] Absolutely. [01:09:31] Yeah. [01:09:32] And I think, you know, it's interesting. [01:09:33] This is like a relatively new mode of thinking, surprisingly enough. [01:09:37] I mean, you know, relatively, it's like, you know, it's risk assessment really started being applied in social theory just in like the late 80s and early 90s. [01:09:47] There was this book that Ulrich Beck wrote called Risk Society. [01:09:52] And he claimed that, I mean, it's kind of, I think it's kind of like a dumb claim, but it was very popular. [01:10:00] That like with new technological advancements, that came like a new global era of modernity. [01:10:06] He's talking kind of about globalization. [01:10:08] And he says one that, quote, freeing itself from the contours of the classical industrial society and forging a new form, the risk society. [01:10:17] It's a very catchy little term. [01:10:20] But he claimed that there was this new stage of development because the logic behind risk production differs from the logic behind distribution. [01:10:30] That I can see. [01:10:31] And he says that now with the advent of modernization, that risks were global. [01:10:35] So what he's saying here basically is like, instead of, you know, like in any like productive undertaking or whatever, like any business, anything like, or not any, but like a wide, like large scale globalized one, there's more risk involved because, for instance, if I'm exporting grain to South Africa and, you know, I have to like. you know, come up with all the different risks that can happen to that getting there. [01:11:02] Yeah, basically. [01:11:02] But also like, think of nuclear power. [01:11:06] Yeah. [01:11:06] Right. [01:11:06] And large scale, the kind of large scale production required of a modern society is going to have like larger risks that will affect larger amounts of people. [01:11:20] And like his book got really popular right because like he published it and then Chernobyl happened. [01:11:25] And everyone was like, damn, there really are risks that affect us all. [01:11:28] It's so funny because a phrase like risk society, like saying stuff like that was so popular in the 80s. [01:11:33] I know, right? [01:11:34] You could like see the cover of the book. [01:11:36] I don't have no idea what the book looks like, but I have like a vision of the hardcover in my head. [01:11:40] And it's very like PBS News Hour. [01:11:42] 100% PBS News Hour. [01:11:44] Yeah, Reagan era style. [01:11:45] But he argues that like safety becomes the ideal in risk societies and that risk evaluating risk and future potentialities basically becomes central to our behavior. [01:11:57] But not just our behavior, like how we conceive of ourselves and then how we relate to others. [01:12:02] Yeah. [01:12:02] You know, and it's true. [01:12:04] I mean, if every, if we conceive of every choice in our lives, I mean, this goes back to the kind of non, the like non-economic spheres of our lives as well, right? [01:12:13] In education, in dating, in whatever. [01:12:17] If we conceive of them as all investments that we're making, if every choice we make is an investment in our future, then of course it makes sense to evaluate those decisions based on risk, right? [01:12:28] Because risk is always about investment and time and reward, right? [01:12:38] And I think that, you know, I was just like thinking about all of this when we were kind of looking at this stuff with the Collins's and the genomic prediction and thinking about it with the kind of earlier, like late 19th century, early 20th century iterations of eugenics, you know, the old school flare. [01:12:56] Old school, yeah. [01:12:57] Yeah. [01:12:58] And I think this like risk discourse, if we can use that term, is one of the things that really distinguishes like this current genetic project from the more classic, classico-style, eugenic ones in the early 20th century. [01:13:17] I mean, this diagnostic stuff, you know, these genetic diagnostic procedures, you know, it's like they can't predict, you can't predict the future, right? [01:13:25] You can't predict it with certainty whether or not someone's going to develop a disease or have a particular life outcome. [01:13:34] But they can contribute to producing this new social category, if you can think about it, which is an individual at risk. [01:13:44] So rather than I'm going to have this, I'm now an individual at risk for X, Y, and Z. [01:13:50] So you could potentially sort of see this taken to its scientific extreme, the babies being sort of assigned these individual risk scores. [01:13:59] And it's like, your life path might, I mean, again, this is far off science fiction, you know, sort of like theorizing here. [01:14:07] But like, you know, your life path could be, you know, affected by that, right? [01:14:12] Because if you're assigned this sort of like score at birth, then that could dictate, I mean, who, you know, I don't know what the legality of that would be around that would be, but like that would dictate a lot of your choices. [01:14:25] That would sort of, that would, it's almost like scientific Calvinism. [01:14:29] Kind of, yeah. [01:14:30] And I think like what's really interesting about it, I mean, you know, taking out a, maybe, you know, imagining a future where it's not something that you're assigned from an authority or like imposed on you, but information that you're now sort of not required, but encouraged to know about yourself, right? [01:14:50] And what's interesting with risk as a kind of discursive tool in that way is that it doesn't really depend on a state body for authority. [01:14:58] Right. [01:14:58] Like these old eugenic regimes were really about, you know, you can't do that. [01:15:05] We're taking away your ability to do this. [01:15:08] We're going to sterilize you. [01:15:08] We're sterilizing you. [01:15:09] We're taking away your ability to procreate, whatever, right? [01:15:13] And what this does is this depends on the individual, what risk says, right, is it actually depends on the individual and their choices. [01:15:25] It's actually empowering you to choose more and better. [01:15:28] Yeah. [01:15:29] Right. [01:15:30] And it's like, and rather than like kind of reducing the responsibility of the individual based on kind of genetic marker in that kind of old top-down state authority way, right? [01:15:41] It's like this new genetic knowledge really expands kind of individual duties. [01:15:46] It further responsibilizes the individual. [01:15:50] Yeah. [01:15:50] Yeah. [01:15:51] You know, empowering them to kind of, you know, basically take responsibility for their own risk, which is really just social risk, right? [01:16:01] It's further removing the responsibility of social risk from the greater population and now putting it onto the individual. [01:16:11] And I think like, yeah, I mean, I just think in so many ways this mirrors or is like concurrent with like other kind of developments or shifts in production, you know, the state and dismantling kind of the social and kind of collective over the past some odd decades. [01:16:29] Well, it's it's funny to me because it's like so much of this depends on this like raw data being captured or whatever and then interpreted. [01:16:38] But like if there's anything that I've learned, again, as very much the layman from the past, you know, my whole life in basically every subject, it's that data people are often fucking totally wrong, you know? [01:16:51] And like it's often bullshit, you know, especially like even some of the data people were talking about in this episode, right? [01:16:58] Like it's actually like oftentimes I think like data interpreted through some authority, maybe not like a central body like a government, but like, you know, some company or something like that, can be twisted for whatever means or lied about. [01:17:14] I mean, in a way, like I have, I kind of always think of any of these like med tech companies or really like a lot of just tech companies in general as having this kind of like Theranos underbelly to them, right? [01:17:27] Like where it's all kind of built on air. [01:17:30] And like it, that's, and the thing is, like, I do know enough to know that like sometimes that doesn't matter. [01:17:36] Sometimes like you can bullshit your way enough into something being like, quote, real, right? [01:17:40] Or being like accepted by people. [01:17:43] But, you know, I think of like, like, the way I see this being implemented would be like in a sort of like 23andMe kind of way, right? [01:17:50] I mean, you know, like, as it stands now, there are multiple companies where you send them fucking, you know, a little bit of your snot or whatever. [01:17:57] I've never done it. [01:17:57] You spit. [01:17:59] And then they give you your entire genetic history or whatever. [01:18:02] And then you'd be like, ooh, look, I'm 3% African or whatever. [01:18:05] Like people love to do. [01:18:05] Or like, I'm fucking, I'm 1% Jewish. [01:18:10] And I mean, that would be, I mean, again, like, I kind of think it's bullshit, but it's like, you're essentially sending like, just like sending someone just your DNA to be like, you know, looked at or poured over and then collected by them and stored by them. [01:18:25] And I don't know, just like the combination of all of those things. [01:18:27] And it's like this sort of like, this like worship, this like religious devotion to like science, such as it is here. [01:18:37] There's something like unsettling about it that like I have a hard time even putting into words. [01:18:42] Yeah, I mean, I think a lot of people are more on board with this stuff than they realize. [01:18:49] And I guess the thing that I would like kind of like point at is, you know, it's one thing, I mean, you know, I don't think it's bad or wrong to have this information about yourself. [01:19:03] You know, this kind of genetic testing they offer, you know, seeing that you have a marker for breast cancer, for example, right? [01:19:12] That's not bad information to have or wrong. [01:19:14] No, absolutely not. === Rationalism EA's Individual Devotion (05:45) === [01:19:15] Yeah. [01:19:15] But I question why this is being pushed on as a responsibility of the individual. [01:19:23] You know what I'm saying? [01:19:24] And I think that like you start to understand these kind of larger political changes not as declines in state authority, right? [01:19:34] But like as promotions of governments that foster and enhance individual responsibility. [01:19:41] This stuff fits in line perfectly with that, right? [01:19:43] If you think of kind of like, you know, I hate using the word, right, but neoliberal kind of revolution as privatizing risk management as, you know, enhancing and multiplying choice in order to kind of find and promote new markets everywhere and completely and totally gut whatever it is that we understand about social insurance, [01:20:06] which was, I mean, a clear project of just the fucking Bush administration, if not the past like, you know, some odd decades. [01:20:14] Like the kind of entrepreneurial models that then start to pop up in all of these sort of different social arenas, like health being one of them, then it makes sense that a lot of people, you know, [01:20:28] when you start to see that as kind of like a governing logic of everything, then of course like you start to see how rationalism and effective altruism and the kind of ideology behind a lot of this stuff like fits in with it and can be popular amongst a certain like, you know, influential upper middle class, bourgeois class of like tech guys, not even just the Elon Musk and the Peter Thiel like vampire people. [01:20:55] The Collinses. [01:20:56] Yeah, totally, right? [01:20:57] Upper middle class families and, you know, people trying to feel empowered in their choice, right? [01:21:03] Well, it's funny to me too. [01:21:04] Like, I mean, you know, it's like Elon Musk and the Collinses are these people. [01:21:08] It's like this ideology of rationalism and EA, all of this, and like even getting back to what you're talking about, the risk stuff is so individualized. [01:21:16] And it's like, or it's so, so the worship of yourself as an individual or as individuals in general and individualism is so important to this. [01:21:29] And it's funny because the overriding goal of these long-termists, right, is to have this like great society of the future and to do all of these things for like our future generations, this like collective future, right? [01:21:42] Not even just your own family, although the Collinses seem to be particularly enamored with that, but in general, like the Bostromites or whatever, this devotion to the future and to these great masses of people in the future, these trillions and trillions of people. [01:21:59] But to do that, I need to be this like complete individualist now. [01:22:03] And it's like this, this, like the only way we're going to get to space is through this individual effort or like whatever. [01:22:08] It's funny because it runs so contrary to what you would think would be actually be the logical and rational approach to that is that like, well, if we actually want to want to make this, you know, if we want to make this earth a paradise or if we want to get people out to, you know, and have the greatest future possible, seems to be like a coll would make sense as a collective effort. [01:22:30] But collectivism in any form is an anathema to these people. [01:22:36] And so I think really wiping away all of that, like the rationalism or the EA, the long-termism, all of that, it's like, it's really, I think, a, a, it's like window. [01:22:47] And like even all this fucking genetic testing, all this, like, I mean, not all this genetic testing, but like in the way that like the Collins at least are going about it. [01:22:54] It's like, it really is like, at the end of the day, like it's, it's, it's a means to worship yourself, I think, and your own genius. [01:23:05] And, you know, the, the, the, the, reading interviews of these people or like the way that the way that they talk about themselves is like a way that like, I would be aghast if anyone ever said these things about themselves to me. [01:23:17] Like, you know, like, I'm an, I'm, you know, like, what's better? [01:23:20] Like, you know, the top 1% elite. [01:23:22] And then pointing to yourself, not even talking about your money, but talking about your intelligence or your ability. [01:23:29] I mean, that is an extraordinary thing to say about yourself. [01:23:33] And I would be appalled if someone said that to me. [01:23:36] I mean, it's just a bizarre way to view the world. [01:23:40] But I don't know. [01:23:41] There's something about it that like seems sick to me. [01:23:48] Like this just like complete and total devotion to oneself and using that as an excuse, like, or using the welfare of the future as an excuse. [01:23:59] Plus, if you're so smart, then why the fuck are you talking to a journalist? [01:24:21] Ho, ho, ho, ho, ho, ho, ho. [01:24:28] Merry Christmas. [01:24:30] Thanks, Brace. [01:24:31] Oh, no, sorry, I was talking to the hoe. [01:24:36] You know what I got you? [01:24:38] No, why did you get me? [01:24:40] A donation to Open Eye in your name. [01:24:43] Really? [01:24:44] Yeah. [01:24:45] That's crazy. [01:24:46] I donated in your name, Black Lives Matter Global Foundation. [01:24:52] It is going to a brand new wing on the Studio City Mansion. [01:24:57] I was just going to say, you are going to love it. === Elon's Leg Lengthening Surgery (06:12) === [01:25:00] Young Chomsky, don't think I forgot you either. [01:25:04] I donated $7,000. [01:25:08] That's right. [01:25:09] To Modi from India? [01:25:18] thought about him for a second yeah he kind of fell off but i'm just you know if like something happens in the future and like you got to go to india you know you kind of kind of greasing the wheels for you there like that And for myself, of course, a little self-care here. [01:25:33] I got chad surgery. [01:25:35] Oh, speaking of, I would like, I'm going to implore the little gumshoes out there to do some homework over the Christmas break because I have a theory that I would like us all to collectively work together on. [01:25:51] You think Terry Swift would like me? [01:25:53] No. [01:25:54] I think that Elon Musk had leg lengthening surgery. [01:25:59] You haven't told me this. [01:26:00] You think Elon Musk got taller? [01:26:02] I think he had tall guy surgery. [01:26:03] And so what I need to do, we need to find every early photo of Musk and compare it. [01:26:11] Because have you looked at his body? [01:26:14] Now, it's an odd bod. [01:26:16] He's 6'2. [01:26:18] No, that's what they say. [01:26:19] That's what they say. [01:26:20] He's an odd bod, just proportionally speaking. [01:26:24] Check out the legs. [01:26:25] And then, so I saw an early photo of him with Kimball, and he was shorter than Kimball. [01:26:29] And I was like, wait, that looks weird. [01:26:32] And I was like, maybe they're like, maybe he's leaning. [01:26:34] Maybe he's like, you know, it's like he's doing a little crip walk or whatever. [01:26:38] But I don't think so. [01:26:41] And then I was looking at him and I was like, his stance is weird. [01:26:45] Yeah, his stance is fucked up. [01:26:46] He's funky. [01:26:46] He's got odd heads. [01:26:47] He stands funky. [01:26:49] Yeah, he's weird. [01:26:50] And I think he had, because I think he had, I have it, it's in me. [01:26:57] Like, I really believe this. [01:26:58] And so what I want is for Christmas break, little homework, little project let's all work on. [01:27:06] Collectively, let's decide whether or not Elon Musk has allegedly had tall guy surgery, leg lengthening. [01:27:12] I am getting wildly different answers from the internet. [01:27:15] StarUnfolded.com, which is like one of those age, wife, girlfriend, children, family websites that those are always fake. [01:27:23] It says he's 5'11. [01:27:25] How tall is Elon Musk? [01:27:26] And then parentheses explain from thecoldwire.com. [01:27:30] Never heard of that. [01:27:32] 6' and 1.5 inches. [01:27:35] And then from Quora, is Elon Musk really 6'2? [01:27:39] This is a person who answered, I spent one afternoon with him at a Model S launch event, but was struck by a few things. [01:27:46] He's a much more imposing presence than TV appearances suggest. [01:27:49] I was surprised at how tall and portly. [01:27:51] Yes, I mean, he looked quite fat he was. [01:27:53] He's seriously a big guy, and I must admit, I was pretty intimidated by him. [01:27:56] He has a real nervous energy about him. [01:27:58] He has a fast walking pace, wanted to get things done quickly. [01:28:02] You could see he was always thinking ahead, not wanting to idle, time to go to waste. [01:28:06] Yeah, or on drugs and putting on a big fake show for his little stupid little play that he was doing. [01:28:11] E.g., while being introduced to go on stage, he was simultaneously listening to the speaker while answering emails on his phone. [01:28:16] You mean texting? [01:28:17] And he comes across quite cold. [01:28:20] He's an asshole. [01:28:21] Yeah, he's an asshole. [01:28:22] Not unpleasant, just direct, polite, with no airs or pretenses. [01:28:25] While speaking with a small team of us, he answered our questions sincerely. [01:28:28] But when someone asked what he considered a dumb question, he didn't hide the fact that he thought so. [01:28:34] You know what? [01:28:36] Anyway, I have a theory, and let's all be little scientists out there on little gum shoes and do some investigations, please. [01:28:42] I got to tell you, once again, open call. [01:28:45] If you have fucked Elon Musk, if you are a sex worker somewhere on the West Coast, there is a decent percent that you have. [01:28:56] I need to get penile information. [01:29:00] Ew. [01:29:01] I'm sorry. [01:29:01] I don't. [01:29:02] I don't. [01:29:03] You don't. [01:29:03] You hit the hotline to brace. [01:29:06] Hit the Sally. [01:29:06] Don't hit the Sally, actually. [01:29:08] I'm taking, you know what? [01:29:10] Yeah, don't hit the Sally. [01:29:12] I don't need to know that. [01:29:13] I just want to know about the leg surgery. [01:29:14] I want to know about the robot. [01:29:17] But doesn't it make sense? [01:29:18] I mean, look, I got to say, I really hate thinking and talking about Elon Musk. [01:29:24] I'm just like. [01:29:25] You made us do a mini-series. [01:29:26] And we got it all out. [01:29:28] By the way, a long time, a year ago. [01:29:31] Yeah. [01:29:31] Right? [01:29:32] Over a year ago. [01:29:33] Over a year ago, yeah. [01:29:34] Yeah. [01:29:34] So, yeah, I got it all out. [01:29:37] But it fits his little approach. [01:29:41] I mean, this was a man who went bald quite early on in his life. [01:29:45] Yeah. [01:29:45] Didn't have a lot of money. [01:29:47] Then had a lot of money. [01:29:49] And loves frivolous vaporware type technologies. [01:29:53] Yeah, that is. [01:29:54] Like leg lengthening. [01:29:56] Which is very difficult for me to say. [01:29:58] Leg lengthening? [01:29:59] I want to get my toes lengthened. [01:30:01] That'd be crazy. [01:30:02] Extendoid surgery. [01:30:03] What about having Go Extendo? [01:30:04] Four-inch toes. [01:30:05] Fantastic forces. [01:30:07] Yeah. [01:30:08] Yeah. [01:30:08] Mr. Fantastic. [01:30:09] He did Mr. Fantastic. [01:30:10] He's fantastic. [01:30:11] Horsepower. [01:30:12] What about horror? [01:30:14] Stretch Armstrong. [01:30:16] That's the same power, but yeah. [01:30:18] Yeah, same sort of deal. [01:30:20] What if I had four inch toes? [01:30:21] That would be weird. [01:30:23] Long boys? [01:30:23] How tall are fingers? [01:30:25] A finger? [01:30:26] That's like. [01:30:27] Oh, like you had little toes for fingies? [01:30:29] Well, that'd be crazy. [01:30:30] If you had tiny little fingies, and long toes? [01:30:36] Oh, like switch-ems. [01:30:37] Switch ems. [01:30:37] If you had toe fingers and finger toes. [01:30:39] That'd be freaking crazy. [01:30:41] What if you had two noses instead of ears and then an ear for a nose? [01:30:45] And then what if your eyes were like a cat? [01:30:51] All right, everyone. [01:30:52] I'm Liz. [01:30:53] My name is Bryce, but like what if what if my name is Gretchen? [01:30:56] And we have with us today on the ones and twos. [01:31:00] Whatcha, whatcha, young Chomsky. [01:31:02] And the podcast is called It's called Tron on everyone. [01:31:07] And Merry Christmas. [01:31:09] Happy holidays. [01:31:11] And Hanukkah. [01:31:12] We'll see you next time.