Behind the Bastards - Part Three: How Peter Thiel Became the Gravedigger of Democracy Aired: 2024-11-05 Duration: 01:41:26 === The Future President Debate (01:43) === [00:00:01] Coolzone media. [00:00:04] What's dying in darkness, my democracy? [00:00:08] I'm Robert Evans, host of Behind the Bastards. [00:00:11] We're coming on, recording this the day that the Washington Post is getting attacked online for not endorsing anybody in the election, which I'm grateful for because it means that no one hasn't noticed that CoolZone has also not put in our endorsement for the 2024 election, which is really good because every year, you know, we advise people to vote for the same man, Richard Milhouse Nixon. [00:00:36] Now, to talk about our greatest president, and I think our greatest future president, Noah Schachtman. [00:00:43] Noah, how are you feeling? [00:00:44] Do you think Nixon's got it this year? [00:00:45] You think he's going to pull out a win? [00:00:48] I thought you were saying I was your greatest future president. [00:00:53] You could be. [00:00:53] You could be, but you need to be a little more Nixonian, you know? [00:00:58] Have you considered trying to destroy the world while drunk and only Henry Kissinger being the one that can stop you? [00:01:08] No, I haven't. [00:01:09] So I guess I'm not qualified. [00:01:11] Yeah. [00:01:12] That is a bummer. [00:01:13] Noah, you are a contributing writer at Rolling Stone, contributing editor at Wired, and you're here to talk about P. Tizzy, which Peter Thiel does not go by. [00:01:23] And will probably, if he was not committed to destroying us after the first two episodes, that nickname is probably going to get us attacked. [00:01:31] Yeah, you're definitely sued. [00:01:33] Yeah, we're done. [00:01:34] We're done here, everybody. [00:01:36] Yeah. [00:01:37] How do you feel about the news today? [00:01:39] Is it good? [00:01:40] You happy? [00:01:41] Happy about the news? [00:01:43] The Washington Post thing? === Ready To Talk About Him (02:41) === [00:01:44] I don't know. [00:01:45] Whatever news is happening today. [00:01:46] I assume something else went down, right? [00:01:48] Somebody died. [00:01:50] I'm excited the Yankees are playing in the World Series. [00:01:53] That's good. [00:01:54] That's good. [00:01:54] Bill Clinton called Carrie Lake attractive. [00:01:56] It's been an exciting week for everybody. [00:01:58] I mean, you know, tiger can't change his stripes, right? [00:02:05] Yeah. [00:02:06] Yeah, if we want to call him a tiger. [00:02:08] Yeah. [00:02:09] Yeah. [00:02:09] So, uh, yeah, let's uh, I guess let's get back into the old Peter Thiel game. [00:02:14] Um, I'm, I'm, I'm ready to talk about him. [00:02:17] You're ready to talk about him. [00:02:18] Bum us out, Robert. [00:02:19] Tell us. [00:02:21] I will bums away. [00:02:25] This is an iHeart podcast. [00:02:28] Guaranteed human. [00:02:30] When a group of women discover they've all dated the same prolific con artist, they take matters into their own hands. [00:02:39] I vowed I will be his last target. [00:02:41] He is not going to get away with this. [00:02:43] He's going to get what he deserves. [00:02:45] We always say, trust your girlfriends. [00:02:50] Listen to the girlfriends. [00:02:51] Trust me, babe. [00:02:52] On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:03:02] What's up, everyone? [00:03:03] I'm Ago Modern. [00:03:04] My next guest, it's Will Farrell. [00:03:08] My dad gave me the best advice ever. [00:03:11] He goes, just give it a shot. [00:03:12] But if you ever reach a point where you're banging your head against the wall and it doesn't feel fun anymore, it's okay to quit. [00:03:19] If you saw it written down, it would not be an inspiration. [00:03:22] It would not be on a calendar of, you know, the cat just hang in there. [00:03:29] Yeah, it would not be. [00:03:31] Right, it wouldn't be that. [00:03:32] There's a lot of life. [00:03:33] Listen to Thanks Dad on the iHeartRadio app, Apple Podcast, or wherever you get your podcasts. [00:03:41] In 2023, bachelor star Clayton Eckard was accused of fathering twins, but the pregnancy appeared to be a hoax. [00:03:48] You doctored this particular test twice, Miss Ellens, correct? [00:03:52] I doctored the test once. [00:03:53] It took an army of internet detectives to uncover a disturbing pattern. [00:03:58] Two more men who'd been through the same thing. [00:04:00] Greg Gillespie and Michael Mancini. [00:04:03] My mind was blown. [00:04:04] I'm Stephanie Young. [00:04:06] This is Love Trapped. [00:04:07] Laura, Scottsdale Police. [00:04:09] As the season continues, Laura Owens finally faces consequences. [00:04:13] Listen to Love Trapped podcast on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:04:21] 10-10 shots five, city hall building. [00:04:23] How could this have happened in City Hall? === Tech Journalism Under Fire (15:37) === [00:04:25] Somebody tell me that. [00:04:27] A shocking public murder. [00:04:29] This is one of the most dramatic events that really ever happened in New York City politics. [00:04:35] They screamed, get down, get down. [00:04:37] Those are shots. [00:04:39] A tragedy that's now forgotten. [00:04:41] And a mystery that may or may not have been political, that may have been about sex. [00:04:45] Listen to Rorschach, murder at City Hall on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:04:55] I'm Laurie Siegel, and this is Mostly Human, a tech podcast through a human lens. [00:04:59] This week, an interview with OpenAI CEO Sam Altman. [00:05:03] I think society is going to decide that creators of AI products bear a tremendous amount of responsibility to the products we put out in the world. [00:05:10] An in-depth conversation with a man who's shaping our future. [00:05:14] My highest order bit is to not destroy the world with AI. [00:05:17] Listen to Mostly Human on the iHeartRadio app, Apple Podcasts, or wherever you listen to your favorite shows. [00:05:27] If you're a journalist, which, you know, two-thirds of us are in this call, December 7th, 2007 ought to be a date that lives in infamy. [00:05:37] That was my Pearl Harbor joke. [00:05:39] But it's also a joke referencing the Gawker lawsuit, because that is the day that Gawker, via its tech website, Vallywag, published an article with the title, Peter Thiel is Totally Gay People. [00:05:52] Now, Valley Wag, which was, you know, the, again, like the tech imprint of Gawker, had been writing about Peter Thiel for a while, and they had published articles kind of insinuating that Peter was gay for quite a while. [00:06:05] The company founder Nick Denton was, to his credit, someone who recognized early on that Peter was not just another rich investor guy, but somebody who was amassing significant power and had a weird ideology and should be covered. [00:06:19] Unfortunately, the downside of it was that Nick's instincts were, you know, this was a messy time for digital media, shall we say. [00:06:29] And Valley Wag was not at this point entirely conducting itself in the best traditions of a journalistic enterprise, right? [00:06:36] And while I think an argument can be made, a strong one, that Peter being gay, given his funding of the Republican Party, is to a degree relevant to the public interest, the way in which Valley Wag reported on this initially was not a public interest story, right? [00:06:53] Like that title, Peter Thiel is to that's not a we're getting out necessary information title, right? [00:06:59] That's kind of that's being extremely caddy, right? [00:07:02] Um, by the way, uh, our guest today is Noah Shaft and Dreaming Writer at Rolling State. [00:07:08] I tried to get it. [00:07:09] I introduced him. [00:07:11] Why are you slipping me in on the caddy outing here? [00:07:14] I introduced him this little this much. [00:07:19] I haven't cattly outed anybody in New York. [00:07:21] Yes. [00:07:22] Yeah. [00:07:23] Robert definitely did forget to introduce you last time. [00:07:26] Noah. [00:07:28] We redid it. [00:07:29] We did it and it was fine. [00:07:30] I got it this time. [00:07:32] I'm just making sure Noah gets his credits because they're impressive. [00:07:35] Thank you, Sophie. [00:07:36] Thank you. [00:07:37] It's sticking up for me. [00:07:38] Your credits. [00:07:40] Robert, I would credit you too if I could. [00:07:42] I don't know. [00:07:43] Do you remember this, Noah, when Gawker outed Peter? [00:07:47] Because I didn't catch it really at the time, but I was a little baby at this point. [00:07:51] I didn't either. [00:07:53] Yeah. [00:07:53] But I did know, you know, the Nick Denton crew and Nick a little bit back then. [00:08:00] And honestly, so many of the people involved were so fucking whacked out on powders and pills, they probably forgot they even did it. [00:08:08] All of the money that was going around in digital media back then, I think it would have been hard not to be whacked out on powders and pills. [00:08:14] But you could, this is not like the Post wouldn't have done this reporting in this way, right? [00:08:18] Like, or the New York Times, you can think of that what you will. [00:08:21] But like, this was, this was a little messy. [00:08:23] I think probably, I mean, Peter never sues over this, but this is the inciting incident of why he gets angry at them. [00:08:30] So I don't think this would have been something that could have been adjudicated in court. [00:08:34] But it is something that if you're kind of going on, where does Peter Thiel have a right to privacy? [00:08:41] Like if you're arguing that because of his advocacy, you know, this is relevant, which I think is an argument that can be made strongly. [00:08:49] You probably want to be a little bit clearer in making that argument than Peter Thiel is totally gay, folks. [00:08:58] Yeah. [00:08:58] Yeah. [00:08:59] Now, again, that's that I don't think like the fundamental like issue here is that they outed him. [00:09:05] I think it's just more that like, yeah, it's kind of a kind of a grody way to do it. [00:09:09] Did you say grody? [00:09:10] Grody, I did. [00:09:11] I did. [00:09:12] I am a high school girl in 2004. [00:09:16] Wow. [00:09:17] That would have been late 90s, right? [00:09:19] Yeah. [00:09:20] I thought it was more of an 80s thing, but hey, you know. [00:09:22] Yeah, it's probably more of an 80s thing. [00:09:23] Yeah. [00:09:25] So the, you know, this had been Valley Wag had been kind of poking at Peter for a while, right? [00:09:30] They had been making before that article some kind of veiled claims about him being gay. [00:09:36] And Valley Wag is kind of certainly writing more on like the, what do you call it? Tabloid end of things, right? [00:09:43] At this period of time, Gawker is going to professionalize in the period, you know, before they get sued into oblivion by Peter, but in 2007, they are still very much like new kids on the block. [00:09:55] We don't really give a shit. [00:09:58] Now, the question that comes to mind, if you've read about Peter Thiel, is like, why did he get so offended at the fact that he was outed? [00:10:06] Because by all accounts, he was pretty open in his personal life. [00:10:10] Like it doesn't seem like this shocked even his like Republican colleagues, people who had gone to Teal parties, who like knew him personally, who had gone to his nightclub. [00:10:20] Like he didn't like go to extreme lengths to hide this fact about himself. [00:10:26] Instead, what seems to have enraged him was not the specifics of the fact that he was outed, but this line from the Valley Wag article. [00:10:34] The only thing that's strange about Teal's sexuality, why on earth was he so paranoid about its discovery for so long? [00:10:41] Now, I wouldn't really, that didn't, line doesn't stick out to me. [00:10:46] But here's from an article in The Atlantic, which interviewed Ryan Holiday, who wrote a book about the Gawker case. [00:10:51] Here's what Ryan said about why that line in particular, like tweaked Teal. [00:10:58] He thought Denton was implying that Peter had psychological problems. [00:11:03] When you read the comment, it doesn't feel that way. [00:11:05] But Teal thought, here's the publisher of a media outlet, not just a blogger going after me. [00:11:09] The blog post felt like the first article after years of negative Gawker coverage against Teal. [00:11:14] I mean, look, I do think it like it feels weird when you're on the other side of it. [00:11:18] And I think like, you know, for those of us that like writing broadcast, right? [00:11:22] It like you want to, you sometimes want to take a spin on the other side of the camera, so to speak, and see how that stuff feels. [00:11:30] On the other hand, like what's the, he doesn't seem to have made it a secret, doesn't seem to have been a big deal. [00:11:37] On the other hand, you know, I think outing people is fucked up. [00:11:41] And yeah. [00:11:42] And so, and I feel like, you know, people's sexuality is like their own, is their own choice. [00:11:49] On the other hand, like, you know, if you're going to embrace some weirdo, like, you know, retro 17th century ideology about religion and power, then, you know, then you might have to grapple with its inconsistencies and hypocrisies. [00:12:08] So I don't know. [00:12:09] It's a tough one. [00:12:10] Yeah, it is. [00:12:10] It is kind of a, like, this is, I think, a useful thing for people who are interested in the ethics of journalism to comment on. [00:12:17] I do think it's interesting, like, the specifics of why Peter gets angry, this idea that he was mostly pissed that Gawker had maybe insinuated that he was not emotionally balanced. [00:12:26] Now, the other argument you'll hear here is that the primary real reason Peter was pissed about this is that it was fine for him to be gay and kind of open about it in his private life with the people who hung out around him, but not publicly open about it, because who he really wanted to keep this from or to maintain plausible deniability with is the Saudis, right? [00:12:50] He has a lot of business involvement in the Middle East and the Emirates, as well as in Saudi Arabia. [00:12:56] And he didn't want to be an out-gay man traveling to and dealing with these countries, right? [00:13:02] Like he felt that that would be damaging to his business interests. [00:13:06] So that's the other argument that you'll hear. [00:13:08] And I'm sure it's probably a number of things. [00:13:11] And in any case, it doesn't, this is like, I think when I first was aware of this case, most of the casual reporting was like, you know, Peter Thiel got involved in wanting to sue Gawker into oblivion because they outed him, right? [00:13:25] Like that was the deep, deep, deep A to B. [00:13:28] I think it's a little less direct than that. [00:13:30] And I think this is the picture Chafkin paints, the picture Holiday paints, and Holiday is the guy who's really seems to have gone into this the most, is that this is what kind of gets Gawker on Peter's radar. [00:13:42] It annoys him, but he's not committed to taking them down yet, right? [00:13:46] Like that's not going to happen for years and years. [00:13:48] So this is just kind of like the beginning of the conflict that they have in each other. [00:13:53] So we're going to move on here and later we will come back to the story. [00:13:57] But like, yeah, this is this is how he's kind of starting to get angry at Gawker. [00:14:03] And I do think it's useful to Holiday suggests that there's another reason why Peter's pissed as a result of this. [00:14:10] And it has more to do philosophically with the kind of reporting that Gawker is doing and what they represent about the media in the digital age that Peter is kind of personally repelled by, maybe even frightened of. [00:14:25] And I'm going to quote from an interview that he did again here. [00:14:28] From 2007 up until 2012, Denton was on a devil may care right of breaking rules as a media publisher. [00:14:34] And that was so diametrically opposed to Peter's vision of quiet individuality, this belief that weirdos needed to be left alone if they were going to change the world. [00:14:42] Peter saw that Gawker would punish people for that weirdness. [00:14:49] Yeah, I'm not sure how much, I mean, it's perfectly fine for, it's perfectly reasonable on Holiday's part to be like, yeah, Peter's doing this because that's just how he feels. [00:14:57] I do think that's a very silly like. [00:15:01] Come on, give me a break. [00:15:03] What was Thiel's Stanford paper again? [00:15:06] What were they doing? [00:15:06] Yeah, yeah, the Stanford, where they were like outing professors and stuff based on their political ideology and like his best friend who wrote those anti-AIDS columns, like screaming about how he hopes that this fucking gay professor dies of AIDS or whatever. [00:15:21] Like, yeah. [00:15:22] Right. [00:15:23] And so now he's worried about what exactly? [00:15:25] Yeah, yeah, that Gawker is making it unsafe to be weird. [00:15:30] You know, and Holiday is more sympathetic to Peter in this and that I than I certainly come out of. [00:15:37] Like, if that is how Peter justified this to himself, it's stupid. [00:15:43] Or maybe it's just Ryan kind of needing to find a more reasonable reason. [00:15:49] I think there's let me put it a different way, man. [00:15:52] Like, okay, I've been, as you can see from the gray hair, I've been involved in journalism for a long time and I've been involved in tech journalism for a long time. [00:15:59] And back then was like the time of maximum subservience to tech journalism. [00:16:07] It was too critical industry. [00:16:09] Yeah. [00:16:10] Not at all. [00:16:10] I mean, like, if you think the political press of 2025 towards future press and stuff is bad, this is like, you know, that would be a pale imitation of the Silicon Valley press of that era. [00:16:24] And so I think Valley Wag in its own, you know, fucked up weirdo way was like the only people that were like bothering to penetrate or interrogate that or one of one of the few people. [00:16:34] They weren't reflexively, yeah, just completely subservient. [00:16:38] Yeah. [00:16:39] Now they were reflexively gross in some other ways. [00:16:42] And I say, yeah, a friend of mine worked on that, but like, you know, I feel like some of it was just like, how dare you actually, you know, not follow the rules of obeisance here. [00:16:55] And yeah, how dare you not kiss my ass? [00:16:59] That feels like more part of it. [00:17:01] Yeah. [00:17:02] Yeah. [00:17:02] I think that's probably like, I think that's probably a fair because this is actually what I came up in tech journalism. [00:17:07] And yeah, it was a completely like Valley Wag was one of the rare places where you would get people who were trying to be confrontational to these guys who were kind of worshipped at the time. [00:17:18] You would, you would have to go far to find really critical reporting on Zuckerberg, on fucking Steve Jobs, on a guy like Teal in this period of time. [00:17:28] Like 2007 is when that Forbes article on the PayPal Mafia that we quoted from comes out, where they're like taking a picture of Peter and all of his friends and framing it like it's a like a film like poster and stuff. [00:17:41] So yeah, I do think, and I do think that's important context for like, we don't want to, I'm not like trying to deny how gross a lot of the, especially today, a lot of like the way Valleyweg framed things was, but it is important to note also the value of what they were doing that like, well, at least they were confronting these guys, you know, and it was, it was 2007. [00:18:02] It was a different era. [00:18:03] Digital media was new. [00:18:04] We can talk about like what the ideal way to confront them would have been, but like at least they were. [00:18:10] So I don't know. [00:18:11] It's, it's, it's a, it's a messy time. [00:18:13] Nobody's, nobody, nobody handled it perfectly. [00:18:16] Yeah. [00:18:17] Now, in terms of how Chafkin interprets this, because Chafkin is a lot, gives a lot less slack to Peter. [00:18:26] And his argument is that, uh, his argument is that like after this Gawker article comes out, Peter is angry, but at the same sense of the, but in the same time, he's, he's kind of like liberated by the fact that he's been outed now. [00:18:41] And that this is a big part of why he becomes such an open, not just in funding of the far right, but funding of a lot of his weird libertarian pet causes is now that he has been outed, like, well, you know, maybe it's going to fuck with some of my business in the Middle East, but at least I can be who I am openly now, right? [00:19:00] And I think there's a good case to be made for that because it's after this article comes out that Peter starts, for example, sinking huge amounts of money into seasteading, which is an art idea championed by weird libertarians who wanted to build their own cities independent of the government in the ocean. [00:19:17] Peter backs the Seasteading Institute. [00:19:19] He starts funding these guys who are doing like little Burning Man style events, which actually do sound kind of cool where they're like living in the sea or rivers and stuff for days at a time. [00:19:29] And he's funding this libertarian, you know, kind of fail sun dude who's a major sea setting advocate. [00:19:39] And he's giving like speeches and stuff. [00:19:41] He's actually more into this. [00:19:42] This is not just because when you've got teal money, you can just be a dilettante about something that you're casually interested in. [00:19:48] Peter is like giving speeches and writing essays about how seasteading he thinks might hold some of. [00:19:55] It's an S-E-A. [00:19:56] S-E-A, yeah, yeah, yeah. [00:19:58] Like homesteading, but on the same seasteading, yeah. === Seasteading And Dolphins (09:26) === [00:20:03] Look. [00:20:03] Tell me again how we, how did we get from Peter wants is totally gay to Peter Thial is totally seasteading? [00:20:11] Well, Peter Teal, Peter Thial, who has now been revealed as gay, can be revealed, can reveal himself also as a weirdo libertarian and be like, look, you know, I've been outed on this thing that I actually wanted to keep quiet. [00:20:23] So I might as well be open about the fact that I think that we can replace governments by living on the ocean and building floating cities. [00:20:31] Why not? [00:20:34] Again, like, I feel like at every step, it's like, dude, this guy is like stuck in like second semester freshman year. [00:20:42] Yeah. [00:20:42] It's like, he like took some bong hit that like he never quite recovered from. [00:20:49] We all took a bong hit we didn't quite recover from, Noah. [00:20:51] Let's not judge him for that. [00:20:53] Guilty is fucking charged. [00:20:55] I'm just saying that like his particular like techno-libertarian, utopian dude, brah, what if we homesteaded on the sea? [00:21:04] Yeah, and then no government could touch us. [00:21:05] We could be pirates. [00:21:08] Are you kidding me? [00:21:09] This is the toughest part of the Peter Thiel story for me here because I have to report on this and I don't like Peter. [00:21:15] Obviously, I wrote like 17, 18,000 words on why he's a bad guy. [00:21:22] I think this kind of rules. [00:21:23] I do think it kind of rules. [00:21:24] I don't like it as a political thing. [00:21:27] It's like, we're going to replace all the governments, but I love the idea of, I liked, look, I watched too much Sequest as a kid to not be attracted to the idea of taking to the ocean to build your, it's cool. [00:21:39] I'm sorry. [00:21:39] It's a cool idea. [00:21:41] I like, fucking, forgive me. [00:21:45] Wow. [00:21:46] I like it. [00:21:46] I think it's neat. [00:21:50] Sorry. [00:21:50] What is SeQuest? [00:21:52] What is Sequest? [00:21:53] Oh, my God. [00:21:54] This is Alien 4 all. [00:21:56] Unbelievable. [00:21:57] Noah. [00:21:58] So back in like the mid to late 90s, when after Star Trek to the Next Generation really blew up, when it was kind of like season three or so starting to hit its stride, a TV show that was basically Star Trek the Next Generation, but set in a future where humans had taken to the ocean to like expand their living territory. [00:22:16] And it was the lead actor, like their Picard was Roy Scheider, the sheriff from Jaws. [00:22:22] And he like ran this giant like submarine city that traveled around and like kept the peace in the underwater frontier. [00:22:30] It was a good show. [00:22:32] It was a good show. [00:22:33] Yeah. [00:22:33] Wait, is Jaws on it too? [00:22:35] No, no, but there was a dolphin character. [00:22:38] There was a talking dolphin. [00:22:39] There was a talking dolphin, which there was supposed to be in the original Star Trek, The Next Generation, I think. [00:22:45] That was something. [00:22:46] Yeah, because Gene Roddenberry was a pesadist. [00:22:49] He was a believer that once we have a nuclear war, dolphins and humans will ascend together. [00:22:56] Hell yeah. [00:22:56] Yeah. [00:22:57] He might have been friends with that guy who raped that dolphin. [00:23:00] I'm not saying he was in favor of raping dolphins, but there was a John C. Lilly raped a dolphin. [00:23:04] Yeah. [00:23:05] What? [00:23:05] You guys know about John C. Lilly. [00:23:07] Peter Thiel raped a dolphin? [00:23:09] Yes. [00:23:10] That is the allegation we're making on Behind the Bastards. [00:23:12] Thank you, Noah, for stating it. [00:23:14] Because now you and the iHeart Radio Corporation are both on the hook for making that statement. [00:23:21] Hey, in for a dollar, in for a fucking $100 bill. [00:23:26] To be honest, I think the guys who think dolphins are equal to human beings, I don't think Peter Thiel cares about dolphins very much, else he would have different politics. [00:23:35] Dolphins are cool. [00:23:36] Although I also don't think he's molested a dolphin. [00:23:39] You know, some of those pro-dolphin guys did. [00:23:42] Where do you stand on molesting dolphins? [00:23:44] Email us. [00:23:45] Sophie, do we have an email? [00:23:47] Technically. [00:23:48] Okay. [00:23:48] Well, I'm not going to read it. [00:23:52] So we've hit about the point of 2008 or so. [00:23:55] Peter is getting into funding seasteadding. [00:23:58] He's getting more open. [00:23:59] He's starting to put out more money to libertarian causes. [00:24:02] Are dolphins allowed on the seastead? [00:24:04] You know, I think that's going to vary from if I'm seasteading. [00:24:08] Yes, dolphins are independent citizens with independent rights, but also they have to abide by our laws, which is going to be hard for some dolphins because some dolphins are scum. [00:24:18] But that's a separate question. [00:24:23] So this is right around the time, 2008 or so, that Peter Thiel starts reading the work of a fairly new blogger on like the right-wing scene, this kind of underground hit who's particularly popular in the Bay Area tech industry scene, a guy named Curtis Yarvin, who at this point is writing under the name Minchius Moldbug. [00:24:42] Now, we did our episodes with, you know, pretty recently on Minches, so I'm not going to go into a ton of detail on him, but he advocates a return to monarchy based around like small city-states ruled by CEO kings. [00:24:55] Like that's his idea is like, wouldn't it be better if tech CEOs ran the world and like it was a series of small city-states that you could travel in between? [00:25:04] Which, if you've ever like had to use the, yeah, like all of us have, if you've ever used any of the products these companies make, the idea of them running an entire government is a nightmare. [00:25:15] But Peter thinks it is starting to think in this period that maybe that's the right way to do things. [00:25:20] And I, I, the open question always here is like, does he actually believe this is better for mankind? [00:25:26] Because the thing you'll get in Chafkin's writing and in Peter's own writings, if you're trying to figure out like, why does he think this is, he has this belief that the tech industry has ground to a halt, that human innovation is frozen, right? [00:25:41] That all of the stuff the tech industry is putting out are these like bullshit little products and like gadgets and stuff that don't actually take us forward in the way that, you know, we had dreamed of going, you know, when Peter was a young kid, which is to a degree true, although Peter's one of the guys funding and investing in these bullshit projects that absolutely don't take the species forward, make a lot of money. [00:26:05] So does he believe that like we need to do this in order to actually increase innovation again? [00:26:10] Cause capitalist democracy can't do it. [00:26:13] Or does he, is he just a guy who wants to be more powerful? [00:26:16] And he's like, well, if I'm a CEO, King, I'm more powerful, right? [00:26:19] Yeah. [00:26:19] Isn't that the, I mean, that's the Occam's razor answer. [00:26:22] I mean, like, what the guy that in the future will fund like the right-wing YouTube is upset that tech isn't being innovative enough. [00:26:30] Yeah. [00:26:31] Like, you know, that's the real motivation here. [00:26:34] I find that hard to believe. [00:26:36] That's what I always come back to is like, but he funded all of the bullshit projects. [00:26:41] Like, like he's backing Facebook. [00:26:43] He's like the first Facebook investor. [00:26:45] That was, there was never any chance that that was going to take us into Star Trek future, right? [00:26:50] Right. [00:26:51] Yeah. [00:26:51] Like, I want to establish a multi-planetary species. [00:26:56] And the way I'm going to do it is by putting some money behind my Sphinx 2.0. [00:27:01] Yeah. [00:27:01] This guy built a website to rate chicks on how hot they are. [00:27:04] That's got to, that's really got to, that's going to bring us to the hoverboards. [00:27:08] Oh, my God. [00:27:09] Yeah. [00:27:10] Put us right on the Sequest. [00:27:12] Yeah. [00:27:12] Or put us on the Sequest ESP. [00:27:14] That's right. [00:27:15] Thank you. [00:27:16] Thank you. [00:27:17] And also, everyone, R.I.P. Roy Scheider. [00:27:19] You know, if there's ever been a better drunk sheriff in film history, I haven't seen him. [00:27:24] You know, go watch Jaws tonight, people. [00:27:27] It's a nice Halloween movie. [00:27:29] So Peter starts shotgunning money to Yarvin during this period of time as well. [00:27:34] He invests, you know, like a million dollars, something like that in the stupid tech company project Yarvin has. [00:27:39] And I think there's probably an additional chunk of dark money that he, he kind of, he, and this is where this is, you know, we can laugh about how inconsistent or unethical his like motivations are, but the way he does this is smart because he recognizes, I really like this guy's writing. [00:27:55] This guy is putting out some stuff that's legitimately subversive on the like the role of democracy and how it's like doomed that I think is useful towards where I want to see things go. [00:28:06] And he's writing in such a way that is inherently attractive and magnetic to other tech bros. [00:28:14] So I want to fund this guy as a way of like slipping this drug into the supply of the Silicon Valley power elite that's going to warp the way they think about the world. [00:28:26] And this is a very successful project. [00:28:28] You know, I don't know the degree to which all of that is a plan from the beginning, but he really liked, like, Yarvin goes to parties at Peter's house and stuff. [00:28:38] Like they, they are tight. [00:28:39] And I think this is very much like a kind of part of his cohesive increasing plan that like this is a guy who's a reflexive contrarian. [00:28:48] He kind of hates ordinary people. [00:28:50] He wants to be able to rule them. [00:28:51] He certainly wants to be locked forever as someone who is above them. [00:28:57] And he, I think, finds very attractive this idea of if we build, go back to a system that's this kind of neo-monarchist system, I can be enshrined like the House of Windsor as a permanent, especially if I never have to die, right? [00:29:11] As a permanent power. [00:29:14] And speaking of never dying, Nova, you know who can't die, who cannot be killed, absolutely cannot be killed. [00:29:21] I've tried to kill them. [00:29:22] They won't die. [00:29:23] It's the sponsors of our podcast. === Neo-Monarchist Ambitions (05:50) === [00:29:29] There's two golden rules that any man should live by. [00:29:33] Rule one, never mess with a country girl. [00:29:37] You play stupid games, you get stupid prizes. [00:29:40] And rule two, never mess with her friends either. [00:29:43] We always say, trust your girlfriends. [00:29:47] I'm Anna Sinfield. [00:29:48] And in this new season of The Girlfriends, oh my God, this is the same man. [00:29:53] A group of women discover they've all dated the same prolific con artist. [00:29:58] I felt like I got hit by a truck. [00:29:59] I thought, how could this happen to me? [00:30:01] The cops didn't seem to care. [00:30:03] So they take matters into their own hands. [00:30:06] They said, oh, hell no. [00:30:08] I vowed I will be his last target. [00:30:10] He's going to get what he deserves. [00:30:15] Listen to the girlfriends. [00:30:16] Trust me, babe. [00:30:17] On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:30:27] Hey, I'm Nora Jones, and I love playing music with people so much that my podcast called Playing Along is back. [00:30:33] I sit down with musicians from all musical styles to play songs together in an intimate setting. [00:30:38] Every episode's a little different, but it all involves music and conversation with some of my favorite musicians. [00:30:43] Over the past two seasons, I've had special guests like Dave Grohl, Leve, Mavis Staples, Remy Wolf, Jeff Tweedy, really too many to name. [00:30:53] And this season, I've sat down with Alessia Cara, Sarah McLaughlin, John Legend, and more. [00:30:58] Check out my new episode with Josh Grobin. [00:31:01] He related to the Phantom at that point. [00:31:04] Yeah, I was definitely the Phantom in that. [00:31:06] That's so funny. [00:31:07] Share each day with me each night, each morning. [00:31:16] Say you love me. [00:31:18] You know I. [00:31:20] So come hang out with us in the studio and listen to Playing Along on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:31:28] I'm Laurie Siegel and on Mostly Human, I go beyond the headlines with the people building our future. [00:31:33] This week, an interview with one of the most influential figures in Silicon Valley, OpenAI CEO Sam Altman. [00:31:40] I think society is going to decide that creators of AI products bear a tremendous amount of responsibility to products we put out in the world. [00:31:47] From power to parenthood. [00:31:49] Kids, teenagers, I think they will need a lot of guardrails around AI. [00:31:52] This is such a powerful and such a new thing. [00:31:54] From addiction to acceleration. [00:31:57] The world we live in is a competitive world, and I don't think that's going to stop, even if you did a lot of redistribution. [00:32:01] You know, we have a deep desire to excel and be competitive and gain status and be useful to others. [00:32:08] And it's a multiplayer game. [00:32:10] What does the man who has extraordinary influence over our lives have to say about the weight of that responsibility? [00:32:17] Find out on Mostly Human. [00:32:18] My highest order bit is to not destroy the world with AI. [00:32:21] Listen to Mostly Human on the iHeartRadio app, Apple Podcasts, or wherever you listen to your favorite shows. [00:32:30] What's up, everyone? [00:32:30] I'm Ago Moda. [00:32:32] My next guest, you know, from Step Brothers, Anchorman, Saturday Night Live, and the Big Money Players Network, it's Will Farrell. [00:32:43] My dad gave me the best advice ever. [00:32:46] I went and had lunch with him one day, and I was like, and dad, I think I want to really give this a shot. [00:32:51] I don't know what that means, but I just know the groundlings. [00:32:54] I'm working my way up through and I know it's a place to come look for up and coming talent. [00:32:58] He said, if it was based solely on talent, I wouldn't worry about you, which is really sweet. [00:33:02] Yeah. [00:33:03] He goes, but there's so much luck involved. [00:33:06] And he's like, just give it a shot. [00:33:07] He goes, but if you ever reach a point where you're banging your head against the wall and it doesn't feel fun anymore, it's okay to quit. [00:33:16] If you saw it written down, it would not be an inspiration. [00:33:18] It would not be on a calendar of, you know, the cat just hang in there. [00:33:25] Yeah, it would not be. [00:33:27] Right, it wouldn't be that. [00:33:28] There's a lot of luck. [00:33:30] Listen to Thanks Dad on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:33:38] In 2023, former bachelor star Clayton Eckard found himself at the center of a paternity scandal. [00:33:45] The family court hearings that followed revealed glaring inconsistencies in her story. [00:33:50] This began a years-long court battle to prove the truth. [00:33:53] You doctored this particular test twice, Miss Owens, correct? [00:33:57] I doctored the test once. [00:33:59] It took an army of internet detectives to crack the case. [00:34:02] I wanted people to be able to see what their tax dollars were being used for. [00:34:06] Sunlight's the greatest disinfectant. [00:34:08] They would uncover a disturbing pattern. [00:34:11] Two more men who'd been through the same thing. [00:34:13] Greg Oespi and Michael Marancini. [00:34:15] My mind was blown. [00:34:17] I'm Stephanie Young. [00:34:18] This is Love Trap. [00:34:20] Laura, Scottsdale Police. [00:34:22] As the season continues, Laura Owens finally faces consequences. [00:34:27] Ladies and gentlemen, breaking news at Americopa County as Laura Owens has been indicted on fraud charges. [00:34:33] This isn't over until justice is served in Arizona. [00:34:38] Listen to Love Trapped podcast on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:34:48] We're back. [00:34:49] No, no, no. [00:34:50] You called me Nova before the first time. [00:34:52] Did I call you Nova? [00:34:54] Yeah, shit. [00:34:55] That might be my new nickname. [00:34:57] There's a crap. [00:35:03] Because I too have a shiny helmet and I'm a third-rate Marvel superhero. [00:35:08] Oh, no. [00:35:08] You're a second-rate easily. [00:35:11] You're like above Morbius tier. [00:35:14] Wow. [00:35:14] Thank you, dude. [00:35:15] Yeah, I'm like better than the Morbs. [00:35:18] Yeah, you're Madam Webb. [00:35:18] You're a Madam Webb style character. === Founders And Life Extension (12:08) === [00:35:20] You know what? [00:35:21] I'd go so far as Ant-Man. [00:35:23] And you know why that's a big compliment is everybody likes Paul Rudd. [00:35:27] Everybody, it's true. [00:35:28] I assume everyone doesn't like Paul Rudd, but a lot of people do. [00:35:31] He's very popular. [00:35:32] Um, um, he's a dad in Brooklyn. [00:35:35] I like him. [00:35:36] Oh, yeah, yeah. [00:35:36] And speaking of Paul Rudd, not at all, yeah, because Paul Rudd has not aged in 30 years, has he? [00:35:42] And Peter, Peter puts a lot of money into life extension projects. [00:35:46] We connected it, we made it work. [00:35:48] Yes, specifically trying to steal the blood of Paul Rudd to figure out what's going on there, what's going on there. [00:35:54] How can I still play as a 39-year-old man? [00:35:58] It's incredible. [00:35:58] He takes the blood of teenagers and Paul Rudd. [00:36:01] Yeah. [00:36:02] Um, hold on. [00:36:04] I want to give the Curtis Yarvin thing that we're talking about before the bread. [00:36:07] Absolutely. [00:36:07] Isn't a little bit of it like you know, Peter Thiel himself got sort of seed funding money as a weirdo reactionary writer back when he was a kid. [00:36:19] And so he's just kind of like paying it forward to this next weirdo reactionary. [00:36:25] Yeah, paying it forward, I think, credits Peter maybe with a degree of generosity, which is a weird term to use for like right-wing bullshit. [00:36:32] But I think maybe it's I think maybe if I'm trying to psychoanalyze Peter, and I'm not being fair here, but fuck it, it's my podcast. [00:36:40] Peter was willing to take that money and preferred it to not having the money and not having like a platform. [00:36:46] But I also think he probably found it kind of like emasculating maybe to need someone else's money. [00:36:53] That like his first plans had failed, and that's why he had to take that right-wing influencer grifter money in the first place. [00:37:01] And I think maybe there's a satisfaction to him in having the shoeby on the other foot on now being the guy who is funding those influencers, right? [00:37:10] Obviously, he sees the value in that kind of funding, right? [00:37:13] So I think he's always been kind of supportive of that. [00:37:15] But a and maybe on the if you're being, if you're giving him more credit as like being less of a dick, although again, this is still an evil thing to do. [00:37:24] Maybe it's that he genuinely is like, well, this money was there for me when I was a fledgling right-wing shithead. [00:37:30] I got to pay it forward, you know, and invest in the career of another asshole. [00:37:35] But the difference here is Jarvin is specifically right. [00:37:40] I mean, correct me if I'm wrong, but Jarvin is like specifically promoting guys like Peter Thiel as yes, God kings, yes, right, god-kings of the city-states that are going to replace the United States as uh as the doom of democracy comes down. [00:37:59] Um, right, yeah, yeah, they're all neo-feudalists, right? [00:38:04] At the end of the day, all they want is a fucking coat of arms and a goddamn uh fucking uh march to listen to. [00:38:12] You go to the comments, you like look up old czarist Russian band music and shit. [00:38:17] You look in the comments about all these guys being like, Oh, if only we could go back to the beautiful days of the Romanovs. [00:38:22] All of those guys are no less intellectually courageous than fucking Peter Thiel, right? [00:38:27] They just dress it up a little less by masturbating over the fucking czar. [00:38:31] All of these guys just want a czar, or they want to be the czar, right? [00:38:34] Though they want to be, you know, the czars. [00:38:36] I think Peter wants to be the grand duke or some shit, right? [00:38:40] The czar is probably a little bit too much exposure. [00:38:43] Um, this would all be much funnier if you think of them all with those giant curly cue mustaches, absolutely knee britches, yeah, like you know, powdered wigs, yeah. [00:38:54] Or if you think about them all, we could just talk about what happened to the czars in a basement, but you know, that's probably hey. [00:39:00] So Peter starts throwing money into living forever. [00:39:03] He invests a lot in a guy named Aubrey deGray, who's running something called the Methuselah Institute. [00:39:09] And what? [00:39:11] Yeah, DeGray is like the most prominent we could live forever if we just figure out the right things advocate up to the present day. [00:39:19] He's still sort of like one of the big names in this industry. [00:39:23] Depending on how you kind of read into things, I think he's a guy who got a lot more credit because I used to be interested in some of what he had to say. [00:39:31] I think maybe I've come around and he's more of a con man in the modern era. [00:39:35] That's not an allegation, but like, that's kind of my gut feeling about the dude. [00:39:40] But he certainly, he is not a right-wing figure in this period. [00:39:43] DeGrey, if anything, would be more on the progressive side of things in the early 2000s, like progressive left. [00:39:50] So the fact that Peter is funding him, again, libertarians are kind of like more aligned with the left in this period of time because of their opposition to the Bush Party as a general rule. [00:40:03] So the fact that Teal, who is kind of a neocon in some ways, is funding a guy like DeGray would not have been seen as like, oh, it's this weird right-wing billionaire foisting money on us, right? [00:40:15] Like that's just not how it would have looked. [00:40:16] He also puts money into cryogenics. [00:40:19] And he's, there's some interesting interviews with him where he's like asked what he thinks would be a good human lifespan. [00:40:25] And he was like, I don't see why people shouldn't live forever, right? [00:40:29] But specifically, he's the, what's kind of important to note here is that Peter doesn't really have an interest in making sure people live forever. [00:40:36] That's not what he's about. [00:40:39] He wants to live forever. [00:40:40] And he even makes some specific statements about how I don't agree with the ideology that death for every person is necessary. [00:40:48] Right. [00:40:48] We talked about that at the top. [00:40:50] Yeah. [00:40:50] Yeah. [00:40:51] And I think what's happening here is that like, again, Peter is this kind of to his bones contrarian. [00:40:57] He rejects other people. [00:40:59] And one of the things that bonds all people together, no matter what, how smart you are, how rich you are, who you are, is that everybody dies. [00:41:07] And that, I think, is what's most offensive about Death to Peter is that it kind of forever locks him in as one of the herd, right? [00:41:17] Like you're not fundamentally above the rest of mankind if you die like everyone else. [00:41:21] And I think that's the primary reason why he's so obsessed with this, you know? [00:41:25] Like he is, he wants to be a pharaoh. [00:41:28] He sees himself as like a pharaoh type, right? [00:41:31] That he is owed this kind of eternity of power and influence because he is so special. [00:41:35] And the idea that like, no, man, when it all comes down into it, you wind up in the dirt like everybody else. [00:41:40] Like that is the most offensive part of this to him, even more than like any fears about, you know, the final cessation of consciousness. [00:41:49] It's, it's being inextricably bound to everyone else who exists. [00:41:54] That is like more than that than just like scared little man-child with too much money who is just like, oh no, this, you know, this might happen to me. [00:42:05] And therefore, I'm going to like support this like guy who looks like a wizard who's going to tell me. [00:42:10] Yeah. [00:42:11] He does look like a wizard. [00:42:12] Did you, had you looked up a picture of Aubrey deGrey or did you just guess that he looks like a wizard? [00:42:18] Just to remind myself, this guy has got a beard. [00:42:20] Oh, man. [00:42:22] He looks like such a wizard. [00:42:24] It's crazy. [00:42:25] So we pull up that wizard ass motherfucker. [00:42:29] It's wild. [00:42:30] It's like, honestly, it looks like one of those things like there's like a kid hiding inside the beard. [00:42:37] It's so big and unwieldy. [00:42:39] Get off the gray ass, son of a bitch. [00:42:42] We'll just have Malcolm throw a picture up while we're talking about it. [00:42:46] Yeah. [00:42:46] Yeah. [00:42:46] Find one that really makes him look like a fucking wizard. [00:42:49] There's really none that don't. [00:42:51] They are all wizard pics. [00:42:53] Yeah. [00:42:54] You know, I did did Teal fund this guy directly or did he fund him through the founders? [00:43:02] I think he, I think it's through the founders fund that he starts at Clarium. [00:43:06] I think that's where most of his money comes in. [00:43:07] Yeah. [00:43:08] Yeah. [00:43:09] So I like semi, like I ran across the edges of this when I was reporting back in the day. [00:43:18] Yeah. [00:43:19] I definitely like there's a couple of other founders fund partners who are also equally into you know wizardry and life extension and stuff like that. [00:43:30] And I was doing a story on them. [00:43:33] Specifically, they hired an in-house meditation teacher and guru who claimed that he could personally enlighten them and bring them like universal consciousness and oneness with the Buddha. [00:43:46] Yeah. [00:43:46] I mean, as far as I can tell, it totally worked, right? [00:43:48] I mean, what else would you do with your time than support the end of American democracy if you're enlightened? [00:43:56] Yeah. [00:43:58] So anyway, and yeah, they were all into life extension and all kinds of stuff like that. [00:44:04] I didn't see Teal at that time, but definitely like there's a lot of his people that were in there. [00:44:11] Well, and I think it's natural you kind of get super rich in your early 20s. [00:44:16] And then, you know, your first concern after that, when you have more money than you could ever spend, is like, well, I want to live long enough to spend all this, right? [00:44:25] Right. [00:44:25] I've got more money than God. [00:44:26] I want to be God. [00:44:28] Yeah. [00:44:28] I don't know. [00:44:29] Some of it's probably that like, I think as a general rule, by the time you get really rich, you're usually maybe probably closer to your 30s than your early 20s for most of these guys. [00:44:37] And that is when feelings of mortality, you know, you start to, and you start to also, at the early stages of aging, there is a lot that if you have shitloads of money for the right kind of drugs and the right kind of like personal training and shit, you can kind of push off the early steps of aging significantly. [00:44:57] And you can also do stuff like you see this with both Teal and with Elon Musk. [00:45:01] Once they get rich, physically they change a lot initially. [00:45:04] You know, you get the hair transplants like Jeff Bezos. [00:45:07] You get on HGH. [00:45:08] You get a personal trainer. [00:45:10] And you start to convince yourself, wow, so much of what I, you know, when I was like a young kid just working, I couldn't have a body like this because I didn't have the resources to pay experts to maintain it for me. [00:45:21] And I'm able to, what else is possible? [00:45:24] Right. [00:45:26] And I think that's probably part of what's going on there. [00:45:28] Also, just getting a lot of money all at once breaks your brain. [00:45:32] Bad for you. [00:45:33] That was a good part of like Bojack Horseman, right? [00:45:36] Where there's that line where it's like the age at which you suddenly, at which you become a millionaire is the age that you're like frozen at forever. [00:45:42] You don't really progress mentally past that point. [00:45:45] Yeah. [00:45:46] May we all get to that point. [00:45:47] May we all get to that point, you know, but hopefully when you're like 40, right? [00:45:52] As opposed to taking the Zuckerberg route. [00:45:56] So at this point, all of these gifts that Peter, you know, has been giving, it's interesting. [00:46:02] Like he's a the primary thing that he's done at this stage, right? [00:46:06] For all of you know, the money and the high ambitions is he has started, cashed out on and abandoned PayPal. [00:46:13] And then he has launched an investment firm called Clarium. [00:46:16] And by kind of the second Bush term, Clarium is becoming a really big deal. [00:46:22] By 2004, they had $260 million under management. [00:46:26] And within like a couple of years, the fund was worth more than $2 billion, which is that it's double and triple digit growth for most of its early years. [00:46:35] It changes. [00:46:36] People will say it completely changed how venture capital works in the Valley, right? [00:46:41] Because it was such a successful company. [00:46:43] The kind of bets that Peter and his, because all of the people staffed there are his friends from PayPal and his like right-wing buddies at Stanford, right? [00:46:51] Who were also in large part a lot of his buddies from PayPal. [00:46:55] You know, and he's, he's kind of picking, he's finding guys who are starting companies. [00:46:59] Some people will allege they're all guys he finds attractive. [00:47:02] You know, I don't know. [00:47:03] I think that sometimes it's just people being like, oh, this guy's handsome. [00:47:06] Peter's gay. [00:47:06] That must be part of it. [00:47:07] I don't know that it actually is. [00:47:10] But he's finding these other founders and he's bringing them in. [00:47:14] It is noted there's a couple of things that make Peter's fund really different from other funds. [00:47:18] For one, he's not interested in people constantly making moves. [00:47:23] He's fine if you only make one investment a year, right? [00:47:26] And again, he doesn't really fire people. === Peter Thiel's Investment Style (03:39) === [00:47:28] He's bad at that. [00:47:30] He's bad at confrontation. [00:47:31] You can kind of wind up shuffled off to a part of the company where you don't have much connection to Peter if you fuck up enough. [00:47:36] But like, he doesn't like conflict for as many sort of evil fucking confrontational things and people as this guy invests in. [00:47:45] He personally doesn't seem to have much stomach for conflict, especially not with people he likes. [00:47:50] So when it comes to like what made Clarium super wealthy, one of the things that was hugely influential in their growth was backing one of the most toxic corporations on the planet, Opti Canada. [00:48:03] Now, Opti is an Israeli-Canadian company that is involved in like taking bitumen and extracting oil from it. [00:48:10] And this is of all of the ways to get oil out of the ground. [00:48:13] Bitumen extraction is like the most fucking poisonous, right? [00:48:17] It is, this is the absolute worst way to get oil for the environment. [00:48:22] It is a hideously toxic thing to do. [00:48:25] And Peter and his company put a shitload of money behind this and it secures returns of like 60% for them. [00:48:31] And this is the period Peter is very much anti-kind of climate change. [00:48:36] I don't know how much he actually sounds like he's pro-climate change. [00:48:39] He's very pro-climate change. [00:48:41] Yeah. [00:48:41] I think he's anti-anti-like, I think he would say anti-the ideology of climate change, right? [00:48:47] And one of the things that's happened here is right around like 2006, seven, he, he, Elon Musk and David Sachs fund a movie called Thank You for Smoking, which I actually just watched the day that Biden dropped out of the election. [00:48:59] It still holds up. [00:49:00] Yeah. [00:49:01] It's a good movie. [00:49:02] It's a fucking, what's his name? [00:49:04] The guy who played Too Face in the new in the Chris Nolan Batman movies. [00:49:08] That guy, yeah. [00:49:10] Yeah, him. [00:49:10] Yeah. [00:49:11] He was disturbingly handsome guy. [00:49:12] Incredibly like right at his, Aaron Eckert, right at his like the peak of him being a handsome, charming son of a bitch. [00:49:18] It's a good movie. [00:49:20] Like it's easily the best thing that those three guys were ever involved with. [00:49:24] It's based off a book by William F. Buckley's son, which is Rhodesia lover, William F. Buckley. [00:49:32] But it's a good book. [00:49:34] It is extremely libertarian and it is extremely early aughts libertarian. [00:49:40] And it's one of those things, I think if you take the ideology that the book's characters have completely seriously, then it's a lot less enjoyable. [00:49:49] But it's impossible to really do that when you're watching it because there's just so many talented people involved. [00:49:54] And it's a good script. [00:49:56] Again, Aaron Eckert is just soaking up the screen. [00:50:01] And you've got fucking J.K. Simmons is kind of the antagonist. [00:50:05] There's a lot of great people in that movie. [00:50:07] Anyway, go watch Thank You for Smoking. [00:50:09] It holds up. [00:50:10] Like, believe me, people. [00:50:12] But you also, as you watch it, think like this is pretty true to Peter's actual unironic beliefs about politics in the early 2000s, right? [00:50:20] As opposed to just like, well, this is a fun movie about like these absolutely amoral merchants of disinformation, right? [00:50:27] Right. [00:50:28] Yeah. [00:50:29] Come on, stop being so serious. [00:50:31] Wise up. [00:50:32] Have fun. [00:50:33] Right, right, right. [00:50:34] We're never going to wind up behind power. [00:50:36] Yeah. [00:50:37] Yeah. [00:50:41] Peter's putting money into thank you for smoking and bitumen extraction. [00:50:46] And he's also kind of, this is the period where he's really started to relish being the famous founder guy here. [00:50:54] And he gets more open about everything in his life. [00:50:56] And I'm going to quote from Chake Chafkin's book, The Contrarian here. [00:51:00] He began telling close friends and then co-workers that he was gay, socializing at bars or on the roof of his new house often with the handsome young men he was hiring, many of whom were out. === Apocalypse Preacher Persona (10:21) === [00:51:08] Thiel's self-actualization would pay off in August 2007, four months before the recession began and close to a year before most Americans realized the economy was collapsing. [00:51:17] He sent a letter to investors declaring that the economic expansion was officially over. [00:51:22] We've begun a long post-boom phase that can be called the long goodbye, the letter said. [00:51:27] And this is one of Peter's great successful predictions, right? [00:51:32] Which is that he calls, he starts writing in 2007 about the global financial crash that's going to really hit in 2008. [00:51:41] He is very much ahead of the curve on this. [00:51:44] A few months after that 2007 letter, right at the start of 2008, just weeks after he'd been outed by Gawker, Peter sends out a 10,000-word essay to investors. [00:51:54] And this is another thing that kind of he's famous for as like a hedge fund guy, is he will periodically write these massive political and philosophical, sometimes even religious essays and send them out to all of his funders, right? [00:52:06] To kind of explain their philosophy at the moment. [00:52:09] And a lot of this is like, you gotta have a lot of confidence in your fucking dorm room ass musings if you're sending this kind of shit out of people you're trusting to invest money in. [00:52:17] I'm guessing a lot of these wound up just kind of like thrown into trash. [00:52:21] But so he predicts correctly that there's this crash coming, but he also, I think because just of who he is, he overcatastrophizes, right? [00:52:32] The 2008 crash was really bad. [00:52:34] He sees that coming. [00:52:36] He also thinks it's definitely going to cause a depression, right? [00:52:39] That no one is going to bail out anything and that the cycle of collapse will continue absolutely unabated. [00:52:46] It will go on like a runaway sort of freight train kind of deal, right? [00:52:50] And so instead of doing what would have been the smart thing as an evil investor, which is shorting the housing market, right? [00:52:57] Tell your have your people end their risky positions with like companies that owe a lot of money and fucking short the housing market, right? [00:53:04] In order to make money off of what you can in the immediate term and kind of avoid the consequences of this cruel winter coming. [00:53:11] Peter, he kind of continues in this like apocalypse preacher persona and quote, and states that he is quote recommending prayer and repentance in lieu of investment analysis, which is an insane thing to write to investors. [00:53:26] Yeah, he's like, you should all repent to Christ. [00:53:28] We're dude. [00:53:31] That's unbelievable. [00:53:33] Was he on a sandwich board? [00:53:35] Is he what? [00:53:36] Sorry? [00:53:38] Yeah, he's basically doing like a fucking sandwich board kind of thing, right? [00:53:41] Like the end is nigh. [00:53:44] Is there any chance? [00:53:45] Come on. [00:53:46] Was there any chance that he was just, that was just a bit or he was just making a joke there? [00:53:50] I think he's being, I think maybe there's an element of that, but he doesn't, his, what he does financially is also what you would do if you legitimately expected total collapse, right? [00:54:01] Like he does initially short the dollar and there are initial like high yields, right? [00:54:07] Like he bets against some companies that are taken up by large loans and their yields in the first half of 2008 go up to like five times their prior rate, right? [00:54:16] Kind of at the height of this, they've got between $6.4 and like $8 billion under management, right? [00:54:21] And this is a fund that back in like 2002 or three was 260 million, right? [00:54:26] So you can see why people are like, wow, this is the future of investing. [00:54:29] What a genius Peter is. [00:54:30] And he looks like a genius in kind of the early stages of that financial collapse. [00:54:34] But again, we're just talking about the first half of 2008 here. [00:54:38] Now, he described his school of thought on these matters as being a global macro investor, which in his terms meant looking out at world events and basing your economic predictions on kind of the vibe you felt about the times and large, as opposed to the specific situation each of those companies was in. [00:54:57] He urged investors at Clarium to make one trade per week, which Chafkin credits to his combination of indecisiveness and high tolerance for risk. [00:55:05] Quote, Thiel argued that the world was heading to end times. [00:55:08] Investment analysts often employ religious metaphors, speaking of the second coming of bond yields or an equities apocalypse, but Thiel was not speaking metaphorically. [00:55:17] The entire human order, he wrote, could unravel in a relentless escalation of violence, famine, disease, war, and death. [00:55:23] Against this future, it is far better to save one's immortal soul and accumulate treasures in heaven in the eternal city of God than it is to amass a fleeting fortune in the transient and passing city of man. [00:55:33] And when you read it like that, it is kind of hard to see that as not. [00:55:36] Like, I don't know, man, you're going if that's just totally tongue-in-cheek, you're really going far with it. [00:55:42] Yeah, no, that is deep into the bit. [00:55:44] I mean, you are really committing you're far too committed to this bit, and it's not a great bit. [00:55:49] Yeah, that is a whole life of Brian. [00:55:52] Yeah, kind of right, right. [00:55:54] Holy shit. [00:55:55] That's so weird. [00:55:56] Yeah. [00:55:57] This is this is a deeply weird guy. [00:55:59] Yeah. [00:55:59] It's such a fucking strange fella. [00:56:02] It's pretty weird when like the young blood transfer and like bank rolling, the like roided out wrestlers lawsuit over the nudes are like sort of the bottom-tier weird things you do. [00:56:17] Like the really weird stuff is the stuff he says out in the open to his own investors. [00:56:22] Yeah. [00:56:22] What the hell? [00:56:23] He's so strange. [00:56:26] Oh man. [00:56:27] Oh my God. [00:56:28] I'm not going to be as rich as I'm personally not may not be as rich as I might have been before. [00:56:34] My like Wall Street buddies aren't going to be able to rapaciously divide up loans the way they were before. [00:56:41] Society is doomed. [00:56:42] We're all fucked. [00:56:44] If you hate people and you fundamentally think that they're like messy little scum who need to be ruled, you can't imagine things would go bad if you're scared about a financial collapse. [00:56:56] You have to imagine they are on the edge of eating each other, right? [00:56:59] Because they're not any better than animals, right? [00:57:02] And I'm not trying to shit talk animals. [00:57:04] I'm just saying I think that's how Peter thinks about things, right? [00:57:07] I think animals are much better than people usually. [00:57:10] But that's, I think, I think I'm accurately describing. [00:57:14] That's how I think Peter feels. [00:57:16] I'm basing that off of vibes. [00:57:17] Like Peter was basing thinking about the collapse of the world, right? [00:57:22] But I have as good a record with vibes as Peter does, at least. [00:57:26] So yeah, yeah. [00:57:28] Here's some ads. [00:57:33] There's two golden rules that any man should live by. [00:57:37] Rule one, never mess with a country girl. [00:57:41] You play stupid games, you get stupid prizes. [00:57:44] And rule two, never mess with her friends either. [00:57:47] We always say, trust your girlfriends. [00:57:51] I'm Anna Sinfield, and in this new season of The Girlfriends, oh my God, this is the same man. [00:57:57] A group of women discover they've all dated the same prolific con artist. [00:58:02] I felt like I got hit by a truck. [00:58:04] I thought, how could this happen to me? [00:58:05] The cops didn't seem to care. [00:58:07] So they take matters into their own hands. [00:58:10] I said, oh, hell no. [00:58:12] I vowed I will be his last target. [00:58:14] He's going to get what he deserves. [00:58:19] Listen to the girlfriends. [00:58:20] Trust me, babe. [00:58:22] On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [00:58:32] I'm Laurie Siegel, and on Mostly Human, I go beyond the headlines with the people building our future. [00:58:37] This week, an interview with one of the most influential figures in Silicon Valley, OpenAI CEO Sam Altman. [00:58:44] I think society is going to decide that creators of AI products bear a tremendous amount of responsibility to products we put out in the world. [00:58:51] From power to parenthood. [00:58:53] Kids, teenagers, I think they will need a lot of guardrails around AI. [00:58:56] This is such a powerful and such a new thing. [00:58:58] From addiction to acceleration. [00:59:01] The world we live in is a competitive world, and I don't think that's going to stop, even if you did a lot of redistribution. [00:59:05] You know, we have a deep desire to excel and be competitive and gain status and be useful to others. [00:59:12] And it's a multiplayer game. [00:59:14] What does the man who has extraordinary influence over our lives have to say about the weight of that responsibility? [00:59:20] Find out on Mostly Human. [00:59:22] My highest order bit is to not destroy the world with AI. [00:59:25] Listen to Mostly Human on the iHeartRadio app, Apple Podcasts, or wherever you listen to your favorite shows. [00:59:33] Hey, I'm Nora Jones, and I love playing music with people so much that my podcast called Playing Along is back. [00:59:39] I sit down with musicians from all musical styles to play songs together in an intimate setting. [00:59:44] Every episode's a little different, but it all involves music and conversation with some of my favorite musicians. [00:59:50] Over the past two seasons, I've had special guests like Dave Grohl, Leve, Mavis Staples, Remy Wolf, Jeff Tweedy, really too many to name. [00:59:59] And this season, I've sat down with Alessia Cara, Sarah McLaughlin, John Legend, and more. [01:00:04] Check out my new episode with Josh Grobin. [01:00:07] He related to the Phantom at that point. [01:00:10] Yeah, I was definitely the Phantom in that. [01:00:12] That's so funny. [01:00:13] Shar each day with me each night, each morning. [01:00:22] Say you love me. [01:00:25] You know I. [01:00:26] So come hang out with us in the studio and listen to Playing Along on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:00:34] What's up, everyone? [01:00:35] I'm Ago Moda. [01:00:36] My next guest, you know, from Step Brothers, Anchorman, Saturday Night Live, and the Big Money Players Network. [01:00:43] It's Will Farrell. [01:00:47] My dad gave me the best advice ever. [01:00:50] I went and had lunch with him one day, and I was like, and dad, I think I want to really give this a shot. [01:00:55] I don't know what that means, but I just know the groundlings. [01:00:58] I'm working my way up through and I know it's a place that come look for up and coming talent. [01:01:02] He said, if it was based solely on talent, I wouldn't worry about you, which is really sweet. [01:01:07] Yeah. [01:01:07] He goes, but there's so much luck involved. [01:01:10] And he's like, just give it a shot. [01:01:11] He goes, but if you ever reach a point where you're banging your head against the wall and it doesn't feel fun anymore, it's okay to quit. [01:01:20] If you saw it written down, it would not be an inspiration. [01:01:22] It would not be on a calendar of, you know, the cat just hang in there. === Betting On Initial Successes (16:06) === [01:01:29] Yeah, it would not be. [01:01:31] Right, it wouldn't be that. [01:01:32] There's a lot of luck. [01:01:34] Listen to Thanks Dad on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:01:42] In 2023, former bachelor star Clayton Eckard found himself at the center of a paternity scandal. [01:01:49] The family court hearings that followed revealed glaring inconsistencies in her story. [01:01:54] This began a years-long court battle to prove the truth. [01:01:58] You doctored this particular test twice, Ms. Owens, correct? [01:02:01] I doctored the test once. [01:02:03] It took an army of internet detectives to crack the case. [01:02:06] I wanted people to be able to see what their tax dollars were being used for. [01:02:10] Sunlight's the greatest disinfectant. [01:02:12] They would uncover a disturbing pattern. [01:02:15] Two more men who'd been through the same thing. [01:02:17] Greg Oespi and Michael Marancine. [01:02:19] My mind was blown. [01:02:21] I'm Stephanie Young. [01:02:22] This is Love Trap. [01:02:24] Laura, Scottsdale Police. [01:02:26] As the season continues, Laura Owens finally faces consequences. [01:02:31] Ladies and gentlemen, breaking news at Americopa County as Laura Owens has been indicted on fraud charges. [01:02:37] This isn't over until justice is served in Arizona. [01:02:42] Listen to Love Trapped Podcast on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:02:51] And we're back. [01:02:57] Anyway, that's all very weird that Peter is so kind of married to the fucking Bible as a hedge fund guy. [01:03:05] But the weirdness does match. [01:03:06] There's a real insight here, right? [01:03:08] In that Peter, again, he's going to fuck up on taking advantage of this. [01:03:13] He extends it too far. [01:03:14] I also think his fundamental analysis is correct, which is he argues in that paper that investors are going to be unable to, as the housing market comes unwound and as these increasing contradictions in the way our economy is set up become impossible to ignore. [01:03:29] And I think, although Peter won't admit this, climate change is a big part of that. [01:03:32] Investors are going to be unable and unwilling to accept that things can't continue growing at the rates that they'd always been growing, right? [01:03:40] That that's not possible. [01:03:41] And rather than accept the inevitability of contraction or even collapse, they will start a process of massively overvaluing every asset systematically, causing an endless cascade of bubbles in every sector. [01:03:55] And that is what happened, right? [01:03:57] That's today. [01:03:58] That's the last 20 years, right? [01:04:00] Like he's not fundamentally wrong, but he also overextends how bad it's going to be and how quickly it's going to be that bad, right? [01:04:08] And the other bad move here is that if you are a hedge fund guy, even though I think this is fundamentally not incorrect analysis, it's a bad thing to put out to the people investing money in you that I think the end times are coming, right? [01:04:23] That does not make people want to keep money with you. [01:04:26] It doesn't make them want to invest more money with you. [01:04:28] It kind of makes them want to build bunkers and maybe feel like they need some of that money liquid to build bunkers, right? [01:04:34] Right. [01:04:35] Yeah. [01:04:36] So Peter starts to panic increasingly later in 2008 as the signs get worse. [01:04:41] He holds meetings to warn employees that he thinks every brokerage in the country is going to go under and there's going to be no currency. [01:04:49] And like he literally, he has his company making sure they have at least a couple grand or a thousand or something on hand for every employee so that he can keep his employees fed if all of the currency collapses. [01:05:00] They're talking about like buying gold bricks. [01:05:02] Like this is like apocalypse hoarder nonsense. [01:05:07] Like Peter is worried that his employee best friends who are his entire social group are going to starve to death and he has to make sure he has cash to pay for their food. [01:05:19] Which is actually kind of sweet. [01:05:21] Like it does show Peter is to some extent capable of caring about other people, if that's accurate. [01:05:28] Not in a way that makes him a good guy, Noah. [01:05:31] But like that is... [01:05:32] There's a degree of care there. [01:05:34] I mean, I think it's like, who else will live to serve him? [01:05:37] Right, right. [01:05:39] I need to have cash so I can buy food so I can maintain a degree of control over these other people so they have to continue being my friends. [01:05:45] Maybe that's it. [01:05:46] Did they do anything else? [01:05:47] Did they learn kung fu? [01:05:49] Did they start shooting? [01:05:50] Do they have to assume food? [01:05:52] You have to assume there's some stuff. [01:05:56] I can't imagine that you could believe this about the future and not be buying guns and stuff, right? [01:06:03] Now, Peter's strong belief that the tech bubble is going to burst and cause a depression causes him to change. [01:06:08] So like Clarium's standard operating procedure is that we're going to bet against the future stability of the U.S. economy. [01:06:15] And that seems like that should have been the first half of 2008, that makes them a fuckload of money, right? [01:06:19] And in the second half, it's going to cost them everything, right? [01:06:24] So because Peter's word is law, very little is expected of his workers on a day-to-day basis. [01:06:28] So life at the company is like pretty chill during these early apocalypse stages. [01:06:32] According to Chafkin, people played a lot of chess and spent their free time debating over how they'd run a theoretical country if they had the freedom to build it from the ground up. [01:06:40] Quote, everyone spent a lot of time talking politics, although it was important that those politics always be of the right-wing variety. [01:06:46] An employee told me that it was common to hear about, talk about climate change denial and to see web browsers open to V-DARE, a far-right website with a long record of publishing white nationalist writing. [01:06:57] Oh, we'll be talking about that. [01:06:58] It gets a lot worse in terms of V-DARE's shit, my mans. [01:07:03] This has been a good enough strategy for years, but Peter's inherent distrust of tech businesses is going to cause him to miss a lot of opportunities here, as well as his belief that like collapse is inevitable. [01:07:13] He turns down a chance to invest in Tesla, which might be understandable given his history with Elon, right? [01:07:20] If I can, if I can be like, well, I get why you would miss this because like you know what a mess he is, right? [01:07:25] But that is undoubtedly, it's a bad financial decision to not get involved with Tesla in 2008 is a poor financial, right? [01:07:33] He also turned down the money, the chance to put money into YouTube when it was still a startup. [01:07:38] And that is a catastrophic bit. [01:07:40] As a tech founder, missing YouTube is a big one. [01:07:44] That's also particularly funny because then later on he'll fund Rumble, the left-wing YouTube. [01:07:50] Yeah, right. [01:07:50] Yeah. [01:07:51] So I can get YouTube, but I got something even better. [01:07:54] I got right-wing YouTube with Russell Brown. [01:07:58] And this is like, this is right around the same time he neglects to invest in YouTube is when he neglects to take part in like the second funding round for Facebook. [01:08:05] So he misses a lot in a row here. [01:08:07] And then 2008 comes about. [01:08:09] And right. [01:08:09] And after these initial successes, he bets on the idea. [01:08:13] So he's made money the first half of the financial of 2008, of the collapse year, by betting against the dollar. [01:08:20] But then he has this belief that as the collapse starts to run away, he believes not only are some banks going to collapse, every bank is going to be nationalized, right? [01:08:31] And because he thinks that the government is going to have to take over all of the banks, I think this is just because he's also a libertarian. [01:08:37] So his nightmare is like this, you know, the government taking over everything. [01:08:41] He decides, unlike all of the guys who make money off of, you know, the collapse, unlike all the big short guys, he decides not to short any banks because he thinks that since the government's going to nationalize them all, once they get nationalized, their stock value is going to soar. [01:08:56] And so after predicting the collapse, he puts nearly a billion dollars into stocks in all of these banks and like another one and a half billion or so into Google and fucking of all of the places to put like 800 million dollars, fucking Yahoo. [01:09:11] He puts 800 million dollars into Yahoo. [01:09:13] Oh my God, that's awesome. [01:09:17] It's such a whiff. [01:09:18] It's such a fuck up. [01:09:21] Oh my God. [01:09:22] It's so funny, man. [01:09:24] It's so fucking funny. [01:09:25] That's incredible. [01:09:26] And I had, again, as a general rule, I actually, compared to most of our guys, I respect Peter, you know, not in a positive way, but just in a like, it's dangerous not to. [01:09:36] But I also had thought he had been much more successful than this. [01:09:40] And it's so interesting to me that he fucks up on these big investments while getting the bigger picture kind of fundamentally right, which is so humanizing, right? [01:09:52] Because that's such a human thing to do. [01:09:53] I've been there myself where like you predict a big thing correctly and your micro response to that, you fuck up because of who you are, right? [01:10:04] That's so human. [01:10:06] Yeah. [01:10:07] But still, I mean, I think all banks are going to fail. [01:10:12] Therefore, I'm going to put a billion dollars into them is wild. [01:10:16] That is government's obviously going to nationalize every bank, right? [01:10:19] Yeah. [01:10:19] Yeah. [01:10:20] That's really sniffing too much at your own clue. [01:10:23] It is. [01:10:23] And it's like this, it's this fail. [01:10:25] It's this belief that like, I believe that the, you know, the fucking Democrats who are going to come into office are legitimately communists. [01:10:32] They would never do the capitalist thing of propping up the banks and just giving these people free money to put a halt, like to paper over their fuck-ups, right? [01:10:41] Obviously, they're going to make this bid for ultimate power and nationalize everything, right? [01:10:47] And he's like, nope, that's not what they did. [01:10:49] Turns out they are just owned by bankers. [01:10:52] Yeah, sorry. [01:10:53] Sorry, Peter. [01:10:54] You guys won too much and now you've lost your money. [01:10:57] Just because you call every Democrat a socialist does not mean, in fact, that's what we're doing. [01:11:02] Not what they are. [01:11:04] Not what they are. [01:11:06] Good job. [01:11:08] So at the end of the year, and by the time Obama takes office, Peter has lost billions, putting him in a similar bucket to people who had failed completely to see that the crash was coming. [01:11:18] After being up by like five times in early to kind of mid 2008, by the end of the year, he's down 5%, right? [01:11:26] And then to make matters worse. [01:11:28] Once kind of the collapse hits, he dumps a shitload of his holdings while stock prices are low because he assumes a depression is coming and nothing is going to rebound and he needs cash. [01:11:39] And then all of this shit does rebound. [01:11:41] And so he misses out twice in a row here. [01:11:44] Investors begin to abandon Clarium, which had been worth $8 billion almost, I think, at its height. [01:11:49] And by the end of the year, was down to $2 billion. [01:11:52] Like by 2009, it's like a quarter of what it had been at its height, right? [01:11:55] Which is a major fuck up, you know? [01:11:57] Yeah. [01:11:59] So, yeah, fascinating stuff. [01:12:02] Now, this is where we also get back to Gawker, right? [01:12:05] Kind of 2008 or so here. [01:12:08] I stated earlier that Ryan Holiday argues Peter responded to Gawker's confrontational tabloid expose style, right? [01:12:16] Which he saw as a danger to weird geniuses like him who moved the species forward. [01:12:20] Chafkin makes a somewhat different argument. [01:12:23] Quote, Teal came to believe that the real reason for the mass redemptions, which is like all of the people taking their money out of Clarium, was Gawker media. [01:12:32] Some of Clarium's big investors, according to former employees, were Arab sovereign wealth funds controlled by governments that considered homosexuality to be a crime. [01:12:40] Thiel has never explicitly acknowledged this, but he has hinted at why he may have wanted to keep his sexual orientation out of view. [01:12:46] So a bunch of people pull their money out of his fund in this period where he is making repeated fuck ups and he blames it on Gawker outing him, right? [01:12:57] I invest $800 million in Yahoo of all places. [01:13:01] In Yahoo in 2008, homie? [01:13:04] Yeah. [01:13:04] What? [01:13:05] And like, somehow that's Nick Denton's fault. [01:13:07] Yeah, Nick Denton made you put a billion dollars into Yahoo, brother. [01:13:12] I let my own weird like coming to steal your underwear. [01:13:17] Yeah, and so technology warp my brain. [01:13:20] And somehow that's Valley Wag's fault. [01:13:22] A billion dollars almost into Yahoo, like a billion dollars into Google almost. [01:13:27] That's probably a smart, definitely a smart long-term investment. [01:13:29] Like, Yahoo? [01:13:31] Yahoo. [01:13:32] In 2008, I was 20 and I knew that Yahoo was a bad investment. [01:13:36] Yeah, no, Yahoo is crazy. [01:13:40] That's so funny. [01:13:42] That's so fucking funny. [01:13:45] Oh, man. [01:13:46] Oh, my God. [01:13:47] So I think the other reason he's angry at Gawker here is that Gawker is reporting on a lot of his fuck ups too. [01:13:53] And I think in such a way that he kind of blames them for why investors start pulling out, right? [01:13:58] Gawker revealed our screw-ups publicly in a way that hurt us, right? [01:14:02] And so, and that, by the way, again, this kind of view that like, oh, they outed him and so he destroyed them. [01:14:09] That's, you know, that makes the case, oh, the, like, the dangers of a petty billionaire seem clearer with that statement. [01:14:15] But this is much more standard evil rich guy. [01:14:18] They damaged my business interests by reporting on me screwing up. [01:14:22] And so I wanted them out, right? [01:14:23] That's perfectly normal rich guy evil bullshit, right? [01:14:26] Yeah. [01:14:28] So this period, though, you know, while he is apparently burning with rage at Gawker, the end of the Bush years, the start of the Obama era, it's not a wholly bad time for him either. [01:14:37] Because while he's, again, he's super rich, so he's insulated personally from any real consequences. [01:14:42] And while his Clarium investments fail and he stops being like the lauded, you know, hedge fund genius of the new generation, Peter sees success in another venture, which had been launched based on the Igor software that he's depending. [01:14:56] I've heard a couple of different versions of the story. [01:14:58] One is that Lebchen, his founder at PayPal buddy, codes it. [01:15:03] Some of them, the intercepts reporting says it was another guy at PayPal who coded it. [01:15:07] I don't know which one of them coded it, but Igor is this software that they had started at PayPal to stop Russian scammers, right? [01:15:13] That there are allegations and lawsuits that it was kind of stolen in part from another company, but that's a story for another day. [01:15:21] Peter is as obsessed with security and fighting terrorism as any neocon. [01:15:25] And he starts to focus on the idea that like Igor might be useful for something besides fighting fraud. [01:15:31] Perhaps this could also allow the government to hunt and kill terrorists that had caused Peter to fear for his own life. [01:15:38] I'm going to have to go back in time here. [01:15:40] And I hate to jump around like this, but Peter's involved in too many things to not do this from time to time. [01:15:44] So let's go back to July of 2004, right? [01:15:48] When he and a conservative chaplain from Stanford organize a six-day seminar with Renee Girard, the scapegoat philosopher guy. [01:15:56] Teal had attacked the Bush administration then for not being cruel enough to Muslims and had gone after the ACLU for their unhinged support of civil liberties at the expense of security. [01:16:05] He had encouraged the creation of a new system, which he built as a replacement for the UN. [01:16:10] He was like, instead of the UN, we should have an international consortium of public and private intelligence companies all working together to, quote, forge the decisive path to a truly global Pax Americana, right? [01:16:22] This American peace that intelligence agencies can bring us if we give them enough money and power to murder people with drones. [01:16:32] It's so why, yeah. [01:16:34] And Igor at this point, it's just a fancy way of what I call making a crazy board, right? [01:16:39] That thing you see in like movies and TV, you know, true to tech, where you've got like all the pictures connected by bits of string, right? [01:16:45] Igor is a way of doing that on the computer, right? [01:16:47] Where you're plugging in and it also is pulling from, you can have it pull from, oh, I know that we're looking for a guy who drives a, who lives in this state and drives a blue car and is a DUI. [01:16:57] I want you to pull up from like all of these records you have access to, everyone who fits that, right? [01:17:00] And we can add them to the crazy board, you know? [01:17:03] And it was like, look, like starting at 9-11, especially, there was like the fantasy, the uber fantasy of all these spook and spooker Jason types and all tech types that wanted to make money off of them was that, you know, you could have sort of like, you know, the equivalent of maybe what we call an AI today that like could basically predict terrorists were going to strike before they were going to strike. [01:17:28] You know, they were going to call, you know, they were going to stop the next 9-11 before the guys even, you know, had really hatched the plan. === Palantir And The CIA (12:18) === [01:17:36] And so there had been a bunch of programs that started and failed, you know, before then. [01:17:41] And, you know, there are some that were in existence in law enforcement, but they're clunky. [01:17:46] They were, you know, they're government software. [01:17:48] So they look shitty and they, you know, and they didn't work so well. [01:17:53] And then, and they didn't have real Silicon Valley talent like these guys did. [01:18:00] Yeah. [01:18:01] So I think it could be a little hard for people to visualize what Igor did, what this software does, this software that comes Palantir does. [01:18:08] So I'm going to quote from an article in Bloomberg by Peter Waldman, Lizette Chapman, and Jordan Robertson here. [01:18:14] The software combs through disparate data sources, financial documents, airline reservations, cell phone records, social media postings, and searches for connections that human analysts might miss. [01:18:24] It then presents the linkages in colorful, easy to interpret graphics that look like spider webs. [01:18:29] So based on this technology, Peter founds Palantir that same year, 2004. [01:18:35] And Palantir is the name of the infamous seeing stones in The Lord of the Rings, which are, you know, most famously the one is owned by the evil wizard Saruman, right? [01:18:46] So if you're naming your company that exists to provide this like seeing stone to the highest bidder, Palantir, that's explicitly evil, right? [01:18:56] This isn't like a case where like the good guys have their own seeing stones. [01:19:00] It's just the bad guys. [01:19:01] It's a bad thing to have. [01:19:03] It connects you inevitably to Sauron. [01:19:06] Like anyway, very fun. [01:19:09] I love fiction. [01:19:11] Now, Peter, one of the things that's interesting about fucking Palantir is that, like with most of his companies, Peter has an interest in this, but he also has one of his close friends actually running shit, being the day-to-day guy organizing everything. [01:19:25] And Peter's friend who helps him run Palantir is famously always described as his most liberal friend, a guy named Alex Karp, who Bloomberg identifies as a self-described neo-Marxist. [01:19:38] Now, I don't know about Alex, you know, like, what the fuck do you mean by as the guy starting the capitalist spy firm, how can you consider yourself a neo-Marxist? [01:19:47] But also, some Marxists are bootlickers. [01:19:49] So maybe that's the kind of guy that Alex Karp is. [01:19:52] I mean, look, there's plenty of people that, you know, or whatever the plenty of communists had spy agencies, right? [01:20:00] Yeah, yeah, yeah, yeah. [01:20:01] Or no, or what I was going to say is, you know, whatever the far-left equivalent of a limousine liberal is, those people are definitely out there. [01:20:07] Yeah, a fucking CIA socialist or something like that, I guess. [01:20:10] Yeah. [01:20:11] Okay. [01:20:11] There you go. [01:20:12] Keep going. [01:20:12] Yeah, there you go. [01:20:14] So getting off the ground is slow going at first for Palantir. [01:20:18] I mean, this is a hard thing. [01:20:19] It's a really hard industry to break into, right? [01:20:21] Because Intel agencies are first and foremost government agencies. [01:20:25] And if you know anything about the government, getting a government agency to adopt a new software tool is a brutal thing to do, right? [01:20:32] It is very hard to convince them to make moves like that. [01:20:36] It doesn't matter what kind of agency it is. [01:20:38] This is always an uphill struggle. [01:20:40] So in order to aid Palantir in kind of getting this buy-in they needed to really start to take off, Peter brought in some of the most ghoulish neocons he could find to apply pressure in DC. [01:20:50] And this included friend of the pod, John Poindexter, who old JP had worked for Ronald Reagan until he was convicted of lying to Congress about Iran-Contra. [01:21:01] He then got kind of quote unquote exonerated and gets hired by Dick Cheney to craft the Total Information Awareness Program. [01:21:10] This was a Bush era intelligence mission with a seemingly kind of reasonable goal, right? [01:21:15] We're going to collect all of the data we can on everything happening, food prices, yada, yada, yada, in these different areas that we're, we have military interests in. [01:21:24] And we're going to have algorithms comb that data to spot patterns that might be indicative of terrorist groups operating beneath the surface, right? [01:21:33] It's one of those things that sounds reasonable on its face. [01:21:37] If you look at how the war on terror went, none of this ever works out quite as well as they think it does, right? [01:21:43] But the smart guys in the room are all like, obviously, this is how we win the war on terror back then. [01:21:48] Yeah, I mean, that was the whole shit. [01:21:50] That was the whole thing for these guys. [01:21:52] Yep. [01:21:53] Was, you know, it was going to be, you know, what's the Tom Cruise movie? [01:22:00] Minority reports. [01:22:01] Right. [01:22:01] Yeah. [01:22:01] We're going to know about it before they do. [01:22:04] Yeah. [01:22:04] Yeah. [01:22:04] And they were going to, you know, put money to the or where they were going to, you know, feed the information, the organs, and they were going to feed you a red ball. [01:22:11] That was the whole thing there. [01:22:13] And yeah, Palantir got every single last one of the, you know, members of this church of this like, you know, weird spying church to advocate for them. [01:22:26] The other thing was like, you couldn't walk into a train in DC without there being a fucking Palantir ad. [01:22:31] Oh, yeah. [01:22:33] Yeah. [01:22:33] And like every guy that, you know, fucking hated Muslims and loved Star Wars was promoting this. [01:22:43] What crazy? [01:22:44] What a brand they have. [01:22:45] Yeah. [01:22:49] So they bring out John Poindexter. [01:22:51] Now, it's important here to note that Igor, which is what Palantir is selling at this point, didn't gather or create new information of its own, right? [01:22:59] This is not a big brother system. [01:23:01] This is an algorithm that works off of the extant big brother system, right? [01:23:06] Organizing all of the info that the existing surveillance operations can put at your disposal, right? [01:23:12] It's data mining, you know. [01:23:14] Now, one early concern is that analysts using Igor would use the vast array of catalog data at their disposal to stalk and harass their ex-girlfriends, which, if you know how cops work, is a thing that happens constantly. [01:23:25] Anytime you've got a database that like a certain chunk of employees have access to, some of them are going to use it to stalk their girlfriends, right? [01:23:33] This is a known issue. [01:23:35] Palantir, one of the ways in which they sell themselves to a lot of these intel agencies is Alex Karp promises we're going to make it impossible for cops to stalk their ex-girlfriends because we're going to log every search request made into the system in a way that will allow you to audit them, right? [01:23:50] So, I mean, part of their selling point is like, we're going to actually provide accountability in the spook process, right? [01:23:56] Chafke describes this as a key part of the company's pitch, but he also writes: one of Palantir's former engineers recalled meetings during which government clients would suggest trying to use the database to look up an ex-girlfriend immediately after hearing the whole privacy spiel. [01:24:09] Palantir employees would never object to these requests. [01:24:12] This person said. [01:24:13] Instead, they would remind clients that searches were logged and then allow them to look up whoever they wanted, no matter how flimsy the pretext. [01:24:20] Yeah, that's how that always works, huh? [01:24:22] Yeah, every time. [01:24:25] Every time. [01:24:26] Every time. [01:24:26] You don't need a predictive algorithm to predict that one, do you? [01:24:31] Yeah, no, you sure don't. [01:24:32] You sure don't. [01:24:33] Yeah. [01:24:33] You get you give people a computer spying device and they are going to stalk their girlfriends. [01:24:38] Yeah, that's the worst. [01:24:40] Now, thanks to their famous friends and infinite cash reserves, Palantir managed to get contracts with the CIA and the NYPD. [01:24:47] But actual adoption on a wide scale wouldn't happen without corporate purchases because no one in intelligence trusted a product that only the government used. [01:24:55] Peter tried to force the government's hand here by selling other hedge funds, like selling Igor to other finance companies, right? [01:25:03] And he marketed it as Palantir Finance. [01:25:05] So this, we've got this software that the government's interested in using. [01:25:08] It's the spy software, but you can use it to like predict which kind of investments are going to work out best, right? [01:25:14] By gathering all of this data and using it to predict the future of finance. [01:25:18] Now, this is a massive failure as an actual finance product because it doesn't work very well. [01:25:24] People don't really make a lot of money off of Palantir Finance advising their trades, but it works out as a business decision because they're able to get a couple of different finance companies to buy into it. [01:25:34] And then they can go back to the government and say, hey, look, this company and that company are already using it to make trades. [01:25:40] Obviously, DC should be using this to fight terrorism, right? [01:25:43] And the government's like, oh, well, some bank bought it. [01:25:45] So I guess we should as well. [01:25:47] Yeah. [01:25:48] Yeah. [01:25:49] And there's, I mean, look, in government, there's always like, you know, they're constantly being like, oh, we're falling behind. [01:25:54] You know, the private sector really knows, even though they work, they wouldn't make a big fuck up. [01:25:58] Yeah, exactly. [01:26:00] Yeah. [01:26:01] If only we could run government more effectively, like a business. [01:26:05] Yeah. [01:26:06] And I mean, I mean, it's totally how it worked. [01:26:10] It's totally how it worked. [01:26:12] Our adversaries can use these tools freely. [01:26:15] Why shouldn't we? [01:26:16] Why shouldn't we? [01:26:17] And obviously, as soon as the public sector, as soon as actually like the CIA and the FBI and the NYPD start putting money into Palantir, then even though Palantir Finance had kind of not done great, a lot of banks and finance agencies start to be like, oh, I guess what we have to get involved in, the government's buying this stuff. [01:26:35] So we've got to buy it. [01:26:35] We've got to get. [01:26:36] And so in 2009, JP Morgan CEO Jamie Diamond, future, future subject of the pod, gets purchases, like makes a contract with Palantir. [01:26:49] Now, almost the instant they, and they're doing this as like a security measure, we want to, we want, we have like a division in our company that's looking for evidence based on like the communications our employees have internally that we might have an employee who's breaking the law, right? [01:27:03] That's part of compliance, right? [01:27:05] We want to be, there's a degree to which we're legally obligated to spy on our employees, making investment decisions to make sure nobody's doing anything criminal, right? [01:27:12] Sure. [01:27:13] So that's why they get this software the instant they get it. [01:27:16] Their head of security, who's like using the Palantir software, uses it to spy on the entire C-suite for no real reason, right? [01:27:22] Like he loses his mind with power and starts stalking all of the people running the company. [01:27:29] This guy's name was Peter Kovichia, and he was a former secret service man who ran again a group in the company that was tasked with using algorithms to monitor employees. [01:27:38] And I'm going to quote again from Bloomberg here. [01:27:40] Aided by as many as 120 forward-deployed engineers from the data mining company Palantir Technologies, Covichia's group vacuumed up emails and browser histories, GPS locations from company-issued smartphones, printer and download activity, and transcripts of digitally recorded phone conversations. [01:27:56] Palantir's software aggregated, searched, sorted, and analyzed these records, surfacing keywords and patterns of behavior that Covichia's team had flagged for potential abuse of corporate secrets. [01:28:05] Palantir's algorithm, for example, alerted the insider threat team when an employee started badging into work later than usual, a sign of potential disgruntlement. [01:28:14] That would trigger further scrutiny and possible physical surveillance after hours by bank security personnel. [01:28:21] So that's nuts, but that's also what he's supposed to be doing. [01:28:25] But right after, right as soon as they get access to this, Kovichia goes rogue. [01:28:29] And I'm going to continue with that quote. [01:28:30] Former JP Morgan colleagues described the environment as Wall Street meets apocalypse now, with Kovichia as Colonel Kurtz ensconced upriver in his office suite, eight floors above the breast of the bank's security team. [01:28:42] People in the department were shocked that no one from the bank or Palantir set any real limits. [01:28:47] They darkly joked that Kovichia was listening to their calls, reading their emails, watching them come and go. [01:28:52] Some planted fake information in their communications to see if Covichia would mention it at meetings, which he did. [01:28:58] It all ended when the bank's senior executives learned that they too were being watched in what began as a promising marriage of masters of big data and global finance descended into a spying scandal. [01:29:10] Nice. [01:29:11] Very funny. [01:29:12] Extremely funny. [01:29:13] Oh my God. [01:29:14] So good. [01:29:15] No one deserved it more. [01:29:17] No, no, no. [01:29:18] And just like the most predictable thing that could have happened, right? [01:29:21] This is what happens every time you give these guys this toy. [01:29:24] Yeah. [01:29:25] Kavichia gets fired, but the promise of Palantir remains undimmed, even though there is tremendous debate up to the present day as to whether or not much of what they do works. [01:29:35] This is less the case now that they're doing like, we'll talk about this some in the last episode, like when it comes to providing targeting solutions based on data, I mean, there's the jury is still out on how well that's working in like Ukraine, but certainly in this period, there's a big debate: is any of this shit really work, right? [01:29:50] Right. [01:29:50] Palantir is going to take credit for the bin Laden assassination. === Pizza Orders As Evidence (02:49) === [01:29:54] Very unlikely they had anything to do with it, but they do kind of, they take like oblique credit for it. [01:30:01] And the software is swiftly adopted by police stations in cities like New York, Chicago, and LA. [01:30:06] Palantir's software was often used, allegedly, to single out innocent individuals because the connections in their lives looked suspicious to the algorithm. [01:30:14] And here's Bloomberg again. [01:30:16] The platform is supplemented with what sociologist Sarah Brain calls the secondary surveillance network, the web of who is related to who, friends with, or sleeping with whom. [01:30:25] One woman in the system, for example, who wasn't suspected of committing any crime, was identified as having multiple boyfriends within the same network of associates, says Brain, who spent two and a half years embedded with the LAPD while researching her dissertation on big data policing at Princeton University, and who's now an associate professor at the University of Texas at Austin. [01:30:42] Anybody who logs into the system can see all these intimate ties, she says. [01:30:46] To widen the scope of possible connections, she says, the LAPD has also explored purchasing private data, including social media for closure and toll road information, camera feeds from hospitals, parking lots, and universities, and delivery information from Papa John's International and Pizza Hut LLC. [01:31:03] Now, this is fucked up. [01:31:07] Pizza Terror connection. [01:31:09] Yeah, no, I do think that the government should have access to our Papa John's records, but mainly to make sure, like, hey, man, you've ordered 14 extra large pizzas this month. [01:31:18] Everything okay? [01:31:19] You did all right? [01:31:20] Did you need like a hug? [01:31:21] Can we send a guy by your house to give you a hug? [01:31:24] The proper order of Papa John's orders is zero. [01:31:27] Zero of Domino's orders is zero. [01:31:30] Yeah. [01:31:30] Pizza Hut, man. [01:31:31] Come on. [01:31:32] No, pizza, a proper number of orders is zero. [01:31:36] No, I like a good stuff crust. [01:31:38] I like a good stuff crust. [01:31:40] Oh, yeah. [01:31:41] Once a year, I love a big Pizza Hut pizza. [01:31:44] You know what? [01:31:44] You know what's good, though? [01:31:45] Is those Shaq Papa John's commercials? [01:31:48] They get me every day. [01:31:49] Oh, yeah, okay. [01:31:50] No, no. [01:31:52] You can't get me into a Papa John's. [01:31:53] But you know what they don't get me to do? [01:31:55] Still eat Papa John's. [01:31:57] No, no. [01:31:58] I would eat Shaq before I'd eat a Papa John's. [01:32:01] Wow. [01:32:02] Honey. [01:32:06] He's got to have marbling, you know? [01:32:07] Yeah. [01:32:08] I've always endorsed cannibalism. [01:32:09] I'm very pro-cannibal. [01:32:10] Like one of those freaks that said they wanted to eat Mu Dang. [01:32:14] I don't know who that is, Sophie. [01:32:15] Well. [01:32:16] But speaking of eating people, let's talk about the tragic case of Manuel Rios. [01:32:20] He doesn't get eaten. [01:32:22] I hope not. [01:32:23] Manuel is a guy who grew up in East LA and had a lot of friends who joined a local gang, Eastside 18. [01:32:30] In 2016, he was seated in a parked car with a friend who'd been jumped into the gang when police rolled up. [01:32:35] His friend ran, but Rios, who had not been breaking any laws, didn't run. [01:32:39] He was like, Why the fuck would I run? [01:32:41] Like, my buddy's in a gang. [01:32:42] Like, of course, he's going to flee. === Privacy Is Dead (03:27) === [01:32:43] I'm not doing anything wrong. [01:32:45] But because, you know, cops be how they are, he gets added to the LAPD's gang database, right? [01:32:52] And Palantir's software, because of how many other gang dudes that he just kind of socially knows because of where he lives, he gets identified as a high-priority target and he starts getting stopped constantly by the cops. [01:33:04] Quote, the police on autopilot with Palantir are driving Rios towards his gang friends, not away from them, worries Maria Saba, a neighbor and community organizer who helped him get off meth. [01:33:15] When whole communities like East LA are algorithmically scraped for pre-crime suspects, data is destiny, says Saba. [01:33:21] These are systemic processes. [01:33:22] And when people are constantly harassed in a gang context, it pushes them to join. [01:33:26] They internalize being told they're bad. [01:33:29] And that's kind of the, you know, one of the, we will talk a lot more about Palantir even in part four, but that's like one of the really dark things about this is that it's masquerading as like this genius predictive, but all it's really doing is like, oh, you live in a poor neighborhood. [01:33:44] A lot of your friends grew up to be in gangs. [01:33:45] Cops should probably fuck with you constantly. [01:33:48] You know, fuck with this guy constantly, right? [01:33:50] And that's all it is. [01:33:51] It's stop and frisk that you threw an algorithm over. [01:33:54] Yeah. [01:33:54] Or it almost reminds me of like, you know, how like in the shitty day, shitty early days of like MapQuest and Google Maps, where it like you'd get some drivers that would just follow the directions no matter how unhinged or whatever. [01:34:09] Like that office scene, right? [01:34:11] They drive down to a river. [01:34:12] Yeah, And it's like, guys, use your eyes, homie. [01:34:18] What's up? [01:34:18] Yeah. [01:34:19] What's up, man? [01:34:19] You know this. [01:34:20] Yeah. [01:34:21] And instead, it'd be like, they just follow, they blame the algorithm. [01:34:26] Yeah. [01:34:26] Now, if it's just going to cost you an extra 10 bucks on an Uber, that's one thing. [01:34:30] But when it costs, you know, some poor kid getting harassed over and over again, that's something totally different. [01:34:35] And that totally happened. [01:34:37] Yeah. [01:34:38] Yeah. [01:34:38] It sure did. [01:34:40] So in 2011, Peter did an interview with Bloomberg. [01:34:44] This point, civil libertarians, which had been previously kind of Peter's constituency, had started blowing the horn over Palantir. [01:34:50] Peter felt a need to make the case to his fellow libertarians on the need to embrace being spied on. [01:34:56] He argued that data mining was less harmful than the quote crazy abuses and draconian policies the Bush administration had pushed after 9-11. [01:35:03] And I would desperately love to hear which of those policies didn't you agree with, Peter? [01:35:07] Right? [01:35:07] Because it seems like you think they didn't go far enough, right? [01:35:10] But he's like, look, if we want to avoid a police state, obviously you have to let me, Peter Thiel, build a surveillance state. [01:35:16] That's all that can stop us from having an evil police state, right? [01:35:19] I'm a libertarian. [01:35:21] That's why I'm selling this shit to the CIA to spy on us because it keeps liberty. [01:35:25] A Sheritan police state. [01:35:27] Yeah. [01:35:27] You know, in my, I hadn't thought of this before, but I remember around this time, libertarians in my life, or self-professed libertarians in my life, moved from like really caring about this stuff to throwing up their hands and saying, well, privacy is dead. [01:35:40] Yeah. [01:35:41] Nothing you can do. [01:35:42] It's the 9-11 effect. [01:35:44] It's so much because a big part of this is like you have this kind of same thing that happens where you've got all these guys who in the early 2000s, kind of the early part of the Bush era, the late 90s had been like atheist activists who were like super anti-the religious right. [01:36:02] And then after 9-11, they all get really racist against Muslims and it pushes them towards conservatives. [01:36:08] And it's like, oh, guy, so you guys didn't have any principles ever. === Touch Grass And Pet A Dog (05:14) === [01:36:11] Right. [01:36:11] Okay. [01:36:12] I get it. [01:36:12] I get it. [01:36:13] Yeah. [01:36:13] Yeah. [01:36:13] And so, and on the data side, it's like, oh, yeah, we were really for civil liberties, but now that privacy is dead, you might as well have a libertarian. [01:36:23] Yeah. [01:36:24] Yeah. [01:36:24] What's the difference? [01:36:25] Just let it happen. [01:36:26] Or if someone's going to spy on you, let it be Peter Thiel. [01:36:30] Yeah. [01:36:31] He's one of us at least. [01:36:33] Yep. [01:36:34] Speaking of Peter Thiel, Noah, you got anything to plug? [01:36:41] Yes, I have my new. [01:36:42] I don't know. [01:36:43] I have some. [01:36:44] I tried to come up with a lame joke there on the front. [01:36:46] No, I have nothing to plug, but you can find me at Noah Shackman. [01:36:49] That's N-O-A-H-S-H-A-C-H-T-M-A-S at most social platforms. [01:36:58] Well, check out Noah and, you know, figure out. [01:37:05] No, I'm not going to tell people to do anything illegal. [01:37:08] Crazy board. [01:37:09] Find out who I'm connected to. [01:37:11] Yeah, make a crazy, go make a crazy board. [01:37:13] Go make a crazy board. [01:37:14] Become a control. [01:37:14] Put me at the center. [01:37:15] May I also put someone in the center. [01:37:17] May I also suggest instead touch grass and pet a dog in a concentrated touchgrass, pet a dog, make a crazy board on your wall. [01:37:25] Stop hanging out with your friends. [01:37:27] Cut off all contact with your family. [01:37:30] Live alone in a dark room. [01:37:31] Just try to be more like Matthew McConaughey and true detective, right? [01:37:36] Just as much like Matt, thank your daughter for dying and sparing you the sin of being a father. [01:37:41] Do all that good, good Matthew McConaughey and true detective stuff. [01:37:45] Have fun with it, buddy. [01:37:46] Just have fun with it. [01:37:47] Also, don't eat. [01:37:48] Do not eat. [01:37:49] Do not eat anything but amphetamines. [01:37:51] Nothing but amphetamines. [01:37:52] Yeah, cigarettes. [01:37:53] All right. [01:37:53] Just chew them up. [01:37:55] Make a cigarette shake every morning. [01:37:58] Okay. [01:38:02] Behind the Bastards is a production of CoolZone Media. [01:38:05] For more from CoolZone Media, visit our website, coolzone media.com or check us out on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:38:14] Behind the Bastards is now available on YouTube. [01:38:17] New episodes every Wednesday and Friday. [01:38:20] Subscribe to our channel, youtube.com/slash at Behind the Bastards. [01:38:27] When a group of women discover they've all dated the same prolific con artist, they take matters into their own hands. [01:38:35] I vowed I will be his last target. [01:38:38] He is not going to get away with this. [01:38:40] He's going to get what he deserves. [01:38:42] We always say that, trust your girlfriends. [01:38:46] Listen to the girlfriends. [01:38:48] Trust me, babe. [01:38:49] On the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:38:58] What's up, everyone? [01:38:59] I'm Ago Modern. [01:39:00] My next guest, it's Will Farrell. [01:39:04] My dad gave me the best advice ever. [01:39:08] He goes, just give it a shot. [01:39:09] But if you ever reach a point where you're banging your head against the wall and it doesn't feel fun anymore, it's okay to quit. [01:39:16] If you saw it written down, it would not be an inspiration. [01:39:18] It would not be on a calendar of, you know, the cat just hanging in there. [01:39:25] Yeah, it would not be. [01:39:27] Right, it wouldn't be that. [01:39:28] There's a lot of life. [01:39:30] Listen to Thanksgiving on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:39:37] In 2023, bachelor star Clayton Eckard was accused of fathering twins, but the pregnancy appeared to be a hoax. [01:39:45] You doctored this particular test twice, Miss Owens, correct? [01:39:48] I doctored the test once. [01:39:50] It took an army of internet detectives to uncover a disturbing pattern. [01:39:55] Two more men who'd been through the same thing. [01:39:57] Greg Gillespie and Michael Mancini. [01:39:59] My mind was blown. [01:40:01] I'm Stephanie Young. [01:40:02] This is Love Trapped. [01:40:03] Laura, Scottsdale Police. [01:40:05] As the season continues, Laura Owens finally faces consequences. [01:40:10] Listen to Love Trapped podcast on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:40:17] 10-10 shots five, City Hall building. [01:40:20] How could this ever happen to City Hall? [01:40:22] Somebody tell me that. [01:40:24] A shocking public murder. [01:40:25] This is one of the most dramatic events that really ever happened in New York City politics. [01:40:31] They screamed, get down, get down. [01:40:33] Those are shots. [01:40:35] A tragedy that's now forgotten. [01:40:38] And a mystery that may or may not have been political, that may have been about sex. [01:40:42] Listen to Rorschach, Murder at City Hall on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. [01:40:52] I'm Laurie Siegel, and this is Mostly Human, a tech podcast through a human lens. [01:40:56] This week, an interview with OpenAI CEO Sam Altman. [01:41:00] I think society is going to decide that creators of AI products bear a tremendous amount of responsibility to the products we put out in the world. [01:41:07] An in-depth conversation with the man who's shaping our future. [01:41:10] My highest order bit is to not destroy the world with AI. [01:41:13] Listen to Mostly Human on the iHeartRadio app, Apple Podcasts, or wherever you listen to your favorite shows. [01:41:22] This is an iHeart podcast. [01:41:24] Guaranteed human.