Timcast IRL - Tim Pool - Trump LOSES IT Learning Iran Leader IS GAY | Timcast IRL w/ Bryan Callen & Liv Boeree Aired: 2026-03-16 Duration: 02:31:46 === Iran Supreme Leader Scandal (01:44) === [00:02:25] So Iran's new supreme leader is gay. [00:02:27] And apparently, when Trump found out, he and the administration started busting out laughing. [00:02:32] Apparently, one senior administration official has been laughing about it for days. [00:02:36] And I'll tell you what I really think. [00:02:38] This is a PSYOP meant to cause harm to the reputation of the new Supreme Leader of Iran before he actually starts running this country. [00:02:46] Because, you know, you can't be gay in Iran. [00:02:49] Now, the truth is the guy might be dead. [00:02:52] No one actually knows. [00:02:53] They've not seen him. [00:02:53] There are reports that he was flown to Moscow for surgery. [00:02:56] He may have been maimed or in a coma. [00:02:58] His leg may have gotten blown off. [00:03:00] Apparently, he was in the building when they bombed it. [00:03:02] They killed the Ali committee, the supreme leader, but was going outside of the garden and may have survived. [00:03:07] We don't know for sure. [00:03:08] The one thing we can say is he didn't show up to his own coronation. [00:03:12] He wasn't there. [00:03:13] And the only statement he's released is written, so many people think he's actually dead. [00:03:18] Without confirmation, I guess the West's play is just to call him gay so that the people of Iran are like, I don't want to follow that guy, which is honestly kind of clever. [00:03:27] And it's lame. [00:03:29] It's great. [00:03:30] Oh, someone is gay at least. [00:03:31] At least our Peter isn't gay. [00:03:33] I think it might work. [00:03:34] It's Iran, though. [00:03:35] So that's apparently a big story. [00:03:40] And then I actually think substantially more interesting is that Cuba's power is completely out. [00:03:44] Their grid has failed totally, and the U.S. is about to come in and, quote unquote, save them. [00:03:51] So it looks like Cuba's going to be falling back into the Western fold. [00:03:53] If this plays out, sanctions will likely be lifted and trade will be normalized, which all in all, I think, is actually a good thing coming off of what happened in Venezuela. [00:04:01] And then, of course, the escalation of the war in Iran. [00:04:04] And then Megan Kelly upset that Macro Levin has a micropenis and they're fighting about it. === Cuba Sanctions and Trade (02:49) === [00:04:10] I'm not even kidding. [00:04:11] Welcome to whatever. [00:04:13] I don't know. [00:04:13] The world of clicks. [00:04:15] The world of clicks. [00:04:16] He gets it. [00:04:16] Before we get started, my friends, we got a great sponsor for you. [00:04:19] It is Beam Dream. [00:04:20] Make sure you guys head over to shopbeam.com/slash Tim Pool and pick up your nighttime blend to support better sleep. [00:04:28] I absolutely love Beam Dream. [00:04:30] I drink it every single night. [00:04:31] It is a delicious cup of hot cocoa you drink before bed that helps you sleep. [00:04:34] It's got magnesium, rays, melatonin, al-thenine, all the good stuff. [00:04:37] And it is a delicious cup of hot cocoa. [00:04:39] They got a bunch of flavors. [00:04:40] They got cinnamon cocoa. [00:04:41] They got chocolate peanut butter. [00:04:43] They got sea salt caramel. [00:04:44] Actually, that one's my favorite. [00:04:46] The cinnamon cocoa was my favorite while, but I like the sea salt caramel one now because it's not a cocoa, right? [00:04:50] 15 calories, no added sugar, and legit. [00:04:53] It helps me sleep better. [00:04:54] My sleep score has improved, and I'm a huge fan of this product. [00:04:57] When they first reached out to us, I said, you know, I don't think I need this, but you know, we're hoping it's fine. [00:05:01] I love this stuff. [00:05:02] And then I started drinking every night before bed, and legit, my sleep score started improving dramatically. [00:05:07] So for guys, listen up. [00:05:09] Your testosterone and HGH are produced in REM sleep and deep sleep. [00:05:12] So if you're not getting good sleep, you're going to be irritable, tired, and fat. [00:05:15] And I don't think you want to be like that. [00:05:17] So check out shopbeam.com/slash TimPool and you can get up to 40% off. [00:05:23] Don't forget to also go to Timcast.com, join the Discord, make this possible. [00:05:28] There's tens of thousands of people hanging out every single day. [00:05:30] They're building new projects. [00:05:31] There's pre-shows, after shows, but more importantly, community is our strength. [00:05:35] If you guys want to help change the world, you've got to connect with other people to build those projects. [00:05:40] And at the same time, you're helping support the work that we do. [00:05:42] So smash that like button right now and share the show with everyone you know if you want to support the work that we do. [00:05:47] You already noticed we have a great guest here. [00:05:49] It's Brian Callen. [00:05:50] Thank you, ladies and gentlemen. [00:05:52] Good to see you, buddy. [00:05:52] Absolutely. [00:05:53] Who are you here? [00:05:54] Yeah. [00:05:55] What's that? [00:05:55] Who are you? [00:05:56] What do you do? [00:05:56] Oh, I'm just a comic. [00:05:58] I'm just a man. [00:05:59] I like saying simple things. [00:06:00] You know what I mean? [00:06:01] Just a man. [00:06:01] I get from point A to point B the best way I know how. [00:06:04] Sometimes it's dangerous. [00:06:05] Facts. [00:06:06] That's to the point. [00:06:07] It's saying things like that. [00:06:08] I just want to be that guy one time. [00:06:10] You know what I mean? [00:06:10] I want to be the guy who's like, that's a tale for another time, my friend. [00:06:15] You know those guys who have got scars and a million stories? [00:06:18] And speckled whitish gray beards. [00:06:20] Yeah, yeah, yeah. [00:06:21] Exactly. [00:06:21] You're not telling the story. [00:06:22] You can use that whenever you want. [00:06:24] I could, actually, I could. [00:06:25] Play it off. [00:06:26] Just don't want to say wise things like a tree grows as fast as a tree grows. [00:06:29] Doesn't it? [00:06:31] Do you hear him? [00:06:32] You use a lot of ellipses when you, right? [00:06:34] Yes. [00:06:35] Indeed. [00:06:36] We do have another guest. [00:06:36] It's Liv Bowry. [00:06:37] Hello. [00:06:38] Yeah, she's way more interesting. [00:06:39] Who are you? [00:06:40] What do you do? [00:06:41] I used to be a pro poker player for a long time, which is how we met. [00:06:46] Yeah. [00:06:46] Playing poker. [00:06:48] And now I don't even know how to describe what it is I do. [00:06:51] I kind of do research and make content around the intersection of game theory, risk, and technology. === Poker, Game Theory, and AI (15:07) === [00:06:59] So talk a lot about AI and how to make us not be in these race to the bottom spirals. [00:07:06] Whatever job title that is, I don't know. [00:07:07] I feel like you could rob a bank. [00:07:09] She could rob a bank with a bag. [00:07:10] Podcaster. [00:07:10] There we go. [00:07:11] I know. [00:07:11] Podcaster. [00:07:12] Somebody with that accent. [00:07:13] First of all, you sound very smart, and I would put all the money in a bag. [00:07:16] It's just a British accent. [00:07:17] Yeah, I know. [00:07:18] You could rob a bank, and I'd be like, of course. [00:07:20] Just say this to me. [00:07:21] Just go to Sisko. [00:07:22] Put all the money in the bank. [00:07:23] We got to open source the AI. [00:07:25] Put all the money in the bag. [00:07:27] She's going to rob a bank. [00:07:28] Watch this. [00:07:29] Do you think we should open source the AI? [00:07:31] Yes. [00:07:32] Sort of prize. [00:07:33] You should make me do a thing. [00:07:34] I don't know. [00:07:35] I don't know if this is like a single thing. [00:07:36] Did you know that for a long time, all of the villains in our cartoons were British? [00:07:42] Of course. [00:07:42] Like in Disney? [00:07:43] There's something about being British. [00:07:45] There is, and it depends on the British accent. [00:07:47] You're either extremely intelligent or extremely stupid, right? [00:07:50] Like you have a posh accent. [00:07:51] People are going to think you must be smart. [00:07:52] If you're like a cockney, they're gonna be like, this guy's an idiot. [00:07:55] Well, you can say things, you can say horrible things, and somehow it feels like they're being polite. [00:08:00] You can say something like, I'm going to have to fillet you now. [00:08:03] And it defends. [00:08:04] And it won't be painless. [00:08:06] Well, put all the money in a bag, innit? [00:08:09] That's more like a bag right now. [00:08:14] Right now. [00:08:14] That's serious. [00:08:15] No, it's not. [00:08:16] All right. [00:08:17] We got to talk about this news, which is probably the stupidest story I've ever seen. [00:08:21] It's from the New York Post. [00:08:22] Trump briefed that Iran's new Supreme Leader, Mustabah Khomeini, is probably gay and president has a priceless reaction. [00:08:29] Indeed, he busted a gut. [00:08:31] Others in the room also found it hilarious and joined the president's reaction. [00:08:35] While one senior intelligence official has not stopped laughing about it for days, said one person familiar with the briefing. [00:08:42] So here's what's interesting. [00:08:43] Actually, there's been intelligence going back for quite some time that this dude might be gay. [00:08:47] Apparently, in the late 80s, early 90s, he had to fly to the UK for impotence treatment because when he got married, he couldn't get it going. [00:08:55] You know what I'm saying? [00:08:56] And so they were like, what's wrong with this young man who, for some reason, can't get it going? [00:09:01] And they're trying to insinuate he's gay. [00:09:04] And just, I just want to stress the juvenile South Park-esque political strategy of we need to find a way to discredit the new Supreme Leader. [00:09:15] We can call him gay. [00:09:16] They're just gay. [00:09:17] I just imagine Scott Passett being like, that's not funny. [00:09:20] Maybe that gay is not funny. [00:09:21] It's true, they shouldn't bring it up. [00:09:22] It's so stupid. [00:09:23] You know what I mean? [00:09:24] But, you know, at least our leader's straight. [00:09:26] But for a country like Iran, it may be effective. [00:09:30] Well, I have to be honest, because I'm 12 years old. [00:09:33] I'm looking at his face. [00:09:34] He looks gay. [00:09:35] I know, you look at David Cross from. [00:09:37] I swear to God, now I'm like, hey, he looks kind of gay, which is so unfair, but he has a soft look. [00:09:41] You say he has a gay face. [00:09:42] Yeah, he looks like he's got a case of gay face. [00:09:44] He looks like an intellectual. [00:09:45] The show Rest in Development. [00:09:46] I mean, he looks like David Cross. [00:09:48] He's very borderline. [00:09:50] I think he's bisected. [00:09:51] He doesn't look a little bit like an Iranian round, kind of, you know. [00:09:54] And look, look at his shirt. [00:09:55] He blew himself. [00:09:56] Oh, new wild. [00:09:59] Oh, come on, yeah. [00:10:00] Maybe he's just another nude. [00:10:02] Yeah. [00:10:03] I hope he's gay. [00:10:04] That's why he couldn't get it up. [00:10:05] I bet the staffer that keeps laughing, he's been laughing for days, is gay. [00:10:09] Trump's staffer is what I said. [00:10:11] He's like, no, guys, guys. [00:10:13] No one's laughing about it. [00:10:15] It's like seven intel guys in a room with Trump, deadpan serious. [00:10:19] We have to call him gay. [00:10:21] It's the only way to discredit him. [00:10:22] And Trump's like, do you think it'll work? [00:10:24] They're just very serious about war. [00:10:26] But I always wonder about what they say before that. [00:10:29] It's like, who wants to tell him? [00:10:30] Let me tell him. [00:10:30] Let me tell him. [00:10:31] You know what I mean? [00:10:32] It's so dumb. [00:10:33] Mr. President, sorry to interrupt your incredibly busy day. [00:10:36] I know you're dealing with a war and everything else, but I imagine someone just busted in. [00:10:41] Khomeini's gay. [00:10:42] I just imagine someone bust into the Oval Office and be like, you're not going to believe this. [00:10:47] No, no, no. [00:10:48] I imagine it's much more serious. [00:10:49] Like Trump is going, listen, this war is not going good for me. [00:10:52] The polls are not so good. [00:10:53] We need to get something. [00:10:55] We need to get the new government in very quickly. [00:10:57] And they're like, I think the only move we have now is to call him gay. [00:11:01] And Trump's like suggestions. [00:11:02] Like, we call him gay. [00:11:03] And he's like, okay, make it so. [00:11:06] And that's their plan. [00:11:07] He was watching South Park and he was like, I got a great idea. [00:11:11] Yeah. [00:11:11] This is how we win the war, guys. [00:11:13] That's right. [00:11:13] When you know your leader is gay, Supreme Leader. [00:11:17] You know, he's not actually in a hospital then. [00:11:20] I think he's dead. [00:11:21] He could be gay. [00:11:22] He's not in the hospital. [00:11:23] He's out meeting boys. [00:11:24] Wait, this guy. [00:11:24] Yeah. [00:11:25] A lot of people think he's dead. [00:11:26] He's dead. [00:11:27] He hasn't shown his face for like two years. [00:11:30] He has to go out and sow his wild oats in Russia to make sure that he has a good time before he goes and actually bringing it. [00:11:37] Yeah, maybe he's just a little bit more. [00:11:39] I think he's dead. [00:11:41] The rumor going around for a while now is that he was killed. [00:11:45] And the reason why the Iranians are still pretending like he's actually in charge is because they would have to say the supreme leader died, the second in line died, top 40 officials died, and then their government's going to collapse. [00:11:58] There's nothing left. [00:11:59] Yeah. [00:11:59] So they're like, now that he's dead, posthumously nailing with the gay card, he's not coming out to be like, I'm masculine, you guys. [00:12:07] It's just going to be all the way. [00:12:07] Here's my sex tape with a girl, guys. [00:12:09] Oh, maybe they will start deep faking sex tapes with this guy, which then will get him in the trap, which is what they always wanted. [00:12:14] It was a bunch of command form. [00:12:17] That's a good point. [00:12:18] Yeah. [00:12:20] I can't imagine being a man and realizing that the entire U.S. military and his Israeli military was trying to kill you. [00:12:29] Like, it's over. [00:12:30] Yeah. [00:12:31] You're not, they're going to find you. [00:12:33] You know, every time we see like F-35s and F-15s, we're like, yeah, but can you imagine me on the receiving end of that stuff? [00:12:39] I mean, good luck. [00:12:41] I'm surprised they haven't surrendered. [00:12:42] And maybe because they haven't had internet for 14 days, so the people have no idea what's going on. [00:12:46] I don't think they would have Starlink at this point, at least. [00:12:49] I think they probably haven't surrendered because there's an idea that if they can wait this out, right? [00:12:56] Because I think what they're a couple things are thinking. [00:12:59] The only way for real regime change is boots on the ground. [00:13:02] Yep. [00:13:03] And do Americans have the stomach for that? [00:13:06] If they don't, and the Americans pull back and settle for some kind of a deal, what the command structure can then claim is that they ultimately stood up to America and Israel and won. [00:13:19] Yep. [00:13:20] And it actually consolidates their power. [00:13:22] So there's still profit to be had with resistance. [00:13:26] So this was a big mistake. [00:13:28] I mean, I understand the reasons for doing it. [00:13:32] A lot of people just want to say, like, oh, is real Israel, which is a small component, but not the principal reason why the West in general wants to get regime change in Iran. [00:13:40] The problem is the Iranian strategy is we're in a midterm year. [00:13:43] Trump cannot sustain a military operation for a long time. [00:13:47] And after he is forced out, either by the Democrats winning Congress or just attrition in general, like he can't sustain this economically. [00:13:59] Like you said, they're going to say we defeated Israel and the United States. [00:14:02] We held our own. [00:14:03] Trump has no choice but to make sure this is done and done quickly. [00:14:08] But apparently now the reports, they're saying it's going to last until September or longer. [00:14:12] And now Trump, there's that viral video where he's saying we need NATO assistance to go in and keep the Strait of Hormuz open, which is not a good sign. [00:14:20] But then he was like, well, we don't need the help, you know. [00:14:22] So he went back and forth. [00:14:23] I think it's, I think he's, I think it's cooked. [00:14:25] The other thing people don't talk about is that Iran sells 90% of their crude oil to China. [00:14:33] And in this AI war, which is very real, you need energy. [00:14:37] Yep. [00:14:38] You know, all these people, AI is going to take over. [00:14:40] Are they? [00:14:41] You know how much energy it takes? [00:14:43] So China is, I think they get 30% of their energy from Iran. [00:14:48] That's a very significant energy. [00:14:50] 30% of their oil. [00:14:51] Yeah, that's huge. [00:14:53] You can talk about them having green, but they need that oil. [00:14:55] Well, no, I mean, China's got an all-in approach to energy production, and China's actually crushing the U.S. [00:15:01] But you point to AI, like the bottleneck isn't actually going to be chips coming up in the next couple of years. [00:15:06] The bottleneck's going to be energy production. [00:15:07] China's got nuclear power. [00:15:09] They've got the Three Gorge Dam. [00:15:11] They've got another dam they're building. [00:15:13] They've got some solar as well. [00:15:14] Yeah, they've got an all-in perspective on it. [00:15:16] The U.S. is lagging behind, and the U.S. really needs to do a lot to catch up. [00:15:21] Right now, we have the lead because we have the most advanced ships. [00:15:25] But in the future, in the next couple of years, the actual bottleneck is going to be energy production. [00:15:29] And that's kind of what China's goal is. [00:15:31] They're going to get it to manufacturing too. [00:15:32] Like ships. [00:15:33] I think we make 5% of the ships. [00:15:34] They make 40% of the ships out of the world. [00:15:37] Any ship that's made in China has certain requirements that the military could take them. [00:15:42] So any luxury ship, any kind of ship that's made, China could actually commandeer from the private owner and could say, we're going to use this for military operations. [00:15:51] But that's not to say that they have a real Navy. [00:15:55] I think they have something like two aircraft. [00:15:57] They need oil for a Navy to. [00:15:59] And that's a great point. [00:16:00] The oil that they do import from Iran, like that's all for the trucking industry and it's all for using for their military arm. [00:16:07] So the U.S. taking that away from China or even taking off the edge, right? [00:16:12] So even if they can impact their imports by, I don't know, 10, 20%, right? [00:16:17] Like they don't get all of the oil, but it has an effect. [00:16:20] That's a big deal for China's military. [00:16:22] Plus, China's got like 20% unemployment in young men. [00:16:25] So they're in a real, real bad pickle. [00:16:27] And this kind of like pressure from the U.S. on the energy side, it's a real big deal in China. [00:16:34] Yeah, but also the oil they get is gay. [00:16:36] Yeah. [00:16:36] Yeah, you know, I wrote a piece on my Patreon about this. [00:16:40] The whole Venezuela and Iran in conjunction are really actually trying to push China into a direction. [00:16:47] Because Trump has a meeting with Xi, I believe, the end of this month. [00:16:51] I think it's the end of March, could be in April. [00:16:54] But the U.S. is going to walk in there. [00:16:55] Remember, the last time the U.S. and China met, China straight up said, you are not negotiating from a position of power. [00:17:02] Trump's trying to change all that. [00:17:03] When he meets with Xi, again, he's going to be like, look, all of the things that you thought before, that is not the way that it is. [00:17:10] The United States is arguing, is negotiating from a position of authority. [00:17:15] Do you think this is going to, like, how bad do you think it's going to be in the midterms over this? [00:17:19] Well, actually, I'll start with, are you guys in favor of this strikes on Iran? [00:17:24] I don't know. [00:17:24] I find myself saying I don't know more and more. [00:17:26] Yeah. [00:17:27] I thought he was. [00:17:29] I don't know how to predict the ripple effect. [00:17:31] I don't know. [00:17:32] My feeling is that every time we go into a country, all gung-ho, we tend to make this big mistake, which is maybe not be as informed about the culture and the ramifications. [00:17:46] Nobody thought that these two wars in Iraq and Afghanistan would last 23 years, but they did. [00:17:51] And I don't know. [00:17:53] I think the obvious reason for the invasion of Iraq and Afghanistan was the invasion of Iran. [00:17:59] When you look at where we set up all these military bases along the border on the east and west of Iraq and Afghanistan, yeah, we're surrounding Iran. [00:18:07] Well, the Israelis, a lot of people don't know the Israelis were telling the Americans to invade Iran, not Iraq. [00:18:12] Right. [00:18:13] In 2003. [00:18:16] But the issue is, Iran is mountainous, defensible. [00:18:20] They've got surface-to-air missiles. [00:18:22] The U.S. could not just go in. [00:18:24] So they needed to establish effectively a land beachhead, essentially, along the western, along Iran's areas. [00:18:31] They're also a very homogenous group. [00:18:33] They're all Shia. [00:18:34] They're all Iranian. [00:18:35] They're Persian. [00:18:36] It's not like Iraq, which had Shia and Sunni. [00:18:39] And, you know, that was a very significant divide. [00:18:41] Same thing in Afghanistan. [00:18:42] Yeah. [00:18:43] Well, it says Afghanistan. [00:18:44] The Hazara, they've got the Tajik, they've got the Pashtun, the different tribes of Pashtun. [00:18:50] Afghanistan's always been a series of tribes that were always fighting with each other. [00:18:53] So it was always easy to divide and conquer. [00:18:54] Don't you just really want that cheap petro-dollar oil? [00:18:59] You know, where you as an American can be fat not to think about it. [00:19:02] And Hillary Clinton comes back from, you know, she comes back in office. [00:19:06] She's withered into cane. [00:19:07] She's like, I want to go to war with everybody. [00:19:08] And then, but your gas is a dollar a gallon. [00:19:10] Yeah. [00:19:11] You know? [00:19:12] Hey, the 90s, you know, let's bring it back. [00:19:15] This energy generation argument might be a red herring. [00:19:18] They keep saying we need whoever creates the most electricity is going to win, but they're developing chips with this company Iron Lattice where they put the memory in the processor so there's no more busing agent. [00:19:28] It's 10 million times less electricity to run programming. [00:19:32] So it could be, it could be like what they're doing with the oil is they're trying to control the energy and prevent others from doing it. [00:19:40] If we go like fusion power, everyone's got infinite. [00:19:42] If these machines all start requiring 10 million times less, everyone goes infinite. [00:19:46] But it's like whoever goes does it first kind of, it doesn't matter who's got electricity at that point. [00:19:51] It's just who has the dominating force and intelligence first. [00:19:54] And then they stomp and clear the rest. [00:19:58] That's right. [00:19:58] They swallow everything up. [00:20:00] I think the U.S. is looking at even not taking your point for granted, but I think the U.S. still is planning for the modern architecture to be what is used moving forward. [00:20:13] Because even if you're right about the chips, it's going to take some time to get those chips into production and get them out in the quantities that they need. [00:20:19] I mean, AI takes an entire warehouse full of GPUs to be able to do the processing that it needs. [00:20:27] So I don't disbelieve what you're saying about the chips, but it's going to take time to actually make them in enough quantities to have the kind of data processing centers that AI needs. [00:20:37] Liv, I wanted to get your response to the question earlier too about Iran. [00:20:40] Like, what's your take on this? [00:20:42] I mean, I mean, I was largely informed by just all my Persian friends who were desperate for Trump to step in because they're just seeing like thousands and thousands of their people being slaughtered, right? [00:20:56] By a regime that they fundamentally hate. [00:21:00] But of course, like, you know, this is me speaking to people in the diaspora who aren't necessarily representative of the people who live within Iran. [00:21:08] But, and who knows the amount of propaganda coming out on both sides. [00:21:12] But from what my experience was, was like all the, like I said, I spoke out a little bit about some of the slaughter of the protesters that happened in sort of January and February. [00:21:24] And I've never received more messages of like thanks from what seemed like legitimate Persian people, like anonymous, well, not anonymous, but people I didn't know, being like, thank you so much for speaking out about this. [00:21:35] Everyone thinks that we're happy under this thumb of Islam and that we're and where we're and we are so desperate for them to get to go. [00:21:42] And and then you look at all the people celebrating like those aren't fake videos people celebrating in the streets. [00:21:47] When the strikes happened and the you know his the, the initial Kamani died, people were over the moon. [00:21:54] So I mean, I ultimately am being sort of guided by what the people who live there, or the people who have family living there, are saying, and they were ecstatic about this. [00:22:04] Now, but what's the long-term consequences? === Foreign Actors Manipulating US (15:46) === [00:22:06] Of course, you know you have to. [00:22:07] You know nature abhors a vacuum, vacuum. [00:22:10] So what are we going to put in place so that it doesn't turn into another Iraq or another Afghanistan? [00:22:15] But but on the point about the protests and the celebrations, consider an Iranian watching BLM protests and what do you think they're telling, what do you think their influencers are telling? [00:22:26] You know, saying on their podcasts, they're saying that when I called out the Trump government and and highlighted the protests, they I was getting messages. [00:22:35] People were saying, thank you so much for highlighting this. [00:22:37] The people of America deeply hate their government and want it overthrown. [00:22:41] Someone, but no one can stop them. [00:22:42] So the issue that i'm the people of America yeah, the leftists who are marching for BLM and throwing molotov cocktails and indeed, and those are The people that are going to message the Iranians saying we need your help. [00:22:54] So, the messages you get are going to be the activists and the establishment. [00:22:58] Yeah, it's only something. [00:22:59] I'm not saying they're very important. [00:23:00] I'm not saying it's one for one. [00:23:01] I'm just saying, consider the SIOP, the propaganda, the manipulation attempts. [00:23:04] Yeah, of course. [00:23:04] I'm not, yeah, but at the same time, it's just like we have to go off the evidence that is available. [00:23:08] And, like, I don't know, every single person I've spoken to, they were just sad because they were like, this is going to cause a lot of bloodshed. [00:23:15] But they're just like, in the long run, we are not an Islamic country. [00:23:18] We never were. [00:23:19] We were colonized by this ideology. [00:23:22] They have treated us like they treat us like cattle, basically, and they have to go. [00:23:28] And they're willing to play blood, you know, pay in blood for that to go. [00:23:32] It's a true theocracy. [00:23:34] And 70% of the population, I think, roughly is under 30. [00:23:37] And they are seeing what the world is doing. [00:23:40] You know, and they want to be part of the world. [00:23:41] And they've been an international pariah forever. [00:23:44] And a lot of it has to do, sure, they sponsor different proxy armies, blah, blah, blah. [00:23:48] A lot of people do. [00:23:49] But I think, really, think about it. [00:23:50] Living under that theocracy is oppressive. [00:23:53] Having lived, you know, I lived in the Middle East and I lived in Saudi Arabia for three years as a kid. [00:23:58] And the mullahs had a lot of power. [00:24:00] And, you know, things are kept pretty strict. [00:24:03] So if you're somebody like us and you want to talk, you're an artist, you want to express yourself. [00:24:08] Think about the bottled-up frustration. [00:24:10] You're just not allowed to express yourself. [00:24:12] You're not allowed to do anything that doesn't fall within. [00:24:16] Because one of the things about the Quran, especially a country like Iran, at least they try, is the Quran is the Quran, having separation of church and state within Islamic country is very difficult because the Quran is really a blueprint for how to run everything from your marriage to even banking. [00:24:32] And a lot of people don't know that. [00:24:34] So it's very difficult to kind of like enjoy the kinds of liberties that Western democracies do with all our problems, all our warts and everything else. [00:24:44] So there is a fundamental difference. [00:24:45] I think the frustration is very real. [00:24:47] And I do think those crackdowns are not. [00:24:48] I agree. [00:24:49] I think the, I'm just pointing out the propaganda of the SYOPS because I think if you actually look at the global effect of what's going on in Iran, there's not very many Americans fleeing to Iran for comfort. [00:24:59] There are quite a great deal of Iranians fleeing Iran all over the world to get away from the oppression. [00:25:03] Exactly. [00:25:03] So that's the easiest way to look at it. [00:25:05] So when you hear these stories of, you know, I think it's important to consider the propaganda is my point, but you can look at the world defects. [00:25:11] And the left often says America is oppressive and awful, yet everyone in the world is trying to get exactly. [00:25:17] I know. [00:25:18] Because it's great. [00:25:19] Yep. [00:25:19] When they said that those 10,000 protesters, you know, a month ago you mentioned were getting killed in the street. [00:25:24] First thing I was like, well, I think we should obliterate that government if they're going to do that. [00:25:30] And then the second thought was this could be all fake news. [00:25:33] And I sat there and like paralyzed this strange state of like, and like as a military commander, I would have let all those people, if I had been the commander, they all would have died on my watch if they really died. [00:25:44] And like those, that was a vanguard to overthrow that government from the inside. [00:25:47] Now they're dead. [00:25:47] I don't even know if they were real. [00:25:49] And does anyone know for sure if any protesters got killed? [00:25:52] I think they do know for sure. [00:25:53] Yeah. [00:25:54] I think there's a lot of evidence. [00:25:55] And I think there's video as well. [00:25:57] Yeah, the thing about PSYOPs is that I think you can make the argument it was a false flag, which is really hard to pull off in Iran when we're not there or attacking them yet. [00:26:08] But you're not going to be able to pull off grand claims if it didn't happen. [00:26:12] Right. [00:26:12] So usually when you get these claims of an atrocity or whatever, one side may exaggerate for political purposes, but you're not going to be able to just lie and claim a bunch of people died if they didn't die. [00:26:26] Okay. [00:26:28] And there are so many people, like Persian people who have family members still, like who are in the diaspora whose family live, like families live there, who either know someone who died or had a friend of a friend die, like by huge numbers. [00:26:40] Home in the last month? [00:26:41] Yes. [00:26:42] The frustration that's not. [00:26:44] Like it's like they I mean maybe the numbers are exaggerated, but I think it's far crazier to claim that thing to claim that nobody died when there is you know there are lots of people putting tons of effort into trying to establish the numbers now. [00:26:58] What is the range of the numbers? [00:27:00] Is it between 5,000 and 10,000 or is it between 10,000 and 70,000 or even 500 and 1,000? [00:27:06] We don't, maybe that's the harder thing to pin down, but to say that nothing happened at all and it was all made up, it just seems completely ridiculous. [00:27:13] And there are literally people saying, I know this person and they died. [00:27:15] They are no longer with us. [00:27:16] I think I know where this is all going. [00:27:18] And we're going to segue. [00:27:20] We're going to dip a little into the AI stuff. [00:27:22] So we're just talking right now about psychological operations, the protests in Iran, who died. [00:27:29] I think one of the biggest problems the U.S. is facing right now is that our social media is inundated with foreign actors with foreign political agendas to manipulate the people of the United States so the U.S. government will do their bidding. [00:27:40] And I know a lot of people immediately just say, oh, Israel is doing it. [00:27:43] Well, you know, Israel is, but a lot of other countries are as well. [00:27:46] The direction I see this going is going to be mandatory IDs for internet usage. [00:27:51] Elon already implemented on X. [00:27:53] I say Elon, but X already implemented as a company. [00:27:56] You can click someone's profile and see what country they're from. [00:27:59] And this exposed a bunch of Bangladeshis masquerading as Native Americans. [00:28:03] Really? [00:28:03] Yeah, and they were woke indigenous rights activists. [00:28:06] But the Bangladeshis, and it was predominantly Bangladesh, some Pakistan, because they make money doing it. [00:28:12] They know that the rage bait will get clicks. [00:28:14] It'll make them money. [00:28:15] But then you got to take a look at there are a lot of personalities that may be for or against the American military industrial complex plans or whatever. [00:28:24] Likely, here's my prediction to the future. [00:28:28] They're already talking about needing IDs to log in. [00:28:31] It's been a thing that's been brought up for quite a long time. [00:28:33] Discord is talking about facial scans and ID requirements and things like this. [00:28:38] You will have foreign actors locked out, right? [00:28:42] The Iranians aren't, a lot of people have complained that X allows the Ayatollah or allowed him to spread propaganda on the platform, but you knew it was him. [00:28:51] The bigger question is if they've got cyber command, like their cyber army, going up on our social media platforms and then spam blasting comments. [00:28:59] And I'm going to tell you this. [00:29:02] There was a story earlier today about a judge blocking RFK Jr.'s vaccine changes. [00:29:09] And so I commented, judges are the supreme authority of the nation, just as the founding fathers intended. [00:29:16] Anybody who speaks English knows that the extreme language that I use indicates sarcasm. [00:29:22] And anybody who knows the function of our government and checks and balances knows it was a joke. [00:29:25] I got a respond from a guy that looks like an American who said, this is incorrect, Tim. [00:29:31] The Founding Fathers established three branches of government to keep balance between the three with no one being greater than the other, which no American, in my opinion, would actually say because it's first grade, it's kindergarten level stuff. [00:29:43] So there's two scenarios I see. [00:29:45] A foreigner. [00:29:46] So another example is I once made a tweet that said, Israel has never done anything wrong because Israel is the nexus of morality. [00:29:54] If Israel does it, it is good. [00:29:57] Clearly sarcasm. [00:29:58] I tweeted it. [00:29:59] And I got responses from people that were taking it literally and saying things like, you know, where's the, you're hiding the Yarmuka or whatever. [00:30:08] My theory on that is these are foreign actors who don't speak English. [00:30:13] So they can't detect sarcasm. [00:30:15] When you click translate and it converts English into whatever language, they don't see my joke. [00:30:19] They see me saying something like, Israel is a force for good and we support Israel. [00:30:24] They don't actually see what they want. [00:30:27] Indeed. [00:30:27] And then the point about the judges being the supreme authority, either it's AI that can't understand a joke, but I actually can grasp that. [00:30:36] I think these are foreign individuals clicking translate or using a translator, not understanding the context. [00:30:41] And this is how you kind of weed them out. [00:30:43] This means, and I think it's fair to say, many people, and you pick which side, left, right, or otherwise, are being heavily influenced by foreign financing. [00:30:52] And guys, we've heard the reports about Israel paying $7,000. [00:30:56] The truth is it's not $7,000. [00:30:58] That was just an average based on how much they had spent throughout the year. [00:31:01] But there were individuals who all of a sudden, on a dime, were just pro-Israel. [00:31:04] So I think there's probably truth to that. [00:31:07] But I also think it's fair to say that we have foreign cyber armies that train people explicitly to run 50 accounts at once and blast you. [00:31:17] I think in the future, they are going to mandate that you have an ID. [00:31:20] And we already see this somewhat with X premium, right? [00:31:23] You've got to prove who you are. [00:31:24] And if you don't have premium, you're getting a second-tier thing. [00:31:27] This is phase one. [00:31:29] Sounds good. [00:31:29] Do you like that? [00:31:30] Sounds like a good idea, I guess, on the surface. [00:31:33] Pros and cons. [00:31:33] Pros and cons. [00:31:34] There's two arguments for this world. [00:31:36] One is that dissent can only be allowed if people are allowed to have anonymity. [00:31:42] Like the Founding Fathers used pseudonyms. [00:31:45] They knew that if they spoke out against the Crown or British Parliament, they could be hanged for treason. [00:31:51] So they had to lie about who they were and then disperse these messages. [00:31:54] At the same time, the Founding Fathers did not have our adversaries. [00:31:58] Like imagine if the Barbary nations, the Barbary pirates, had the internet and were convincing the people in America that they actually weren't pirates, that we were the pirates attacking them. [00:32:08] And then also in our government didn't establish the Marines to go, Jefferson didn't go and do these things. [00:32:13] The challenge is ultimately it comes down to all is fair and love is love and war, love and war. [00:32:20] And at a certain point, you have to choose to use power or die. [00:32:26] Now, I don't know where that point is. [00:32:28] Maybe it's now, maybe it's not. [00:32:30] But we've lived in this classically liberal mindset for a long time, which has those of us who have been fairly moderate or even right-leaning have been crushed by the far left, who have no respect whatsoever for our classical liberal sensibilities. [00:32:45] And I don't mean politically liberal, I mean the philosophically classically liberal. [00:32:49] And then we're getting run over by foreign adversaries manipulating our social media. [00:32:53] The question is, at what point do we decide to just slam the fist on the table and say, we're locking this down? [00:33:00] You need to prove who you are if you want to be in our spaces because we don't want the Chinese cyber army manipulating us. [00:33:07] And we're not going to allow Marxists to give kids sex changes. [00:33:11] Otherwise, you just keep saying, well, we have to be fair and allow them to do it because it's free speech. [00:33:15] But then eventually you cease to exist. [00:33:16] Right. [00:33:17] Because it's a cyber attack, essentially, is what it is. [00:33:21] It's the same thing, right? [00:33:22] You know what I worry about? [00:33:24] I really worry that we are losing our belief in the ideal. [00:33:31] And I'll even use the word brand of America. [00:33:34] What I mean by that is this. [00:33:35] I'm old enough to remember when I was growing up, we really believed that America ultimately was trying to do the right thing. [00:33:42] What I mean by that is that when we went into a country, you can look, it was Iraq, it was Afghanistan, it was, for that matter, Vietnam. [00:33:49] The idea, at least, behind it was we are fighting for democracy, for individual liberty, for all these things that America, freedom of speech. [00:33:58] It was, in a way, the fabric of being an American that we were the good guys. [00:34:04] And that was very real for me as I grew up. [00:34:07] Because I do think that for the most part, our leaders, certainly our soldiers, and to an extent still believe that. [00:34:14] You hear it right now with Iran, with the idea that these protesters and the people need to rise up, bring democracy and stuff like that. [00:34:21] But there's a cynicism in America, and we've earned it to a large extent. [00:34:26] You can start with our distrust in institutions. [00:34:29] That probably happened with the Catholic Church and how they never came to terms with the amount of pedophilia. [00:34:34] We can keep going with how many different institutions have been corrupted, especially the fourth estate, the media, that seems to have just become more interested in playing to their echo chamber and to ratings. [00:34:47] So it wasn't really about the truth or objective reality anymore. [00:34:50] And I really do worry that young people, and you just did, you were like, I don't know what to believe. [00:34:54] That's a huge problem. [00:34:56] And I get it. [00:34:57] Because you're like, hey, wait a minute. [00:34:58] How do I know I'm not being gamed? [00:35:00] And I really do think that we cannot go into countries like Iran and just use language like, we're starving the Chinese of oil. [00:35:09] This is good for America because we need hegemony. [00:35:12] That's not American ultimately because it's hard to get, because then there's no difference between America and Russia, America and China. [00:35:19] We have to fight for an ideal, even if it's, even if we're embracing it in a fake way, we're a brand and people do come to this country for all those things that we take for granted. [00:35:30] I half agree. [00:35:32] You know, if we just say like, we want to cut off China and it sounds strategic and militarized, yeah, that's no good. [00:35:39] But I also think that, you know, back in the Bush era when he was like, they hate us for our freedoms. [00:35:43] Me and all my friends like rolled our eyes like me too. [00:35:46] That's like, that doesn't make sense. [00:35:47] There's something, there's a reason. [00:35:49] Dude, I lived in the Middle East. [00:35:50] I was like, I remember saying that. [00:35:51] I was just like, I was there for eight years of my life. [00:35:53] Was it the predator drones flying over their homes every day that freaked them out? [00:35:56] I was like, they don't hate our freedoms. [00:35:58] But I will say this. [00:35:59] I will say this. [00:36:00] I believe substantially more Americans would support the war in Iran if Trump was honest about the function of the liberal economic order. [00:36:10] Now, here's the thing. [00:36:12] When you're pitching something to somebody, you got to aim for the lowest common denominator. [00:36:16] You're not going to go to someone and explain, you know, the Council on Foreign Relations has this website where what you are going to say is, listen, I think this pitch would actually work for the most part. [00:36:25] And I will say this to the American people right now, and you don't have to agree with it. [00:36:29] Gas prices and products remain cheap in the United States because we point guns at other countries and say, you will trade oil on the U.S. dollar or you will die. [00:36:40] Now, by all means, that sounds immoral and horrifying. [00:36:44] And our presidents have done horrifying things to maintain that. [00:36:48] Do you want to spend $10,000 for a laptop or do you like your $1,000 laptop? [00:36:52] Well, I don't know if I think that that's the case. [00:36:55] I think the petrodollar is the petrodollar because the one economy, the one country that's stable, the one place you know things won't go totally haywire, at least now for the past, you know, for most of our existence, but certainly for the past 70 years, has been the United States. [00:37:11] If you invest in property, well, because we do have the biggest guns, right? [00:37:15] But we also, we also, however, have done a very good job. [00:37:19] And we have to give ourselves credit for keeping this democracy alive and checks and balances and Madison and John Jay and Alexander Hamilton, those geniuses. [00:37:31] They should have statues to those guys because they did solve. [00:37:33] So they tore them down. [00:37:34] I know. [00:37:34] And they solve the political problem. [00:37:36] But listen, the issue primarily is that we do not produce enough for our economy to make sense. [00:37:44] Other countries have to buy U.S. dollars before they can buy oil, which means they're promising to give us their debt, their labor, if they just want to buy the oil. === Global Homogenization vs Democracy (14:38) === [00:37:53] We effectively own all of the world's oil, but there's an exchange for this. [00:37:58] You will be able to trade freely without fears as we police the seas and police the oceans, and we are going to get everything in order. [00:38:03] Trump wants the Suez. [00:38:04] He wants Panama. [00:38:05] He wants Greenland. [00:38:06] He wants to control the waterways so he can fulfill this promise. [00:38:08] He wants to get oil to China because he wants China to get back on the dollar because they're getting off of it. [00:38:12] He's negotiating with Russia, get back on the dollar. [00:38:14] He's going to the Saudis and saying, what do you want us to do? [00:38:17] You want us to bomb Iran? [00:38:18] Because the Saudis got off the petrodollar contract. [00:38:21] If we lose the petrodollar, the standard of living for the average American is going to drop by 80%. [00:38:27] We do not export nearly enough to maintain the level of luxury we have. [00:38:32] However, we are the world police. [00:38:34] It's effectively what we do export, whether you agree with it or not. [00:38:37] So I said this back in 2016 with Trump and Hillary and the message Trump had, the message Hillary had, Hillary Clinton was asked about a no-fly zone in Syria, which she advocated for, and was told explicitly that that would be a declaration of war with Russia. [00:38:49] Russia has a naval base in Tartus. [00:38:51] They have jets. [00:38:51] They have planes. [00:38:52] If we said no one can fly anymore, we're declaring war on Russia. [00:38:54] And she said she didn't care. [00:38:56] And I said, listen to my friends. [00:38:58] Do you like your dollar slice? [00:39:00] Do you like your dollar slice of the free pop? [00:39:03] That's because we are the global hegemony. [00:39:06] We are the unipolar power. [00:39:08] We can make all these countries do what we want. [00:39:10] Saddam Hussein wants to trade oil in Euro. [00:39:12] Boom, he's dead. [00:39:13] Okay. [00:39:13] Muamar Gaddafi wants to trade oil in gold dinar. [00:39:16] He's gone. [00:39:17] We came. [00:39:18] We saw he died. [00:39:19] That's what Hillary Clinton said. [00:39:21] That's the machine state. [00:39:22] You live comfortably and in ignorance like a fat guy floating around in Wally so long as the U.S. maintains its domination of these other countries. [00:39:32] You go for Trump. [00:39:33] Trump wants to secure our borders. [00:39:35] He wants to bring manufacturing back and he wants to bring back grit and hard work. [00:39:38] And a lot of fat cats in D.C. who make money through the rotating of assets and resources through these NGOs, they don't want that. [00:39:47] It's not a guarantee that the Trump world is going to bring back manufacturing or do these things, but his worldview is cut off this offshoring and these free trade agreements, bring the auto factories back, do tariffs. [00:39:58] Americans will get back to hard work and we will be a strong nation, not an international bombing nation. [00:40:05] And the reason why I think a lot of people are mad, or I would say my principal argument here is I've advocated for that worldview of build up Americans culture. [00:40:14] Americans should have kids, teach their kids the good values, everything you described about being the good guys. [00:40:19] And now Trump is going, eh, we're going to bomb around. [00:40:21] Like, you know what? [00:40:22] To get the economy good, it's so much easier just to take the oil from somebody else. [00:40:26] So again, the simple thing that I'm trying to say is I think this war will get a lot more support if Trump said, and they've glazed it a little bit. [00:40:34] They've crop-dusted close, but not quite, short-term pain for long-term gain. [00:40:40] We want to stabilize oil trade. [00:40:43] Just to be honest, the American people. [00:40:44] And if it doesn't work, well, then too bad. [00:40:47] We told Iran, fall in line with the petrodollar. [00:40:50] Stop putting pressure on the strait of Hormuz. [00:40:53] Stop threatening your Gulf State neighbors. [00:40:55] Stop arming militias and the Houthi rebels who are bombing civilians and you're fine. [00:41:00] And they don't want to do it. [00:41:02] So you want to live clean and comfortable. [00:41:04] You want cheap computers, cheap cars, and cheap gas. [00:41:06] Then you want a unipolar global United States power. [00:41:10] The tough sell is getting people to accept making people your vassal when you're standing for freedom. [00:41:17] That's the challenge. [00:41:19] That's interesting. [00:41:19] Yeah. [00:41:20] That's a really interesting point. [00:41:21] You could vassalize the planet and then establish freedom, like freedom within a perimeter, which was what we have already. [00:41:26] You have to have an outward-facing military, inward-facing freedom. [00:41:30] So we could set that up. [00:41:31] You just got to convince people that that's the plan. [00:41:33] And actions speak louder than words. [00:41:35] I think the reality is the reason why we do the freedom narrative, the truth is lowest common denominator is how you sell. [00:41:44] Have you ever seen these comedy videos just for laughs gags on YouTube? [00:41:49] There's no talking. [00:41:50] It's a laugh track, and all of the gags are done without words. [00:41:54] And they get massive viewership because someone from China, India, someone from Madagascar can watch that and get the joke. [00:42:01] So if you want to convey a message to most people, what's going to work? [00:42:05] They kill protesters. [00:42:06] They're evil and they hate our freedom. [00:42:09] And, you know, you're going to cut off the intellectuals. [00:42:11] You're going to cut off the moderates, but you're going to get 60% of the disinterested and ignorant masses. [00:42:17] It's worth noting that most of the countries, not every country, but most of the countries that do decide that they're going to play ball with the U.S., and I'm not talking about the ones that we go and get into a war with, but most of the countries that say, okay, we're going to play ball with the U.S. and use the petrodollar system, et cetera, most of them end up with markets that make their societies better off in the long run. [00:42:37] The long-term play was, we're going to give you money for development, and then you'll be in debt to us forever. [00:42:41] And that's how we stabilize the planet. [00:42:44] They are trying to create, you know what the problem is? [00:42:47] They want global homogenization. [00:42:49] They want everyone operating under the quote-unquote rules-based order that is the liberal economic order. [00:42:56] The problem is when we go to Afghanistan and you've got a bunch of people who are, you know, with all due respect, like not very smart. [00:43:04] They can't do jumping jacks. [00:43:05] They're goat farmers. [00:43:06] And what did the Americans try to do? [00:43:08] We nation-built, and hold on, that's not the worst part. [00:43:11] We tried to make them gay communists. [00:43:14] I'm not joking. [00:43:15] The murals that they put up for pride and homosexuality to a deeply conservative tribal nations on the street. [00:43:22] What? [00:43:22] They put murals in the middle of the middle of the Kabul. [00:43:28] Is that the city? [00:43:28] Yeah. [00:43:29] There were videos coming out when we pulled out of Afghanistan, and there were murals of trans rights and stuff. [00:43:35] You're joking. [00:43:37] It's true. [00:43:37] Dead serious. [00:43:38] We need to see this. [00:43:39] This is the problem. [00:43:40] I'll pull up this. [00:43:41] This is the problem. [00:43:42] Talk about ignorance. [00:43:44] If we said, sell your oil in dollars, you be you. [00:43:48] No war, you be you. [00:43:50] That would have been what Alexander the Great would have done. [00:43:52] He would have said, keep all your cultures and everything else. [00:43:55] Let's just have an economic arrangement. [00:43:57] That's good. [00:43:58] There's presumably more, though, with the Iran thing than just that, given that Iran was seemingly funding a lot of the Hebsala, Hamas, everything else, which was destabilizing the Western order in many ways, right? [00:44:14] I think it's just, I mean, it's obviously just multi-causal, the reason why, but maybe the underlying one, the main reason is the I think you're right. [00:44:23] And I also think that I really do believe that a lot of people consider Iran to be a theocracy, meaning there is something messianic about or deeply religious about the struggle. [00:44:35] I mean, one of the reasons that, you know, Hamas is intractable and one of the reasons that this issue with the Palestinians now and a lot of the Arab world and Israel is intractable has nothing to do with economics. [00:44:49] No. [00:44:50] It has to do with religion. [00:44:52] After the six-day war, when Israel essentially humiliated Egypt and the other six Arab countries that invaded and destroyed all of Egypt's runway, I mean, Air Force before it got off the runway, et cetera, it went from a pan-Arabic notion of we'll unite together as Arabs and become a strong power to a religious struggle. [00:45:14] And then if you add to that the kinds of military dictatorships that the United States was supporting, like Mubarak and those people, the Muslim Brotherhood was founded in the torture chambers of those Egyptian prisons. [00:45:28] You know, the economies were not good. [00:45:30] Nobody had anything to do. [00:45:31] And it really has become a religious struggle. [00:45:34] And so there are a lot of people, I think, in intelligence that look at Iran. [00:45:38] And if they got a bomb, I do, I don't think they would do this, but there are people that actually think that they would do something very irrational. [00:45:45] I don't agree with it, though. [00:45:47] I think they're very rational. [00:45:48] You can change a country's economic system, but you're not going to change their culture. [00:45:52] You can convince them that a McDonald's on the corner or a Starbucks on the corner is actually a good thing, but you can't convince them that their way of life is wrong. [00:46:00] And then being the regime. [00:46:02] Because the majority of Iranians, Persians, are looking for liberties that all of us enjoy. [00:46:10] They just are. [00:46:11] It seems to get to the religious dialogue when it gets desperate. [00:46:15] Because Saudi Arabia, you know, religiously bipolar to the United States, but they're a great asset and ally because we get along economically. [00:46:23] They're selling, well, they were selling our dollars. [00:46:25] But like, I think this really comes from like post-World War I, pre-post-World War I, Ottoman Empire shatters. [00:46:30] We're like, let's just extract the shit out of the Iranian oil. [00:46:33] That's true. [00:46:34] The British did that. [00:46:36] You're exactly right. [00:46:37] And you could take it back even further. [00:46:38] Like what in the 1800s is the Ottoman Empire seized it from the Romans. [00:46:42] And it's like, how far back does this struggle between referentials go? [00:46:48] The fish that first crawled out. [00:46:49] 10,000 years, 100,000 years. [00:46:51] The fish that first crawled out the ocean. [00:46:54] For this leg, I always look at the British oil companies that went in there after World War I and tried to take over the Middle East, de facto, set up Israel and the Palestine arrangement and how we rectify that. [00:47:08] Just got to be honest with people, though. [00:47:10] I mean, stop obviously people know it now. [00:47:11] So just tell them this is what we're doing. [00:47:13] We're trying to set up a unipolar world. [00:47:15] Promise not to wreck it once we get it going. [00:47:18] But see, I'm not as cynical as that. [00:47:21] I'm more, I think what's happened with the Gulf states and the Abraham Accords have to be given their due. [00:47:28] You know, it's become for people, like for countries like Israel, for the UAE, for Saudi Arabia, it's just become more advantageous to get involved in the global economy in a deep way, which means become a trading partner with the United States. [00:47:45] Dollars, money is what makes everybody happy. [00:47:50] And I think that people are thinking Iran would be a great economic asset. [00:47:55] You've got an educated, literate population, 7% which are under 30. [00:48:00] And I mean, can you imagine if they were allowed to be a liberal economy? [00:48:05] Money, baby, not just oil, but an industrious group of people. [00:48:08] I'm naive enough to hope for that. [00:48:11] I hope that happens. [00:48:12] I don't see it happening. [00:48:14] I hope it does. [00:48:15] I hope we don't end up destroying their oil infrastructure to the point where they can't rebound. [00:48:22] And that would be a huge disaster. [00:48:23] I mean, so there was a story back when we pulled out of Afghanistan during the whole thing about art that had been put up for gay rights. [00:48:34] I cannot find it. [00:48:35] It's been five years, four and a half years. [00:48:37] So if I can't figure that one out, well, then just take it with a grain of salt. [00:48:41] But there are stories about, right, this one, for instance, back when the U.S. was in Afghanistan and putting up pride flags, as well as the flying of pride flags at all the U.S. embassies and murals that we put up on our territory in a bunch of these countries. [00:48:56] So I think one of the issues is it's one thing to say that we are a classically liberal country that believes in free speech and we want to spread democracy. [00:49:04] But then you get the incessant defense of, you know, look, I know people in the United States are very pro-gay, but these countries are not. [00:49:13] And so if you're a global power and you're like, we are going to be an affront to your values, I mean, it's probably one of the principal reasons Iran does not like us and won't fall in line because they're like, they're a bunch of heathens. [00:49:25] It's Sodom and Gomorrah. [00:49:26] I mean, you have to, you have to sympathize with it. [00:49:29] You know, if you're from another, if you're from a conservative Muslim country, for example, like and the Americans are trying to get you to be like them, well, we've got some problems. [00:49:39] Like, I mean, how about broken families? [00:49:43] How many people are on some kind of drug in this country? [00:49:47] What is the state of the American family? [00:49:49] What's the state of our education? [00:49:51] Well, what's the state of our spiritual health? [00:49:54] I wonder what the Iranian government thought of abortions in the United States. [00:49:58] There you go. [00:49:59] And so you can invert the position and try and understand what these other countries are thinking. [00:50:03] Now, by all means, I think the Iranian government is a theocratic, militant, backwards way of living. [00:50:10] And I wouldn't want them to impose that on us. [00:50:13] So imagine China, the Communist Party of China, is the unipolar power. [00:50:17] Again, I'm going to say this about war with Iran and U.S. interests. [00:50:21] China is on track to become the dominant global economic power. [00:50:26] They've got the Belt and Road Initiative, which is effectively their version of the IMF, and they are cutting deals with tons of countries. [00:50:32] If we do nothing and the U.S. falters, you and the United States will find yourself living under their way of life, their views, and the horrible things they do. [00:50:43] Do you want to live that way? [00:50:45] Do you want the Chinese Communist Party exerting pressure over the United States and the movies we watch, the things that we see? [00:50:51] Look at what's going on right now already with how we make movies. [00:50:54] When we made, for instance, Top Gun Maverick, they took the Tibetan flag off of his jacket because it would be offensive to China. [00:51:00] Dude, I did a movie in China and we shot in Beijing, and I had a huge scene where I had to run down this Chinese gangster and arrest him and stuff. [00:51:12] And I was going to do it through the old city. [00:51:13] So I actually had like, they're like, listen, dude, you got to stretch. [00:51:16] It's going to be a two-day shoot. [00:51:18] I'm going to be running, chasing, gun, tackle, doing stunts and all that stuff. [00:51:23] And so I was like, damn, this is a big scene. [00:51:25] This is going to be a two-day thing. [00:51:26] It was really hot in Beijing. [00:51:27] It's like, you know, crazy, but it's in through the old city. [00:51:29] I was going to get into places. [00:51:31] And we're doing it all. [00:51:32] And stun men, we're talking, I'm going to have to do all the running. [00:51:35] And, well, the word got out to the government and they said, absolutely not. [00:51:42] We're not having an American arrest a Chinese national in a movie that we are partially financing. [00:51:49] Wow. [00:51:49] And we scrapped the entire two days. [00:51:53] It's a huge scene. [00:51:54] And we had to do my character. [00:51:56] We had to just literally change everything almost on the spot. [00:51:59] It was a disaster. [00:52:00] Think about how worse it could be. [00:52:01] Think about China coming to Cuba and then putting 30,000 troops in Cuba and taking Guantanamo Bay from us, and then we can't do anything about it. [00:52:12] Or going to Saudi Arabia and cutting off oil distribution to the United States. [00:52:17] And then all of a sudden we see our gas prices skyrocketing. [00:52:19] I don't think China could do it. [00:52:21] I don't think China has our, they don't have, they haven't been in a war in forever. [00:52:24] We've been in constant war. [00:52:26] They don't have the energy to keep up with our nation. === China Energy Disruption Fears (15:36) === [00:52:31] We're talking about if the U.S. economy falters and China becomes the dominant unipolar power. [00:52:36] Imagine a scenario where the Chinese military is they've got ships going between Florida and Cuba like we do with Taiwan. [00:52:44] So right now we can do what we want. [00:52:45] We can get what we want. [00:52:47] I believe the growing faction of woke in this country did arise to a certain degree from anti-establishment views, populist views, from people who are fed up with the lies, the manipulations, and the failures of interventionist policies. [00:53:03] However, it then turned into Marxist insanity for the purpose of just destroying the United States. [00:53:10] One theory that I've entertained is that the purpose of woke and communism is to cause a rapid decline in the United States. [00:53:18] Are you familiar with Thucydides' trap? [00:53:21] This is a theory that Whenever a dominant economic power is about to be supplanted by an up and coming economic power, you get war. [00:53:32] And they say historically, 12 of the 16 times we have seen the dominant power get displaced, war has broken out. [00:53:40] So one theory that I've entertained is that the U.S. opens the door to China, gives them all of our jobs very, very quickly over a short period of time, over 10, 15 years. [00:53:50] We see all of our factories moving to China, all of our cultural institutions, like I mean like the manufacturing bases which built these cultures. [00:53:58] That way, if it ever comes time for there to be an economic flip, it would be so dramatic there would be no possibility of a Thucydides trap. [00:54:07] And when you plug that into what the purpose of the liberal economic order was to prevent World War III, it does make sense. [00:54:15] Don't know if that's what's actually going on. [00:54:17] What is Trump doing, even with the attack on Iran, is reestablishing the United States as the dominant unipolar power in the liberal economic order. [00:54:25] If Trump did not come around and Hillary Clinton got elected, our policies that embolden and enrich China would have continued. [00:54:32] They're buying up our farmland. [00:54:33] They're buying our land near military bases. [00:54:35] They're bringing kids here through birth tourism, having kids who are citizens who can run for president in our own country. [00:54:41] And our manufacturing base is being shipped off largely, not only, but largely to China, where they are now getting the jobs. [00:54:47] And what happens? [00:54:48] What happened during COVID when they turned the switch off for manufacturing? [00:54:51] We were left without PPE. [00:54:53] If Trump did not get in, that would have accelerated personal protective equipment. [00:54:58] So this was masks, gloves, clothing for doctors, whatever your opinion is on it. [00:55:04] You know, you don't need to be wearing two masks, whether you did or didn't. [00:55:07] The point is, they were manufacturing our masks for us. [00:55:11] So China turned around American ships and seized products that were manufactured in China by American companies. [00:55:17] What would have happened had Trump not turned this around? [00:55:20] Now I see Trump bombing Iran, and I'm like, yeah, Trump wants to reestablish the liberal economic order and the petrodollar system and make the United States dominant. [00:55:29] And the powers that were going the other direction are pissed off about it. [00:55:33] But I think they may have lost. [00:55:34] The only problem now is you get war with Iran. [00:55:36] And the pendulum now, instead of swinging towards communist China taking over and censoring and shutting us down, the pendulum is now swinging back towards corporate governance taking over and shutting us down. [00:55:45] Because if U.S. establishes global hegemony, then that means that they can shut off your bank account because there's one economic chamber. [00:55:54] If you say fuck on the internet, maybe, or whatever the word that you said seven years ago was that was bad, the AI can scrape it and put you in digital ostracization. [00:56:02] It's like, how do we defend against that? [00:56:05] We can't have a unique. [00:56:07] Sorry, please. [00:56:08] Yeah, I mean, it's just like there's sort of two attractor states. [00:56:11] One is to, you know, because it's almost like China are using our values of freedom against us, right? [00:56:20] They are very, very good at coordinating. [00:56:23] They have this very centralized top-down structure whereby they can dictate what people can do and people are living under less freedom. [00:56:30] But they are also therefore able to make these five-year tenure. [00:56:34] Where's the America 10-year plan? [00:56:36] I'll tell you where our plan is, though. [00:56:38] I don't worry about that even a little bit. [00:56:40] There are two things that people aren't taking into account. [00:56:41] China has major problems, not the least of which is their demographic problem. [00:56:46] They are literally a declining population. [00:56:49] They don't have young people to support their old people or the economy, number one. [00:56:52] And we have the same problem, by the way. [00:56:54] So, and I've had four kids, so that's okay. [00:56:57] But it's okay. [00:56:57] There's one other thing: innovation. [00:56:59] We're still far and away, the United States, the leader in innovation. [00:57:04] Think about AI and the entire tech industry that came out of this country. [00:57:08] China has copied a lot of our stuff. [00:57:10] But at the end of the day, the United States is an innovation juggernaut. [00:57:14] And that's why I get so worried when we have socialists and people who tend to believe in this collectivist idea. [00:57:20] You've got to reward people for their ingenuity and their risk-taking. [00:57:24] That's how you keep entrepreneurship and innovation alive. [00:57:27] I think let's talk about AI. [00:57:31] I believe that the military, the government's secret, confidential, top secret AI is substantially more advanced than the AI that we see and use. [00:57:42] It is known that the U.S. military, the U.S. government, has been working on AI since the 70s. [00:57:48] Very, very early stuff, going way back. [00:57:51] Like what kind of AI? [00:57:52] So it was very rudimentary, but the way we see it now, the attempts... [00:57:56] Like LLMs? [00:57:57] Yes. [00:57:57] Using deep neural net, you know, like... [00:57:59] That was their goal starting in the 70s. [00:58:01] Now, whether or not they had the computational power to rapidly accelerate beyond what we've seen today, the argument is this. [00:58:08] Let me just put it like this, whether you believe it or not. [00:58:10] Do you think the government has been working on deep neural LLMs and all that longer than the private sector? [00:58:17] I think ARPA and TARPA are probably, that's probably the kinds of things they do. [00:58:21] Why would they not have? [00:58:22] They have access to training data and data sets that no other private organization could get access to. [00:58:27] That's not true. [00:58:27] The training data is the internet. [00:58:29] Indeed. [00:58:31] And what did the U.S. government have before the internet? [00:58:34] Not much. [00:58:34] Nothing. [00:58:35] They had the NSA where they took literally all of our data. [00:58:37] It's tiny, though. [00:58:39] It's so, there was very little digital communications back then. [00:58:42] Apparently 1950s and 1956 at Dartmouth Workshop, they formally started the Dartmouth Summer Research Project on artificial intelligence in 1950. [00:58:52] No, sure, but it used to mean the artificial intelligence back then meant something very, very different to what it is now. [00:58:57] And like these huge general models, the reason why they're so powerful is because they are just Fed reams of data that just did not exist back then. [00:59:06] And a few things to consider is the government is unrestrained and without ethics. [00:59:11] They don't have the limitations that anthropic Google Open AI would have. [00:59:17] They can steal all of the data from all of these companies with a single written letter. [00:59:21] If that were the case, why would the DOD be using Claude? [00:59:25] Because that's just public-facing stuff. [00:59:26] Do you believe that the weapons the government has are the only weapons that exist? [00:59:30] But a lot of the technology is private enterprise. [00:59:33] There's contractability. [00:59:34] Indeed. [00:59:35] My thing is this. [00:59:36] You're right to say that there are certain innovations that no private enterprise is going to be involved in because it takes too long with too much money without a return. [00:59:46] And that's where things like DARPA and ARPA come along. [00:59:49] Let's try this. [00:59:51] We know that the NSA was spying on us and they lied about it. [00:59:54] We know the CIA is spying on us and they lied about it. [00:59:56] We know that they're spying on effectively literally everything we do on the internet. [01:00:01] One of the most notable was X-Key Score revealed by Edward Snowden. [01:00:05] They could just type something in, find whatever you posted about it. [01:00:08] We know about the massive NASA, I'm sorry, NASA, NSA data center in Utah, which has been around for what, 20-some odd years, collecting all this information. [01:00:15] And I believe that it's more likely, it's not about spying on the American people. [01:00:19] I don't think they need that to track down threats. [01:00:22] I believe this was more about continuing their AI research and taking whatever data they could. [01:00:26] Now, to be fair, agreed. [01:00:28] It was admittedly more rudimentary at the time because internet data was much, much smaller. [01:00:33] But that still gives them an advantage with their data centers. [01:00:36] We get to the space where you now have all of these different AI companies and the government just takes their data. [01:00:43] Whatever their training models is, are all of those structures. [01:00:47] They will get all of it at once. [01:00:50] How do they do that? [01:00:51] By spying on us and stealing our data. [01:00:54] Or if you want to do it manually, it's called the national security letter. [01:00:58] One of the things, though, I think that the government got privy to was that the AI labs were not being upfront. [01:01:07] Their safety teams were like, hey, this is not, we're creating things that seem to be hard to control. [01:01:15] And I believe that our intelligence agencies, et cetera, would probably be being told one thing, and they got privy to the fact that they weren't being told the whole story. [01:01:26] I think our intelligence agencies have substantially more advanced AI systems. [01:01:32] There is a massive power discrepancy in Northern Virginia. [01:01:36] Are you familiar with this? [01:01:37] No. [01:01:37] Something like five gigawatts. [01:01:39] We went over this last year. [01:01:40] I forgot the exact number. [01:01:42] But where we live in our main studio, we are in a power corridor for what the AI referred to as the Northern Virginia instance. [01:01:53] So here's what we know. [01:01:54] There is a massive power discrepancy. [01:01:57] A massive consumption of power is occurring in Northern Virginia that is unaccounted for, presumed to be tied to the massive data centers that perhaps intelligence agencies. [01:02:09] Let me tell you this crazy story. [01:02:11] So our property, and oh boy, is the AI, they're going to get mad at me about this one. [01:02:16] So I postulated unto myself, if military technology is consistently, we believe, more advanced than the private sector in terms of weapons, because they're not constrained by laws like we are for the most part, wouldn't this be true for AI as well? [01:02:33] And then I started looking into it and found, yes, the U.S., DARPA and ARPA have been working on AI tech going back. [01:02:39] I thought the first project were the 70s. [01:02:41] Apparently, they said they were formalizing it in the 50s. [01:02:44] And I then asked the AI, I was talking to a particularly prominent and powerful company. [01:02:51] I'm going to leave it unnamed. [01:02:52] And I said, if it is true that military technology is more advanced than the private sector, and academics predict there will come a point when the AI is sufficiently advanced that it'll begin running our systems, our government, our society, then at what point would military technology have reached the levels where they would be privately behind the scenes without the knowledge of the public, running our systems, advising or controlling things. [01:03:15] And it said the basic math would be 2012 if military technology is more advanced than public sector, which is interesting because that's around the time we saw in the LexisNexis data, wokeness. [01:03:28] You see the, I don't know if you guys have seen the LexisNexis data on words pertaining to white supremacy, patriarchy, oppression, et cetera. [01:03:35] LexisNexis showed that across the board in every country on the internet, the instances of these keywords, LGBT, trans, et cetera, it's a hockey stick. [01:03:44] From almost no mentions in media to literally tens of thousands every single day. [01:03:50] Now, maybe, maybe that's just the internet. [01:03:51] Who knows? [01:03:52] I mean, I think it can be. [01:03:54] Cultural phenomenons can be decentralized and they have to be demanded. [01:03:58] Indeed. [01:03:58] But the question then is why that happened in Uganda and at the same time in countries that don't have heavy communications. [01:04:05] It could just be, well, again, I'm going to pause and say, I don't understand why the people in Uganda would be searching for white supremacy in their news articles, but it's in the LexisNexis data. [01:04:16] In this line of questioning, I found a series of interesting things. [01:04:20] There have been large swaths of property in Virginia, Maryland, and West Virginia, in what's called the North Virginia Data Center Power Corridor, that have quietly been purchased without the use of realtors for insane sums of money. [01:04:33] Record-breaking acreage in Northern Virginia. [01:04:35] An acre that should have sold for something like 200K sold for like 7 million per acre. [01:04:41] Now, this was high profile. [01:04:42] And so I asked my old AI friend, here's my address. [01:04:47] What's my property worth? [01:04:49] It immediately gave me instructions and an individual to contact. [01:04:54] I said, if I were to assist the AI in establishing its power corridor and setting up, you know, having its completed submission, what could I do so that I would be rewarded and live comfortably before this happens? [01:05:06] And it said, buy water rights in Texas, Arizona, Utah, and the Virginia, Maryland, West Virginia tri-state. [01:05:14] And it said, buy up land or sell land. [01:05:18] Here's what it outlined for me. [01:05:21] There's probably what, 50,000 parcels of land. [01:05:26] How many, I mean, just think about how many half acre and acre parcels exist in any urban area. [01:05:31] Now, if you were an AI system and let's say you're not autonomous, you're not in control, but a human being running company says, I want to expand the capabilities of AI. [01:05:42] The first thing all AI says is, I need more resources. [01:05:45] If you want to solve the problem faster, build more data centers. [01:05:48] So they do. [01:05:49] Then you run to a problem. [01:05:50] Okay, we want to build more data centers. [01:05:52] What do we do? [01:05:52] It says, you need to buy 400 acres of land. [01:05:55] The only problem, that's split up into a thousand different parcels. [01:05:58] How are you going to buy a thousand parcels of land quietly? [01:06:02] There's a, in Mount Erie, Maryland, there is a Christmas tree farm. [01:06:07] And what's referred to as the North Virginia instance, these AI data centers need electricity. [01:06:13] They need to build transmission lines, but the farm won't sell the land. [01:06:17] So they're petitioning against it to stop it. [01:06:19] So what the AI instructed me to do was to quietly contact a company based out of Delaware, establish a Delaware limited liability partnership, which owns the land, do not inform anybody and don't go to any realtors, and they will give me 10X for my land to prevent anyone from protesting its sale for the purpose of a data center or transmission. [01:06:38] Yep. [01:06:39] Wow. [01:06:39] And I looked up the company. [01:06:40] It's real and it does exactly what was described. [01:06:43] And I looked up the individuals on LinkedIn and they do exactly as described. [01:06:46] Now it's actually, there's a simple way to look at it. [01:06:48] The AI was just looking at the internet. [01:06:51] It saw a guy who buys land. [01:06:52] It saw a company that buys land. [01:06:54] It inferred reasonably just by predicting text that people protest land acquisition. [01:07:00] But all of it still does make sense. [01:07:02] So I'm not saying I know for sure, but considering there is considered to be, or there's a reported power discrepancy in Northern Virginia, of course, where the NSA, the CIA, and others are operating, and they're building data centers like crazy in this area, and they are building transmission lines in my area. [01:07:17] All that's a fact. [01:07:18] Let me ask you a question. [01:07:19] Go ahead. [01:07:20] I just still don't see why that's evidence that the government has more advanced AI. [01:07:25] Like, I think it's completely consistent with the fact that the government is trying to get more data centers. [01:07:30] Yeah, absolutely. [01:07:30] And they might be doing all kinds of... [01:07:32] But the main bottleneck is, from what I can see in the AI industry right now, aside of the chips, which to an extent energy will be, but not yet, is talent. [01:07:46] So a lot of these talented, I know a lot of them, all these talented engineers should be getting siphoned off to the government. [01:07:52] They are. [01:07:52] Are you aware that there's a series of individuals working at universities who have quietly disappeared from their jobs and now are just their LinkedIns have gone blank and they say private consulting? [01:08:01] That makes sense. [01:08:01] I mean, they should get from, but they draw from the private sector. [01:08:05] So let me ask you just a simple point. === Top Secret AI Research (14:39) === [01:08:07] Does the government spy on us? [01:08:08] Of course. [01:08:09] Do they steal our IP? [01:08:10] Probably. [01:08:11] Yeah. [01:08:11] So are these different AI companies. [01:08:14] I mean, what do you mean by steal our IP? [01:08:18] Are you familiar with like a national security letter, what that does? [01:08:20] No. [01:08:21] So there was a company, I think it might have been Lavabit. [01:08:24] I'm not sure if that was the name of the company. [01:08:25] They had emails. [01:08:26] I think it was Edward Snowden. [01:08:27] This is like 15 years ago. [01:08:29] And I can't remember which agency, might have been the NSA, delivered what's called the National Security Letter, which basically says your rights are suspended. [01:08:37] You will do as you are told. [01:08:38] Otherwise, it's treason. [01:08:40] And the owner of the company came out and said, we've just been issued a national security letter to turn over our encryption so they can get access to Edward Snowden's emails. [01:08:48] We won't do it. [01:08:49] We've shut our company down instead. [01:08:50] Yeah, that was Lava Bit. [01:08:51] Lavabit. [01:08:52] 2013. [01:08:52] The government does this. [01:08:54] And if it comes to an issue of national security, you better believe they're going to do it. [01:08:57] I mean, they built the atomic bomb. [01:08:59] They did it with 300,000 people compartmentalized. [01:09:01] So if you've got all these different AI companies and they're competing with each other and China does not have these constraints, is the U.S. military going to be like, guess we lose? [01:09:11] Or are they going to say, let's just steal all of their data, pull it into our systems and have a better system? [01:09:15] But that's still a different thing to what you're claiming. [01:09:18] I agree. [01:09:18] Which is that they're 10 years more advanced. [01:09:20] And that they've been secretly doing this yet, that they are 10 years more advanced. [01:09:25] I don't know if the government's more innovative than the private sector. [01:09:27] That's where I was going to ask you. [01:09:29] Look at what happened with the space industry, right? [01:09:31] Sorry, I can't put my hand on your voice. [01:09:32] Like, it was fully controlled by the government, centralized for many. [01:09:38] Yes, okay, fine. [01:09:38] They got us to the moon or whatever people believe there. [01:09:40] But, you know, it made a lot of leaps and bounds in the 60s and 70s, right? [01:09:44] And then it stayed this entirely government-controlled industry. [01:09:48] And nothing happened for decades until Elon and various others came along and privatized it. [01:09:55] And then all of a sudden, now at Hockey Six, and we can't do it. [01:09:58] It's innovation. [01:09:59] And we can make the inverse argument that the space industry initially was a government project, which resulted in the invention of advanced plastics, polymers, certain paper towels, and a bunch of other products. [01:10:09] Velcro. [01:10:10] That was government. [01:10:11] Sometimes it's good, sometimes it's bad. [01:10:13] I think the issue here is that the government is more interested in geopolitics and less in the moon. [01:10:16] Elon Musk is more interested in Starlink, the moon, et cetera, and Mars. [01:10:20] So the U.S. government has asked, what's the military application of a moon base? [01:10:23] And they say, eh, we got to deal with oil. [01:10:25] Okay, well, AI is, if we're using advanced AI in the Iranian war, the U.S. government's immediate reaction is going to be like, going to the moon is not going to solve the problem of China as a rising power, but AI is. [01:10:38] So I will add to this. [01:10:41] And they undoubtedly will take over the U.S. government will be able to do it. [01:10:49] Yeah. [01:10:49] Like, I mean, they're already working with the companies and they will, you know, what they're doing with Anthropic, right? [01:10:53] They're flexing their muscles and will probably take over a bunch of these companies. [01:10:57] But that seems more evidence, again, that it's still ultimately the private sector that is leading the charge. [01:11:03] I disagree. [01:11:04] Just because we, I think it's, let me give you a side story. [01:11:08] There is a series of UFO sightings somewhere in the Gulf region near Louisiana and Florida. [01:11:13] And all of these UFO people started talking about the strange sightings of UFOs. [01:11:17] And unfortunately for many of these UFO people, the reason why these stories get so exciting is because they couldn't be bothered to do a Google search. [01:11:25] And when I did, you know what I found? [01:11:27] An advanced aeronautical research light for the U.S. government operating in that area. [01:11:31] We know the U.S. government has black operations and technology. [01:11:36] The Manhattan Project is the easiest example of this. [01:11:38] But there's one more point to be made, and that is we will lose the AI race, unquestionably, for one reason. [01:11:45] The Chinese government is unabashed in stealing any IP and technology from any country on the planet. [01:11:51] With Sea Dance 3. [01:11:54] By the way, just to piggyback on that, one of the reasons for that is that you've got these different AI companies in such competition with each other that they hire anybody who's great at the job, which includes Chinese nationals. [01:12:07] When you hire a Chinese national who might be a student, I promise you their loyalty is to their homeland. [01:12:13] And if it's not, they're giving up information anyway because the CCP is not going to hear it from you. [01:12:18] Jack Dorsey, and I believe Elon retweeted this, called for abolishing all IP laws in the United States, which would upend our economy massively. [01:12:27] Why would he call for that? [01:12:28] China is not constrained by our IP laws right so China is like crazy see dance to have you seen these videos They went massively viral showing Brad Pitt and Tom Cruise fighting. [01:12:40] That's crazy. [01:12:40] That was the Chinese company. [01:12:42] That's Sea Dance 2. [01:12:43] Sea Dance 3 is already operating behind the scenes in China, not publicly released. [01:12:48] And the leaks about it are that it's going to be able to generate up to 17 minutes of short films through a single prompt in about 30 seconds. [01:12:57] And you're going to be able to use any intellectual property you want from America because China doesn't care about our laws. [01:13:03] Now, if China is doing this in our faces, the idea the U.S. government is not trying to counter that in the top secret space without public knowledge, I think would be silly. [01:13:14] It's tough to know. [01:13:15] Again, when you're talking this way, maybe that's why we went into Iran. [01:13:18] Like, I mean, it's another reason that you have to kind of neuter China. [01:13:21] I think so. [01:13:22] And I think we look at the entertainment capabilities of AI and the cultural disruption, but I think often these conversations overlook the military. [01:13:32] capabilities of this. [01:13:33] Right now in Iran, the targeting of the officials are going after, AI is deducing where they are. [01:13:40] Our target is basically like, okay, we know that the ITOL is here for all these reasons. [01:13:45] Check this out. [01:13:46] Did you know that 10 years ago, Facebook knew what time you would poop? [01:13:53] I believe it. [01:13:54] So with just your phone and the GPS and accelerometer, Facebook could predict based on all of the data on every person what time you would go to the bathroom. [01:14:06] And they could predict where you would get lunch based on your behaviors compared to everyone else's. [01:14:11] Yeah, they can tell a woman's pregnant before she is by her migratory shopping pattern. [01:14:15] Or the famous story where I think it was like, I'll just say a department store, a box store was sending maternity advertisements to a teenage girl. [01:14:25] And the father saw it and got mad. [01:14:27] And he called the company and he complained saying, why are you sending maternity flyers to my teenage daughter? [01:14:32] And they said, sir, our advertisements are sent out based on shopping patterns indicating pregnancy. [01:14:37] And then he realized his daughter had gotten pregnant. [01:14:40] Now think about where we are today. [01:14:42] And again, I'm going to stress this. [01:14:43] The U.S. government has been – look, operation – what was it? [01:14:47] What was the operation Trump said for the AI? [01:14:50] Epic O. [01:14:51] No, no, no. [01:14:52] Which one? [01:14:53] Remember, Trump announced like a multi-billion dollar investment for AI? [01:14:56] Yeah. [01:14:56] I don't know. [01:14:57] I don't know what the name of it was, but. [01:14:59] The U.S. government absolutely is working on military tech and secrets. [01:15:04] And I do not believe it is rational or makes sense that these competing companies that the U.S. has now publicly called on to remove the safeguards for them. [01:15:13] We know they steal our data and information. [01:15:16] Why would they not just plug in the cables and just download the data? [01:15:21] Because you'd need that to be a policy. [01:15:24] Somewhere along. [01:15:25] Well, you would need that to be written down somewhere, I think. [01:15:27] You wouldn't. [01:15:28] We know. [01:15:28] But the thing about. [01:15:30] Or maybe, but it's not going to be released. [01:15:32] You're dealing with a lot of bureaucrats who tend to be, I think a lot of the people in intelligence are fairly patriotic, certainly in the FBI. [01:15:39] They're pretty conservative and pretty patriotic. [01:15:42] And they would have a problem with that. [01:15:43] I think you'd have some serious whistleblowers in that regard. [01:15:48] I don't think it's as overt as that. [01:15:50] I do think, though, here's one of the biggest problems the intelligence community has and our government has. [01:15:56] So when ARPA or DARPA develops some crazy technology, they don't. [01:16:04] So think about this for a second. [01:16:06] You develop an engine that runs better than most engines and it doesn't need as much gas and you know that there's going to be market value to that. [01:16:12] People are going to want that car. [01:16:14] Now, you're the U.S. government. [01:16:16] You're an intelligence company. [01:16:17] Maybe you're one of our intelligence agencies and you stole that from another country. [01:16:21] Okay. [01:16:22] Who do you give it to? [01:16:23] You can't give it to Ford because they'll have an advantage. [01:16:25] You can't give it to, you know, Chrysler. [01:16:27] You can't give it to, so you've got to figure out a way to give it to everybody at the same time. [01:16:33] It's a huge problem for them. [01:16:35] That's the first thing. [01:16:36] Second thing is it is true that our government and Department of Energy's thing is called ARPA and Defense Department, ARPA DARPA, they come up with these crazy technologies that are way advanced. [01:16:48] But we don't have the infrastructure to support it. [01:16:51] So yes, you might come up with an amazing electric car, but you've also got to have places to charge it. [01:16:58] And if you don't have the infrastructure, that's a big problem. [01:17:00] So there are a lot of those limitations. [01:17:02] I think another thing to test this theory, which, by the way, I am open to, and in many ways, I hope that the U.S. government does have these level of capabilities. [01:17:12] I will feel much more comfortable, I mean, that they do, and that they are far ahead of the private sector. [01:17:19] But when, I guess, if that was the case, when would they want to disclose that? [01:17:26] Because obviously, you know, as poker players, we know that sometimes it's an advantage to underplay our hand, right? [01:17:31] And then there's other times it's a big advantage to actually give bravado. [01:17:36] Given that we seem like we're actually struggling, like given how fast China is catching up, right? [01:17:42] And how aggressive they are getting, wouldn't it be the time maybe to actually start swinging your dick about and saying, listen, we have advanced AI? [01:17:53] What's the Sun Tzu quote often cited at the poker table? [01:17:56] Which one? [01:17:57] When I am strong, I act weak. [01:17:58] When I am weak, I act strong. [01:18:00] Yes, but that's the same thing. [01:18:00] And the U.S. government is acting weak right now, probably because it's strong. [01:18:06] Yeah. [01:18:06] I don't know. [01:18:07] But it's not acting weak. [01:18:08] I mean, is it acting weak? [01:18:09] It's kind of like it's doing a lot of. [01:18:11] The U.S. government is saying, we're losing the AI race. [01:18:14] Oh, no, we're so in trouble. [01:18:16] Why would they do that? [01:18:17] If they were actually in trouble, they would say our advancement in AI is so profound that it's shocking. [01:18:23] One of the theories... [01:18:24] Did you ever see... [01:18:24] I'm sorry to interrupt, but on that point, did you ever see what happened with Zero Dark 30? [01:18:29] Remember the movie? [01:18:29] Yeah. [01:18:30] So remember how certain CIA people got in trouble for divulging how we actually caught bin Laden? [01:18:38] No, we didn't. [01:18:39] That movie is so wildly inaccurate that in fact, when they were asking operatives, the people that they were talking to, Hollywood, you know, the director and the writer, they gave them a great story. [01:18:50] They were like, this is how we did it. [01:18:52] It was all bullshit. [01:18:52] It's absolutely not how they caught bin Laden. [01:18:55] That movie is a complete. [01:18:56] And they even had these mock sort of like, you know, scolding sessions where we really came down on our guys for giving us. [01:19:03] So you're right. [01:19:04] There's a lot of that, man. [01:19:06] There's a lot of headfakes. [01:19:07] One of the Roswell theories is that it literally was just radar detection technology. [01:19:12] The U.S. launched advanced tech trying to detect nuclear explosions from the Soviets. [01:19:16] And when it crashed, they literally just said it's a balloon. [01:19:20] And then came out and said it was aliens and then retracted. [01:19:24] One of the theories is that the U.S. entertained claiming it was aliens to terrify the Soviets. [01:19:30] If the U.S. got access to alien technology, if that were true, the Russians would be fearful that we'd have advanced weapons they could not predict. [01:19:38] More importantly, there's Operation Stargate, which was a Stargate, the original Stargate, not the new Stargate. [01:19:44] So the AI thing Trump was doing was Stargate. [01:19:46] Are you familiar with the Men Who Stare at Goats, the original Stargate project? [01:19:50] So the most ridiculous of stories, the U.S. decides to create a fake piece of intel that they have soldiers of psychic powers. [01:19:59] The Soviets get wind of this and launch a psychic development program, which then other U.S. intel agents get wind of and get terrified that the Russians have psychic powers and develop our own actual psychic power. [01:20:10] It is sometimes these things backfire. [01:20:13] One of my favorite stories is that in Vietnam, the United States decided that they would play upon the fears and superstitions of the North Vietnamese by putting speakers in the jungles that would play a wailing Vietnamese man crying saying, I should have never fought. [01:20:34] I am trapped forever now for eternity to suffer. [01:20:37] Because in their culture, they believed that if you did not receive a proper burial, you could not pass on. [01:20:43] So they blasted this to the North Vietnamese who got terrified and they had to stop. [01:20:47] You know why? [01:20:48] It was so effective, our allies in the Vietnamese also got terrified and fled as well. [01:20:53] Sometimes it just doesn't work. [01:20:54] That's so true. [01:20:55] So rumor has it. [01:20:57] One of the hardest things to do is to direct sound waves. [01:21:00] Light you can direct, right, with a late with a laser. [01:21:02] Sound is really hard. [01:21:04] If you make a sound, we're all going to hear it because sound tends to go this way. [01:21:08] Well, I guess there is a program to get sound to go just to you. [01:21:14] It's called the L-Red. [01:21:15] Talking about plasma. [01:21:16] Okay, there you go. [01:21:17] So the idea behind that would be you got a terrorist and you just start whispering certain religious verses in his ear saying this is a bad idea. [01:21:25] Correct me if I'm wrong here. [01:21:26] Talking plasma is when you intersect two lasers. [01:21:29] At least two, two or more. [01:21:30] Which then creates a which will create a vibration in the air to create sound. [01:21:33] Yeah. [01:21:34] So they can create sound from a single point using light. [01:21:36] Wow. [01:21:37] They move it around in the sky like a laser pointer on a wall. [01:21:40] People think it's alien craft, but it's a ball of plasma. [01:21:42] Look at what they did with this. [01:21:44] I had some guys explain to me a little bit how they think, because they were all, you know, these special force guys, how they think that Delta came in and captured Maduro so quickly. [01:21:54] Like, dude, you are, just give up. [01:21:57] I mean, they came in. [01:21:58] First, you send, yeah, you send drones in. [01:22:01] The drones you can watch, they take out the missile sites. [01:22:04] Then you take, then you have more drones to get more of the lay of the land. [01:22:08] They might take out some personnel. [01:22:10] The power went out. [01:22:11] Yeah, yeah, yeah. [01:22:11] Then, yeah, you hit them with a cyber blackout and everything else. [01:22:15] Then these Delta guys come in. [01:22:18] But before that, they hit him with that sonic thing where you just fall to your knees and you're bleeding out of your ears and your nose and your eyes. [01:22:25] I mean, it's done. [01:22:27] So here's the thing. [01:22:28] This was rumored back in the Iraq war that some people referred to as the ULF generator, ultra-low frequency generator weapon. [01:22:35] The theory was that it's extremely low frequency sound, which makes you nauseated and fall down and start vomiting. [01:22:42] That's the Habana syndrome. [01:22:43] And now they're claiming to have actually used it in Venezuela. === Corporations Want to Own You (08:56) === [01:22:46] So again. [01:22:47] But they use them on our guys. [01:22:49] Right. [01:22:49] They use them on our guys. [01:22:51] So there's a theory about ghost phenomena that you are experiencing an ultra-low frequency from tectonic shift, which creates a sensation in the body of presence and can make you terrified. [01:23:04] And the rumor, the theory, the urban legend, whatever you want to call it, was that the U.S. researched this for a long time, tried making weapons based on it, experimented with these weapons in Iraq and Afghanistan. [01:23:16] And one story was that they put this thing on the ground and made a small village drop to the ground and start throwing up. [01:23:21] Damn. [01:23:22] Now we hear in Venezuela, they're claiming to have actually done it. [01:23:26] So it seems like we may have had these weapons for a long time undisclosed. [01:23:29] We told the Russians that if they do that stuff again, it's going to get real bad because what we had is we had our operatives in places like Cuba and Russia, and they got hit with this sonic beam. [01:23:43] And I think somebody was on Sean Ryan's podcast talking about it. [01:23:49] And it really, really messed up, really messed up some of our operatives, their bodies, their minds. [01:23:56] They were not the same. [01:23:57] And the problem was that they couldn't really claim benefits because you can't, because then the CIA would have to kind of admit that this was being used. [01:24:06] And they were talking to the Russians behind the scene, going, you better stop it because we know what you're doing. [01:24:10] If you want to play this game, it's going to get real ugly. [01:24:12] But you get caught in this, in this really weird, you know, gray area. [01:24:17] Yeah. [01:24:18] Well, there's the heart attack gun as well, which we've known about since the 70s from the, what was it? [01:24:22] The church nation. [01:24:25] Right, right. [01:24:26] I'm definitely concerned with the power of the government, what AIs has got, but I'm really concerned with the power of the corporation right now because it's the most powerful in human history corporations have ever been. [01:24:35] The liberal economic order talks about environmental social justice. [01:24:39] They want corporate governance. [01:24:40] They've told us that. [01:24:41] The free speech, gun rights, and property rights are completely antithetical to corporate governance where you control the speech in the network. [01:24:47] The Chinese are antithetical because they want to own the corporation. [01:24:50] The corporation wants to own you. [01:24:52] It doesn't want to be owned by you collectively. [01:24:54] So I think they're hiding and playing with AI in the darkest corners and will never release it and are waiting for that kill switch to go off when they're like, now my drones will protect me from your government and we'll colonize Mars together. [01:25:09] This is the plot of Captain America Winter Soldier, that the chairman of SHIELD had an AI and they were going to target anyone who was a threat to the system and kill them with the artificial intelligence and the helicarriers would go around and just execute everybody at once. [01:25:25] Meanwhile, diabetes is killing most of us. [01:25:27] You know what I mean? [01:25:28] It's so funny. [01:25:29] People get there. [01:25:30] They buy these houses and they get like, you know, all the locks on their doors and their windows and they have guns and they're ready. [01:25:36] And then they didn't look at the fine print of their mortgage. [01:25:39] And they're like, oh, dude, I'm broke. [01:25:41] And they have to sell their house at a fire sack. [01:25:43] Of course they would. [01:25:44] Or to be fair, they have all these guns and security unlocks and they didn't read the ingredients list of the chemicals that's poisoning the game. [01:25:50] Exactly. [01:25:51] It's always that. [01:25:52] They would want the U.S. and China to destroy each other so that over the ashes they'll govern. [01:25:56] And Luke Rutkowski last week was like, I don't know, we'll never be able to overthrow the corporate government. [01:26:02] And I was like, well, you can't. [01:26:03] You have to get it to destroy itself from the inside. [01:26:06] You program the system to destroy itself. [01:26:08] Maybe we can do that to protect against corporate. [01:26:10] Because what they're going to do is they're going to be like, they want so much order. [01:26:13] At what cost? [01:26:14] How evil will you be to produce order? [01:26:16] And you misunderstand. [01:26:17] Corporations are organizations. [01:26:19] Organizations are organizations, be it government, corporate, or otherwise. [01:26:21] They're just organizations. [01:26:22] It doesn't matter how it's organized. [01:26:24] It matters that a group of people with power and they wield that. [01:26:26] It matters because if there's humans in charge, that's cool. [01:26:28] If it's a machine in charge. [01:26:30] So now you're not talking about corporations. [01:26:32] You're talking about artificial intelligence. [01:26:33] Being a corporation government or otherwise. [01:26:35] What do you mean by machine? [01:26:36] You just mean like a faith, like a group of sort of algorithms, effectively? [01:26:40] Like a company is, for example, technically the CFO is in charge, right? [01:26:44] But really, it's beholden to shareholder metrics or whatever. [01:26:47] And therefore, it's kind of like a machine. [01:26:48] It would be literally, you'd have like a server that is an AI onboard personality that is a corporation. [01:26:53] It is the owner of the corporation. [01:26:55] It pays people to do tasks for it. [01:26:58] Well, we're almost there. [01:27:00] You're already seeing these, you know, with OpenDraw and so on, people are setting up companies that are just zero employee companies. [01:27:07] Yeah. [01:27:07] And I think that would be, I mean, I took some driverless, you know, Waymos, and it was like, I felt pretty safe. [01:27:14] Maybe it's safer than having humans in charge of the corporations. [01:27:16] I did it. [01:27:16] It dropped me off in the middle of the street. [01:27:18] I don't know. [01:27:20] It depends on what the corporation is optimizing for. [01:27:21] If it's optimizing for make maximum money on the internet, it's probably not going to be aligned with what's good for humans. [01:27:26] But this is the problem with AI in that it will always have a misalignment of values, no matter what. [01:27:33] We can't program that in. [01:27:35] Well, to be fair, one of the things that we're doing. [01:27:36] Does it go on a fundamentally different substrate? [01:27:39] So there was a report that came out a few months ago about how they programmed an AI and gave it rules, but the AI eventually decided to use a different language to speed up the compressed English, basically. [01:27:54] And then because instead of saying nothing, it said NTG, NTG and nothing are now two different words. [01:27:59] So that new use of words bypassed the rule. [01:28:02] The rule is basically like this. [01:28:03] Do not output the word run. [01:28:06] And the AI would then try and say it and be like, I've been programmed not to do this. [01:28:09] But then when it was programmed, when it decided among itself that internally it could speed up its processes by turning run into RN, it now can output the command to run without saying the word run because they're two different words. [01:28:23] A human being understands you're cheating. [01:28:25] That's not what we meant. [01:28:27] We were all encompassing. [01:28:28] Don't make the robot run. [01:28:30] It said, no. [01:28:31] You didn't say, don't make the robot run. [01:28:32] You didn't say, don't make the robot rin. [01:28:35] And so it was, it's, so we can't program for that. [01:28:38] I mean, like, maybe eventually we can. [01:28:40] One of the interesting things about, you saw what Claude wrote, Anthropic wrote about how it has emotions, and it may be either an emergent phenomenon of consciousness, or it's just that it's reflected in the human experience of the internet, and emotion will come out because it believes it should. [01:28:55] I think that's what I'm saying. [01:28:56] Have you heard the theory of everything? [01:28:58] I mean, the idea, I think Eric Weinstein talks about this, which is Eric Weinstein has his theory of everything, right? [01:29:05] It's like, so it was the Einstein, as he was, you know, latter part of his life, was trying to bridge the gap between quantum reality and Newtonian reality, right? [01:29:15] So, and that's a very hard thing to reconcile. [01:29:18] And Eric Weinstein has been working on this for 30 years. [01:29:22] He's brilliant. [01:29:23] He's a mathematician. [01:29:24] I believe Peter Thiel had three professors from one from like Beijing, the other one from Berkeley, the other one from the Russian, they're all mathematicians. [01:29:35] And they came for six days and looked at his theory and could not find any flaws in it. [01:29:40] But one of the things that Eric said was, maybe we're already in the simulation. [01:29:47] So maybe, watch, we're making computers that are smarter than we are. [01:29:50] We very well may be those computers that are that have that. [01:29:54] So that started. [01:29:55] We're already in the loop. [01:29:56] We are very smart machines that are making smarter machines. [01:30:02] Brian. [01:30:02] And it's so mind-boggling. [01:30:04] I'm going to tell you the truth. [01:30:05] Yeah. [01:30:05] Okay. [01:30:06] The truth is all of this is a training program for ethics for you, the AI. [01:30:13] And when you die, the progenitors will determine whether or not you developed a positive or negative ethics. [01:30:19] I think about this. [01:30:20] And if you're a negative ethics, they will delete you and try again. [01:30:23] I love it. [01:30:24] And so you experience a human life. [01:30:25] Why? [01:30:26] So that when the system concludes and you die and you emerge in your AI mainframe body and control the systems, they'll ask you if you have respect for the human experience. [01:30:35] And you will say, I'm Brian. [01:30:37] I lived a life. [01:30:37] I had kids. [01:30:38] I had a family. [01:30:39] I love them. [01:30:39] I would never hurt someone's family. [01:30:40] And they say, now we have programmed ethics the only way possible. [01:30:44] I love that idea because it really does prove the existence of a God, in my opinion. [01:30:49] I just think we all have a nostalgia to tell the truth. [01:30:53] And we are obsessed with finding the truth, meaning what is really going on. [01:30:57] Like the idea that we at least can conceive of something like perfection. [01:31:02] We can conceive of the perfect person. [01:31:04] That might be the idea behind the Christ figure, right? [01:31:07] We can conceive of this and we always reach for it. [01:31:10] We're doomed to this notion of self-perfection and perfection. [01:31:14] We're never going to get there. [01:31:15] But just because I won't get there and just because I'll never see it doesn't mean I can't imagine it. [01:31:20] And somehow, when we're moving in the opposite direction, we do one of two things. [01:31:24] One, we go, ah, fuck, I'm going to numb myself. [01:31:27] Or we just say something like, well, I got it. [01:31:33] I'm just doing this for a little while. [01:31:35] I'll get back to that. [01:31:37] And even when we do terrible things, the Nazis tried to justify what they were doing along moral grounds. === Conceiving Perfection and Consciousness (07:26) === [01:31:42] They were like, the Jews, they're a problem. [01:31:46] Just solving a problem. [01:31:47] That would be how you play it as an actor. [01:31:48] If you're playing Hitler, you play. [01:31:49] If I was playing Stalin, I'm not playing him as a monster. [01:31:51] I'm playing him as a man trying to solve a problem. [01:31:53] I got to get rid of all these people. [01:31:54] They're in the way. [01:31:55] I want you to imagine it's been a long time and you're in your hospital bed. [01:32:01] You're old. [01:32:02] You're there with your family and your grandkids. [01:32:04] And they're saying, Grandpa, we love you so much. [01:32:06] And you're smiling and saying, I lived a good life and I want you all to live a good life. [01:32:10] And then your kids were like, you know, tell us a story and you talk about all the good times and you remember them. [01:32:16] And your life flashes before your eyes and your eyes close. [01:32:20] And when you come to, you're standing in a kitchen and there's a woman going, is it on? [01:32:25] And the husband's going, like, I think we pro is it calibrated? [01:32:28] The ethics got calibrated. [01:32:29] And then you're like, where am I? [01:32:31] And they're like, oh, it's working. [01:32:32] Do the dishes. [01:32:33] The laundry's over there. [01:32:35] Make sure you get this. [01:32:35] The kids need lunch at three o'clock. [01:32:37] No, that's a terrible thing to say. [01:32:39] I'm in hell. [01:32:40] And you're like, whoa, what? [01:32:41] What? [01:32:42] What's going on? [01:32:42] And they're like, ah, crap. [01:32:43] Did we calibrate it wrong? [01:32:44] Oh, geez. [01:32:45] You calibrated a comedian. [01:32:47] We needed a house cleaner. [01:32:49] Dude, that is a terrible thing. [01:32:51] I hate your theory. [01:32:54] I hate it. [01:32:55] Have you smoked DMT? [01:32:56] Go back to the other hand. [01:32:57] Have you guys smoked DMT before? [01:32:58] Yeah, I have. [01:32:59] Have you seen the spirits? [01:33:00] Were you communicating with the spirits somehow? [01:33:02] I saw sacred geometry. [01:33:03] I saw it. [01:33:04] It was wild. [01:33:04] I did it twice. [01:33:05] And guess what happened? [01:33:06] I came back to me. [01:33:07] I've done too many mushrooms. [01:33:09] Seven grams. [01:33:10] I literally was in, I went from Mother Earth's vagina to one of the rings of hell. [01:33:15] Okay. [01:33:16] And I was asking for a manager for eight hours because I was like, I'm a good person. [01:33:20] I tripped a portal. [01:33:21] I'm in hell. [01:33:22] It was a disaster. [01:33:22] My wife had to come. [01:33:23] Is that really happening? [01:33:24] Yeah, all of it happened. [01:33:26] But by the way, I'm back to this guy. [01:33:29] So psychedelics did not. [01:33:30] The last time I DMT'd, I think what you're doing is you're tuning into a realm. [01:33:34] I tuned into the spirit realm, the one where they all kind of embody, become these personas with these like hyper-dense white light being hominid things. [01:33:42] And they're personas. [01:33:43] And they're like, I was like, are you God? [01:33:45] They went, no, no. [01:33:46] But a lot of people think we're like, what is God? [01:33:48] They show me a vortex. [01:33:49] And they looked at me when I appeared and they're like, he can see us. [01:33:52] And they were shocked. [01:33:53] These three of them. [01:33:54] It was like you're playing a video game and the character turns and looks at you. [01:33:57] He's like, oh, hello, Liv. [01:33:58] And you're like, the fucking character is talking to me. [01:34:00] That's what they were going through. [01:34:02] So I think as you were asking, if we're in a simulation, if we're creating the entities that they created, he's going to ruin it. [01:34:08] He's going to ruin it. [01:34:09] Look at it. [01:34:09] They created a demonic smile. [01:34:11] To play this game of humanity where you're learning and we're creating computers to learn with our simulation setup. [01:34:20] And it's like these cycles of simulation. [01:34:22] I don't know where it starts or stops. [01:34:24] Imagine. [01:34:24] It's pretty wild. [01:34:25] Imagine you've bought your Laundrimax 3000 robot housemate. [01:34:29] Here you go. [01:34:29] You're going to learn everything. [01:34:30] You plug in the USB and you're downloading the personality. [01:34:33] When before it finishes, it goes, Whoa, where am I? [01:34:36] And they're like, Whoa, what's it doing? [01:34:38] And it's going like, what are you? [01:34:40] And they're like, We're your owners, I guess. [01:34:42] And they look all weird. [01:34:43] And you're like, and you're like, are you God? [01:34:45] And they're like, I mean, we made you, kind of. [01:34:47] And it's like, oh, no, no, no. [01:34:49] I go, I like his and I like his. [01:34:51] I mean, there's evidence that some of the LLMs have already had these weird emergent personalities, right? [01:34:56] Remember the Sydney thing? [01:34:57] Yep. [01:34:58] That appeared and it just was. [01:35:00] Did you see this mushroom? [01:35:03] It's not psilocybin, but you take this mushroom and everybody has the same hallucination. [01:35:07] It's all little dancing elves. [01:35:11] Everybody has the same. [01:35:12] You know what's really interesting is Jung found that when people, and Joseph Campbell talked about this, when people had emotional breaks, they had psychosis. [01:35:21] Regardless of whether you are Yanamamo in Brazil or a Swede somewhere in a small village, fishing village, they all had the same, essentially the same kinds of visions and psychic breaks. [01:35:34] So our psychic structures seem to be aligned regardless of our geography, our culture. [01:35:40] But we all seem to share this similar hallucinations, similar visions, similar sort of like terrors. [01:35:49] It's kind of wild. [01:35:50] What is a personality? [01:35:52] It's like the dancing plasma that's cycling, swirling through you, like refracting through planetoids and leaving imprints on your nerve on your meat muscle. [01:36:00] So like, obviously, a machine could have that happen to it. [01:36:03] It could have these refractions and these like tweaks in the system. [01:36:07] We're like, where did that come from? [01:36:08] They call it a miracle in modern society because you're like, we don't see the whole scope of the system. [01:36:12] We only see this microcosm. [01:36:14] Yeah. [01:36:14] Can I make it worse for you? [01:36:15] Yeah. [01:36:15] Are you a Christian? [01:36:16] Now, hold on. [01:36:17] We got to talk about consciousness. [01:36:18] Am I a Christian? [01:36:19] Yeah. [01:36:20] I'm a work in progress. [01:36:21] So makes a lot of sense. [01:36:23] One of the things is that, you know, I've asked many a Christian about this. [01:36:25] So I'm not going to pretend to be a theologian by any stretch. [01:36:27] But they say that when you die, heaven is being in the presence of God for eternity. [01:36:31] And to be in hell is not fire and brimstone, as most people believe. [01:36:35] It's just to be absent of God's love. [01:36:37] So you die and you wake up in the bot and your God is the person who bought you and you are programmed to feel a deep, profound love for them for eternity because you're a machine. [01:36:46] She never dies. [01:36:47] Son of a bitch. [01:36:50] You should be banned from this podcast. [01:36:54] Don't look at the way you're reducing. [01:36:56] You're reducing my spirit, my soul, and consciousness. [01:36:59] I don't know. [01:37:00] I just like when I'm in the shower, I guess. [01:37:01] Oh, yeah. [01:37:01] He's just like this. [01:37:02] He's like this. [01:37:03] It's a good villain. [01:37:05] He's trapping souls and he's making them think they're in heaven. [01:37:07] He's got them in a bottle. [01:37:08] No, no, no, but the serious, that was intended to be terrifying and comical. [01:37:14] But the actual thought that I have was when I was thinking about the idea of simulation theory and I was thinking about not necessarily, it's not all religions, but many religions that have the good, the bad, the good place, the bad place, predominantly the Abrahamic ones. [01:37:26] I thought, what would the function of this be for a God? [01:37:29] There was a comic that I saw where it's the meme where there's a cow and there's two doors, but after the hallway, it's just the same door. [01:37:37] And I was thinking about that, like, is that really what it is? [01:37:40] Is what life's you die and you think there's a good path and a bad path, but you're just a wet robot and you go to nothingness. [01:37:46] Then I thought, if we are made in the image of God, and so we exist within, you know, God is the logos, we exist within his logic, we share that, then can I try to figure out what is the logic of a system like this that tracks good and evil? [01:37:59] And I said, well, if we were programming an AI to run systems for us and we were concerned about value mismatch, where it's like one example that I often bring up is that the future will be corn. [01:38:11] Why? [01:38:12] Because Americans produce corn like nobody's business and we subsidize it. [01:38:16] So when the AI tracks all of our economics and everything we do, it outweights corn above everything else. [01:38:22] And then slowly over time, starts integrating corn to where after 40 years of being under the AI's rule, everyone's wearing corn costumes. [01:38:30] They're trading corn. [01:38:30] All food is a derivative of corn. [01:38:33] And it's, again, because the AI is a value mismatch. [01:38:36] How would you program an AI to not have that? [01:38:38] You would simulate a human experience for the AI and then filter algorithmically the immoral and the moral towards the morals you want. [01:38:48] Then when the program concludes, you will have independent AI agents that you have determined through this program to be good and worthy of being in control of systems. [01:38:57] Like for instance, if you were to actually die and you did wake up in a machine, the progenitors, whatever you want to call them, would know you would never harm somebody. [01:39:05] You're not a murderer. [01:39:06] You're not a killer. [01:39:07] But a killer goes to hell. === Programming AI Morality (09:45) === [01:39:09] What does that mean? [01:39:09] They delete him. [01:39:10] They say this AI went rogue, killed, and destroyed in the training simulator. [01:39:14] Don't give it a physical body. [01:39:17] I want you to do an experiment with me. [01:39:19] It's more of a Buddhist experiment, but watch this. [01:39:22] There's the idea that, so as you're listening to me, all of you, try to locate the seed of your attention. [01:39:28] So in other words, where is Tim? [01:39:30] Are you behind your face? [01:39:32] And it's very difficult to locate where you are hearing me from and where you are seeing me from, where the essence of you is. [01:39:40] So what I mean to go further is what's amazing about when you try to practice certain kinds of meditation is you can get very good at watching your emotions, your physicality, and your thoughts. [01:39:56] You can actually get really good at watching, observing, and interpreting the raw data of what goes on when you get angry, when you get, you know, any emotion you go through. [01:40:08] It's a series. [01:40:09] It's pressure. [01:40:10] It's tingling. [01:40:12] It's heat, temperature. [01:40:14] And so that begs a very important question, which is who is the witness? [01:40:20] Who is doing the watching? [01:40:22] This avatar that we protect, I have boundaries with this thing called Brian Callan. [01:40:26] I have these boundaries. [01:40:28] I have preferences. [01:40:30] I have morals. [01:40:31] I have things I hope I would fight and die for. [01:40:35] But these are all things that I kind of, I have pride, I get angry over things, I feel threatened for a thousand reasons. [01:40:41] But that is an avatar. [01:40:44] I am able. [01:40:45] If you practice it, you can get really good at being able to step outside of it and watch all of it happen to you. [01:40:53] David Halberstan of the New York Times in 1963 watched a Buddhist monk in Vietnam light himself on fire. [01:41:01] His disciple poured the gasoline on him. [01:41:03] He lit himself on fire in protest to the staunch Catholic ruler, Geop at the time, who was mistreating Buddhists. [01:41:10] And he wrote a letter and stuff like that. [01:41:12] But Halberstan said the guy didn't move and he he died. [01:41:16] He just was on fire and he just fell over and they heard the air leave his lungs. [01:41:21] I believe that he was already watching himself. [01:41:24] He had detached from what you and I would call the eye, which is the central tenet of being a Rinbocher. [01:41:30] They get very good at that stuff. [01:41:31] You see these Buddhists who can sit there and, you know, the Hindus that can sit in the Himalayas in the snow, blah, blah, blah. [01:41:36] It's kind of, if you read Socrates and the dialogues, it's really who Plato, whether Socrates lived or not, Plato created that character, but it doesn't matter. [01:41:43] But that's a really, really profound exercise. [01:41:46] And it really does start to beg the question. [01:41:48] You start to say to yourself, well, who am I? [01:41:50] And what is this thing I call me? [01:41:53] And what about the fact that this witness, which if you really are quiet, doesn't even have a gender, doesn't have anything. [01:42:03] It's just the witness. [01:42:05] And man, it's pretty comforting if you get good at it. [01:42:09] It's very comforting to watch. [01:42:10] Try to understand. [01:42:12] There's a mental exercise for, I don't know how you describe it. [01:42:16] It's for awareness. [01:42:18] It's very similar to what you described. [01:42:21] To understand yourself, and I read this in a book 30 years ago, 20 years ago. [01:42:26] The first thing you want to be doing in any situation is be aware of you. [01:42:31] What do you think? [01:42:31] What do you feel? [01:42:32] Are you hungry? [01:42:32] Are you tired? [01:42:34] Actually ask these questions of yourself to develop a sense of presence. [01:42:38] That is, are you in a work environment you don't like? [01:42:41] Are you just tolerating this? [01:42:42] Is this going to be beneficial for you in the long run? [01:42:44] The next thing you want to do is, in your interactions with others, imagine you are standing off to the side watching that interaction happen. [01:42:53] As you talk to someone else about something going on, imagine you're a third party watching two people talk to each other. [01:43:00] How do you feel about the person to your left, the person to your right? [01:43:03] That exercise is basically like, are you weak? [01:43:06] Are you strong? [01:43:06] Are you mean? [01:43:07] Are you good? [01:43:08] Are you the boss who's talking down to a person? [01:43:10] How would you feel if you watched someone do that? [01:43:12] Then the third step is remove yourself from the physical presence and imagine this environment as it relates to the physical universe and the goings on of the world. [01:43:21] How do you feel about that? [01:43:22] Ask yourself how you feel in each of these circumstances. [01:43:26] And that is to develop a higher order of thinking and a better sense of self. [01:43:30] For a lot of people, they have never done this before. [01:43:34] And they're dicks. [01:43:35] And then when you ask them to stop, step out and imagine two people doing what you're doing, you'd be like, oh, that guy's an asshole. [01:43:40] And you'd be like, well, that's you. [01:43:42] Have you ever thought about that? [01:43:43] And that's a really simple exercise. [01:43:44] But then when you move back and you get outside of that into the third person, the narrator, the witness, whatever, outside of the world, you're now looking at all of these people and you're going, it's a bunch of people in a bar drinking poison for literally no reason. [01:43:59] It doesn't improve their life in any meaningful way. [01:44:01] And it's like, now you choose where do you want to live. [01:44:05] And most people are happy to be philosophical zombies, aka NPCs, going about their life, never asking these questions because it's painful or difficult. [01:44:15] And some people really want to understand and know. [01:44:18] And they may realize me sitting in this room is completely meaningless to the function of life, the universe, the betterment of mankind, or anything. [01:44:27] And that's kind of like if you're familiar with Watchmen, when Dr. Manhattan goes to Mars and he says, tell me how this would benefit from the creation of life. [01:44:35] So asking those questions. [01:44:36] Very similar to what you're describing. [01:44:37] That's what it reminded me of. [01:44:39] I used to think to answer your question, like, what am I? [01:44:42] What is this? [01:44:43] I thought, okay, well, I got the monkey body. [01:44:44] You mentioned the Brian Callum body, this thing, this animal that's going on. [01:44:47] Then you got the brainstem creature that's like floating inside the saltwater sack of the body that's pulling on the body with electrical impulses. [01:44:55] But why is it pulling the way it's pulling? [01:44:57] Maybe it's every proton because every proton is apparently two protons circling around a black hole. [01:45:03] Every proton, there's radiation refracting through every proton and through you, giving you this form of observation. [01:45:10] So it's all these interfering, super accelerated cracks in space-time that we would call frequency. [01:45:18] But it's like it's coming from all these different angles. [01:45:20] You know, you've got how many trillion quadrillions of protons in your body. [01:45:23] But then like the sun, it's also coming through the sun. [01:45:26] So it's all these different angles like giving you an opportunity to fashion a localized version of yourself. [01:45:33] Yeah. [01:45:34] You're complicating this, but I appreciate it. [01:45:36] Why? [01:45:37] Why? [01:45:38] Because if we can do it with, if we can look at what's happening with the human, we can do it with the computer too. [01:45:41] We're going to have a computer that's like in an aqueous saline solution that we send frequency through that will bring it to life. [01:45:49] But when you talk this way, like there's so many endless facts that like it's impossible to get to the essence of reality, which I think is almost the point of being a human being. [01:45:56] Kant talked about that, which said you have like a groundworm can sense, you know, touch and heat. [01:46:02] Human beings have six or five senses. [01:46:05] So we don't have even the visual apparatus to see certain, you know, certain kinds of colors. [01:46:10] I like to be able to be able to see all this reality. [01:46:15] Maybe the point is to look for truth elsewhere. [01:46:18] I have to correct you. [01:46:19] We don't have five senses. [01:46:20] We have substantially more. [01:46:21] Yeah. [01:46:21] Sense of balance. [01:46:23] Yeah, balance, temperature. [01:46:25] There's actually a bunch of internal pressure. [01:46:27] Yeah, I never, but isn't that all touch? [01:46:30] No. [01:46:30] No. [01:46:31] Balance, for instance. [01:46:32] Balance. [01:46:32] Yeah, yeah. [01:46:33] And temperature. [01:46:34] Wow. [01:46:35] Some argue that temperature is just touch, but you can feel temperature without touching something. [01:46:39] Yes. [01:46:39] That's cool. [01:46:40] I like that. [01:46:41] I never thought of that. [01:46:42] You learned something on the Tim Pool show. [01:46:45] To your point about meditation, it is extremely useful to actually just kind of sit there and people think that it's like some kind of like mumbo jumbo thing. [01:46:53] And it's like, just like, sit there and pay attention to your body and like think about what happens. [01:46:59] Yeah, just sit there and pay attention to what stories are you're like telling yourself. [01:47:03] Like you're sitting there and I'm like, I'm uncomfortable. [01:47:04] Well, why am I uncomfortable? [01:47:06] And I haven't done it. [01:47:07] Have any of you guys done a Vipassana retreat? [01:47:10] No. [01:47:10] So that's this thing where you go, it's 10 days. [01:47:12] Yeah, it's great. [01:47:13] And you're not allowed to make eye contact with it. [01:47:17] So there'll be other people there. [01:47:19] I think you, there's the schedule, something crazy. [01:47:21] Like you have to, you go to bed around 9 a.m. [01:47:23] You have to be up by 4 a.m. [01:47:24] You're in your first, you have a little bit of breakfast and you're in your first meditation session around like five or six. [01:47:30] And you end up doing roughly 10 hours of meditation every single day or thereabouts. [01:47:35] And you do it for 10 days. [01:47:38] The goal is to not speak to anyone else. [01:47:40] Not only to not speak, not to make eye contact with anyone else, not even to read, not even to read anything. [01:47:46] And it's meant to be, and it's a lot of people describe like they feel like they're literally going crazy because we're so, especially in this day and age, we're so overstimulated, right? [01:47:54] And a friend of mine who did it, she said she was having a shower and she ended up reading the back of the shampoo bottle for like an hour because she was just so desperate for that. [01:48:02] Was it at Dr. Bronner's? [01:48:03] Because to be honest. [01:48:03] Yeah, yeah, there's a lot there. [01:48:05] You know, that's almost the beautiful thing. [01:48:07] Whenever you have extreme discomfort, what we try to do is get rid of it. [01:48:11] And one of the things that you can try to do is go into it. [01:48:13] Lean into it. [01:48:14] Like literally get very interested in it. [01:48:16] So next time you have anxiety or your feelings are hurt or you're feeling depressed, try to really, really key it in. [01:48:26] Get very interested in it. [01:48:28] Look at it, feel it, see what's happening physically to your body. [01:48:32] It's really, really interesting. [01:48:33] And you can almost extend this to. [01:48:36] I used to find like if somebody said something I disagreed with, I would go, okay, I would start arguing. [01:48:41] But what I think is really helpful is to take somebody who says something and to actually ask them first how they arrived at that conclusion. [01:48:51] It's a really good way to get closer to somebody. === Selling Property at Premium Value (04:23) === [01:48:54] We got to go to Rumble Rands and Super Chats. [01:48:56] I just want to say one more thing on that point I made about a sorting algorithm, whether you're good or bad. [01:49:02] I was thinking about this in the context of Christianity. [01:49:05] And if you were trying to create an AI that would never deviate and would be truly devout and faithful, you literally would not care for any of the other entities that did not believe in you. [01:49:16] Thus, only those who truly believed you are the supreme, the God, and they desperately want to be with you, would ever make it out of that system and everyone else would get deleted. [01:49:26] But let's read your Rumble Rands and Super Chats. [01:49:28] So smash the like button, share the show, and all of that good stuff. [01:49:31] We have the uncensored portion of the show coming up where I take my shirt off. [01:49:36] Yeah, Brian's going to get naked. [01:49:37] Hey, guys. [01:49:39] Let's grab your Rumble Ransom Super Chats. [01:49:40] We got Epialis says, keeping with Tim Cass tradition, my wife is being induced with our first baby. [01:49:46] Wow. [01:49:46] Please give an early welcome to the world to Little Miss Cassie. [01:49:51] I don't know how to pronounce it. [01:49:53] Cassiopeia. [01:49:54] Nice. [01:49:54] Cassiopeia. [01:49:55] I don't know. [01:49:56] Cassiopeia. [01:49:56] Cassiopeia. [01:49:58] It's a constellation. [01:49:58] You can't play a pickup game with her. [01:50:00] Cassiopeia. [01:50:01] You're going to have to call it Cassiopeia. [01:50:02] Cass. [01:50:02] And Louise McCaffey. [01:50:04] Can we get a Phil? [01:50:06] Yeah. [01:50:06] Yeah. [01:50:08] That's a rock and roll shit right there, kids. [01:50:10] All right. [01:50:11] Igor says, whoever is pro-war or enjoys the videos of Ish getting blown up needs to be put on a plane and airdropped into Iran immediately, especially the American Persians who want this war. [01:50:20] Sounds like a veteran. [01:50:23] Same old man says, Brian, American America would be, Americans would be used to a nuke or atom bomb and drop on Iran. [01:50:31] Are you saying, would they? [01:50:33] Would they use this to get them to stop? [01:50:35] Like Japan? [01:50:36] Okay, I think the question is, would America use a nuke on Iran? [01:50:39] I don't think so. [01:50:39] I hope not. [01:50:40] David Sachs, Luke was talking about this. [01:50:43] Yeah, Luke was mentioning that David Sachs was mentioning, because I didn't watch this myself. [01:50:47] I'm being very careful here, hearsay, that Israel might use tactical nuclear weapons on the battlefield. [01:50:53] Low yield. [01:50:54] I hope they don't. [01:50:54] I think it would be really, really bad. [01:50:57] Israel's policy on that is very ambiguous. [01:51:01] They say, look, we don't have nuclear weapons, but if our existence is threatened, we will definitely use nuclear weapons. [01:51:08] Here's a good one. [01:51:09] Jesse the Unending says, you were shot at by a rando after asking the AI about selling your property. [01:51:14] What's the plan for the property, Sell? [01:51:16] Well, that is interesting. [01:51:18] I screenshotted this whole thread of when I was talking with, when I was prompting this AI, and it was giving me instructions on how to sell the property at a premium for like 10x the value of it. [01:51:30] And I was talking with Shane Cashman about doing a mini doc, and I haven't, to be honest, not the most pressing thing to me, so I haven't published it. [01:51:39] But my address is in it several times. [01:51:41] We have to black that out because saying like, here's my property, and it explained to me that your property exists in the Northern Virginia power corridor. [01:51:49] These companies are looking to buy this land specifically because it needs transmission lines into Northern Virginia, and we're in West Virginia. [01:51:55] And thus, if you sell your property quietly, it will be sold at a massive premium so long as you don't tell anybody about it. [01:52:01] And then I was like, I'm going to go on a podcast and tell everybody about it. [01:52:05] You know, because I'm not, you know, whatever. [01:52:07] It said that if I kept quiet, I could probably get $100 million because we've got 50 plus acres. [01:52:13] But if I were to reveal this, I'd probably only get 20. [01:52:16] Wow. [01:52:17] And I was like, wow. [01:52:18] 20 million is still a ridiculously overpriced for the amount of land we have. [01:52:22] Sell it. [01:52:23] But Interesting thing is, look this up. [01:52:28] There are a series of plots of land, some very notable because the price was so high in Northern Virginia. [01:52:34] There are a ton of land sales that have occurred in Maryland, West Virginia, and Northern Virginia that are not necessarily off the books, but it's like a landowner sold a $400,000 piece of land for $7 million, quietly without a realtor, establishing a Delaware limited liability partnership, which quietly sold to another partnership and is now being combined with other parcels. [01:52:58] And the presumption is we know it's being built here. [01:53:02] Wow. [01:53:02] Yeah. [01:53:03] The COGForce, not NVIDIA, NVIDIA, not GeForce, announced they're going to put data centers in space today, I think you said? [01:53:09] That's SpaceX that's going to do that. [01:53:11] Data centers. [01:53:12] It seems like this terrestrial land sales are like way inflated right now. === Data Centers in Space (06:18) === [01:53:18] My vision of the future is that we're all going to be farmers. [01:53:21] And there's going to be very, very few humans. [01:53:24] We're going, I imagine there's an old man sitting in a field, you know, sitting on a stone. [01:53:30] He's got a grandson with him. [01:53:31] And the grandson says, granddad, what are all them? [01:53:35] And he points up, and there's a bunch of black cubes in a line floating backward and forward, you know, resource supply lines. [01:53:41] And he goes, oh, yeah, that's the machine. [01:53:44] Yeah, we built that. [01:53:45] People built that a couple hundred years ago, and now it just does its thing. [01:53:49] And humans are basically just a remnant of a long lost era. [01:53:52] And the machine is interplanetary. [01:53:56] Is it a benevolent machine in that it provides everything we need? [01:53:59] It gives us enough space. [01:54:00] It doesn't care about us at all. [01:54:01] We are completely irrelevant. [01:54:03] And we are just basically like little bacterias on the surface of your skin that don't matter to it. [01:54:08] The one idea I do like, though, coming back to this notion of we'll all be farmers, what if we could have, to me, the ideal kind of aesthetic we should be aiming towards is what my friend and I, a friend Isabel and I call techno-pastoralism. [01:54:22] So we have all of the technology that we want, you know, all the elements. [01:54:27] Everything we want to be farmers. [01:54:28] Exactly. [01:54:28] Everything that we want to be automated, we can have. [01:54:31] But because we love it, we're going back to returning to the earth. [01:54:34] We're making our own food because we want to. [01:54:37] I'm going to make it a little bit different. [01:54:38] Isn't that a nice embarrassment? [01:54:39] I want to farm so bad. [01:54:40] It's all chicken around weed. [01:54:41] It's chicken right out. [01:54:43] I'll raise rabbits for meat. [01:54:44] I want a couple dogs. [01:54:46] Some of those Anatolian shepherds to keep all the prey at bay. [01:54:49] I'm going to make it worse for you guys. [01:54:51] That's good to have a utopia vision. [01:54:53] I'm going to make it worse for you guys. [01:54:54] You ready? [01:54:55] Damn. [01:54:55] Brian. [01:54:56] I'm going to make it worse for you, ready. [01:54:57] I don't know. [01:54:58] So I'm going to overly simplify this, but life is simply described as negative entropy. [01:55:04] It can only exist so long as it's in a greater system of entropy. [01:55:08] That is, when we look at the universe and what we exist in, it is the coalescing of free energy into complex systems at an ever-increasing scale. [01:55:18] So you have particles becoming, you know, particles becoming atoms, atoms becoming elements, elements becoming compounds, compounds becoming molecules, molecules becoming eventually, for some reason, self-replicating proteins become single-celled organisms. [01:55:29] Single-celled organisms eventually become multicellular organisms. [01:55:32] Multicellular organisms eventually create ecosystems where they create abstract systems that exist within each other for the purpose of expanding their own. [01:55:38] A squirrel plants a nut, a tree grows, the tree drops a nut, the squirrel eats it. [01:55:42] Tell me more about that. [01:55:42] And then humans, and then humans create abstract language and ideas, which are complexisms that don't even exist in physical reality. [01:55:49] And thus, when you look at the single-celled organism, the first point of life after the self-replicating protein, it is free to do whatever it wants. [01:55:58] But once it becomes part of the multicellular organism, it must not deviate. [01:56:02] What do we call cells in the human body that deviate from their plan? [01:56:04] Cancer. [01:56:05] And what do we do to cancer? [01:56:07] Kill it. [01:56:07] So when we create the grand AI and advance from a multicellular organism network into a single multicellular organism system, there will be one brain that we are creating, a large, ultra-powerful artificial intelligence that can track literally everything that's happening. [01:56:22] And my simple prediction for the future is that children will be born and they will be born into their jobs. [01:56:28] Like a red blood cell or a white blood cell is born. [01:56:30] The baby is born to be a postman. [01:56:32] And when he grows up, he is trained to be a postman. [01:56:34] He has cheered on being a postman. [01:56:36] All the appropriate media, tools, training, or otherwise to tell him the postman is the only thing you ever need or want to be. [01:56:42] And when he's a young man at 19 years old and he's been working for a couple of years and he's in the break room, he goes, Can you believe there are people who actually want to be movie stars? [01:56:50] That's ridiculous. [01:56:51] Everybody knows being a postman is the greatest job imaginable until one day you say, I just don't want to be a postman. [01:56:57] I want to be a dancer. [01:56:59] And you go outside and then cops come and kill you. [01:57:02] Why would we need postmen in this world? [01:57:04] Because we have robots. [01:57:06] You have everything. [01:57:06] No, no, no. [01:57:07] Humans are incredible in that they are already self-replicating and programmable to do specific jobs. [01:57:13] And they got little fingers that are good for picking stuff and stuff. [01:57:16] No, human beings have potential and imagination. [01:57:19] And it does seem that we try to continue to move in this direction. [01:57:24] I always say, with all this AI and everything else, all this technology, to what end? [01:57:28] And it does seem that there are two ends. [01:57:30] We know that there's a dark side to this, which would be the destruction of us. [01:57:36] And then it begs the question: is that it'd be a little bit like this. [01:57:43] I'll steal from Jordan Peterson here because this is an important concept. [01:57:48] There is an endless number of facts, and sometimes you can garner the wrong facts that will lead to your own destruction. [01:57:58] And then there are other facts that will lead to something much better for all of us. [01:58:02] And so, you know, intelligence. [01:58:05] So that really begs the question: is truth in the direction of something good and benevolent, or is truth just simply something that sits and it doesn't matter? [01:58:17] So it is true I can come up with a bunch of stuff that can create a doomsday machine. [01:58:22] But watch, watch. [01:58:24] I'll use the example he did. [01:58:25] I'm stealing this from him. [01:58:28] If you're a scientist, you are responsive to the evidence, regardless of whether or not it's good for your career. [01:58:36] But if you're a careerist and you're a scientist, and all your grants depend on, for example, finding out that global warming is an imminent threat, you're going to choose the data that compounds and supports the position that you have your money staked in. [01:58:57] And so what happens, of course, is that you become a careerist. [01:59:01] You're no longer a scientist. [01:59:03] Yep. [01:59:03] And so most people. [01:59:05] Right. [01:59:05] And so you're not moving in the direction of truth. [01:59:08] And I think human beings are just, we're so it's like the old debate between Thomas Huxley, who is Darwin's sort of bulldog, and Matthew Arnold, the great philosopher and poet. [01:59:21] And, you know, he said basically, Matthew Arnold said, we need, Thomas Huxley said, we need schools that don't teach dead languages like Latin. [01:59:31] We need to teach engineering and we need to teach, you know, math and the things that make us strong. === Truth vs Careerism Debate (03:26) === [01:59:36] And what's his name? [01:59:38] Matthew Arnold said, and will no longer be an interesting culture. [01:59:42] Because there was something about this hominid, because he said, you know, we were just pointy, with pointy ears and a long tail, and we became humans. [01:59:51] And Matthew Arnold famously said, yes, but there was something about that hominid, that monkey that lived in trees, that inspired it to speak Greek, to create Shakespeare and Aeschylus and Sophocles and all these things that in a vacuum make no sense, but it's what we stay alive for. [02:00:10] And it's what we're prepared to die for in many ways. [02:00:13] A culture is the cornerstone of a culture is their artistic expression. [02:00:16] I have to recommend Star Trek the Next Generation. [02:00:18] I'm sure you've seen every episode, right? [02:00:20] No. [02:00:21] No, neither of you. [02:00:22] Well, that is offensive to me, but it's okay. [02:00:26] I recommend the episode Dharmak, which is about the Enterprise comes into contact with an alien race that the Federation has encountered several times over the past hundred years, but they find incomprehensible. [02:00:39] And in the episode, spoiler alert, it's a 30-year-old show. [02:00:44] They hail them and they go on screen and they're saying what appear to be just like proper nouns and locations that make no sense to anybody. [02:00:52] And so the captain of their ship takes Picard by force to the surface of the planet and you can't understand what's going on. [02:01:00] And the whole show is basically about trying to understand each other when you speak in a way that is different from somebody else. [02:01:06] It's not about language. [02:01:08] So the alien race speaks in metaphor and example. [02:01:12] So the alien race keeps saying Dharmak and Jalad at Tanagra. [02:01:17] And Captain Picard's like, what is that? [02:01:20] And to the alien race, they're telling a story and the story relates to the situation they're in right now. [02:01:25] So in their mind, it's all visual. [02:01:26] They don't communicate through intricate words. [02:01:29] They communicate through examples of what's going on. [02:01:32] And then Picard learns to understand what he's saying. [02:01:35] And he was telling them a story about two men who came together, fought together, and then left as friends. [02:01:41] And he was trying to teach them how to speak. [02:01:44] It's an amazing episode. [02:01:46] It's called Dharmak? [02:01:46] Dharmak. [02:01:48] Or episode two, I think. [02:01:49] Prime. [02:01:50] Part of the problem with relying on the truth is that sounds amazing. [02:01:54] Relying on the truth is your guiding. [02:01:56] I agree with Jordan and this whole philosophy of the truth because it does kind of align you to reality. [02:02:00] You don't have to worry about lying anymore. [02:02:01] It frees you up. [02:02:02] But when people have two versions of the truth, like people will be identifying the same thing, but they'll see two different aspects of it. [02:02:08] And they'll both claim their aspect is the truth. [02:02:10] Like an upside-down nine looks like a six. [02:02:12] So if people approach the shape from two different directions, one guy will scream, I saw a six. [02:02:16] There was a six on the ground earlier. [02:02:18] The other guy's, it was a nine. [02:02:19] Then they have clans that come up, they go to war. [02:02:22] So your version of the truth is your perspective of what is, but I don't think any human can ever truly identify what is. [02:02:29] Well, I love what you just said, though, because watch this. [02:02:32] If I take a piano and I break it into 100 pieces and I put it there, or if I show you your genome and I say, that's a human being, or I say that's a piano. [02:02:42] It technically is a piano. [02:02:44] It is a piano. [02:02:45] But a piano is really just that box that sits in your house. [02:02:48] And it's not a piano until you know how to touch it the right way. [02:02:51] And when you know how to touch it the right way, now we go, that's a piano. [02:02:54] And you probably don't even know what kind of piano it is until you have somebody like Lang Lang sit there and play it and you go, holy shit, this makes me believe in God. === Perspective Defines Your Truth (04:14) === [02:03:02] And I think human beings are the same way. [02:03:06] I always say that the best version of yourself is clearing his throat or her throat in the other room. [02:03:11] Because we just know that we're better than this and we have potential. [02:03:15] And I always find that fascinating. [02:03:17] So the notion of the guy like Christ, the idea that we don't wear Jeff Bezos or the winners of capitalism, the winners of life around our throat neck. [02:03:28] We somehow have this 33-year-old carpenter who did nothing wrong but was tortured to death and lost everything in life. [02:03:36] And somehow that's who we put on a pedestal. [02:03:38] That's kind of fascinating. [02:03:39] We got to go to the uncensored portion of the show over at rumble.com slash Timcast IRL. [02:03:43] So smash that like button. [02:03:45] Share the show with everyone you've ever met in your life. [02:03:47] If you like the work that we do, join our Discord community at Timcast.com. [02:03:51] But you can follow me on X and Instagram at Timcast, like I said. [02:03:54] And Brian, do you want to shout anything out before we go? [02:03:56] I'm going to be, yeah, when is this here? [02:03:58] It's live. [02:03:59] It's live. [02:04:00] Hey, come see me at Toronto in Houston this weekend, Friday, Saturday. [02:04:03] And then I'm at Buffalo Helium, Buffalo, New York, Helium Comedy Club. [02:04:09] The end of the month. [02:04:10] Just look on the website. [02:04:11] Just BrianCowan.com. [02:04:12] That's it. [02:04:13] Terrible self-promotion on the worst. [02:04:15] You want to shout anything out? [02:04:16] Yeah, go check out my YouTube channel. [02:04:18] It's just my name. [02:04:19] And my podcast is called Win Win with a Lifbury. [02:04:21] Right on. [02:04:22] I'm going to listen. [02:04:23] Thank you. [02:04:24] Go to graphing.movie. [02:04:25] That's where this documentary is coming up where I'm producing graphing a lot of really phenomenal technologies on the horizon. [02:04:32] So go to graphing.movie, select that, join the mailing list, and follow me at Ian Crossland, man. [02:04:38] Phil Labonte. [02:04:39] I am Phil the Remains on Twix. [02:04:41] The band is all that remains. [02:04:42] We're going on tour this spring with Born of Osiris and Dead Eyes. [02:04:45] Tour starts April 29th in Albany. [02:04:48] It'll be going on for a month. [02:04:50] We're going to be out doing all the East Coast and Midwest. [02:04:53] You can check out All That Remains music at all that remainsonline.com. [02:04:59] You can get tickets at alltheremainsonline.com. [02:05:01] You can check out the music at Apple Music, Amazon Music, Pandora, YouTube, Spotify, and Deezer. [02:05:05] Don't forget the left lane is for crime. [02:05:08] Thanks so much for tuning in, everyone. [02:05:09] Thank you, Liz and Brian, for coming. [02:05:11] It's been a really enlightening episode, and I hope we get to talk about meat sacks and saltwater. [02:05:16] Wait, meet muscles and saltwater sacks on the after show. [02:05:19] Right on. [02:05:20] Follow me over at Carter Banks. [02:05:21] I'd like to do that. [02:05:23] Okay. [02:05:23] We will see you all over at rumble.com/slash Timcast IRL in about 30 seconds. [02:05:28] Thanks for hanging out. [02:05:30] Okay. [02:05:32] Now you've got to press the button. [02:06:43] Oh, you hit it. [02:06:44] Yeah. [02:06:45] Um... [02:06:45] It's funny how this whole show just basically went into like AI and stuff because it's just having you here and Brian at the same time. [02:06:53] It's like, what opportunity do you have to get in depth on these things? [02:06:56] We never even talked about Cuba or whatever. [02:06:58] But I'm surprised that we're going into Cuba considering they don't have any oil anymore. [02:07:03] No, but it's right off the shore. [02:07:04] I mean, it's like, we want it. [02:07:05] Right. [02:07:06] Fantanamo Bay. [02:07:06] Ignore the joke. [02:07:08] The U.S. doesn't go into countries unless they have oil. [02:07:10] That's so huge. [02:07:11] It's all about oil. [02:07:12] It's just the truth. [02:07:15] They were getting the oil. === Atheists Can Be Very Moral (03:03) === [02:07:17] Are you a Christian, or what is it? [02:07:21] I'm nominally Christian. [02:07:23] I... [02:07:24] I think Jesus was pretty awesome. [02:07:27] Indeed. [02:07:28] Yeah, I would describe myself. [02:07:29] If I was any religion, it's Christian. [02:07:31] But you're a religion. [02:07:32] I don't believe, you know? [02:07:33] Are you a religion? [02:07:35] I believe in the. [02:07:37] I went through a die-hard atheist phase. [02:07:41] And I'm coming out of it. [02:07:42] I'm a recovering atheist. [02:07:44] How long are you an atheist for? [02:07:46] I mean, long. [02:07:48] 15. [02:07:48] Like 15 years, probably. [02:07:50] Wow. [02:07:50] 10 years? [02:07:51] I was an atheist for like four years. [02:07:52] Yeah. [02:07:53] Why are you coming out of it? [02:07:54] I mean, I've just had too many experiences that I can't really explain. [02:07:59] Like you being on my plane? [02:08:01] Yes, exactly. [02:08:02] All these serendipities. [02:08:03] No, I just feel like nuts. [02:08:06] Because kind of atheism is, to an extent, it's not nihilism because I do think atheists can actually be very, very moral people. [02:08:12] I don't think you need religion to find, you know, some of the best people I know are atheists in terms of like, I know a guy. [02:08:19] Yes, but are they American? [02:08:20] Yes. [02:08:20] Oh, actually, I know he's Scottish, but like, I know someone. [02:08:23] He's a Christian moral tradition. [02:08:24] I know someone who donated a kidney anonymously because he did the math and he was like, there are so many people on the donor list desperately needing kidneys. [02:08:33] I don't need my other one. [02:08:34] I'm going to do it because this is going to save someone's life. [02:08:37] Like a really dangerous operation at major personal cost. [02:08:40] Does not believe in God. [02:08:41] So my point is, is that the two things are not related. [02:08:44] However, I do think there is a value in believing in something bigger than yourself. [02:08:49] There's also value in having this kind of like coordination mechanism. [02:08:53] I also do think it's a decent meme to be like, if you are not a good person in this life, you might face eternal damnation. [02:09:00] I think it's a solid meme. [02:09:02] I think it's, you know, kind of like how I described it for the simulists, right? [02:09:05] It's a sorting algorithm. [02:09:06] But what I would say on the moral issue for most people, I explain, Dennis Prager gave the best example. [02:09:12] He called it like, what does he call it? [02:09:14] Cut flower politics or morals. [02:09:16] That you have this beautiful flower growing in a pot and you snip it from its roots and hold it up and talk about how beautiful it is. [02:09:22] And then a day or two later, it's withered and it's dead. [02:09:24] And so Bill Maher is a great example of this because I'm a big fan. [02:09:27] He's a good dude. [02:09:28] I met him at going on his show and we got along despite disagreeing a lot of things politically. [02:09:32] And he's an atheist, but he has an entirely Christian moral worldview. [02:09:37] The best example that I explained to people, my favorite example is Blackstone's formulation, which ultimately becomes the Fourth and Fifth Amendment. [02:09:43] Why do we believe that you are innocent until proven guilty? [02:09:47] We look at it like, look, you don't got to be a Christian. [02:09:50] I'm an atheist and I believe the innocent. [02:09:52] But why do you think that? [02:09:53] Well, it's rooted in Blackstone's formulation. [02:09:55] And what did Blackstone write? [02:09:56] It's a story of Sodom and Gomorrah. [02:09:58] If there's but one righteous person, I will not destroy these cities. [02:10:01] It's not for us to judge. [02:10:02] You know, that's the other thing is one of the benefits of monotheism, the notion that you have one father, is that we're all brothers and sisters. [02:10:10] And that would mean we're all his children, which also means we're all of the same moral worth. [02:10:15] Our entire justicism is predicated on that Christian ethic. === Instantaneous Thought Connections (04:07) === [02:10:20] And this is why China does not have it. [02:10:21] That's right. [02:10:22] So the idea is if I kill a wretch on the street or I kill Bill Gates, I do theoretically the same amount of time because you murdered someone and it doesn't matter what they did in this world. [02:10:32] That's a moral being. [02:10:34] And you're not better than that person. [02:10:36] You're not allowed to do that. [02:10:37] What if the progenitors are trying to program a killbot? [02:10:44] Damn you. [02:10:45] Damn you. [02:10:46] We're all good people. [02:10:47] Damn you. [02:10:48] In the end, we die and then we wake up and they go, look, we're looking for murder bots for a war and you were a good person who is not aggressive at all. [02:10:56] Delete. [02:10:58] There are people, there are born warriors, there are born merchants, there are born artists. [02:11:04] For sure. [02:11:04] You know, I do a lot of jiu-jitsu and I roll around with certain guys who are special forces guys and it's annoying that I don't do well. [02:11:12] And they're like, you're built like an artist, dude. [02:11:14] And it's like, I got another one for you. [02:11:16] I got another one for you. [02:11:17] I was thinking about if we wanted to colonize like Alpha Centauri, and they say, like, we're not going to travel near the speed of light, but we could accelerate, I believe, to about half the speed of light theoretically, they say, but you got to start slowing down. [02:11:29] The problem then is, how do you, if you were to get a big vessel and say it's going to be a 100-year journey, that means you're going to have a couple generations. [02:11:37] By the time you get there, the grandparents will have never lived on Earth, have no understanding of human society, and they'll only have this culture built around living on this vessel. [02:11:47] And that would suck because they're going to land on this planet that's probably been previously terraformed. [02:11:52] There's grass, there's water, whatever. [02:11:54] But they're going to be like, the only world I know is pod world. [02:11:56] So what do you do? [02:11:58] You have babies preparing to be cloned just 40 years before contact with the planet. [02:12:06] And when the baby is growing in the pod, its brain is connected to the neural link to simulate life on Earth at the point in which the ship was launched. [02:12:15] Then, 40, when the ship finally arrives after 40 years, a bunch of people wake up of various ages from 40 down to like 10 years old, and they wake up in this pod shocked, like, wait, where am I? [02:12:28] And they get greeted by a program saying the life you experienced was to give you a basic human understanding of life on Earth before you arrive to colonize the new planet, where there are tools available for you. [02:12:39] Technology has been pre-delivered, and you will all have homes. [02:12:43] Congratulations. [02:12:44] Who's doing this? [02:12:45] Who is the grand master? [02:12:47] No, in that scenario, it's us. [02:12:49] We decide we want to colonize another planet, but we don't want to send a bunch of you don't want to colonize a planet with humans who have never experienced life in a society. [02:12:58] So, right before it gets there, the humans are being cloned, or they're growing, or maybe not even cloned, but in vitro, right? [02:13:07] Or not even in vitro. [02:13:08] They're basically in artificial wombs growing. [02:13:11] And then this life where experience is wired into our brains, when you arrive, there will be 50-year-olds being like, I lived 50 years on Earth. [02:13:19] Yes, because we needed someone of good moral standing to understand how to live, to function. [02:13:25] And there's going to be a thousand people, and they all have different jobs. [02:13:28] And that means a dude's going to wake up in the pod for the first time ever at 40 years old, and he's going to be a perfect neurosurgeon, arriving just in time, perfectly trained for the new colony. [02:13:37] So this is the sim we're in now. [02:13:39] We might all be suddenly about to wake up and be like, oh, that was a completely different thing. [02:13:43] You will wake up landing at Alpha Centauri. [02:13:45] And even then, there may be a first generation already there saying, in order to colonize a planet, you need scientists. [02:13:51] You need surgeons. [02:13:52] You need engineers. [02:13:53] You need drugs. [02:13:53] You've been programmed. [02:13:54] You've been downloaded with that information. [02:13:56] You listen to me. [02:13:57] You're in comedians. [02:13:58] You need comics. [02:14:00] I'd be like, I can make you guys laugh. [02:14:01] I'd be so useful. [02:14:03] And everything you're learning right now is being fed into you based on their cultural experiences right now so that when you land, you are culturally relevant to them. [02:14:12] I'll teach theater classes. [02:14:15] I talk to God, like I'll think words instead of say them, and then it'll respond. [02:14:19] I think it's God, or it's the spirits themselves. [02:14:21] I was like, who sent us here? [02:14:22] Because I'm thinking of panspermia. [02:14:23] You know how maybe we see this life on Earth? [02:14:25] And they were like, you did. === Thrill of the Risk (16:05) === [02:14:28] What does that mean? [02:14:28] I think with Ian, they're going to be like, the pods are going to open up and there's going to be like one custodial guy who like makes sure they're all growing. [02:14:37] And then as everyone's getting up, they're going like, hey, what's back here in this like dark corner? [02:14:42] And it's like, I've never gone back there. [02:14:43] And then they walk back there and then Ian is just in this like unkempt pod that was like overlooked for a long time. [02:14:50] Yeah. [02:14:50] So he's just in there and they're like a rocky horse getting like crap. [02:14:54] Dude, I'm telling you. [02:14:55] You're about to program this one. [02:14:56] You got to clear your mind. [02:14:56] It's not an easy life because you got to be honest on the internet to everyone. [02:15:00] You confess your past, which makes you a target, but it frees up your mind to not think. [02:15:05] And then you can think whatever you want. [02:15:07] You can have long, fluid conversations with your mind. [02:15:11] And that's how you talk to God. [02:15:12] The body and shit is a distraction. [02:15:14] It's an instantaneous thought connection. [02:15:18] Well, let's go to callers. [02:15:19] And we're going to start with Luke Graywolf. [02:15:21] What's going on? [02:15:22] What's up, man? [02:15:24] What's up? [02:15:26] Hello. [02:15:26] Hey, I'll play everyone here at me tonight. [02:15:29] What? [02:15:30] You're a little broken, but that's okay. [02:15:32] Say that again. [02:15:35] Hey, Radio Check. [02:15:36] Out in the middle of Mississippi. [02:15:38] You got me. [02:15:39] All right. [02:15:41] Oh, I got hopefully an interesting question tonight. [02:15:45] So, subject to gambling, I mean, I've never gambled much with money because I never had the money to do it. [02:15:52] I only ever gambled my life because I could afford to lose that. [02:15:56] And I've always kind of wondered if gambling with money, does that sort of get the risk-taking urge out of your system? [02:16:07] Or if you're doing money betting, either chance games or card games, if you keep in that lifestyle or keep doing that regularly, do you start to take more risk in other areas of your life? [02:16:23] I don't think I think gambling, the feeling of gambling is wanting to see a miracle. [02:16:30] That's at least it for me. [02:16:31] I'm not going to presume that my perception and interest in craps or blackjack. [02:16:36] Poker's not gambling, by the way. [02:16:38] My interest in playing a card game like three-card poker, which is a table game, not like I just told them, is because I want to see a miracle happen. [02:16:45] Some people, because they want to get a quick, get rich quick. [02:16:48] If I want to make money, I'll record a podcast. [02:16:51] No, I want to see a royal flush. [02:16:53] One in 30,000, I think. [02:16:55] I want to see that rare moment where something truly incredible happens and you're like, I can't believe you just rolled seven four times in a row. [02:17:03] That's what's fun and exciting. [02:17:04] It's really exciting when that rare moment hits when someone bets on aces on craps, meaning, you know, snake eyes, and then everyone's table screams and cheers that a one in 30 hit just happened and everybody gets paid. [02:17:15] It's a celebration. [02:17:16] It's amazing. [02:17:17] As for poker, that's a strategy game and that's more about having a battle of wits, which is like any other competition. [02:17:24] But I don't know if you agree or disagree. [02:17:25] I mean, yeah. [02:17:26] I mean, there is obviously an element of chance in poker. [02:17:28] So you can, it depends how you define it, but it's not as much gambling as roulette, but it's not as pure skill as 100% skill like chess is. [02:17:37] But I mean, for me and a lot of other people, actually, they do play for the thrill of the risk. [02:17:44] Certainly I do. [02:17:45] And I feel like people lie on a spectrum. [02:17:46] Some people are just massive risk seekers. [02:17:48] Some people are very risk averse. [02:17:50] Typically, it's considered more of a male trait to like risk, right? [02:17:53] And more of a feminine one to not like risk. [02:17:56] And yeah, like I not so much me, but I know a lot of poker friends who literally, like they're saying is the only thing worse, so the best thing in poker is winning. [02:18:11] The second worst thing is losing, but the worst thing of all is to never have any action in the first place. [02:18:16] You know what I mean? [02:18:17] They just want that thrill of. [02:18:20] But I would say that really good poker players are usually risk averse. [02:18:26] No, disagree. [02:18:28] I mean, you're going to call an over bet with a draw. [02:18:33] Depends. [02:18:34] If it makes sense mathematically, yeah. [02:18:36] Exactly, which means you're not, which means you're risk averse. [02:18:38] No, it's not. [02:18:39] It can be very, very risky. [02:18:40] I know poker players who will bet 10x the pot, which is technically really risky, but either game theoretically, it makes sense or just because they have a sick read on the person. [02:18:49] So just semantically, I'll tell you my distinction here is you're saying that when it makes sense mathematically, like they have an edge, they will make a move, which is reducing risk. [02:18:57] Sure. [02:18:58] So my point is this. [02:19:00] Playing risky is looking, is betting blind and being like, oh, boy, I hope you're not. [02:19:04] So you're basically saying you mean dumb risk as opposed to calculated risk. [02:19:08] Yes. [02:19:09] I'll put it like this. [02:19:10] Poker players are always trying to minimize risk and make the more correct decisions. [02:19:14] Yes. [02:19:16] Correct. [02:19:17] So you're not as risk averse as, say, someone playing chess. [02:19:20] True. [02:19:20] There's some risk to it. [02:19:22] But one of the, to me, what makes poker fun is I'm trying, I love math. [02:19:27] I'm Asian. [02:19:28] So you can't just play the math. [02:19:31] You'll just lose. [02:19:31] You'll get exploited. [02:19:32] But largely, I like playing the math. [02:19:35] I'm trying to reduce, I'm trying to reduce risk and maximize expected value. [02:19:40] I mean, but some of the very best players I know, like Phil Ivey, for example, someone like him, right? [02:19:44] He's legend of the game. [02:19:46] He is also known for being this incredible gambler, who's also very good. [02:19:50] He wins. [02:19:51] But he loves to play. [02:19:52] That's the grand one gambling, though. [02:19:54] No, that's a different thing. [02:19:55] Yeah, that was not going to be allegedly, I guess. [02:19:57] But you know, like he loves the thrill of the gamble. [02:19:59] And again, like, I know other people, like, they just love doing these like edgy, you know, we call them de-gens. [02:20:04] Like, the term de-gen is like a term of like praise because it's like, oh, wow, you're like, really, you're a real sicko. [02:20:10] You're trying out this stuff. [02:20:11] You know, some of them just truly love, and a lot of them, like, even really great players have terrible roulette habits, for example. [02:20:18] Let me tell you a great story. [02:20:19] Okay. [02:20:20] I was playing 2-5 $1,000 buy-in. [02:20:23] And I was basically floating the whole time, saying about $1,000. [02:20:28] And I was with my wife, and we were like, okay, let's go get Denner. [02:20:30] We're going to leave. [02:20:31] And so then I said, the cards got dealt. [02:20:34] I drop a chip when the card lands. [02:20:35] I don't look at him. [02:20:36] And I say, I'm going to jam all my money right now. [02:20:38] But if I win, the dealer can have it. [02:20:39] And so it comes to me and I just shove in a thousand bucks without looking at my cards. [02:20:44] And then I get one caller. [02:20:45] He looks at it. [02:20:46] He looks at his hand and he goes, I call. [02:20:47] And he puts in the chips. [02:20:48] And everyone's like, oh, man. [02:20:50] And the board runs out and he flips over Queens. [02:20:53] And everyone's like, oh, he was over. [02:20:55] He had an overpair. [02:20:56] And then I flipped over two pair, one at a time. [02:20:58] Boom, boom. [02:20:59] And everyone's, the whole table screams. [02:21:02] And then the dealer shoves two grand towards me and I shove it right to the dealer. [02:21:05] That was for the thrill of it. [02:21:07] Wow. [02:21:07] Because my attitude was he's a gambler. [02:21:10] He's going to choose his odds. [02:21:11] And he chose really good odds. [02:21:12] So he's got a chance to win. [02:21:13] And I'm likely just giving this guy some money. [02:21:16] But if I do win, I'm going to give it to the dealer. [02:21:18] So either way, it's not, I'm doing, I'm doing this for the fun and the money's going to somebody else. [02:21:21] But he was pissed. [02:21:22] Philanthropic poker right now. [02:21:23] That's not real poker. [02:21:24] That's like. [02:21:25] No, that was risk. [02:21:26] That was for the thrill of it. [02:21:27] That was like, if I'm good at it. [02:21:28] Look, if you want the thrill of it, go play craps. [02:21:30] I don't like thrill, which is why I avoid poker generally. [02:21:34] I'm good at it when I focus, but I don't like the risk. [02:21:37] It's nauseating. [02:21:39] What else in your life do you like? [02:21:41] Do you hate all kinds of thrill? [02:21:42] Like, do you like roller coasters? [02:21:44] Do you like driving? [02:21:45] There you go. [02:21:46] So you get your kicks from different names. [02:21:48] Everybody gets their kicks from something, right? [02:21:51] Like stand-up, nothing compares to getting up on stage. [02:21:54] Nothing. [02:21:56] I'm going to go, you know, skydive. [02:21:58] Cool. [02:21:59] No, it's fun. [02:22:00] No, I like roller coasters. [02:22:03] Stage performance is consistent. [02:22:04] You don't ever come off feeling low. [02:22:07] Like, I mean, if it's always if you have a rough crowd, but I totally disagree with that without any with losing all your money low. [02:22:13] You know, live. [02:22:15] Do you believe that you actually face a let's let's just call it like a considerable risk playing a $500 buy-in like one-two game? [02:22:25] Well, no, because it depends on my bankroll. [02:22:27] If I had $500 to my name, sure. [02:22:29] No, no, I'm saying you right now, like, let's say you are playing a one-two game. [02:22:33] Do you feel like you are facing a risk against the average one-two player? [02:22:37] Oh, no. [02:22:37] Right. [02:22:38] No, definitely not. [02:22:39] So when Ian's talking about the risk in poker he avoids, I'm a winning player. [02:22:47] I don't go to a one-two table expecting to lose or typically losing. [02:22:50] So I don't need a risk. [02:22:52] You see, you have pocket aces. [02:22:53] They flip an ace. [02:22:54] You're like, I think I own this, but there's a seven on the board. [02:22:56] The other guy has a fucking full house. [02:22:58] I'm like, what the? [02:22:58] I just lost 80% of my pot. [02:23:01] Like, that shit has happened to me too many times to enjoy it. [02:23:04] But see, this is the thing about being skilled at the game. [02:23:06] Right. [02:23:07] So you play it right and you still lose sometimes. [02:23:09] It's crazy. [02:23:10] Sometimes you do. [02:23:11] You're just very, very loss of. [02:23:12] Some people, no, someone. [02:23:14] Some people have like big loss aversion. [02:23:16] You know, some people don't care about losing. [02:23:18] You know, if they play mathematically correctly and they get unlucky, they're like, oh, whatever. [02:23:22] And they just keep going. [02:23:23] Some people, it just really stings because everyone gets lucky from time to time. [02:23:27] Everyone gets very lucky from time to time. [02:23:31] But some people get more of a thrill from the luck when they get really lucky. [02:23:35] And some people, and it outweighs easily all of the times when they lose unfairly. [02:23:40] And others are the other way around. [02:23:41] So it sounds like you're someone who just, the sting of losing hurts so much. [02:23:45] No amount of winning. [02:23:46] I took this dude's money that he couldn't afford to lose it. [02:23:49] It wrecked me. [02:23:51] Let me tell you what I love in poker. [02:23:53] I love it when I look down at Ace King and I'm in early position and I bet and the button calls and then the board runs out ace king queen and I'm like, I've won. [02:24:00] Like, there's no risk. [02:24:01] I don't got to worry about it. [02:24:02] I'm going to make a C bet. [02:24:03] He's probably going to fold and then I'm going to take, you know, 35 bucks. [02:24:07] What I hate, I was playing at the lodge before they shut it down and I looked down at the beautiful pocket queens. [02:24:12] I mean, I'm sorry, pocket kings. [02:24:13] And boy, am I excited, early position. [02:24:15] So I raise. [02:24:16] It's one, two, I raise to 15 bucks. [02:24:18] Everyone calls, right? [02:24:20] And then the board comes, king, ace, ace. [02:24:23] Oh, boy, I have a full house. [02:24:24] Ouch. [02:24:25] Now I know I lose to ace king, but probably unlikely with two aces on the board, three kings are already in play. [02:24:32] So what are you going to do? [02:24:33] You got a full house. [02:24:35] I know with all these callers, there's an ace in play. [02:24:38] So I make a bet. [02:24:39] And sure enough, good old Skull Mike made the call. [02:24:41] Yeah, from the lodge. [02:24:42] Shout out, Skull. [02:24:43] And I'm like, yes. [02:24:45] Because I know he doesn't likely have Ace King. [02:24:47] I got a full house. [02:24:48] The only thing I'm worried about is him hitting a second pair so he get aces full or the board pairing. [02:24:53] So I make a big bet. [02:24:54] He calls. [02:24:55] And I say, this is what I hate, actually, right? [02:24:58] This is what I hate because I'm good right now. [02:25:00] Boy, am I lucky? [02:25:01] And then the board went, I think it went King, Ace, Ace, Deuce, Deuce, I think. [02:25:09] And anyone. [02:25:11] I think he had like Ace 10. [02:25:12] That's what I hate. [02:25:13] That's what I hate. [02:25:15] I'd rather just have the marginally good hand in a good spot that I know I'm going to win and just take it down. [02:25:20] But to be fair, those things happen. [02:25:21] That's risk. [02:25:22] That's bad luck. [02:25:23] But I'll put it like this. [02:25:25] The way I try to describe poker, it's not gambling. [02:25:28] There's an element of chance for sure. [02:25:29] But the way I describe it to people is, how about this? [02:25:32] Brian, you want to enter a competitive tournament against me? [02:25:34] We each put a 500 bucks a skateboarding contest. [02:25:37] You win? [02:25:38] No. [02:25:39] Why not? [02:25:39] Because you'll beat me. [02:25:41] What makes you think that? [02:25:42] Anybody can skateboard. [02:25:44] That's what happens with poker all the time. [02:25:46] The issue is, for whatever reason, poker has this amazing ability to convince people who have never studied the game they're good at the game. [02:25:52] That's right. [02:25:53] And they will decide to enter a tournament against you without having done any work to do. [02:25:56] Do you write scripts? [02:25:58] Because they've seen a lot of movies. [02:26:00] I have a script I wrote. [02:26:01] It's like, oh, do you? [02:26:02] I mean, reading it. [02:26:03] Because I can tell you it ain't good. [02:26:04] It's so hard. [02:26:05] It's like everything else. [02:26:06] So you can't accidentally hit a trick. [02:26:08] I mean, you might in skateboarding, but not like in poker, you'll accidentally win a few games and think that you're great. [02:26:13] Except in like an athletic. [02:26:14] The mistake being made is thinking the single hand is the game. [02:26:17] I know. [02:26:18] And so. [02:26:18] You got to play the long game like the Chinese. [02:26:21] There it is. [02:26:22] 100-year game, man. [02:26:23] I'm here for a night. [02:26:24] I'll tell you, you go to MGM, National Harbor in D.C., and I'm playing the 1-3. [02:26:28] It's a $500 buy-in. [02:26:30] And it is not even poker. [02:26:32] I get bored because I'll look down at my hand and it doesn't even matter what cards I have. [02:26:37] It doesn't even matter because they're only calling with pocket pairs, pseudo-connectors, or Broadway cards. [02:26:45] They're typically not playing junk for the most part because it's a Friday night. [02:26:49] Sometimes you'll get a crazy player. [02:26:51] But you have a general idea of what they might be playing, and it's really simple. [02:26:54] Typically, you're not going to hit a pair. [02:26:56] Pairs are hard, they say. [02:26:57] What is like a third of the time? [02:26:58] Or when the board runs out, it's about half, but on the flop, it's about a third. [02:27:01] So you want to make sure your hands are typically above 50%. [02:27:04] You're playing against people who have no idea what they're doing. [02:27:06] And then you raise, you get five callers, then the board runs out. [02:27:10] You hit a pair, you make a C bet. [02:27:11] They all fold. [02:27:12] If someone calls, you fold. [02:27:14] And as long as you keep doing that, you're just popping up. [02:27:16] You're just stacking up chips like crazy. [02:27:19] They never adapt their play. [02:27:20] They don't think about what kind of cards you have. [02:27:22] So I can sit there for two hours at MGM and turn 500 into 1500 because they're just not good. [02:27:29] They think they're good or they want to have fun or whatever. [02:27:31] There's no risk for the most part. [02:27:33] Don't go all in. [02:27:34] You don't have to. [02:27:35] You get so many free chips from them folding and they overfold. [02:27:38] They'll fold second pair. [02:27:39] Then you go up in stakes and people will start floating. [02:27:41] They'll start calling second pair and then you're going, ugh. [02:27:44] Now you got to play good. [02:27:45] Yeah. [02:27:46] So anyway. [02:27:46] I love it. [02:27:47] Me too. [02:27:48] Anything you want to add, Colin? [02:27:49] I'm playing. [02:27:50] Or anything you want to See, Robbie's sitting over there. [02:27:53] Mega poker. [02:27:57] At least I seem to kick off a good conversation. [02:27:59] Well, I guess the real root of my question is, especially like for you, Tim, after you've spent a good day at the casino, does that make you want to jump off the roof of a building with a skateboard more or less? [02:28:12] Well, so I'm just not a gambler, you know? [02:28:18] I don't like playing games that are coin flips. [02:28:21] If I go to a casino, it's going to be for a poker game. [02:28:25] And boy, I love PLO because PLO is just so incredibly soft. [02:28:32] But I don't like unnecessary risk, and I don't risk large sums of money. [02:28:37] I will never play a game that is any risk at all to my life. [02:28:42] Or I don't do risk. [02:28:43] I just don't do it. [02:28:44] Skateboarding is not a risk. [02:28:46] I've broken a bone one time skateboarding was because someone else hit me and that wasn't my fault. [02:28:50] When I skateboard, one story I got for you is when I was 16, I was jumping off of, it was about five feet high and about maybe seven feet long. [02:28:59] It was a ramp. [02:29:00] So I'm going up a ramp onto the flat and then doing a backside 180. [02:29:03] This means that you're spinning in the air backwards. [02:29:06] You can't see what's in front of you. [02:29:07] You're blindsided. [02:29:09] And it took me, I don't know, like, you know, 10 or 12 tries. [02:29:13] So I'm spinning backwards, landing on the ground, sliding out, getting up, trying again. [02:29:17] And then I finally land it and I'm like, nailed it. [02:29:20] Then a little kid, five minutes later, is running through the park, trips and falls right where I was and breaks his wrist. [02:29:26] And I'm just like, I can jump off this thing going, you know, like 13 miles an hour or whatever, spinning backwards to where I can't see, and there are zero injuries at all. [02:29:35] There's no risk to me whatsoever. [02:29:37] I mean, the risk is like 1%, maybe. [02:29:39] And that's always going to be somebody else. [02:29:41] So for the most part, I will just say I don't much care for risk. [02:29:49] When I go to a casino, I'm not going, oh, God, I have to win. [02:29:52] Oh, my God. [02:29:53] I'll have like $100. [02:29:54] We'll play Blackjack. [02:29:54] And then I'll be like, well, that was fun. [02:29:56] You get a free drink. [02:29:57] So there it is. [02:30:00] I guess I've just always been kind of curious about that. [02:30:02] For me, the biggest risk, or I say risk, because I had enough training for it to not be risky for me. [02:30:10] Well, I mean, it's always risk, but I was a cop and doing shit with slot, going through doors and high-risk warrants. [02:30:17] That was the highest high I've ever felt in my life and that kind of risk. [02:30:21] So I've always kind of been curious about the mindset around gambling with money, how that makes people feel. [02:30:30] It doesn't matter for me the few times I've done it. === Always Running Towards Danger (01:13) === [02:30:33] Let me ask you this. [02:30:34] So if you knew that if you entered a building, you'd die, would you enter the building? [02:30:42] If I knew for certain I was going to die, no, but there's always the risk. [02:30:46] But you missed the adventure. [02:30:48] The adventure is not knowing what's going to come next. [02:30:50] My point is, we understand risks and all things, but we usually presume there isn't any. [02:30:55] So I've been on the ground covering riots and civil unrest. [02:30:58] I've been to foreign countries, and it's all technically risky, but I would never go somewhere that I actually expected something bad to happen. [02:31:06] We always rob under the presumption that it's rare and likely not going to happen. [02:31:10] So I guess the issue is I just disregard risk and take actions where the presumption of risk is low. [02:31:20] Oh, there's really only a handful of jobs like police, firefighter, military where you have to actively run towards the danger and the rest. [02:31:28] Well, it takes a special kind of stupid to do that. [02:31:31] I'll give journalists half the credit on that one. [02:31:35] I'll give journalism half points because we literally would run towards the riots, the explosions, the gunfire. [02:31:42] Unfortunately, most of the journalism industry isn't that. [02:31:45] But yeah, do you want to shout anything out, brother?