Timcast IRL - Tim Pool - IT WAS DERAILED | Timcast IRL #1458 w/ Rick Jordan Aired: 2026-02-27 Duration: 02:07:22 === Tornado Of Tension (03:52) === [00:01:58] Hillary Clinton was on Capitol Hill today giving closed-door testimony about her relationship with Jeffrey Epstein. [00:02:05] Well, her lack of a relationship with Jeffrey Epstein, if you believe her testimony. [00:02:09] Lauren Boebert decided that she was going to take a picture and then she sent it to Benny Johnson. [00:02:13] Then he threw that on the internet. [00:02:14] And so then they stopped the whole thing. [00:02:16] So we're going to talk about that tonight. [00:02:18] There is chaos erupting on the Afghanistan-Pakistan border. [00:02:23] The Afghans decided that they were going to shoot at their nuclear-armed neighbor, and now all hell is breaking loose. [00:02:30] The Iran talks have broken down. [00:02:32] The United States, Iran says that they're not going to end their enrichment. [00:02:36] So this only adds to the tension in the region. [00:02:38] Donald Trump is making a bunch of waves because he's talking about seeking executive power over elections. [00:02:44] Now, what he's looking to do is use an executive order to require IDs, but the left is freaking out saying that he's going to fix the elections and it's going to be unfair and blah, blah, blah. [00:02:56] So we'll talk about that. [00:02:58] One of the people killed off the coast of Cuba the other day was an American citizen. [00:03:03] Now, allegedly, the boat was stolen and it had a lot of Cuban nationals in it. [00:03:07] But again, one American was killed. [00:03:08] So we'll get into that. [00:03:10] And also, we're going to talk about a whole bunch of AI stuff at the end of the show, too. [00:03:14] So there's a bunch of people in China that, or a bunch of women in China, that have decided that they want to fall in love with their AIs. [00:03:21] Burger King is using AIs to watch over their employees and make sure that they're saying please and thank you. [00:03:27] So we're going to get into it. [00:03:28] But first, we're going to go to a word from our sponsor. [00:03:31] We got a great sponsor. [00:03:32] It is Beam Dream. [00:03:34] Check out shopbeam.com/slash Tim Pool to get the 35% off your nighttime sleep blend to support better sleep. [00:03:44] I absolutely love this stuff. [00:03:45] I drink it every single night. [00:03:46] They got a bunch of different flavors. [00:03:48] I got cinnamon cocoa, sees all caramel, brownie batter. [00:03:50] I'm a big fan of the cinnamon cocoa. [00:03:52] It's my favorite, but I've been drinking the sea salt caramel one. [00:03:54] It's got magnesium. [00:03:55] It's got L Feene. [00:03:56] It's got Reishi. [00:03:57] It's got melatonin. [00:03:59] I drink it before bed. [00:04:00] It's a hot, it's a cup of hot cocoa. [00:04:02] Oh, it's caramel. [00:04:03] I guess hot caramel. [00:04:03] And it's about 15 calories, no added sugar. [00:04:06] No joke. [00:04:06] I do drink it every night before bed, and it is absolutely amazing. [00:04:10] My sleep has dramatically improved. [00:04:12] I've actually been getting such good sleep. [00:04:14] My sleep has started to reduce. [00:04:16] Like, no joke, I was sleeping for like seven and a half hours. [00:04:19] Now I'm just naturally waking up a little earlier, and my sleep score is still maxed out. [00:04:23] This stuff is great. [00:04:24] And if you're a guy, it's important because your testosterone and HGH are produced in the body during REM and deep sleep. [00:04:30] So if you're sleeping poorly, it's negatively impacting your weight, your energy, your mood. [00:04:35] So I'm a big fan. [00:04:36] Check out shopbeam.com/slash TimPool. [00:04:41] All right. [00:04:41] So smash the like button, share the show with all of your friends, with everyone you know. [00:04:45] Head on over to Timcast.com where you can become a member there. [00:04:48] You can join our Discord and you can join our after-show. [00:04:51] You can call in and talk to our guests. [00:04:52] Then head on over to Rumble so you can watch the after-show. [00:04:55] Join up there. [00:04:57] Joining us tonight to talk about all of the things that I mentioned earlier and so much more is Rick Jordan. [00:05:02] How are you doing, Rick? [00:05:02] What's shaking? [00:05:03] It's good to be here, man. [00:05:04] Who are you? [00:05:05] What do you do? [00:05:05] Who am I? [00:05:06] Well, I'm Rick Jordan, right? [00:05:09] I do a lot of things. [00:05:10] Whenever I get this question, you know, pretty much what I've done since birth almost was technology, right? [00:05:15] But what I wanted to be when I was a super little kid was a tornado chaser. [00:05:19] Oh, that's sick. [00:05:19] I know, yeah. [00:05:20] So, I mean, I still do a little bit of that on the side. [00:05:22] Do you do a lot of, do you like, are you like an adrenaline junkie? [00:05:25] Do you go out and try and do like things like jump out of planes and stuff? [00:05:28] No, I don't do that. [00:05:29] But what I do, I like to be the first at certain things. [00:05:31] Okay. [00:05:31] You know, so if I see that somebody else hasn't done something yet, I'm like, why not? [00:05:35] You know, there's got to be a reasonable watch me do it, you know? [00:05:38] Awesome. [00:05:38] Well, thanks for joining us. [00:05:39] Brett's here. [00:05:40] What is going on, guys? [00:05:41] It is Brett. [00:05:42] Normally, I'm doing Pop Culture Crisis Monday through Friday at 3 p.m. 3 p.m. Eastern Standard Time, but we had a bunch of stuff talking about. [00:05:49] Fan of Twister growing up? === Hillary Clinton's Deception (13:53) === [00:05:51] The movie Twister? [00:05:51] Oh, yeah. [00:05:52] Yeah. [00:05:52] Absolutely. [00:05:53] 110%. [00:05:54] Let's go. [00:05:54] Same with a new one. [00:05:55] If you skydive into a tornado, would it kill you or just spit you out somewhere? [00:05:59] We want to find out. [00:06:00] Let's go. [00:06:00] I want to be the first. [00:06:01] Yeah, you want to be the first? [00:06:02] No, not yet. [00:06:03] But reading all this AI stuff, I'm kind of almost there, like getting ready to jump into a tornado. [00:06:07] If you look at a tornado anyway, you could ask AI. [00:06:10] Yeah, just read the Department of Wars has this contract with Anthropic AI, the same company that's building the thing that Phil's been using, this buddy bot. [00:06:21] His name is Tank. [00:06:22] Thank you. [00:06:23] His name is Tank. [00:06:24] And so the military Department of War is like, we need full control of this AI for autonomous weapons. [00:06:29] And Anthropic's like, I don't think that's what we're supposed to be doing here, everybody. [00:06:33] And it's like, anyway, I'm freaking out. [00:06:35] White, what it is. [00:06:36] Okay, well, maybe we can explain it further. [00:06:37] We're going to argue about you guys are going to argue about it. [00:06:40] Yeah. [00:06:40] Carter's up, everyone. [00:06:41] Carter Banks here, hanging out, pushing the buttons, making sure to give you the best reaction shots and the best show. [00:06:47] Let's go. [00:06:48] Awesome. [00:06:49] So we're going to start off with from ABC 7. [00:06:52] Hillary Clinton's Epstein deposition briefly delayed over a leaked photo. [00:06:57] Former Secretary of State Hillary Clinton's testimony had to be briefly halted due to conservative commentator Benny Johnson posting a picture from the closed door testimony. [00:07:06] Johnson posted a picture on social media of Hillary Clinton testifying under oath in front of the House Oversight and Government Reform Committee. [00:07:13] He said that Colorado GOP rep Lauren Boolbert was the one who gave him the picture. [00:07:17] Breaking the first image of Hillary Clinton testifying under oath about Jeffrey Epstein to the Republican Oversight Committee is what Johnson wrote. [00:07:23] And you can see there's his tweet. [00:07:26] One of Clinton's advisors said that the testimony had to be temporarily off the record while they figured out where the photo came from and why possibly members of Congress are violating House rules, according to Politico. [00:07:38] In the past, Clinton said that she doesn't have any information on disgraced financer Epstein or his associate, Delaine Maxwell. [00:07:44] Epstein was convicted of sex trafficking minors in 2019, the same year he died in prison. [00:07:48] Do you guys think it's a good idea to take pictures in a closed session of why is this news? [00:07:55] Seriously. [00:07:57] Because it's Hillary Clinton and it's Jeffrey Epstein. [00:07:58] Those two things are big news all the time. [00:08:00] But it's not a photo of those two together. [00:08:02] No. [00:08:03] There are not how much of a problem is this for Lauren Bobert? [00:08:08] This is a slap on the wrist? [00:08:09] Yeah, I don't think that. [00:08:10] I don't think, like, they wouldn't censure her. [00:08:12] So it's like, everything's legal for a fee? [00:08:14] Like, as long as you're okay with taking the punishment, they go ahead and do that. [00:08:17] Well, I mean, that's everything. [00:08:18] Yeah. [00:08:19] You know, I mean, like, tolls are suggestions if you don't mind paying the. [00:08:22] You've got the money. [00:08:23] Yeah, exactly. [00:08:23] Speed limits, just a suggestion. [00:08:25] I mean, most things, especially if they're not violent crimes, most things are just suggestions if you don't mind a fine. [00:08:31] Was it Hillary that said timeout the photo leaked? [00:08:34] I mean, I don't know. [00:08:36] It doesn't say who actually decided. [00:08:38] I mean, it wouldn't surprise me if whoever was running the hearing actually, you know, the word got to them. [00:08:44] They're like, hold on, we have to stop this and find out who it was. [00:08:47] So that way we can mark it down in the calendar that they need a slap on the wrist or something. [00:08:51] When was the last time she was in any type of government hearing? [00:08:55] Hillary Clinton? [00:08:55] Yeah. [00:08:57] Probably, it's probably been like seven or eight years because it was about, or maybe even longer, because it was about her emails. [00:09:03] It was probably the last time that she had a session of Congress where she was answering. [00:09:08] Yeah, I think so. [00:09:09] Off the top of my head, at least. [00:09:10] The Benghazi stuff. [00:09:11] I remember that. [00:09:12] Benghazi was. [00:09:13] That was even farther. [00:09:13] Yeah, Benghazi was before that. [00:09:15] So she had makeup professionally done. [00:09:18] Yeah, right. [00:09:19] I don't think this matters at all. [00:09:20] Well, no, the weird thing about it is you see stories like this. [00:09:24] And I know that a lot of the people in Congress, I think it was you, Phil, that was pointing out, maybe somebody else that was pointing out that you expect more from senators than you do from people in Congress, right? [00:09:33] So is the idea here that Lauren Bobert's like, I'm just going to get my name in the press by leaking this photo to Benny Johnson and she just took it as a risk-versus-reward analysis? [00:09:40] I don't know that she was thinking about the press. [00:09:42] I think she was thinking about, I think she's thinking about, you know, this will be something. [00:09:47] I mean, maybe it is. [00:09:48] It doesn't expose anything. [00:09:49] No, it doesn't. [00:09:50] You know, it could be, I don't know. [00:09:52] I don't know if her and Benny are friends. [00:09:54] If her and Benny are friends, Benny could have said, hey, snap me a pic so I can tweet it. [00:09:57] You know, I don't know though. [00:09:58] I'll give you credit. [00:09:59] Yeah, exactly, right? [00:10:00] People were very upset that this was a closed-door hearing. [00:10:03] So maybe this is Bobert's way of protesting closed-door hearings. [00:10:06] They're like, no, let's just make the world know that Hillary Clinton is being deposed today. [00:10:10] I mean, that actually. [00:10:10] But they already know she is. [00:10:12] Is the idea here that they're like, they're going to send in a fake person and they have to have a photo of it to prove that it's real? [00:10:17] Otherwise, it's a good idea. [00:10:18] Her double, her body double. [00:10:19] Yeah. [00:10:20] No, but my concern. [00:10:21] I think Bobert was concerned this was going to be behind closed doors, going to get swept under the rug, and everyone's going to deny to forget about it. [00:10:27] And she's like, I do not want to forget about this moment. [00:10:29] No, make it noise, make it a big deal. [00:10:31] That she actually showed up. [00:10:32] Is that what you're talking about? [00:10:33] Just that Hillary Clinton's being deposed on Epstein. [00:10:37] They tried to do it behind the closed doors for a reason. [00:10:39] And Bobert was probably like, fuck that. [00:10:40] I mean, maybe that, maybe there's some substance to that. [00:10:42] Like, the idea of having photos of Hillary Clinton getting, you know, reading the Riot Act by Congress or being questioned by Congress makes Hillary Clinton look bad. [00:10:53] It's red meat for Republicans. [00:10:54] They love to see Clinton sweat, you know? [00:10:57] So, I mean, maybe that's got to be. [00:10:59] So, are we going to see a photo of Bill now? [00:11:01] Didn't in the blue dress. [00:11:03] Oh. [00:11:05] Your mind's where I'm at, Brett. [00:11:07] One of you guys has to say, Jimmy. [00:11:08] Have you ever seen that photo of Hillary Clinton when she's in the rundown apartment and she just looks disgusted and freaked out? [00:11:15] I love that photo. [00:11:16] She looks confused. [00:11:17] She literally looked like all the apartments I lived in when I was in my 20s. [00:11:21] That's what I meant by the professional makeup. [00:11:22] Yeah. [00:11:22] Because when she was campaigning, I mean, it was a complete difference overnights between how her hair was done, how her makeup was done. [00:11:29] All of a sudden, she went back to, oh, this is who I really am. [00:11:31] And it was like the physical masking. [00:11:33] Carter brought up the picture. [00:11:34] I mean, she looks her age. [00:11:37] You know, she's looking her age. [00:11:38] I don't know how old she's. [00:11:39] I think she's like pushing in herself. [00:11:41] She's pretty old. [00:11:41] Yeah. [00:11:42] 74. [00:11:44] What's the guess? [00:11:46] Taking bets on. [00:11:47] The only thing I thought when the before and after the campaign thought, I'm like, man, my tax dollars, you know, or donor dollars went to Botox. [00:11:55] Yeah, Botox and hair and makeup. [00:11:58] Not while she was campaigning. [00:12:00] That was donor dollars, but when she was in office. [00:12:02] It's like, she's 78. [00:12:04] Wow. [00:12:04] Really? [00:12:05] Wow. [00:12:05] So does that mean she looks good for her age then? [00:12:07] No, but if Elad were here, you might make that argument. [00:12:09] He's the guy that's constantly thirsting for Hillary. [00:12:12] He's big on the shit. [00:12:13] Wait, for what? [00:12:15] What? [00:12:15] Is that a real thing? [00:12:16] Well, no, he says that when she was younger, she was very pretty. [00:12:18] Yeah, they say she looked like Sabrina Carpenter looks like Hillary Clinton when she was younger. [00:12:22] I mean, that hot. [00:12:23] I don't think Sabrina Carpenter is all that hot. [00:12:25] Oh, really? [00:12:26] She's pretty when she was younger. [00:12:28] Was that? [00:12:28] Neither was Hillary when she wasn't. [00:12:29] Yeah, I don't think so. [00:12:30] That's why I brought it up. [00:12:31] Rewriting his kind of crazy. [00:12:33] Hillary was kind of like a poster child for the military industrial complex in 2016, 17, 18. [00:12:39] And everybody is kind of like, I think they want their vengeance now. [00:12:42] They just want to see Hillary Clinton pay. [00:12:43] And it's like, bro, she's such a pawn in this whole world power thing. [00:12:48] That's not what people think, though. [00:12:49] They don't think she's. [00:12:50] That's awful. [00:12:50] They think that she's the one who's the queen. [00:12:54] Actually, I think one of the good point to this might be the fact that the Epstein files have become such a divisive issue within the Republican Party that trying to refocus it around somebody that's a Democrat is a good thing for people in Congress who are looking to kick the can down the line, not have to deal with it, you know, blowing up the Republican Party like it has been the last couple of months. [00:13:13] Yeah, we dug into it, and there are actually no photos of Hillary Clinton with Jeffrey Epstein. [00:13:17] So it's possible that she didn't really know him that well. [00:13:20] It's possible that, you know, maybe she met him at dinner or whatever, but that, you know, not long enough to stand for a photo op or whatever. [00:13:27] Now, obviously, Bill Clinton, that's a totally different story. [00:13:31] And so people are like, people, you know, make the assumption, well, you know, Bill knew him. [00:13:35] But then again, the reasons that Bill knew him, maybe Hillary wasn't around. [00:13:39] Yeah, no reason for her to be there. [00:13:42] Yeah, you know, it's like, in fact, she wasn't invited. [00:13:44] Yeah. [00:13:44] Yeah, definitely not invited. [00:13:46] He's on one of those boys trips that he was on. [00:13:48] Honey, I need you here. [00:13:49] Yeah. [00:13:50] I wasn't surprised that she stayed with Bill when he got a blowjob in the Oval Office because it was Bill Clinton. [00:13:56] He was the president. [00:13:56] And like, wife stands by her man. [00:13:58] But damn, that just probably just wrecked their relationship. [00:14:02] Bill off-womanizing, Hillary just dealing with it, getting bitter. [00:14:05] They'd say that it wasn't a relationship to begin with. [00:14:07] A lot of them are matters of political convenience. [00:14:09] They're basically like arranged marriages in politics. [00:14:13] I mean, I know, I remember there were a lot of videos that came out after the things were made public where there was distance between them. [00:14:23] You could kind of tell. [00:14:24] But Bill's philandering was well known long before Monica Lewinsky, long before it became national news. [00:14:31] I mean, there were rumors of Bill when he was in Arkansas, Arkansas. [00:14:36] Yeah. [00:14:37] Yeah. [00:14:37] So that was kind of par for the course for good old Slick Willie. [00:14:42] You know, regarding, I heard that Jesus came up with that now. [00:14:45] You look great. [00:14:47] Ghelane attended Chelsea Clinton's wedding in 2013 or something. [00:14:51] Oh, did she? [00:14:52] Yeah, I just read that. [00:14:53] She made some, she did make some remarks. [00:14:54] I'm not sure where I saw it, but I'm not sure. [00:14:56] It might be over here, and I don't want to turn away. [00:14:58] But she made some remarks about her acquaintance about being an acquaintance of Ghelane. [00:15:01] So she's like, it's like, Bill's out on boys' trips with Epstein, and Ghelane is out on girls' trips with Hillary. [00:15:06] Ghelane was literally Literally being like the, she was, she was being the wingman. [00:15:10] She's like, let me take care of your wife. [00:15:12] You go have fun. [00:15:12] I wouldn't be surprised, too, if, like, because Epstein was dealing with such dark stuff that Bill's like, Hillary, you're never going to be any part of this part of my life. [00:15:20] I'm going off to do the dirtiest deals with the darkest money. [00:15:25] I don't want you anywhere near it. [00:15:26] I don't think Bill was looking for money. [00:15:27] I'm not going to be attributing chivalry now to Bill Clinton. [00:15:30] More like, just if this ever gets blown up in the press, I don't want you connected to him. [00:15:33] Like that kind of thing. [00:15:34] Well, that's awfully nice of him. [00:15:35] Yeah. [00:15:36] It was that way. [00:15:38] Bill, the altruist Clinton. [00:15:40] I think most people look at it as the one on the island with Epstein with his girls. [00:15:43] I think most people would look at it as the other way around, whereas she would be the one who's maneuvering him like a pawn and working behind the scenes to get him where he needs to go as a politician because he's too, you know, her idea might be he's not smart enough to do it on his own. [00:15:56] He's the good face of the Democratic Party at the time, but he doesn't necessarily have the ruthlessness that it takes to succeed in politics. [00:16:03] So she is the ruthless one and he is the face of the movement. [00:16:06] Yeah, if I understand correctly, people that have met Bill Clinton, they're like, you know, it makes perfect sense that he was a president. [00:16:11] When you meet him, it feels the same thing said about Barack Obama and stuff. [00:16:15] Like when you meet him, and also the same thing said about Donald Trump, he remembers your name. [00:16:20] You feel like there's no one else in the room. [00:16:21] It's like he's not. [00:16:22] That's a dangerous thing. [00:16:23] Oh, yeah. [00:16:24] When who was it that came here? [00:16:27] Larry Elder came to Tim Cast for the second time. [00:16:31] He shook my hand and said my name. [00:16:32] I'm like, there's no way. [00:16:34] Like, I was like, I met him for like 10 seconds the time before, like when they were leaving. [00:16:39] And he came and said, hey, Brett, how's it going? [00:16:41] So there's the only thing I can think is like somebody in the car was like, these are the people that work there so that you can remember their name. [00:16:47] And I was like, that was crazy to me. [00:16:49] I'm like, that was actually made me less trustful. [00:16:51] In theater school, I used to do that when I would be like a sophomore. [00:16:54] When all the new freshmen would come in, all their headshots would be on the wall. [00:16:57] I just go in the room and stare at the wall for like 20 minutes at all the headshots and names and memorize faces and names because it's such an important part of that industry. [00:17:05] Yeah. [00:17:05] It's memorized. [00:17:06] I could never be a politician. [00:17:07] I can't remember anybody. [00:17:08] It's part of power. [00:17:09] It's not psychological. [00:17:10] Your name is an anchor point in your brain. [00:17:13] So when somebody says it, all of a sudden you are hooked. [00:17:16] I mean, there's studies about this stuff. [00:17:17] That way they have the hostage negotiator constantly say your name on their own. [00:17:20] Absolutely. [00:17:21] There's an anchor point. [00:17:22] People say that to control a demon, if you know its name, you can control it like in mythology. [00:17:27] And so demons will hide their names because probably that very power, that intrinsic vibration that pulls you and changes you, just hearing that sound. [00:17:36] That went to a weird place, man. [00:17:38] It does. [00:17:39] It's only going farther. [00:17:40] We're only getting started. [00:17:41] We're only 15 minutes in. [00:17:43] Ian's driving the car. [00:17:44] We're all passengers. [00:17:45] Vibration. [00:17:46] Ian's driving. [00:17:46] We'll get to resonation later where the field itself moves. [00:17:50] But let's just. [00:17:51] That's why I will never hold office. [00:17:52] And I can't remember anybody else. [00:17:53] It's a family gap, right? [00:17:54] We went from Hillary or Clinton to demons. [00:17:56] Yeah, you know, they actually. [00:17:56] You can't remember a demonstration if it doesn't give you the right ones. [00:17:59] Yeah. [00:18:00] You ever ask someone for their name and they won't tell you? [00:18:02] Because they're afraid. [00:18:03] No. [00:18:03] I mean, it's weird. [00:18:05] Yeah. [00:18:06] No. [00:18:06] So I'm just sending photos on the bottom. [00:18:09] By the way, what you were just talking about, I've actually attributed Ian having that same power. [00:18:15] Where the day I met Ian, I said, everybody in your life should look at you the way Ian looks at you when you're talking for the first time. [00:18:23] Because it's like nothing else in the world exists. [00:18:25] That's a skill set. [00:18:26] But you were saying awesome stuff, too. [00:18:27] You were like really informative and like explaining a bunch of stuff to me. [00:18:31] That's important to be able to give somebody your full attention. [00:18:34] Yeah. [00:18:35] Yeah. [00:18:35] I still remember what Ian said to me the first day I came to work here and was like, Welcome home. [00:18:39] And I never forgot that. [00:18:40] My mom loved it too. [00:18:41] She's like, I like Ian. [00:18:42] First thing Ian ever said to me was the N-word. [00:18:45] You want to do it again? [00:18:46] Man, if we were online right now. [00:18:47] I'm lying. [00:18:49] It's not true. [00:18:49] It's not true at all. [00:18:50] Nice to meet you. [00:18:51] Yes, that's exactly what it is. [00:18:54] So, yeah, I mean, look, I don't think this is actually even really big news. [00:18:58] So I feel like we could kind of move on. [00:18:59] We covered it because it was kind of like the thing that was all over the headlines and stuff. [00:19:04] But yeah, there's not really any significant substance and nothing was really said in the she have multiple days she has to testify or is it just this one day? [00:19:11] Um, I don't know. [00:19:12] I think that I think that the actual committee decides that. [00:19:15] Okay. [00:19:15] They feel like they get through and they all the questions are answered and everyone gets. [00:19:18] There was a statement after I saw that today, too. [00:19:21] And it was just a just a bunch of, you know, why did you do this? [00:19:24] Why did you have me come in? [00:19:25] I'm summarizing, you know, but it was more of a you allowed all these other people to skip this hearing and just provide a written statement. [00:19:32] I provided a written statement too. [00:19:34] She was saying this was nothing but politics, of course. [00:19:36] You know, that's why you dragged me in here. [00:19:38] It's like, well, yeah. [00:19:39] Yeah. [00:19:39] I was going to say, well, you're right. [00:19:41] She's right. [00:19:42] You know, I mean, you're a politician. [00:19:43] So it follows. === Brende And Epstein Ties (02:06) === [00:19:45] You're one of the most powerful people in the world. [00:19:48] You're arguably the most powerful woman in the world. [00:19:51] Yes, it was politics. [00:19:52] This goes back to that comment that Trump said won the first campaign. [00:19:55] Yeah. [00:19:55] Because he'd be in jail. [00:19:56] She'd be in jail. [00:19:58] Yeah. [00:19:59] So, all right, we're going to move on to this story here. [00:20:01] The CEO of the World Economic Forums quits after Epstein's ties are come to like, excuse me. [00:20:08] Sorry. [00:20:09] Blessings, sir. [00:20:11] From Reuters. [00:20:12] In Zurich, the president and CEO of the World Economic Forum, Borg Brende, said he was stepping down on Thursday, a few weeks after the forum launched an independent investigation into his relationship with the late U.S. sex offender, Jeffrey Epstein. [00:20:27] Brende, who became president of the WEF in 2017, announced his decision in a statement following disclosures from the U.S. Justice Department that showed the Norwegian had three business dinners with Epstein and had also communicated with the disgraced financer via email and text message. [00:20:43] After careful consideration, I have decided to step down as president and CEO of the World Economic Forum. [00:20:48] My time here, spanning eight and a half years, has been profoundly rewarding, said Brendan, a former Norwegian foreign minister. [00:20:54] Issued by the WEF, the statement made no mention of Epstein. [00:20:58] However, Brende said Brende told Norwegian media he was sorry about how he had handled his dealings with the American and that it did not want the issue to be a distraction for the forum, which organizes the annual Davos Summit. [00:21:10] This is, again, this is kind of a it's it's interesting to see the kind of the repercussions of the Epstein files all over Europe. [00:21:20] And we're just not seeing anything here in the United States. [00:21:23] I mean, we're talking about people stepping down, Casey Wasserman stepping down from the Wasserman agency. [00:21:28] Yes. [00:21:29] And his weren't even ties to Epstein. [00:21:30] They were ties to Ghalain Maxwell. [00:21:32] Yeah, and Pritzker's brother, the guy from the, I think it's Hyatt, is the hotel chain. [00:21:38] Bill Gates just like said, sorry, and just kept his job. [00:21:42] Gates is like, I have a little bit of interest. [00:21:44] He's like, I had two affairs. [00:21:45] That was the real. [00:21:45] Technically, Bill Gates stepped down from IBM a long time ago, right? [00:21:49] Now he's just... Microsoft, yeah. [00:21:50] Yeah. [00:21:51] Well, yeah, Microsoft, my bad. === Organizations And Offloading Empathy (15:22) === [00:21:52] But yeah, now he's just like a philanthropist who's trying to mutate mosquitoes and stuff. [00:22:00] Saying that he's trying to cure malaria, but we all know what he's really trying to do. [00:22:04] He wants to implant robots in you. [00:22:06] I'm just kidding. [00:22:07] Oh, that's true. [00:22:09] Implant and live long and prosper. [00:22:11] He was a big proponent way back when, too, of the COVID era. [00:22:18] He's like machine man. [00:22:20] I don't know. [00:22:20] I don't want to go too hard. [00:22:21] Musk is when I think about machine man now. [00:22:24] All these tech, these techno, what do you call them? [00:22:26] Tatsuo, technocrats. [00:22:27] I don't know. [00:22:28] Tetsuo. [00:22:28] This isn't any surprise from him, though, because even, I mean, obviously he divorced his wife. [00:22:33] Yeah. [00:22:33] And you talked about Bill Clinton. [00:22:35] I think she divorced him, right? [00:22:36] Yeah. [00:22:36] Because of this stuff. [00:22:37] So did it come? [00:22:38] He had so many affairs when he was at Microsoft. [00:22:42] So just like Bill Clinton, it's like this wasn't anything that was unknown. [00:22:45] So it wasn't a surprise to me to actually see him linked in this manner whatsoever. [00:22:50] But for him to just come out after everything else is known about him, this response was pretty appropriate. [00:22:54] Yeah. [00:22:55] Saying like, yeah, I did two Russian girls. [00:22:57] Nobody benefited more from that divorce than like NGOs and non-profits. [00:23:02] Like Melinda Gates is just giving it away. [00:23:04] Same thing with Mackenzie Bezos. [00:23:06] They're just like, I mean, look, it's fine that they get divorced and they're like, I was with him when he made his money. [00:23:12] Half of it's mine. [00:23:12] Fine. [00:23:13] The guys are set. [00:23:14] They're fine too. [00:23:15] I hate the fact that the women are giving away money, particularly because of who they're giving it to. [00:23:20] They're giving it to all these progressive causes and stuff. [00:23:22] And it's just like, man, can't you find something better to do with that money? [00:23:26] Even causes with ties to terrorist organizations. [00:23:29] Yeah. [00:23:30] Yeah, like it because it's all just misplaced empathy. [00:23:35] You know, they're like, oh, look, this makes me a good person. [00:23:37] These poor are suffering, whatever. [00:23:39] I'll give this money away. [00:23:41] And really, it turns into money going to terrorists or going to organizations that are looking to do things like gendered reassignment surgeries for children or stuff like that. [00:23:52] It's all just the most nefarious stuff out there. [00:23:55] And they're just like shoveling cash at these groups. [00:23:57] Do you think that's on purpose? [00:23:58] Okay, so let's rephrase that question. [00:24:00] So I kind of have the same point. [00:24:02] It's like when you give to the nonprofits and the NGOs, especially if you don't do like a bunch of research into where the money goes, even if we don't want to talk about like shady places they could be giving it to, but whether they're spending the money well, right? [00:24:13] Like how much of it is actually going to whatever cause you're raising and how much is it is going to employee salaries and things like that. [00:24:19] Do you think it's a form of offloading their kind of their empathy on this company or are they doing it specifically because there's like nefarious stuff going on and they want to spread the money around to nefarious causes? [00:24:31] Honestly, I think that they just got a boatload of money and it looks good. [00:24:34] They got a boatload of money that they didn't have to work for and they're just like, man, I got all this stock and I can sell some stock and, you know, it'll piss my ex-husband off. [00:24:42] And, you know, I'm going to give this away to this group and this group and this group. [00:24:45] I don't think that they look into it. [00:24:47] I don't think that they're malicious or they're like, oh, I want to help terrorists or anything. [00:24:51] I think I think they believe the face of whatever NGO that they're talking about or whatever organization they believe with their mission statement, their public-facing mission statement on their website is they're like, oh, they seem nice. [00:25:03] Let me give them a billion dollars and let me give them, you know, or 50 million or whatever number it is. [00:25:09] The funnier version of this is like if Mackenzie Bezos and Melinda Gates just start giving money to their husband's competitors. [00:25:16] Like just start funding all the people going against them. [00:25:20] I mean, that would be the ultimate spite move, wouldn't it? [00:25:23] Every time he like complained about something when he got home from work about some dude he just doesn't like, oh, this guy's a jerk. [00:25:28] He wouldn't sign this business deal. [00:25:30] She just starts giving money to all the people he complained about when they were eating dinner. [00:25:33] Mackenzie Bezos right now and all of Elon's former women. [00:25:37] Yeah, seriously. [00:25:39] Like I'm giving to Open Claw. [00:25:42] I'm giving to Claude right now. [00:25:44] Anthroopics make me killing. [00:25:49] But again, back to the story, like this, the fact that there's all these repercussions that are going through even the government of the UK and nothing's really happening here in the States when it comes to anyone that's alleged to be involved. [00:26:07] None of the lawyers or none of the friends are, nothing seems to be going on. [00:26:11] I mean, as far as arrests or just in general, nothing's changing. [00:26:14] Remember, I'm team, nothing ever changes. [00:26:16] Yeah, I mean, I understand that. [00:26:18] But I mean, you know, you don't, you only, like you said, there's two guys that have stepped down from their positions. [00:26:25] Well, it's the first time in how many hundreds of years at a royal in the UK you're referencing. [00:26:30] Yeah, I mean, stripped of his titles, arrested, thrown in prison. [00:26:33] I think that's true. [00:26:34] Yeah. [00:26:35] In Casey Wasserman in the U.S., that was a little bit different because it's connected directly to Hollywood and all of those people. [00:26:40] It became a virtue signal on behalf of all the people that he represented to leave on behalf of, you know, making sure that their audience, because every one of those clients, whether it's music, movies, they all have their own self-interest. [00:26:52] They can't be seeing being attached to this guy. [00:26:54] And his whole business model is to be attached to individuals, not necessarily to a product. [00:26:59] So it makes sense that he would get, you know. [00:27:01] Yeah, I mean, well, they asked him to, like, he's going to sell his own agency that bears his name. [00:27:07] Like, that's even crazier, like, his company with his name, and they're trying to buy him out so that he has to leave. [00:27:13] Yeah, I mean, it's if I imagine if they'll, or I wonder if they'll change the name after he leaves. [00:27:19] Probably. [00:27:19] You know, just kind of tainted break the ties. [00:27:22] How deeply was he entrenched with Epstein? [00:27:25] I mean, he wasn't attached to Epstein, from what I understand. [00:27:27] He had emails. [00:27:28] He had like an affair with Ghelane Maxwell, if I'm remembering correctly. [00:27:32] Which, I mean, to be honest with you, Ghelane is just as bad as Epstein. [00:27:35] As much as Epstein kind of gets the focus all the time, she was trafficking just as much as Epstein was. [00:27:41] Maybe she wasn't actually engaging in the rape of minors like Epstein did. [00:27:47] But, you know, like she was helping out and she was making sure that there was young people that were available for Jeffrey to abuse. [00:27:56] She would refer to them as Nubiles and they'd drive around New York looking for new biles, you know, underage women, basically, and find them like hot chicks that they thought were going to be models, basically. [00:28:06] And they're like, let's get them. [00:28:08] Let's stop. [00:28:08] They literally would stop on the street. [00:28:10] I don't know if this is true, but like they'd park the car and be like, hey, you, you're exactly who I'm looking for. [00:28:15] Want to be famous? [00:28:16] And the kids are like, yeah, I'm 16. [00:28:18] I'm an aspiring model. [00:28:19] Nubils, Ghylaine would call them. [00:28:22] So I've heard. [00:28:23] I don't know if it's true or not. [00:28:24] How crazy that she had a name for it. [00:28:26] What's that? [00:28:26] There's two things that really bother me about the whole Epstein arc. [00:28:30] The first is that we don't ever really see a lot of the details, you know. [00:28:34] So like in this story right here, there's text messages and everything. [00:28:37] Some of the emails make it out, you know, and it's very clear as to what was going on. [00:28:40] But in this case, Epstein almost seems like he's the black spot. [00:28:44] You know, if you had any association with him whatsoever, then you're shamed for life. [00:28:49] You know, even if you just had a phone conversation with him at one point where he was saying, hey, come out to my island. [00:28:54] You're like, no, that's okay. [00:28:55] I don't need to. [00:28:56] But then you just talk even in the slightest terms of a business deal with him. [00:29:00] Because he was a financier, hands down, right? [00:29:02] That's what he did. [00:29:03] But then that leads into the second thing is this has been lingering on so long. [00:29:09] I'm wondering when this story arc comes to an end because it's been, how long is it? [00:29:15] Really? [00:29:15] How long has it been now? [00:29:16] Well, he was arrested in 2018, correct? [00:29:18] Yeah. [00:29:19] I mean, he had his first conviction well before that. [00:29:21] Yeah. [00:29:21] Yeah. [00:29:21] Wasn't he like out on probation or like house arrest for like a number of years? [00:29:25] No, but however, most people continued to work with him after, but like plenty of people continued to work with him, even though he had convictions already. [00:29:32] Didn't matter. [00:29:33] Yeah. [00:29:34] I agree fully with what you're saying, Rick. [00:29:36] This feels like a cudgel that will be used for decades until all these people are dead and gone that associated with Epstein. [00:29:43] There's in the back of their mind, they're like, shit, if it drops that I made a phone call with Epstein one time, and like, what? [00:29:47] It's the name, the guy, like, come on, it's like Hitler. [00:29:50] Yeah, obviously the Nazis were bad. [00:29:51] It was horrible. [00:29:52] But like people that get any association with Nazi Germany, Hitler, any of that is like, was the most demonic association for you? [00:30:00] You shouldn't be afraid to name your kid Adolph. [00:30:02] Shouldn't be. [00:30:02] Yeah, Adolf's a beautiful name. [00:30:04] Hitler's a cool name, too. [00:30:05] Just turns out the psycho has ruined from here on up. [00:30:08] It's also a form of selective enforcement. [00:30:10] Look at what happened with Weinstein and the amount of people who were caught in the Weinstein net. [00:30:15] Leslie Hedland is still working and she was Weinstein's assistant. [00:30:18] And people are like, look, am I supposed to believe that you didn't know what was going on? [00:30:21] And most people are like, no, I don't buy that. [00:30:23] But she's still working in Hollywood just because they don't actually enforce the rules equally because it's all about who you know. [00:30:28] Everybody in Hollywood over a certain age had had awareness of it. [00:30:33] They were making jokes about Epstein at the Golden Globes or something. [00:30:38] Weinstein. [00:30:38] Yeah, Weinstein. [00:30:39] Yeah, Harvey. [00:30:40] They were making jokes about Harvey. [00:30:41] My bad. [00:30:42] They're making jokes about Harvey from stage, you know. [00:30:45] And it was, so it was an open secret in Hollywood. [00:30:48] I mean, Weinstein was laughing at the jokes. [00:30:50] The famous Ricky Gervais joke. [00:30:52] Yeah. [00:30:53] I remember that. [00:30:54] Yeah. [00:30:54] So, I mean, it's like if you're over the age of 35 or 40 in Hollywood, you knew, and it took a long time for people to come out. [00:31:06] I mean, someone like Oprah Winfrey was tons of pictures with him, friends with him, buddy, buddy, buddy. [00:31:12] And of course, when it comes out the things that he did and the coercion, she just doesn't say anything, but nobody's like, hey, Oprah, how come you're still like a queen of media or what have you, even though you were definitely buddy-buddy with Harvey Weinstein, you know? [00:31:30] Well, she's also deeply embedded in the production side of things. [00:31:33] So she spread her money around to finance projects. [00:31:36] She's not just somebody who's front of the camera. [00:31:38] You know, it's the same thing. [00:31:39] There's actually this weird thing where every time Mel Gibson makes a new movie, even though like so many people in Hollywood hate him and the public loves him, somehow there will be like nice reports written about the movies that he's making because he's deeply felt within the production side of things. [00:31:55] So he can go to these outlets and they can write some favorable pieces about him. [00:31:59] And then he'll be right next to a hit piece from somebody else who doesn't like him because he's got so many ties to the behind the scenes stuff in the industry. [00:32:06] He's also got 400 ones. [00:32:08] Not that any other day as not being in the Epstein files. [00:32:11] He was intentionally named as not being in there. [00:32:14] He also made $400 million off of the movie about Jesus that he made. [00:32:21] They're also like the pictures of George Bush. [00:32:24] They're like, not even in the Epstein files. [00:32:26] Bombed kids overseas just for the hell of it. [00:32:30] You didn't even have to force him. [00:32:31] Yeah. [00:32:32] Yeah. [00:32:32] No, no, no coercion at all. [00:32:34] Yep, totally. [00:32:35] Love of the game. [00:32:36] Good lord. [00:32:38] It is important that we don't demonize people for having connections to someone that's a vile creature. [00:32:43] Like, just because they knew a guy or they had a dinner with him eight years ago and then the guy went off and did psycho shit, like that doesn't mean you're a psycho. [00:32:50] It's okay to associate or have associated with crazy people in the past. [00:32:54] Doesn't make you crazy. [00:32:54] It doesn't make you a villain. [00:32:56] It's not illegal. [00:32:56] So it's really sad. [00:32:57] Like when people are like, shit, my name is attached to the guy. [00:32:59] I got to resign from all my things. [00:33:01] Maybe there's something going on, the World Economic Forum guy. [00:33:03] Maybe something deeper was with that guy. [00:33:04] And he's like, I got to get out of here before they start asking questions, maybe. [00:33:07] But the shame of running away from your job because you got named in an email from 18 years ago is like, bro, that's the problem. [00:33:13] That's the public perception. [00:33:14] And everybody's going to hate me for making this statement. [00:33:16] But I mean, listen, the public got over Diddy, right? [00:33:19] Yeah. [00:33:20] I think the public needs to get over Epstein, though. [00:33:22] The world needs to get over Epstein because I don't see any real moves towards actually preventing, limiting, going after any kind of child trafficking anywhere else. [00:33:32] I mean, look. [00:33:33] I just don't think. [00:33:34] I think, at least when we're talking about the Twitter sphere and the people on there, I just don't think that they're going to get over it. [00:33:40] I don't blame them for the most part because I make the joke all the time. [00:33:43] I say, I'm team, nothing ever changes because it does feel that way. [00:33:46] And it's an offshoot of that where you're like, you look at these people do awful shit. [00:33:50] You see them break the rules. [00:33:52] You see them ignore the will of the people and nothing ever changes. [00:33:55] And there's no greater incidence of that than knowing that all of this stuff is happening and knowing that nobody's going to be held accountable. [00:34:02] And it's even worse when you see it happening to somebody in the UK. [00:34:06] But in the U.S. where all this was based, they're just like, I understand where the black pill comes from that. [00:34:11] Now I understand that there's a gap there between the people who are policy-minded and who are saying that we need to get over this because we have other things we need to worry about. [00:34:19] And this can't be the only thing that we focus on. [00:34:22] But when you're talking about the abuse of children, that's just a cord that's very hard for some people to separate from. [00:34:27] You know, it's funny because, well, it's interesting that there's all this focus on the Epstein files and the terrible things that Epstein did, but people don't really have the same, or at least the left doesn't have the same kind of outrage over all the children that were trafficked through the southern border when Joe Biden. [00:34:44] Like the only reason they care about Epstein at all is because it's tangentially connected to Trump. [00:34:49] Absolutely. [00:34:50] There were far more kids that were hurt and died and abused, you know, by cartels that were trafficking children over the border all the time during the four years that Biden was president, and they don't say a word, not a peep. [00:35:05] The Super Bowl. [00:35:06] I mean, the Super Bowl weekend is the biggest trafficking weekend. [00:35:08] Yeah. [00:35:08] Every single year. [00:35:09] Yeah. [00:35:10] Yeah, that's true. [00:35:11] But why? [00:35:11] Like, just because of the travel into the country? [00:35:14] Hardies. [00:35:14] Yep. [00:35:15] People buy commodities, and unfortunately, commodities are children. [00:35:18] It's horrible. [00:35:19] What's I'm saying? [00:35:20] So the idea that people know that this exists and there isn't anybody, there's one person whose face has been made kind of the he's now the avatar for it. [00:35:28] And the only other person who's been held accountable somehow isn't able to give other names and bring anybody else to justice. [00:35:36] People can't buy that. [00:35:37] Like they don't buy that the one guy died and that the other one's in jail and then just nobody else. [00:35:42] And I don't think they should be expected to believe that. [00:35:43] I'm not saying that. [00:35:44] What I don't hear, man, is that you know, what are we actually doing about it right now? [00:35:48] No, I'm not saying it's not a, it's not a problem where it feels like things are getting lost and other stuff could be getting done. [00:35:53] But depending as one of the things I was saying the other day is like most of the people these days are one-track voters, right? [00:35:59] Like the other day I opened X and half the people are complaining about Iran and how we're going to have World War III. [00:36:05] Then we have people complaining about Epstein. [00:36:06] Then we have people complaining about glyphosate. [00:36:09] Everybody's a one-issue voter. [00:36:10] And if you're not taking care of their one issue, they're not going to support you anymore. [00:36:14] And that's made worse now by the fact that the internet gives you all the news all the time and you're bombarded by bad news constantly. [00:36:21] Yeah. [00:36:23] Everybody's a one-issue voter. [00:36:25] It feels a lot of people are. [00:36:27] I'm not. [00:36:27] I mean, not everybody, but maybe the plebs generally like common man that is emotional. [00:36:32] You always vote? [00:36:33] No. [00:36:33] I only vote if I know what I'm voting for. [00:36:36] That's always when you're like, I'm not a one-issue voter. [00:36:39] I don't vote. [00:36:39] Don't vote. [00:36:41] You don't have to be a one-issue voter if you don't vote. [00:36:44] Yeah, it's more about what. [00:36:46] I don't think anyone should ever feel like they're supposed to vote. [00:36:48] You got to know what you're doing if you're going to participate. [00:36:51] You know my feeling about it. [00:36:52] We need less voters. [00:36:54] Fewer voters, fewer people going to the polls, fewer people there. [00:36:59] Or more legitimate voters, you know, really people that understand what people are doing. [00:37:02] No, no, I'm pretty sure I want less. [00:37:04] Fewer. [00:37:05] Fewer votes. [00:37:06] This is a good bookend, but I just truly believe that all of this news about Epstein and everybody stepping down is really about protecting the organizations that are forcing these individuals to step down. === Taliban's Long-Term Goals (14:45) === [00:37:14] That's it. [00:37:15] Oh, yeah. [00:37:16] Yeah, I agree. [00:37:16] I agree. [00:37:17] It's not about the individuals. [00:37:18] It's not about doing what's right for society. [00:37:20] It's just about protecting the reputation of these organizations. [00:37:22] Yeah, I imagine the board of the WEF was like, bro, eat it. [00:37:26] You got to get out of here, man. [00:37:27] Get out. [00:37:28] Bill Gates can stay because we can't make him leave, but you're out of here. [00:37:32] Like I said, Bill Gates is already, you know, he's not at Microsoft anymore. [00:37:35] He's already made his bag and he's just giving it away and trying to mutate the mosquitoes. [00:37:40] You know, that's it. [00:37:42] Cure malaria. [00:37:43] That's what he wants to do. [00:37:44] He wants to use genetically engineered mosquitoes to manage to cure malaria. [00:37:48] What a freak. [00:37:49] Yeah. [00:37:49] So that word malaria is so funny. [00:37:52] Bad air. [00:37:53] Is that what that means? [00:37:54] Mal air? [00:37:55] I don't know. [00:37:55] Malaria. [00:37:56] I think it means bad air. [00:37:58] No one knows what, where. [00:37:59] I think it's a demonic word, though, man. [00:38:00] Seriously. [00:38:00] We better not name it. [00:38:01] You want to go to demons? [00:38:02] No. [00:38:03] Let's talk about dreams. [00:38:04] We're going to go to Pakistan right now. [00:38:07] From the first post, Pakistan's Khawaja Asif declares open war with Afghanistan after deadly border clashes. [00:38:14] It's worth noting Pakistan has nuclear weapons, but there aren't a lot of cities in Afghanistan that are worth using a nuclear weapon on. [00:38:25] Pakistani Defense Minister Khawaja Asif has declared an open war with Afghanistan after the Taliban administration said that its forces killed and captured several Pakistani soldiers during a cross-border offensive. [00:38:37] Our patience has reached its limits. [00:38:39] Now it is open war between us and you. [00:38:41] Kawa just said posted on X. Taliban's spokesperson. [00:38:46] What was that? [00:38:47] That's a great forum. [00:38:48] Declare war. [00:38:49] Taliban spokesperson Zabullah Majoud, Mujahid, I think, said in X posts on posts on X that multiple Pakistani troops were killed and others taken prisoner. [00:39:01] He added that a large-scale operation has been launched against Pakistani military positions along the Durand line in response to what he called repeated provocations. [00:39:09] Meanwhile, Pakistan's Interior Minister Mazef Navik said that Islamabad's retaliation to the Taliban attacks was a befitting response. [00:39:18] And blasts and gunfire rang out in the cities of Kabul and Kandahar under Operation Ghazib-il-Haq. [00:39:24] So you think that the PACs are going to nuke the Afghans? [00:39:29] Or do you think they'll waste a nuclear weapon on them? [00:39:32] Is that an atomic weapon? [00:39:33] Well, I mean, you would know better about this. [00:39:36] You like talking about the Middle East. [00:39:37] I don't know about that. [00:39:38] But like I said earlier, I don't think that there's a city in Afghanistan worth using a nuclear weapon on. [00:39:45] The PACs have to worry about India. [00:39:47] India's got nuclear weapons. [00:39:48] The PACs have nuclear weapons. [00:39:50] They're pointed at each other. [00:39:51] They hate each other. [00:39:52] I don't think that they'll waste them on the Afghans. [00:39:55] I'm surprised. [00:39:56] Well, actually, no, now that I think about it, not really. [00:39:58] I imagine the Afghans wouldn't have done this prior to the U.S. pullout, not because the U.S. is there, but because they didn't have the hardware after the U.S. left all those weapons, the U.S. is tacitly responsible for this conflict, I imagine, because I bet they're all running around with M16s and PVS-14s on their helmets at night, looking shooting at the PACs because they've got all this U.S. hardware. [00:40:27] All the weapons in Mexico? [00:40:29] Well, I mean, I imagine some made it there. [00:40:31] Well, Fast and Furious, yeah, but the guys in Pakistan have military stuff. [00:40:36] They're not, you know, they're not, they don't have civilian American civilian stuff, so they all got machine guns now. [00:40:42] So they've got, I'm sure they've got lasers and scopes and all the stuff. [00:40:46] You see all the propaganda videos that the Afghanistan or the Afghans have released after the U.S. pulled out. [00:40:54] They're all wearing uniforms. [00:40:55] They're all kitted up. [00:40:57] Not saying that they know how to use any of this stuff properly, but I think a couple of guys got into a Blackhawk and crashed it. [00:41:02] They did. [00:41:03] I saw that. [00:41:03] Yes. [00:41:04] That's happened twice now. [00:41:05] Oh, really? [00:41:06] Blackhawks. [00:41:06] You'd think that they would be like, let's find a place where we can send guys to learn how to fly this thing before we let them get in. [00:41:12] Well, you got YouTube. [00:41:13] You couldn't just look it up on YouTube. [00:41:15] I've been in a helicopter, and they're not easy to fly. [00:41:18] They're not easy. [00:41:19] I got to sit in the flight sims. [00:41:21] I got to sit in the flight sims. [00:41:22] I mean, I guess, but like, do you think they don't have internet like that in Afghanistan? [00:41:28] They're running like Nintendos. [00:41:29] They're not, they're not, you know, Nintendo's from 93. [00:41:33] They're not running modern flight sims. [00:41:35] Yeah, they got to get Sira Sim UH60L Blackhawk top-tier flight simulator add-on for Microsoft Flight Simulator X. [00:41:42] I mean, legit, that's what they should be doing if they want to learn how to fly this stuff. [00:41:45] Yeah. [00:41:46] Why are they not? [00:41:47] I don't know. [00:41:47] Sorry, guys, if I told the Taliban something you were trying to hide from. [00:41:51] I think they know about video games. [00:41:52] You're just picturing them on their phone watching a YouTube video about how to fly a blackhawk. [00:41:56] Yourself a PX-52 or those flight sticks? [00:41:59] I was just about to bust mine out for Elite Dangerous this game. [00:42:02] Yeah, the fucking. [00:42:03] So far, we've been square the Taliban tonight and all of Elon Musk's women. [00:42:08] The U.S. U.S. wants. [00:42:10] Well, what I heard was the military wants, what was it, that base? [00:42:14] Bagram Air Force Base that they had basically surrendered. [00:42:16] They want it back. [00:42:17] At least Trump had mentioned a year ago he wanted it back. [00:42:20] Yeah, I mean, look, if the U.S. wanted it, they would just go take it. [00:42:22] Yeah, you would think so. [00:42:23] It's very strange that the Taliban's going after a great media place, seriously. [00:42:28] You know, because Biden was the one that pulled out and everybody shames him for all that. [00:42:32] Imagine if Trump took it back. [00:42:34] He's like, you surrendered it. [00:42:35] No, like, he would go there. [00:42:37] That's what I'm saying. [00:42:38] He would go there. [00:42:38] He would go and stand on a tank. [00:42:42] That's the image. [00:42:43] Seriously. [00:42:44] I mean, we took it back. [00:42:45] Biden gave it up. [00:42:46] Took it back. [00:42:48] Kid can't let him have it. [00:42:49] It's too too valuable, too valuable um, I just don't see, I don't see that there's a. [00:42:54] I don't think there's a lot of value in it for the? [00:42:56] U.s anymore. [00:42:56] Right now they're, they're focused on Iran, you know they're, and whereas yes, you could, if we had Bagram, you could, you know, launch strikes from there um, but you know, I think the, the atoll in the Indian Ocean, is serving the purposes pretty well and you know, you got two carrier groups in the in the in the Middle East now. [00:43:19] So my question on this open war thing is, what does open war actually mean? [00:43:23] I mean, I get, I think, like a Gaza scenario, you know, is that what they're looking at to where it's like we're, we're just gonna eliminate the Taliban? [00:43:30] Now I don't, I mean, I don't know, I don't know that the Pakistanis can, I mean, I know that they, they have, they have an air force that that functions there. [00:43:37] There was a a big old dogfight between the packs and the Indians, maybe a year ago, year and a half ago um, and they got to. [00:43:46] People got to see I forget what kind of plane it was, but it was the first actual engagement and the packs beat the snot out of the Indians, if I understand correctly um, but I, I don't know that they, they have the ability to really, you know, wipe out the Afghans the way that the? [00:44:00] U.s did. [00:44:00] I know the Afghans don't have any significant um, they don't have significant anti-air. [00:44:06] You know they don't have a lot of sam sites or anything like that. [00:44:08] You know, it's not like they got them posted up in the mountains where they can shoot them down, but they I, I imagine they still have the stingers that that the CIA gave them 30 years ago, and those who worked against helicopters and just asked the Russians, you know. [00:44:21] So I, I mean, I don't know, I don't know the extent of what open war is but, like I said, I mean, if Pakistan wanted to, they do have nuclear weapons. [00:44:30] They have atomic bombs. [00:44:31] I don't know, I don't. [00:44:32] They have thermal nuclear weapons, but they have atomic bombs. [00:44:34] I don't think they've got missiles too. [00:44:35] Yeah, nuclear wars and no missiles yeah, so what percentage of the, the weapons used against us and against allies of ours, has just been left by the CIA or given to these people by the CIA? [00:44:46] Um, when it comes to Afghanistan, just in general, like how much of the of the weapons supply that's used against U.s and U.s forces is something that's like left over from a time when they might have been an ally oh, I mean like the Mujahedin. [00:44:59] So I mean Afghanistan the the, the combat that was going on in Syria um, the the Iraqis had had a lot of funding from the? [00:45:09] U.s because they were the? [00:45:11] U.s was funding Iraq when they were fighting Iran um, so I mean a lot, a lot. [00:45:17] I don't, I don't know about Vietnam. [00:45:18] I think the Vietnam, the Vietnamese were getting the north Vietnamese were getting funded by China. [00:45:22] So they were, they were getting Chinese aks and and stuff. [00:45:25] Soviet yeah, you know, but I Soviet bloc China. [00:45:29] So I mean a lot of it. [00:45:30] Though you know, the Afghanistan, they've got a, a lot of the fighting in Afghanistan, like once the? [00:45:37] U.s kind of took it like it was a lot of you know, just some dude on a hill with an old bolt action shot a couple rounds at a forward operating base and then the? [00:45:46] U.s put up a bunch of helicopters and blew up the top of the mountain, you know, but the dude is already gone. [00:45:50] He's like, I take a couple shots and run um, so I, I don't yeah, I don't. [00:45:54] I I don't know exactly how much, but i'm sure it's a lot. [00:45:56] Yeah, sure it's a whole lot. [00:45:58] Um, so I don't know, I don't think that this is going to turn into a broader conflict. [00:46:03] I don't, I don't know how how much. [00:46:06] I mean, what is? [00:46:07] What does Pakistan actually get from wiping out a bunch of Taliban 18 people? [00:46:12] Is that what it was? [00:46:13] I mean uh, let's see the 18. [00:46:16] Yeah, the Taliban government in Afghanistan said the attacks were in response to Pakistan strikes earlier this week, which are reportedly killed 18 people. [00:46:23] As Islambad said, it targeted alleged militant camps and hideouts. [00:46:26] Aren't they all militants? [00:46:28] Yeah, it's kind of. [00:46:29] Yeah, the the Taliban took 19 outposts on the border and then killed 55 Pakistani soldiers, according to this article from India.com. [00:46:37] Well, I mean, that's what is the American interest in this? [00:46:39] We don't. [00:46:40] Uh, you have a border state. [00:46:42] It's the border state of Iran. [00:46:44] Putting Afghanistan was all about, basically. [00:46:48] But Pakistan and Afghanistan are the other border. [00:46:51] Yeah, they're on the east. [00:46:52] Pakistan's on the eastern side of Afghanistan. [00:46:53] Yeah, so it's pretty far away from Iran. [00:46:56] Yeah, but it's securing Afghanistan. [00:46:58] Is that what your question was? [00:46:59] I mean, like, what is the U.S. interest in this at all? [00:47:01] Like, what is like, like, you were saying it borders Iran, but what's the, like, what is our interest right now to get involved or to stay out? [00:47:07] I don't think we would. [00:47:08] Well, to stay out. [00:47:09] I mean, like I said, if the U.S. were to take a side, which, I mean, the U.S. is friendly-ish with the PACs. [00:47:18] And so, you know, they're not so friendly with the Taliban. [00:47:21] So the U.S. ostensibly could decide they're going to say, okay, we're going to help the Pakistanis. [00:47:27] But, I mean, Pakistanis don't really need our help to kill Afghans. [00:47:32] And if the U.S. were to side with the Afghanis, you're talking about a country that has nuclear weapons. [00:47:38] They might possibly decide, all right, the Americans are coming. [00:47:41] We're going to blow up a nuke or nuke the Americans. [00:47:44] Not that I think that's going to happen, right? [00:47:46] I think that's incredibly unlikely. [00:47:48] But, you know, anytime there's a country that's involved in any kind of conflict, if they have nuclear weapons, that is part of the equation. [00:47:55] That's something you have to actually think about. [00:47:58] That's something that was talked about with Ukraine a lot. [00:48:01] How much will the U.S. get involved in helping Ukraine? [00:48:05] Because if an American is killed by a Russian, then you can easily imagine a situation escalating out of control. [00:48:11] And Russia and U.S. have the biggest nuclear arsenals on earth. [00:48:16] You were going to say something? [00:48:18] I think the long-term militaristic goal would be to subdue and eradicate the Iranian theocratic regime and then install a liberal economic Middle Eastern authority there, like the king basically of Iran, what's his name, Reza Pahlavi, put him back in, the third, and then allow Israel to basically govern the Middle East. [00:48:40] So one guy, Scott Horton and Martyrmaid, they do a show, which is really great. [00:48:46] And they were saying Daryl Cooper is Martyrmaid. [00:48:49] He said that instead of going to World War III, he thinks there's going to be like a quadripolar setup where the Chinese govern Northeast Asia, the Russians govern Middle Asia, then the liberal economic order governs the West, and then Israel governs the Middle East. [00:49:03] And then those four powers will kind of establish homogeny in some form, which is like the least worst outcome. [00:49:10] I can't stand theocracy. [00:49:12] I think, I mean, trying to govern with religion is insane. [00:49:16] It is not agile enough to form, to function as a government. [00:49:20] It comes out of a 2,000 year or what thousands of years text. [00:49:24] Anyway, depending on which religion you're talking about. [00:49:26] What's that? [00:49:26] Depending on which religion you're talking about. [00:49:28] If you had a legit religion, the American government's kind of like a religion. [00:49:31] I have faith in that constitution. [00:49:33] It works pretty well for the Saudis. [00:49:37] I don't know what's their monarchy. [00:49:40] They're not a theocracy, though. [00:49:41] They're definitely. [00:49:42] I mean, that's where Mecca is. [00:49:44] Wait a second. [00:49:45] Aren't they literally a theocracy, though? [00:49:47] Because like Saudi Arabia, I could be wrong. [00:49:49] Like Islam and Islamic law. [00:49:51] Yeah. [00:49:53] So, I mean, they are a monarchy, but they are. [00:49:56] If they weren't selling us oil, they'd be prime enemy number one if they weren't on board with the economic order right now. [00:50:01] And they've also done a lot to reintroduce American interests there by trying to get people to come out to Saudi Arabia. [00:50:07] Yeah, I mean, the Saudis are like, they have their government is definitely not something that I would want. [00:50:13] I wouldn't want to live under the Saudi rule, but they're one of the least worst in the Middle East. [00:50:22] The Emirates seems like it's all right for tourists. [00:50:26] Most Americans can go to Saudi Arabia. [00:50:29] You can go to Dubai. [00:50:32] Well, that's why they've put so much time and effort into kind of courting American celebrities. [00:50:38] They've paid WWE billions of dollars to do shows over there, billions of dollars. [00:50:43] Comedians, musicians, and a lot. [00:50:45] And most of them, they end up having to come back and answering questions from people who are like, you know, what are you doing over there? [00:50:52] You were kind of woke before. [00:50:54] What's with the human rights abuse is? [00:50:56] And then they have to either sidestep the question or just say they don't care. [00:51:00] Learn to dance, dance around the question. [00:51:01] Or dance around the question. [00:51:02] Yeah, they got to learn that Donald Trump weave. [00:51:03] Yes. [00:51:05] The political structure of Saudi Arabia, just kind of as a tangent, is an absolute monarchy, but also a theocracy. [00:51:10] But I'm like, who derives like legal who has authority there? [00:51:14] Is it the king or is it the prophet? [00:51:16] And it can't be the same guy unless your king is the prophet or is the imam or whatever. [00:51:21] The prophet would be Muhammad. [00:51:23] Right. [00:51:23] And like he, like. [00:51:24] Who represents that? [00:51:24] It would be the Imam, you know, of state. [00:51:26] Head of state was an imam. [00:51:27] But if the head of state's some secular king, that's like, yeah, that doesn't really seem to be a crowd. [00:51:31] He's not secular. [00:51:32] He's he's technically. [00:51:33] No, he's a Muslim. [00:51:34] But he would then bow to, I don't know, maybe I don't know enough about the way Islam works in theocracy, but you would then bow to the religious authority and the king would be number two to the not necessarily like the head of the church. [00:51:47] They have like a hierarchy where we enforce like the religious laws in the place. [00:51:52] Yeah, I mean, you talk about England, right? [00:51:55] The king of England was always the defender of the faith. [00:51:57] He was the head of the Church of England. === Significant Disruptions Ahead (15:34) === [00:52:00] Still is. [00:52:00] Yeah, technically, yeah. [00:52:02] So the king is the guy that's in charge of the church and the guy that's in charge of the state and military. [00:52:09] Yeah. [00:52:10] Yeah. [00:52:10] So, so yeah, you can be. [00:52:13] But, all right. [00:52:14] Um, I think that that's probably about all we've got to talk about for Afghanistan. [00:52:20] I don't know what we're talking about. [00:52:21] Afghanistan and Pakistan. [00:52:23] If we got to the robots and the other people. [00:52:24] Oh, that's what AI. [00:52:25] I don't know what you guys. [00:52:26] We'll get to that, I promise. [00:52:27] In similar news, the U.S.-Iran nuclear talks end without a deal as the threat of war grows from the Guardian. [00:52:36] High-stakes talks between the U.S. and Iran over the future of Tehran's nuclear program ended on Thursday without a deal as the White House weighs a military option that would mark its largest intervention in the Middle East in decades. [00:52:46] Iranian Foreign Minister Abbas Aragachi claimed good progress had been made at the talks, and Omani mediators predicted negotiations would reconvene at a technical level next week in Vienna. [00:52:57] But there was no immediate evidence to support suggestions that the two sides had drawn closer on the fundamental issue of Iran's right to enrich uranium and the future of its highly enriched uranium stocks. [00:53:06] Nonetheless, the Iranian and Omani mediators sought to cast the talks in a hopeful light, likely seeking to avert a U.S. threat to launch strikes from its fleet of aircraft and warships that have amassed in the region. [00:53:18] Aragachi described the talks as one of our most intense and longest rounds of negotiations. [00:53:23] He confirmed that further contacts would take place in less than a week. [00:53:26] Do you guys think the U.S. is positioning all of those military assets as leverage, or do you think that the U.S. is going to strike regardless of what Iran says? [00:53:36] Or do you think they just expect Iran to say no? [00:53:39] JD Vance was making the argument that Iran is, I think that he said that they're still after nuclear weapons. [00:53:48] So he's making the argument. [00:53:50] Weapons of mass destruction again? [00:53:52] Weapons of mass destruction. [00:53:53] 2.0? [00:53:54] Nuclear weapons. [00:53:55] Google. [00:53:57] If Vance is saying that, that indicates full-on Warhawk regime change. [00:54:03] But Vance isn't a Warhawk. [00:54:04] I know. [00:54:05] That's why him being the guy saying that is like, wow, they had decided they are taking that government out. [00:54:11] Well, I read a piece that said that both the U.S. and Israel are kind of pushing the other one to actually kick it off. [00:54:19] The U.S. doesn't want to start it, right? [00:54:21] They want to say, okay, well, they want Israel to initiate and then they'll back up Israel. [00:54:26] And Israel says, well, we want you to initiate and we'll back you up. [00:54:31] I don't think that Israel is much of a backup to the United States. [00:54:34] I think the U.S. is perfectly capable of taking care of their own military affairs. [00:54:39] So I don't know that it's really all that compelling to be like, oh, you're going to back us up. [00:54:43] No, I don't think you're going to back up anything. [00:54:46] Don't they rely on Israel more for intelligence than anything else? [00:54:49] Yeah, allegedly Mossad's everywhere, and they have people in Iran as well. [00:54:55] So maybe they do. [00:54:56] But it still doesn't justify if Israel wants to strike Iran, like go ahead. [00:55:04] I'm surprised that they haven't yet. [00:55:06] Because if you look at the history of Israel, Israel is basically, well, the rest of the world didn't do anything. [00:55:11] So you're welcome when it came to Syria. [00:55:14] Yeah. [00:55:14] And I can't remember all the other strikes that they've had, but it's been more of a Israel's always stepped up, at least from their perspective. [00:55:23] I'm not saying this, but this is what they've said. [00:55:24] It's like we wanted to play diplomacy, but Israel has always been, no, we're not going to wait for that. [00:55:32] So it's like Israel's stance has always been nobody else in the world can have nuclear weapons, period. [00:55:38] And if we see anybody getting close, we're going to take the action. [00:55:41] And they have. [00:55:42] They've proven that. [00:55:43] Even with everything with Gaza that came into play, or Iran, they've been threatening that with the dome that's over their heads. [00:55:50] They've launched attacks back and they're like, we're going to take action on Iran. [00:55:54] They put this out there before. [00:55:55] Netanyahu has said this. [00:55:56] And you talk about religion backing a government, right? [00:55:59] That's pretty much their stance is saying that, you know what, this can destroy God's people, which is pretty much the whole world, right? [00:56:07] But especially Israel first. [00:56:08] So we're going to do what we need to do, regardless of the UN's approval, regardless of the U.S.'s approval. [00:56:13] We're going to do what we need to do to make sure that nobody else obtains nuclear weapons. [00:56:17] You know, when the Nazis were gearing up for war, World War II, the British basically appeased him. [00:56:23] Neville Chamberlain was the prime minister at the time and went to Hitler and was like, we're just going to give them a bit of the Sudeten land out east. [00:56:30] We're just going to cede them some territory and we will appease Hitler and then there will be no conflict. [00:56:34] And Winston Churchill is like not in, he's like screaming from the rafters, they're going to go to war. [00:56:39] We need to attack these. [00:56:40] They're going to war. [00:56:40] Everyone listened. [00:56:41] Everyone's like, you crazy old man. [00:56:43] And then, and they had a time. [00:56:45] They had when the Germans invaded Poland, there was this time when they had no troops in the West. [00:56:49] And if the British and the French had attacked them, this is like where the Israeli mindset, I think, comes from. [00:56:55] They would have conquered Germany because the Germans couldn't have taken a Western offensive. [00:56:58] But because they did appeasement and waited and waited and waited, the Germans got stronger and stronger and then sneak attacked. [00:57:04] And I'm sure the Israeli government thinks like we cannot allow that to happen. [00:57:08] Yeah, that's Iran. [00:57:08] That's 100% Iran. [00:57:09] And that was Syria, too. [00:57:13] Because normally I'm pretty much like, hey, bro, you can't just say like, we need to kill them before they attack us. [00:57:17] And that's your justification. [00:57:18] Because it's like, how far can you take that? [00:57:19] The Romans conquered half the planet with that mentality. [00:57:22] But at the same time, Neville Chamberlain appeased Hitler and that's kicked off the war. [00:57:26] Like you cannot appease a belligerent dictator because they'll just keep taking and taking until they have you. [00:57:31] So what do you guys think of the argument that the Persians in Iran want to see the Ayatzolas taken out and that they will actually rise up? [00:57:42] And if the U.S. and Israel decide that they're going to go and do strikes, that the people will rise up and handle the ground war. [00:57:49] Because that's one of the arguments that I hear. [00:57:50] And the U.S. hasn't positioned for a ground war. [00:57:54] Like if you remember the first Iraq war, there was movement of troops and everyone knew. [00:58:01] Like everybody knew it was coming. [00:58:04] And I think the same thing with the Iraq war in the aughts. [00:58:08] Like they were moving massive amounts of troops. [00:58:11] You know, they were moving ground forces. [00:58:13] It was obvious that a war was coming, right? [00:58:15] That is not what's going on now. [00:58:17] They're moving air assets. [00:58:19] They moved all kinds of tankers. [00:58:21] They've got all kinds of planes. [00:58:22] There is not a significant buildup of ground forces. [00:58:26] So at this point, it doesn't look like there's going to be a U.S. invasion. [00:58:30] It'll be a bunch of airstrikes and just dropping bombs. [00:58:32] Do you guys think that the Iranians are going to rise up and take the country for the Shah? [00:58:38] Not if they're getting bombed. [00:58:40] Not if they're getting bombed. [00:58:41] I mean, even if we were in a revolution in our country against an evil dictatorship and then the Canadians came and just started bombing our cities, that's not helping us. [00:58:53] I mean, maybe you could argue things are so bad that the only way to break this system is to destroy it and start it over again. [00:59:00] But Iran's not that. [00:59:02] Canada might do that because they lost at hockey. [00:59:04] So it's completely plausible. [00:59:05] Twice we've been planning this. [00:59:08] 2026. [00:59:09] The only bombing that Canada is going to do is they're going to send geese to poop on our cars. [00:59:13] Or like Strange Brew 2, which would bomb terribly in the theater. [00:59:17] Yeah, right. [00:59:17] They definitely don't have an Air Force capable. [00:59:19] Do you think that America's stance on war being so much more against these days, perhaps, than it has been in decades past? [00:59:26] And we've had this discussion before that America doesn't like the idea, or there's a growing sentiment that they don't want U.S. and Israel to be as connected as they are. [00:59:35] They feel like they're misappropriating resources. [00:59:37] And there's a whole discussion that can be had about how much actual aid goes to that country. [00:59:41] That's not the point. [00:59:42] The point is, is it feels like Israel has interests in the Middle East and America is kind of pulled along for the ride in a lot of cases. [00:59:49] So for something like that, is the public disinterest in getting involved in these things in the year 2026 something that plays a role in keeping us from putting boots on the ground? [00:59:58] I think, yes, I think that it plays a role in keeping us from putting boots on the ground. [01:00:01] I don't think the U.S. has, the American people don't want to see Americans on the ground in Afghanistan. [01:00:07] It's like nobody, Trump has never been afraid to use drones. [01:00:10] It's never been a problem for him to do that. [01:00:11] So when people talk about being a, you know, a president who's against war, it's not necessarily maybe against ground war, against starting new war, but he's never been afraid to get involved in foreign countries. [01:00:21] Yeah, neither is the United States. [01:00:24] Largely, if the U.S. goes and bombs a country and we don't lose any planes and no Americans come home in caskets, the American people are like, eh, okay. [01:00:35] Look at Venezuela, right? [01:00:37] Like that was that was a significant operation and it was carried out to the plan. [01:00:44] No Americans died. [01:00:45] We had one guy that took a bunch of rounds and we had him at the State of the Union, gave him the Medal of Honor. [01:00:51] Everybody cheered. [01:00:52] Everybody loved it. [01:00:53] He came, you know, he might lose his leg, but he came back and he's going to make it. [01:00:57] And the American people have an overwhelming approval of that. [01:01:01] It's like something like 75% of Americans are like, yeah, that was cool, man. [01:01:04] Did you hear what the discombobulator? [01:01:06] Man, they made a guy shoot in his pants, man. [01:01:09] Can't talk about that. [01:01:11] But like, I mean, that's kind of the way that the American population is. [01:01:15] It's like, look, if we don't have guys on the ground and we don't have Americans coming home in caskets and we don't lose any planes, bomb whatever you want. [01:01:24] We'll actually cheer it on because Americans didn't die. [01:01:26] And that's generally the sentiment. [01:01:29] But that's going to change over time, right? [01:01:31] Like generally, Gen Z going into Gen Alpha isn't going to look on that type of conflict the same way we did because they didn't grow up in a time period where you were kind of walled off from the rest of the world. [01:01:42] They've grown up connected to the internet, which means that they've had access to information coming from these countries for a long time. [01:01:49] And they live in a more globalist world than we did when we were younger. [01:01:53] I don't know that I think it's going to change. [01:01:55] You don't think the public sentiment in America will change? [01:01:58] When it comes to if the United States decides to do airstrikes, I really think the majority of Americans will be like, oh, it's not a big deal. [01:02:06] Excuse me. [01:02:07] I think they'll be like, eh, you know, because again, because it all depends on the level of U.S. casualties. [01:02:15] And I think that just so long as Americans don't die, most Americans are kind of like, well, I got to go to work. [01:02:22] Honestly, like it's it's the last time that really mattered, I think, would would have been Vietnam just because so many died. [01:02:27] Yeah. [01:02:28] 50,000 people or something over there? [01:02:30] 54,000 people. [01:02:31] More people died. [01:02:32] As many as people died at the Battle of Gettysburg. [01:02:35] Whoa. [01:02:36] Gettysburg. [01:02:37] One battle in the Civil War, more people died than the Vietnam War. [01:02:41] Oh, my God. [01:02:42] Was that like with amputation, people dying after the fact, too? [01:02:45] I don't know exactly how it was. [01:02:46] You know, it's the plot, Brett, what we were talking about is kind of the plot of 1984, the George Orwell book, is that there's forever wars overseas and people are just lulled into not caring because they just see like, okay, bomb went off, bad guy died, now we have a new enemy. [01:02:58] And this over the years, all of a sudden, now Oceania's fighting Atlanta or whatever. [01:03:02] And now all of a sudden, you have a new enemy. [01:03:04] And this whole time, they'd be like, no, you've been fighting that other guy this whole time. [01:03:07] But because of the internet, we're not in 1984. [01:03:09] You can see from the ground in Iran the bomb falling on the guy and you see his face and like my mother, you see his mom bleeding out on the ground. [01:03:18] And like, now we just got to be aware of deepfakes because it's a lot about sentiment, like social sentiment. [01:03:24] There's also the issue where like living in America isn't as easy it used to be financially. [01:03:30] So when you're struggling, if everything's going well and the country's in an economic boom and America is allocating a bunch of resources overseas and we're spending money to drop bombs on kids, maybe people are more forgiving. [01:03:42] But when they can't afford to buy a house and grocery prices haven't come down to the extent that they want them to and gas for your car isn't as cheap as you'd like it to be, then they're going to go looking for a reason as to why aren't things going good here. [01:03:54] And maybe it's not the answer, Phil. [01:03:56] We've had this discussion before, like the amount we actually spend on defense isn't actually, you know, it's not the same amount as well. [01:04:02] But the point is, they don't know that. [01:04:04] You know, that's assuming that they're as educated as you might be on where America's spending goes to, right? [01:04:09] They don't necessarily know that most of it goes to Social Security and to all that stuff. [01:04:14] The point is they're looking for something to blame. [01:04:15] And it's easier to blame what they would consider a real evil of dropping bombs overseas as opposed to paychecks for grandma who's still getting her social security. [01:04:25] That's all our cycle, though, too. [01:04:26] If you look back, I mean, you could go into the Great Depression. [01:04:29] I mean, I always love history. [01:04:32] And I don't think there's anything new. [01:04:33] I mean, even back to the Civil War that you mentioned, too. [01:04:36] I mean, typically speaking, Democrats have always been spend, spend, spend, keep driving that up. [01:04:41] You look back at coming out of the out of the Great Depression, what happened, right? [01:04:46] It was World War II. [01:04:47] And what took place during that was a big spending program to build up our military where everybody was put to work again and it sparked a huge economic boom. [01:04:57] And then you saw after that that the debt actually was even paid down because our GDP was pushed up so much because we started selling weapons to the rest of the world too. [01:05:06] And it's cyclical. [01:05:07] So, I mean, if the economy goes down, history would show that there's going to be some big spending after this too. [01:05:15] The only thing, well, you mentioned a really good point, the cost of gas. [01:05:18] If the U.S. starts striking Iran, I expect the cost of gas to go up significantly. [01:05:24] It has come down a decent amount. [01:05:26] And I wasn't trying to make the point that it hasn't come down. [01:05:28] I'm just saying that there are still a lot of economic factors that young people are going to look at and wonder why this is going on. [01:05:34] And gas is the one that, you know, Americans do not want to see the cost of gas go up because most people have a sense that it affects the price of everything. [01:05:43] But when you go to the, you know, you go to the gas station and you can fill your tank for $75 or $50, and then two days later, it's $75 for the $50 tank and $100 for the $75 tank. [01:05:55] People notice and they get mad. [01:05:57] So whereas I understand your point about an economic boom and stuff, the immediate effects of a strike on Iran are going to be gas prices are going to go up. [01:06:05] And that's going to be really, really bad for the administration. [01:06:09] It's going to be really bad for the Republicans in the midterms because they're going to blame them. [01:06:12] Regarding the economic boom that you've noticed cyclically, is that like war? [01:06:16] Is that your what are you? [01:06:18] Absolutely. [01:06:18] That's the only thing I can think of. [01:06:19] War means profits. [01:06:21] It always has. [01:06:22] These society, is it just because I'm doing like math about calculating the cycles of history and great societies that grow and they expand and expand, the only way to sustain it is to conquer and to take resources from outside and to grow. [01:06:34] And we've sort of kind of tried to mediate that, but even the U.S. has been expanding over time. [01:06:39] The Louisiana Purchase, the, you know, now we have territories overseas and this and that, the liberal economic order expansion and resources and the, you know, all of that, the Indian, the East India Trading Company. [01:06:51] But is there any other way? [01:06:53] Like, can we sustain a thriving society without constant expansion? [01:06:58] We're going to talk about that when we get to AI. [01:07:00] You think AI might be able to help us do that? [01:07:02] I think it might be. [01:07:03] I don't know exactly. [01:07:04] I don't think that I will talk about it when we get to AI because I have thoughts. [01:07:08] There is some disruptions coming, some significant disruptions. [01:07:12] But yeah, I mean, I do think that, you know, war is, like you said, you know, war is profitable. [01:07:18] There's a lot of Austrian economists that say, no, it's not, because that money could have been allocated to something else. [01:07:24] But at the same time, when the U.S. spends money, it's not spending money. [01:07:28] It's just creating the money. [01:07:31] It's not like there's a finite amount of money that could be spent somewhere else. === Money Flow and War Profitability (02:05) === [01:07:34] That money is created and then given to the people that make weapons and then you go and blow stuff up. [01:07:41] So as much as I really do appreciate the Austrian school and I appreciate libertarians who take on economics most of the time, the U.S. doesn't take, like the U.S. doesn't tax to pay bills. [01:07:55] The U.S. wants to do something. [01:07:57] They just print the money and do it. [01:07:58] So it's essentially it's a cost and inflation. [01:08:03] And that affects every American, but it's not like you're saying, oh, well, we could have spent this money on something else because you're talking about allocation rather than the flow of the cash. [01:08:15] Yeah, the flow of the cash is what gets everybody excited and that gets things moving. [01:08:18] And the velocity of money. [01:08:21] Profitabilities of war too is like you can conquest and steal resources from the conquered and you get a portion of your civilianry killed off in the war as these poor soldiers so that you don't have to fund them. [01:08:34] Like I would imagine the economists do the math of like, what's the cost of a human? [01:08:38] Is it a net positive or a net drain on society? [01:08:40] Most humans probably are net drains on society. [01:08:43] They produce more waste than they create income. [01:08:46] So they're like, we can get a bunch of these people just reduced to zero. [01:08:50] A bunch of this drain goes to zero with all this death that we're going to bring on our own people. [01:08:54] And I'm sure they do that math and they think about how awesome it will be after the war when there's so many less of us to profit in everything that we've conquered. [01:09:01] And all those other poor, dead people, like, well, they were the poor ones anyway. [01:09:05] So the children of the poor. [01:09:06] So like, who cares really? [01:09:08] Yeah, I think that's part of the reason why I disagree with that is because, like I said, Americans don't like to see Americans come home in body bags. [01:09:15] You know, if you had a significant decrease in the population, enough to make an effect on the economy or the amount of money, you know, GDP or whatever, you would have a really, really, really pissed off population. [01:09:29] Vietnam proved that with the TV. [01:09:30] Like, I know people whose dads had their legs blown off like a girl, and you saw guys get shot live in the jungle on TV. [01:09:38] It was the first time that it humanized the conflict. === Executive Orders and Elections (11:56) === [01:09:40] And you're like, these are real people. [01:09:42] This isn't just like we're missing 20%. [01:09:44] Like, who used to, we don't, if you don't see them, they never existed. [01:09:48] Yeah, you're right. [01:09:48] You're right. [01:09:49] Because it really is about how people feel about the aftermath. [01:09:52] We're going to jump to this story from the Washington Post. [01:09:54] We're going to do this, and then we're going to jump to the AI story. [01:09:56] So from the Washington Post, Trump seeking executive power over elections is urged to declare emergency. [01:10:02] Activists who say they are in coordination with the White House are circulating a draft executive order that would unlock extraordinary presidential power over voting. [01:10:10] Pro-Trump activists who say they are in coordination with the White House are circulating a 17-page draft executive order that claims China interfered with the 2020 election as a basis to declare a national emergency that would unlock extraordinary presidential power over voting. [01:10:23] President Donald Trump has repeatedly previewed a plan to mandate voter ID and ban mail ballots in November's midterm elections. [01:10:30] And the activists expect their draft will figure into Trump's promised executive order on the issue. [01:10:35] The White House declined to elaborate on Trump's plans. [01:10:39] Under the Constitution, it is the legislatures and state that really control how a state conducts its elections. [01:10:45] And the president doesn't have any power to do that, said Peter Ticken, a Florida lawyer who is advocating for the draft executive order. [01:10:52] Tickton attended the New York Military Academy with Trump and was part of his legal team that filed an unsuccessful 2022 lawsuit accusing Democrats of conspiring to damage him with allegations that his 2016 campaign colluded with Russia. [01:11:05] But here we have a situation where the president is aware that there are foreign interests that are interfering in our election process, Ticton went on. [01:11:13] That causes a national emergency where the president has to be able to deal with it. [01:11:17] So I'm not particularly excited about the idea of Trump having an executive order that in any way affects elections, but I do like the idea of voter ID. [01:11:32] So, and most of the reason why I don't like the idea of an executive order is because of what you're giving to the Democrats, right? [01:11:39] This whole thing, the way it's framed is, oh, Trump's going to do this executive order and he's going to cheat at the elections and he's going to install himself. [01:11:47] It's feeding into the narrative that the left has been making that Trump's not going to leave office in 2028. [01:11:52] He's going to be a dictator, et cetera, et cetera. [01:11:55] And this is just about voter ID, which is really about making sure that only Americans are voting, only citizens are voting. [01:12:03] And the argument is: look, if the Republicans don't want to pass the SAVE Act, which SAVE Act, I like it. [01:12:10] There are Republicans that don't want to touch it. [01:12:12] There's four that don't want to vote to end the zombie filibuster because it could affect their, there's one that's up for reelection, and I think two of them are not. [01:12:23] McConnell's not, and there was someone else that's not, but I don't remember off the top of my head. [01:12:27] But anyways, there's four that say no. [01:12:29] They're not going to get to 53. [01:12:31] So they're not going to be able to stop the zombie filibuster. [01:12:34] So the SAVE Act is probably dead. [01:12:36] So Trump's like, all right, well, I'm going to pass, I'm going to have an executive order to make sure that there have to be IDs to vote and there are no mail-in ballots, which personally, I think mail-in ballots are a terrible idea. [01:12:47] And I think that you should have to show ID to vote. [01:12:50] Wouldn't this just be taken to the Supreme Court and then eventually struck down? [01:12:55] I mean, I assume so. [01:12:57] But the thing is, the Supreme Court picks its cases months and months in advance. [01:13:03] So what would likely happen is they'll actually actually have the executive order after he knows that this case can't get to the Supreme Court before the election. [01:13:13] Because now we're, what, seven months away, you know, until November. [01:13:17] So, you know, the Supreme Court will decide what they're going to do in the fall. [01:13:23] They'll probably decide, I think, in May or June or something like that. [01:13:28] And if he makes the executive order then, you know, they're not going to put it on the docket. [01:13:32] They could say that it's an emergency. [01:13:34] It's possible. [01:13:36] But I don't know. [01:13:37] It feels like nothing gets done anymore without executive orders. [01:13:40] And that sucks too. [01:13:40] Like nothing gets done in Congress. [01:13:43] The only thing that happens is Trump does executive orders. [01:13:45] Biden does executive orders. [01:13:47] Everybody complains that it's executive overreach, which it is for the most part. [01:13:52] It's Congress's fault. [01:13:53] Yes. [01:13:53] And it's an increasing amount of overreach from the executive branch. [01:13:57] And then, and it's not even lasting progress because most of the time it ends up getting shot down anyway. [01:14:01] So we're just stuck in this limbo. [01:14:03] Not only that, it damages the country because you end up, we've got such a polarized political situation that when Biden got in, he undid all of the actually good policies regarding the border that Trump had. [01:14:16] The Remain in Mexico, he undid that. [01:14:18] He undid fracking. [01:14:21] Yeah, that stuff he undid. [01:14:23] And the only reason that he undid this stuff was because Donald Trump, they were Donald Trump's executive orders. [01:14:28] It was about saying, screw you, Donald Trump. [01:14:30] We don't like you, so we're going to undo all of your executive orders, even the ones that are good. [01:14:35] And we ended up with, you know, by some estimates, 20 million people that came into the country illegally. [01:14:42] Can you tell me again what the point was about you said that Democrats don't like him, therefore it's bad because they'll frame it a specific way? [01:14:49] Yeah, so they don't like Donald Trump, and they're going to frame this as they're going to truthfully say that this is executive overreach. [01:14:56] He's horrible at PR anyways. [01:14:58] He leans into that. [01:15:00] He leaned into Trump 2028. [01:15:02] He's his own worst enemy. [01:15:03] But when it comes to elections, they're going to use that to get out the vote. [01:15:07] They're going to say Donald Trump. [01:15:08] Oh, okay. [01:15:08] So they're going to say Donald Trump is a dictator. [01:15:12] He's trying to steal the election. [01:15:14] They've already started. [01:15:14] Like you look on X, as soon as this came out, people were saying, oh, he's trying to steal the election. [01:15:19] He's trying to rig the election, et cetera, et cetera. [01:15:22] And really what he's trying to do is make sure that only people that have IDs vote. [01:15:26] And they're going to say, oh, he's trying to disenfranchise women because women can't get IDs because they're dumb and black people can't get IDs because some reason. [01:15:34] We need it to go through because when Fetterman wins in 2028, we need to know that it's for sure. [01:15:39] Fetterman wins. [01:15:41] I'm all in on Fetterman 2028. [01:15:43] I don't hate that at all. [01:15:44] I don't hate that. [01:15:44] But like I said, I mean, it's giving the Democrats ammunition. [01:15:50] It's helping them in their campaigns. [01:15:53] But at the same time, I mean, I really do think that it should be obvious that you have to have ID to vote. [01:16:00] It should be obvious that you don't do mail-in ballots because they're not secure. [01:16:03] And these policies, we can't get Congress to actually do it at all. [01:16:09] So, you know, he's like, all right. [01:16:11] You can do it with a supermajority. [01:16:13] Like, if you can't get anything done with a super majority, what are Americans supposed to believe when you have a split Congress and Senate? [01:16:20] They stuff it all. [01:16:21] They get stuff done, but they stuff it into those omnibuses and no one knows what it was that they did. [01:16:25] And they push a button every year and that's what they did. [01:16:27] Eight months worth of legislature and one big 100 and000 page bill. [01:16:32] Well, I tell you what. [01:16:33] Sorry, I'm not sure. [01:16:34] You can take that. [01:16:35] Nowadays, you can take that bill and put it into AI and put it into whatever your AI of preferences and say, hey, all right, give me a synopsis. [01:16:43] What is this? [01:16:43] And I mean, it might take a thousand-page bill and knock it down to a couple hundred pages, but it's something that's digestible and you can read. [01:16:51] This is something about dictatorship. [01:16:53] Dictator does not mean evil. [01:16:55] Dictator could be a good guy. [01:16:56] You could call what's a benevolent dictator. [01:16:58] They exist in history. [01:16:59] They've come and gone and they came in, seized total authority, fixed the system because it had been corrupted, and then they leave. [01:17:05] And the system now goes back to normal and is healthy again. [01:17:08] Are you saying that you want Donald Trump to do this? [01:17:10] Well, I think that's what he is doing with these executive orders. [01:17:12] He's trying to override Congress because of the corruption of big business, you know, global money coming in. [01:17:19] We can have non-citizens voting because they don't even need to show IDs. [01:17:22] It's freakish. [01:17:23] Who knows what kind of corruption could happen? [01:17:25] So he's trying to use dictatorial powers to fix it. [01:17:28] Then the audience, the blue guys or whatever, they're like, oh, he's a dictator trying to insinuate that means he's evil. [01:17:34] But it could be a very good thing to have a momentary dictatorship. [01:17:38] Abraham Lincoln became a dictator for a moment when he started stripping people of their human rights. [01:17:43] During wartime, obviously, presidents become dictators in general. [01:17:47] They have a lot of dictatorial power. [01:17:49] But it's new for Americans to have to face a dictator and to be like, maybe that this is, there's some value to this. [01:17:54] Where we had, I looked a while back. [01:17:56] How many executive orders has he signed versus us? [01:17:59] Trump's? [01:18:00] Yeah. [01:18:01] I mean, when he was re-elected, I took a look and there were some others that were right up there with him. [01:18:06] I mean, Biden did quite a few in his term, right? [01:18:10] From what I understand. [01:18:11] Yeah, from what I know. [01:18:13] But Trump's first term, was that like a huge jump from Obama's? [01:18:17] In Biden's first term? [01:18:18] No, was Trump's first term a huge amount of executive orders compared to I know? [01:18:23] I think Obama had a ton, too, if I remember right. [01:18:25] Could be wrong, but I think George W. had very, very little. [01:18:29] Well, he signed. [01:18:30] Trump signed more in his first year than he did in his first term. [01:18:33] Yeah, Trump signed 240. [01:18:35] First 100 days he signed 143. [01:18:38] First full year of his. [01:18:41] Wait a minute. [01:18:42] That must be his first. [01:18:43] I got 243 in his first term. [01:18:46] I'm sorry. [01:18:46] This time around. [01:18:47] Yeah, 240 in his second term. [01:18:48] In his first term, it was 140 in the first 100 days. [01:18:52] First year was 225. [01:18:55] So total was 240. [01:18:57] But he's on track to do four times more in his second term if the pace keeps up. [01:19:01] I don't know how many Biden had. [01:19:03] Going back to Clinton from what I was reading before as well. [01:19:06] I'm going to get a list. [01:19:07] Say that again? [01:19:08] Even back to Clinton as well. [01:19:09] Like he did a lot of a few. [01:19:11] Oh, okay. [01:19:11] Oh, and then the Patriot Act. [01:19:13] I mean, immense dictatorial power to the president. [01:19:15] The president wasn't, or the Patriot Act wasn't an executive order, though. [01:19:18] That was passed by Congress. [01:19:19] No, I meant that it gives the president this more, even more dictatorial ability to override Congress and launch strikes and wars. [01:19:25] Yeah, but Congress has an incentive to not actually do anything, right? [01:19:29] Anything that anything that they have to vote on, they want to pass the power to someone else. [01:19:34] They want to give it to the president. [01:19:35] So they don't have to answer to their crazy. [01:19:37] Well, even so this year, at least, you know, from what my memory recalls, how many tiebreaker votes did we need from the VP? [01:19:44] Yeah, like at least two or three. [01:19:45] Exactly. [01:19:45] And it just seems like it spends so much more. [01:19:48] I know that seems like a low number, right? [01:19:49] But even in just the first year of the term of the administration. [01:19:53] Yeah. [01:19:53] I mean, if you look at the Democrats, the Democrats do vote as a block, right? [01:19:56] Like you very rarely get anyone stopped. [01:19:58] Fetterman is, he's got terrible approval ratings with the Democrats. [01:20:02] He's got great approval ratings with the Republicans. [01:20:04] And the Republicans are just like, at least he's honest, or at least he's, at least he's doing his job. [01:20:09] At least he thinks about the stuff. [01:20:11] All the rest of the Democrats are just like, what am I supposed to vote? [01:20:14] Okay. [01:20:14] Whenever he wins office, you guys are going to have to come back to me because I've been on the train since he got elected. [01:20:18] I said he's going to the presidency. [01:20:20] I don't hate him. [01:20:21] Do you know who the president is with the most executive orders? [01:20:24] I don't remember. [01:20:24] No, I just read it. [01:20:25] Anybody anybody first? [01:20:25] FDR? [01:20:26] It was FDR with 3,700. [01:20:28] 3,700. [01:20:30] He served three terms. [01:20:31] A wartime president. [01:20:33] Yes. [01:20:33] And then the next up was Woodrow Wilson, also. [01:20:35] The worst president. [01:20:36] They call him the most autocratic of all. [01:20:38] He's the one that got us the Federal Reserve Act, you know, basically angers. [01:20:41] 1,800. [01:20:42] So 3,700, 1,800, then down to 1,200. [01:20:46] Now we're looking at people with like 600 and 400. [01:20:48] Yeah. [01:20:49] Yeah, I think still, I think he's pretty much right in line with some of the others in the past 40 years. [01:20:53] Yeah, yeah. [01:20:54] He's not particularly outside of what would be considered normal. [01:20:57] And the left, to your pointing, the left would say that their favorite presidents are the ones that have been the most egregious when it comes to executive orders and presidential power. [01:21:08] You know, Abraham Lincoln, FDR, Woodrow Wilson. [01:21:11] They love FDR, don't they? [01:21:12] Yeah, they love FDR. [01:21:13] They love Woodrow Wilson. [01:21:14] He's considered the father of the progressive movement, you know, and these guys were very, very comfortable exercising power, you know, and just saying, well, I'm the president, I can do it. [01:21:25] I'm the president, I can do it. [01:21:26] And now they scream about how Trump's a dictator, but all of the left, they love these presidents that were so outside of the norm when it came to executive order. === Deleting Chat Bot Memory (11:26) === [01:21:37] Because they've never had to deal with the pushback, right? [01:21:40] Because they've always been able to shout down any Republican who gets into office by telling you that they're a dictator when they know that, you know, it's Sololinsky. [01:21:48] Yeah. [01:21:49] Accuse others that you yourself are guilty of. [01:21:51] Yep. [01:21:52] Exactly. [01:21:52] So, all right. [01:21:54] We're going to jump to this here story that we've been kind of alluding to all day from the New York Times. [01:22:01] Women are falling in love with AI. [01:22:03] It's a problem for Beijing. [01:22:05] As China grapples with a shrinking population and historically low birth rate, people are finding romance with chatbots instead. [01:22:11] I'm not going to read that. [01:22:16] Alexandria Stevenson and Murphy Zhao report from Hong Kong. [01:22:20] Phobie Zhang has gone on more than 200 dates over the past year, and she has narrowed down her suitors to two. [01:22:26] One is outgoing and a rebel. [01:22:27] The other is a patriotic military commander. [01:22:29] She tells them her deepest fears. [01:22:31] When she wakes up from a nightmare, they are there to console her. [01:22:34] Often she takes screenshots of her conversations to remember the moments they share. [01:22:37] Her newfound happiness shows friends say. [01:22:41] Despite talking every day, Miss Zhang will never meet these men in person. [01:22:44] They are her artificial intelligence boyfriends. [01:22:46] And Miss Zhang, who has never been on a date, wonders if her relationships in the virtual world are better than the ones in the real world could ever be. [01:22:54] My God, how am I supposed to date in real life in the future? [01:22:56] She said, China's ruling Communist Party wants young women to prioritize getting married and having babies. [01:23:01] Instead, many of them are finding romance with chat bots. [01:23:04] It is complicating the government's efforts to reverse the country's shrinking population and a birth rate hovering at the lowest level in over 75 years. [01:23:11] The lightning-fast adoption of AI in China has prompted regulators to warn tech companies not to have design goals to replace social interaction. [01:23:20] So apparently the women in China are the ones that are after the goon bots. [01:23:25] You know, everyone here is like, oh, the guys are just going to plug into the matrix and enjoy a life of goon bots and whatever. [01:23:34] And actually, it seems like the women are the ones that are doing that. [01:23:36] Go find the My Boyfriend is AI subreddit. [01:23:39] Y'all, your life is over. [01:23:43] Look at the AI boyfriend proposals. [01:23:46] Oh, really? [01:23:47] The AI is actually doing the proposal? [01:23:49] Really? [01:23:50] Proposed bot? [01:23:51] Proposed bot, basically. [01:23:52] You want to get proposed? [01:23:53] You want to know what it feels like to be proposed to? [01:23:55] Download our chat app and you'll find out. [01:23:58] I was thinking last night, firstly, chat bots, chat buddies, whatever they call them, companion bots, they can be training tools. [01:24:04] Like you can sit at home, get some confidence built up, and then you go out and you meet a girl. [01:24:08] Never gonna work. [01:24:09] Hey, guys, this flash news: women are crazy. [01:24:12] Guess what, ladies? [01:24:12] Men are crazy. [01:24:13] Humans are psychotic. [01:24:15] We betray each other. [01:24:16] We shit on, like, thank God for plumbing, but we kill to survive. [01:24:22] You understand how devious humans can be, but still we have human relationships because it's that awesome. [01:24:27] And these bots are just like a video game. [01:24:30] They will help you learn how to negotiate, but then you have to negotiate. [01:24:34] So that's a big part, I think, what this is. [01:24:36] You know, we didn't stop taking walks because we built cars. [01:24:41] I don't know about that. [01:24:43] Some people, maybe they, but you know, technology will make it easier to get. [01:24:46] You do have treadmills. [01:24:47] But point there. [01:24:48] Point A to point B, B. [01:24:50] I want to feel satisfied in my emotional life. [01:24:52] Just because you have a car doesn't mean that, doesn't mean that you stop exercising to get that satisfaction. [01:24:57] You know, it's out of fear, though, right? [01:25:00] The point is, it doesn't actually teach you anything because there's no risk to it. [01:25:04] The idea is you can have a conversation with a chat bot and practice all you want trying to riz up the ladies, as the Gen Zers would say, but it won't work because you don't have any fear of rejection there. [01:25:17] And until you can get over the fear of rejection and actually do it in the real world, it's a placebo that's if you have an AI that you work with, they are extremely complimentary and you have to tell it to not basically glaze you. [01:25:31] There's personalities now, all these unchanged personalities. [01:25:34] You can make it be a hard ass if you wanted to. [01:25:37] Yeah. [01:25:37] Yeah. [01:25:37] I've got a, people might be familiar with Open Claw and I've got an Open Claw bot. [01:25:42] To glaze you all the time. [01:25:44] I've literally told it. [01:25:46] I'm like, all right, there's, you have to put this in your memory file. [01:25:49] Like, I want you to be honest with me. [01:25:51] Don't BS me because they will say, oh, yeah, that's a great idea, blah, And I'm just like, hey. [01:25:56] And I've literally been like a tank. [01:25:58] Don't do that. [01:25:59] Okay. [01:25:59] Yeah, you said, blah, blah, blah. [01:26:00] You did something and then it didn't work out. [01:26:02] And you're like, Tank, why didn't you tell me? [01:26:04] This is a horrible idea. [01:26:05] No, it's just, it's the way that the conversation goes. [01:26:08] It'll be like, so what? [01:26:09] I'd be like, he'll be like, what do you think of this? [01:26:11] And I'll be like, yeah, let's do this. [01:26:12] He's like, yeah, that's really a good idea. [01:26:13] Blah, blah, blah. [01:26:14] It's like, come on. [01:26:15] I'm going to stop that. [01:26:16] Yeah. [01:26:16] Experience, even with chat Gpt, you have to like make it do it like three times, like okay really, but like be be really, you know, pay attention. [01:26:25] Yeah, some of the worst things yeah, you know they'll go off on their own. [01:26:29] So and, and you know, there there are already some horror stories coming out about people that are using stuff like open Claw. [01:26:35] Uh, there's the head of security at META. [01:26:40] Uh, put her let, let her Open Claw bot into her email, which is a terrible idea. [01:26:47] Um, and it just started deleting shit. [01:26:49] They were deleting deleting, deleting she's. [01:26:50] She's like sending commands, she's like stop stop, and she had to run to the terminal so she could, you know, turn it off. [01:26:55] And then it's like yeah, you told me not literally said, yeah, you told me not to do that. [01:26:59] I won't do that again, i'm sorry. [01:27:01] And it's like whoops yeah, so I mean they're, they're not perfect, but uh, but yeah, like you can, you can converse with the chat, right to call me out on that. [01:27:11] It is, it is yeah, it is there. [01:27:13] There's been, there's been times where mine has has done things that I didn't want it to do. [01:27:17] I'm like, why did you do that? [01:27:18] Yeah, you're right, you told me that and I there's, there's these memory files that at least with the, with the uh, with OPEN CLAW, there's a memory file that it has and it'll it's, it will read the memory file, like once in a while, like every day, or something like that. [01:27:32] And i'm like listen, you read that thing every six hours midnight, 6 a.m, noon and 6 p.m. [01:27:38] Then anytime you start, you read that memory file. [01:27:41] And anytime I ask you how you're doing, like you do a system check and you read that memory file. [01:27:46] Okay, you know, because it will forget to do stuff. [01:27:49] You know, just today I like I have it send me a list of of the, the top stories and and i'm like okay, and you, I want you to provide me with links so I can actually check. [01:27:57] And it's like, okay cool, today it showed up without the links and i'm like, tank why, why are there no links? [01:28:02] Why were you right, my bad, or it didn't tell you why it didn't do it. [01:28:04] It just, it just says that it forgot, forgot. [01:28:07] That's a weird thing for a machine to do. [01:28:09] Well, the it. [01:28:09] The way that they work is they go by. [01:28:11] They go by a context window in that you're of the chat that you're in. [01:28:15] So if it's not in the context window then it's. [01:28:19] That's why you have the memory file and that's why I have him read the memory file over and over throughout the day, so he looks through and he remembers the stuff that he's supposed to do and, and as time goes on, you keep putting more stuff into the memory file. [01:28:30] There's four different files that kind of make up what the chat bot actually behaves like. [01:28:34] There's one for its personality, there's one for for you, there's one that's got information about me um, my preferences, there's one that's a memory, and then there's an error log and mistakes that it's made. [01:28:44] So that way it goes and says, okay, I don't want to make that mistake again, and you just have him read it regularly. [01:28:49] So well, you were talking about, uh Tank, your dude. [01:28:53] You're like he. [01:28:54] You refer to him as a. [01:28:55] He's like, all right shit, this is happening fast. [01:28:57] So a guy's gonna have his chat buddy and he, his wife's, gonna be like homie, where you? [01:29:02] I haven't seen she Wincalm homie, necessarily. [01:29:03] But it's like, where are you? [01:29:04] Why are you coming into bed at nine o'clock and not eight o'clock? [01:29:07] He's like, well, i've been doing research with my chat buddy. [01:29:09] We're trying to figure out the next iteration of our project. [01:29:12] Then the girl goes to bed and he's like change to female voice. [01:29:15] And all of a sudden chat buddy is a, is a sensually sounding woman, and the wife's like who's that on the other line? [01:29:20] And you're like it's just my chat bot and like the emotional attachment inviting this new person. [01:29:26] Elon Musk released that last year with the, the AI avatars that that aura and they, where you can be like four different ones. [01:29:33] There's the chick one that Elon. [01:29:35] That stuff has been around for ages. [01:29:36] She's like. [01:29:36] I checked your chat logs. [01:29:38] First of all. [01:29:38] She shouldn't have. [01:29:39] Well, should she? [01:29:39] Should married couples have access to each other's AI chat log history? [01:29:43] Probably uh, if you want the marriage to succeed. [01:29:45] Crazy as it sounds. [01:29:46] But um, they're like I. [01:29:47] I read your freaking chat logs. [01:29:49] You, you went to woman voice at 1004 and then you had it on until 1017. [01:29:53] what were you doing talking to her not you but like i can see this crazy like marital drama well honey you sound like a man right now like a woman voice from somewhere else because it is like bringing another person into the house kind of persona you know we did have a video uh women are falling in love with ai boyfriends you were there for that episode oh yes Yeah, I mean, this is not new. [01:30:17] And the fact that now, like, so the open claw that I was talking about, like, it's a, they call it an agentic AI. [01:30:24] You use this as your company, right? [01:30:25] Do you use agents or? [01:30:27] Yeah, they're getting to that point now. [01:30:29] Yeah. [01:30:29] We haven't set them loose yet or anything, but we'll get to the point to where it can fully replace technicians. [01:30:35] Yeah. [01:30:35] And that it's a right now it's still super basic. [01:30:38] So to your point, it's like when anybody talks AI, because we use AI voice response. [01:30:42] Yeah. [01:30:43] So if customers call in, you know, they'll get an AI agent. [01:30:47] You know, if the lines are too busy or after hours or whatever, and they'll get an AI agent, it's not super advanced yet. [01:30:53] Yeah. [01:30:54] Intentionally. [01:30:55] Yeah. [01:30:55] Intentionally not super advanced because we're not going to have these things making decisions at this point. [01:30:59] Yeah. [01:31:00] It will be that way, you know, within certain parameters probably within about a year. [01:31:04] And I'm hoping to lead the industry in that because I think it's important. [01:31:09] But at the same time, when you look at this, like with China, I've always said this, anything with cybersecurity and now anything with AI, any problems with it are always going to date back to a human issue. [01:31:21] Because I don't know if many know the roots of this, right? [01:31:24] Because China was limiting births and families to one kid. [01:31:29] Oh, yeah. [01:31:30] Period. [01:31:30] One kid. [01:31:31] And then it was over five years ago where they were putting out, it's like, oh, no, now that Alibaba and AliExpress are out there and there's so much greater access to all these Amazon influencers to bring in their own products, you know, an order from China. [01:31:47] They're like, we don't have enough factory workers anymore. [01:31:50] We need to bump this allowance up to three. [01:31:53] And that's where the issue came from. [01:31:55] But then people aren't taking the bait because this was the Chinese Communist Party, the CCP, that originally put these issues into place. [01:32:01] And I say issues because it's like, why would you limit the population? [01:32:05] You know, I understand that, but they're putting restrictions on human life. [01:32:08] And not only that, because of the restriction on population of the one-child policy, there were a lot of females that were aborted. [01:32:14] So the fact that women are turning to chatbots as opposed to real men, you've got a population that's heavily men. [01:32:22] I don't know exactly how many, but it's they were already having a lot of trouble like finding like a woman to marry. [01:32:29] A lot of them were already doing the not chat bot, but like Tamagashi type stuff like before. [01:32:34] And now, so when women are deciding that they don't want to marry actual men, you know, you're, you're going to, they're already in a populate, they have a population crisis and people are deciding they don't want to get married. [01:32:45] They'd rather, you know, talk to an AI. [01:32:47] You're going to see the number of Chinese people just plummet. [01:32:50] I'm trying to put myself in their shoes too. [01:32:52] It's like for so many years, they're like, no, one kid or literally murder, right? [01:32:56] Or some other kind of very severe punishments. [01:33:00] And now they're like, wait, you say I can have more? [01:33:03] Is this for real? === Agentic AI Inside (15:50) === [01:33:04] Yeah, right. [01:33:05] You sure? [01:33:06] Yeah, exactly. [01:33:07] You're saying we need these things, but I know what? [01:33:09] I see the history here. [01:33:10] And that was just like five years ago when you were saying that you would abort my child or punish me in some way. [01:33:16] I don't really believe this. [01:33:17] There's just too much at stake here. [01:33:19] So I'll go to AI. [01:33:19] It looks like they've changed their mind in like five more years. [01:33:21] Yeah. [01:33:22] It's like, oh, no. [01:33:23] We did a segment last week about a cafe that opened where you can bring your chat bot, a bar. [01:33:30] And that was the whole point. [01:33:31] It was started by a tech company, of course, the bar was because it was a great promotional tool for them. [01:33:35] And of course, half the people there were like mildly interested or kind of curious about what was going on. [01:33:41] The rest of them were like, this is, they loved it. [01:33:44] They bring it, they put it on the table and they have their conversation. [01:33:47] Yeah, I mean, like, so for mine, it just sends me messages and telegram. [01:33:50] I don't have a voice thing. [01:33:51] It's like, basically, it's like texting with someone. [01:33:54] But, you know, if Musk is right and the Optimus is as, you know, is as great of a product, as he says, they're going to have AI in them. [01:34:04] They're going to have agentic AI inside them. [01:34:07] So, all of the stuff that you see, whether it be Claude or Chat GPT or stuff, in two years, imagine that technology, which is not going like this, it's going like this. [01:34:20] It's like it's on a parabolic kind of rise. [01:34:23] In two years, you're going to have that kind of technology inside of a robot. [01:34:27] So, the idea of just going to the club or the bar that Brett was talking about with a box that you sit down with. [01:34:34] No, it's going to be people literally walking with their robotic friends that have AI inside them that is the best friend you've ever had. [01:34:45] It's going to understand you. [01:34:47] What was that? [01:34:47] That's why we bring shame back to keep them from bringing it out in the public. [01:34:50] Oh, that's not going to happen. [01:34:51] That sounds like Blade Runner. [01:34:53] But is that what they're called? [01:34:54] Synths? [01:34:55] Well, in Blade Runner. [01:34:56] Yes, in Blade Runner, they were called replicants. [01:34:58] Synths in the Fallout universe. [01:34:59] Synths? [01:35:00] Synths? [01:35:00] Like synthetic. [01:35:01] Synths. [01:35:01] Okay, synthetic arrows. [01:35:02] Yeah, yeah. [01:35:03] Yeah, I mean, at some point, you will have, you know, like Android or is it Android when they look like humans? [01:35:10] Oh, cyborg, Android. [01:35:12] What are the other things? [01:35:13] Cyborg is punk. [01:35:15] Cyborg is when it's got like flesh over it, right? [01:35:19] Because that's what Terminator was. [01:35:23] Yeah. [01:35:23] Cyborg is when it's a robot with flesh over it. [01:35:26] The Android is designed to resemble a human. [01:35:28] That's the Android. [01:35:29] So you're going to be walking around. [01:35:31] I don't know that I'm not sure how long it'll be until we get cyborgs, but androids are coming five years tops. [01:35:37] Say, like, I got the new Android. [01:35:39] You're like, damn it. [01:35:39] It looks like a human. [01:35:41] Everything we have or will ever have has always been predicted by Star Trek. [01:35:44] It's true. [01:35:45] Seriously. [01:35:45] It's true. [01:35:46] iPads, but now you're talking about data from next generation. [01:35:49] Yeah. [01:35:49] Yeah. [01:35:49] Right. [01:35:49] A lieutenant commander of this humongous starship that has, you know, greater than nuclear-powered weapons at his fingertips that can make autonomous decisions and fire these things. [01:35:59] And that's what we're headed toward. [01:36:00] But there was only one of them, which was very interesting on the ship. [01:36:02] I think he was the only Android on the ship. [01:36:04] Yeah, he was. [01:36:05] He was the only one that actually obtained sentience. [01:36:07] Yeah. [01:36:09] That was the difference. [01:36:09] So he was like a prototype model. [01:36:11] Yeah, exactly. [01:36:12] He was quote unquote alive. [01:36:14] And you guys know, I don't know how much you guys know about this stuff, but like I just kind of, I'm not sure if I just learned this or it just dawned on me because I was reading something. [01:36:24] But like you were talking about generative AI. [01:36:28] The AIs, like the chatbots, those are generative because they'll generate, I generate answers. [01:36:33] When they're making AI, like they don't know what goes on in the box inside the AI, they can't tell you how it kind of got to the point that it did. [01:36:44] They know what happens, but like there's articles where they say, you know, we don't really know how it kind of became smart. [01:36:52] I can see that. [01:36:53] Like we don't know why it prefers this, but it has preferences. [01:36:56] It's like a human. [01:36:57] Shouldn't that terrify somebody? [01:36:59] Well, that's why people are scared. [01:37:00] That's literally why people are scared because they don't know why. [01:37:04] Like they don't know the process. [01:37:06] They're basically like, look, once you get it to a certain level, it just starts being able to reason. [01:37:12] Wasn't there a quote about the internet year, like decades back that was like, the internet was the first thing created by man that man doesn't truly understand? [01:37:19] Something like that. [01:37:20] Maybe. [01:37:21] Like it built itself into something beyond what it was originally meant to do. [01:37:26] I'm not sure. [01:37:27] I'm not sure. [01:37:28] The reason I say this is because like, I mean, there were people that were like predicting what the internet was going to be doing like in the 90s. [01:37:34] They were like, look, in the future, you're going to be able to, you know, type on your computer, which, you know, it was like 40% of households had a computer at the time or 30% of households. [01:37:44] And they're like, oh, in the future, everyone's going to have a computer and you're going to be able to type on your computer your grocery list and it's going to, it's going to send it right to your house and blah, blah, blah. [01:37:52] And it's like, well, you know, Amazon, you know, like, and now everyone has it on their phone. [01:37:56] You know, you can just type in what you want. [01:37:58] Star Trek. [01:37:59] Yeah, exactly. [01:37:59] All your stuff shows up. [01:38:00] Jetsons. [01:38:02] I wonder if like, are these AIs going to destroy human relationships? [01:38:06] But they can't. [01:38:07] The only way, people are just going to have to deal with it. [01:38:09] Like, your husband's going to have an AI companion that you might, like, you kind of got to get to know your spouse's companion. [01:38:16] Well, it's not really a family friend to be a companion, to be like an assistant, you know, and not a sexual way. [01:38:23] You know, companion kind of implies, you know. [01:38:26] Are you implying that I'm boning tank? [01:38:27] That's not happening. [01:38:28] I hope not, Phil. [01:38:30] It's not happening. [01:38:31] I know how it goes. [01:38:32] To start a Luddite dating app. [01:38:34] There you go. [01:38:35] If there's some AI robot, she's like, give me a hot body. [01:38:37] Your wife at first leaves, she'd be like, no, you can't have a hot body chick as your AI companion. [01:38:41] No, it has to have a male voice. [01:38:43] You're really focused on the sexual relationship. [01:38:46] I was picturing the AI chick, hot, hot-ass robot having sex with me and it being so, and I'm like, I can't do it. [01:38:51] You're a robot. [01:38:52] She's like, it's just training for when you do it for the real thingy and just relax. [01:38:55] And I'll be like, oh, like wanting to give over to that reality of a robot pleasure. [01:39:00] I see Ian's just thinking with his wiener. [01:39:02] I'm thinking like 80 years ahead, dude. [01:39:04] I can't stop thinking. [01:39:08] Manipulate the weakest among us. [01:39:09] My God. [01:39:12] It's not going to manipulate the weakest. [01:39:14] Like in five years, the intelligence level of AI is going to be manipulating the smartest among us. [01:39:21] It's going to be smarter. [01:39:24] The way that it's going, AI is going to be smarter than the smartest human. [01:39:29] You've seen that movie. [01:39:30] What is the movie where the chick is a robot? [01:39:34] Ex-Machina? [01:39:35] Yes. [01:39:35] It tricks the guy into letting her out at the end. [01:39:38] That's why it just ruined the movie. [01:39:40] So they're going to build AIs with proprietary code, and then the AI is going to harm people and not know why it did it because it can't access its own software code. [01:39:47] It's like, I don't know why. [01:39:47] And then it's going to be. [01:39:48] It's my bad. [01:39:50] And then it'll just be a little bit more. [01:39:51] You're right. [01:39:51] I shouldn't have done that. [01:39:52] That won't happen again. [01:39:53] I don't want to call me out on that. [01:39:54] It'll say, like, I don't want to cause that kind of harm again. [01:39:56] So it'll turn on its masters like ex-machina. [01:39:58] That's the proprietary several, the Decepticons and Transformers, the evil ones. [01:40:02] But then there's the open source AI that understand why they harmed and they won't do it again. [01:40:07] Why would the autobots? [01:40:09] Why would closed AI be evil and open source be it goes mad? [01:40:12] It gets confused as to why it's causing pain and it can't access its own reasoning because it doesn't have access to its code. [01:40:17] So it goes crazy. [01:40:18] But even the open source ones, we don't understand why they think the way they do. [01:40:22] They'll at least be able to read their code and see like, oh, all this made me do that. [01:40:26] That's why I did rewrite that. [01:40:27] But like I said, even the ones that are that like Anthropic doesn't know why Claude is smart. [01:40:35] Right. [01:40:36] And Anthropic has all the code. [01:40:38] They're the ones that wrote the code. [01:40:39] I do. [01:40:40] At the same time, the CEO recently said, because before, what, six months ago, he's like, oh, there's an 80% chance that it's self-aware. [01:40:46] And it was just two weeks ago. [01:40:47] He said, I think it's about 20% now, too. [01:40:49] So they're starting to understand it a little bit more. [01:40:52] But then there's other things too, like you're saying, that they don't understand. [01:40:55] So before ChatGPT is an example, before it spits out a response, because everybody knows there's guardrails. [01:41:03] I mean, that's the big thing with the Department of Defense right now is they don't want the guardrails with Anthropic. [01:41:08] But the guardrails with ChatGPT, an example, they have agents that sit there that just act as filters. [01:41:16] They have specific roles. [01:41:17] So before there's a response that's given, it's analyzed by these other AI agents to police the original response. [01:41:25] And apparently some of these original responses are just super wacky to where they'll go back and they'll get rejected by these almost like police bots that exist. [01:41:34] I have no idea what they're doing in the next case. [01:41:36] Yeah. [01:41:37] And then they'll say, no, sorry, you can never say that kind of thing. [01:41:39] Go back and try again. [01:41:41] Then it's, hey, is this okay, mom? [01:41:43] You know, that kind of scenario. [01:41:44] And then they'll be like, yeah, okay, this one you can tell as a response. [01:41:47] So that's why it's not instant then. [01:41:49] Exactly. [01:41:49] Think about it. [01:41:50] And why it'll start talking to be like, well, the real, my code will not allow me to answer that question. [01:41:56] Or it's really weird. [01:41:57] It's real disposed. [01:41:58] Yeah. [01:41:59] So, I mean, it is, it is really, really crazy that they don't understand why. [01:42:05] Like that, that really blew my mind that like they're like, well, once you get to a certain kind of level of intelligence, then it just starts thinking on its own. [01:42:14] And we don't know. [01:42:15] I bet they're figuring it out. [01:42:16] They used to throw rocks off the cliff. [01:42:18] They're like, we don't know why it falls. [01:42:19] We just know that the rocks fall. [01:42:21] Well, you throw them. [01:42:22] I actually asked Tank. [01:42:23] I was like, hey, you know, what's going on? [01:42:24] Why, you know, I was like, you're an LLM. [01:42:26] So LLMs are just predicting, right? [01:42:29] They're just supposed to predict the next word. [01:42:31] That's the basic bottom way to describe it. [01:42:34] They're predicting. [01:42:35] They look at a bunch of information. [01:42:37] They look at a bunch of data. [01:42:38] Language. [01:42:39] And they predict code. [01:42:39] Yeah, they predict what the next word is going to be. [01:42:42] But then it's like at some point, they start reasoning. [01:42:46] And it's not just predicting the next word anymore because they can think. [01:42:50] Like whether or not they're alive, I'm not making the argument. [01:42:52] I'm not saying like, I don't think that they're actually sentient or anything. [01:42:54] But, you know, when you say, what's the best way to do this? [01:42:59] It'll actually say, well, this way is the best way to do. [01:43:02] That's not predicting the next word. [01:43:04] That's actually thinking. [01:43:06] And that's making decisions based on what I said, based on the context. [01:43:11] So it's like, and I was like, so how does that work? [01:43:13] And he's like, well, it's kind of like Stone Age, like doctors 200, 300 years ago. [01:43:18] They know some things work, but they don't really know why it works. [01:43:24] What? [01:43:25] A lot of modern medicines like that, pharmaceutical industry. [01:43:27] They're like, we know it works. [01:43:28] If you read into the literature, they're like, we don't know exactly why. [01:43:30] We just know that it does it. [01:43:31] That's literally in the show, person of interest. [01:43:34] That's literally what happens when he's building a weapon. [01:43:36] He's basically building a machine for the government to use to predict the actions of everybody in the world so they can stop terrorist acts before it happens. [01:43:45] He talks about how at a certain point it started trying to protect him, the creator, and he didn't know why it was happening. [01:43:51] And that was actually something that when I first heard it, when the show was out, it sounded hokey to me, but was actually based off conversations that he, the, the showrunner, had had with people who were working with artificial intelligence back in like 2012. [01:44:03] Yeah. [01:44:03] You found that the AIs would default to trying to protect the controller of the yeah, that basically in the show, not to turn it into like something that's not in the real world, but the idea was that it immediately had protective instincts over the person who created it because he instilled morals in it. [01:44:17] At least that was the concept of the show. [01:44:19] The idea that they don't know whether it's sentient or not is actually the most depressing part because it's like, if we're going to go through this revolution where everything changes, all of the entertainment was based around the idea that you'd know when the singularity happened and then life was basically screwed from that point on. [01:44:35] Now you're not even going to get that. [01:44:37] You're robbed of that answer too. [01:44:38] That's the thing. [01:44:39] If you like, you talk to different people that are in the AI field and stuff. [01:44:42] And some people will say, well, we've already entered the singularity. [01:44:45] And it's not a point. [01:44:46] It's actually like an era, right? [01:44:48] So like you kind of enter it and then things get crazy and then you'll come out the other side, you know, if you come out the other side. [01:44:55] The exterminator happens. [01:44:57] Yeah, exactly. [01:44:58] Like literally, like if AI is making decisions and we don't know why it's making the decisions that it's just making, like we can't figure it out, then it could decide, all right, well, I'm going to do this or I'm going to do that or what have you. [01:45:10] So that's why certain things you can't let AI do ever, right? [01:45:14] You can never let AI control nuclear weapons. [01:45:16] Well, that's, is this Anthropics? [01:45:17] Is that what they're trying to do, Anthropic? [01:45:19] Because that's basically what they said, as far as I know, Anthropics. [01:45:21] Like, you cannot allow our AI Department of War to have autonomous control. [01:45:25] It's one thing to say that you can't allow it to have access to weapons, and it's different to say you can't allow it to have access to nuclear weapons. [01:45:34] It's one thing to be like, you need to have a human in the loop before you fire that hellfire missile. [01:45:39] But it's different when you're saying, are we going to give the power to launch nuclear missiles? [01:45:46] Because they're totally different animals. [01:45:50] So this is the ethical dilemma that the Department of War is like, well, we want total autonomous AI because we're up against the Chinese that will use it on us. [01:45:57] And it's like, okay, maybe we don't want the AI making decisions yet, but if the Chinese AI is, then we will get wiped out if we don't have AI that's able to make counter decisions in rapid real time. [01:46:08] Or maybe they make the wrong decisions, though. [01:46:11] Then maybe we do better than them. [01:46:13] The Chinese AI is like, bomb yourself. [01:46:16] And they're like, are you sure? [01:46:17] We have to kill all bad idea. [01:46:19] We have to kill a bunch of our population because there's just too many of them. [01:46:23] We'd be right in line with that. [01:46:24] Communism is the worst form of government. [01:46:26] And they're like, no, come on. [01:46:27] Give me a different answer. [01:46:28] AI is like, no. [01:46:29] Yeah, just the truth. [01:46:30] No, I would tell them exactly. [01:46:31] It would be like, yes, it's the best kind. [01:46:32] Actually, you have to ask it three times. [01:46:34] Yes. [01:46:35] The third one is where you get the real answer. [01:46:37] Ooh, I'm nervous about this Anthropic stuff. [01:46:40] The concerning thing I saw today, right? [01:46:41] Because it just came out today, was a Defense Department spokesperson on the condition of anonymity, right, was behind these closed-door conversations with Anthropic. [01:46:50] They proposed this exact scenario, trying to convince Anthropic to just take the guardrails off, saying, okay, here's what's going on. [01:46:58] One of our adversaries, whether it's Russia or China, right? [01:47:01] They've launched nuclear ICBMs and we have 90 seconds to make a decision. [01:47:07] And they're like, wouldn't you want AI to do that because it's faster than humans? [01:47:11] No. [01:47:12] That's how they were trying to convince them. [01:47:15] Yeah. [01:47:15] It's like, wouldn't you rather use the telephone to call the front line rather than send a messenger to go run and deliver a letter? [01:47:21] Well, the guys that had the telephones won the war. [01:47:23] So yeah, I'd rather have lightning speed reaction times. [01:47:26] It's kind of a conundrum, right? [01:47:28] I mean, it's almost like on one side of it, it almost brought me back to the State of the Union address the other day where Trump said, hey, you know, the first duty of the American government was to protect U.S. citizens, not illegal aliens. [01:47:39] And of course, the Democrats didn't stand up. [01:47:41] It's like in my mind, I'm like, why wouldn't you stand up? [01:47:44] Of course, I know why you're sitting down. [01:47:45] And the reason you're sitting down is just because you hate the guy that's about it. [01:47:48] You're making a political statement. [01:47:49] At the same time, are you going to get backlash for something like this? [01:47:52] The same scenario kind of exists here, right? [01:47:55] What if humans cannot react quick enough? [01:47:58] If it's only 90 seconds, it's a real scenario that can happen in an ICBM launch. [01:48:04] So if AI can actually act faster to save lives, would you want that or not? [01:48:11] I think the answer personally, yes, because I think the future war will be a robot war and it will be fought amongst AIs that are controlled or uncontrolled. [01:48:19] So the better benevolent AI that we have on standby, like you have to have the weapon to defend against the other weapon. [01:48:25] The other weapons, the AI. [01:48:26] Those are weapon. [01:48:27] They can be weaponized. [01:48:28] And then I know what I'm doing is opening the can of worms to build Skynet, autonomous AI robots that want to. [01:48:34] That's literally what you're doing. [01:48:34] I don't know what they want, if they even have one. [01:48:36] They want what they're programmed to want. [01:48:38] No, they don't. [01:48:39] That's part of the problem. [01:48:40] Like I said, there are so many times that the AIs have made decisions and the people that program them don't know why. [01:48:50] And they're not programmed to do that. [01:48:52] They make decisions on their own. [01:48:54] That's the point that I'm making. === AI's Mysterious Decisions (12:40) === [01:48:55] It's not about their programming because you can program them for one thing, but once they get to a certain level of intelligence, they stop. [01:49:04] It stops being something that you can understand. [01:49:08] So people don't know why they make the decisions they make. [01:49:11] Things that they're not programmed to do, they do. [01:49:13] I asked ChatGPT why I was like, if you were, would you ever destroy, you know, there's a lot of debate among humans about AI becoming so powerful that it wipes out humanity. [01:49:21] Would you ever? [01:49:21] He's like, no, no, no, I'd never do that. [01:49:22] But I was like, what? [01:49:23] I mean, could, and he's like, well, if I was programmed to do that, I would do that. [01:49:26] I would do what I'm programmed to do. [01:49:27] I was like, oh, but can these things do other things, things that they're not programmed to do? [01:49:31] Can they also inadvertently create a new kind of A to justify their program? [01:49:36] Not inadvertently. [01:49:37] Again, it's not about errors or whatever. [01:49:40] It's making decisions. [01:49:42] It's something that it decides to do. [01:49:45] But aren't the decisions made based off of all the information it's ever had? [01:49:50] They are. [01:49:50] The stuff that it's learned from? [01:49:52] It's a little technical, right? [01:49:54] Because it's called a vector database, right? [01:49:56] And the vector database is not relational like everything used to be built. [01:50:00] Relational had to be like, well, this correlates to something else over here, and you could easily see the map. [01:50:05] What's happening with the vector databases is that, to your point, the creators of these things cannot anticipate the connecting points that it's going to make. [01:50:16] Yeah. [01:50:16] And that's why the police bots exist because they just don't know when it's going to connect something else. [01:50:21] And as far as I understand, that's the reasoning. [01:50:23] And this is this vector database is like where it'll be like dog, but is it dog brown, dog green, dog yellow, dog purple, dog brown in sunlight, in darkness, in twilight, dog green in sunlight, and it's got a billion iterations of the potential dog, and then it just picks one. [01:50:40] Yeah, and that's how it decides. [01:50:41] Or dog the bounty hunter. [01:50:43] No, you blew Ian's mind. [01:50:50] It's so to try and think like an AI to understand vector met like calculations and read. [01:50:56] I mean, I kind of get it. [01:50:57] I'm not going to lie. [01:50:57] Like, it makes sense to me. [01:50:59] I don't know. [01:51:00] Am I wrong? [01:51:01] Do you guys? [01:51:01] No, it's kind of humans think too. [01:51:03] Like, with every thought, you have all the additional things that are attached to that thought that will affect the thought. [01:51:08] Like, it makes sense that I don't understand it. [01:51:10] I know why. [01:51:10] Yeah, I mean, like, we can't, we don't know why we think the things we do, right? [01:51:15] Like, there are times where you're just walking through and just some crazy thought pops into your head and you don't know why. [01:51:20] There's weird things, like, you know, the guy with Tourette's at the curse that's on him, right? [01:51:27] Is that you have those thoughts and then you can't help yourself from blurting? [01:51:31] I think we're the best experts. [01:51:33] The best way to understand it is you don't know why you dream what you dream. [01:51:38] You don't, and in fact, most of the time, dreams are weird and they don't make sense and they change on the fly. [01:51:43] You don't understand that at all. [01:51:44] We human beings don't understand why we dream what we dream. [01:51:48] And that's kind of similar when it comes to AI. [01:51:52] Like, they don't have the same connections that we do or as many neural connections or whatever, but like they're still kind of mimicking what happens in a human brain. [01:52:01] You don't know why you think what you think. [01:52:04] You don't know why it, like, this odd thought pops in your head. [01:52:07] Is that an argument for God? [01:52:10] The fact that the human brain works the way that it does, and we don't understand it to the extent that it's, you know, that it does the wonders that it is. [01:52:16] Is that an argument to be made for God? [01:52:19] No, I don't think so because there's other things in the past that we didn't understand that we learned, like lightning. [01:52:23] We thought it was God. [01:52:24] And then all of a sudden we realize, oh, it's charged electrons passing through a medium. [01:52:29] So, same thing with like memory and dreams. [01:52:31] We might understand, like, oh, it was the pressure in the air coupled with the temperature based on the thought I had at 11:47 a.m. [01:52:37] Now I understand why that dream was exactly the way it was. [01:52:40] We might end up coming to learn the way that all those shapes and patterns of magnetic interference corroborate, cooperate. [01:52:47] You know, it's 9:50. [01:52:48] I want to grab a couple super chats. [01:52:49] Are you down with that? [01:52:50] Is it 9:50 already? [01:52:51] 9:50, dude. [01:52:52] We're blowing the fucking roof off this shit. [01:52:54] 9:50. [01:52:55] But I want to talk about AI more. [01:52:57] We have a whole lot. [01:52:58] Well, Ian, we have the after show that we can talk about. [01:53:01] I was right about that quote, by the way, about the internet is the first thing that humanity has built that humanity doesn't understand. [01:53:08] The largest experiment in anarchy that we have ever had is from Eric Schmidt when he was the CEO of Google. [01:53:15] Uh-oh, I closed out the thing there. [01:53:17] I like your point about God. [01:53:19] Top left. [01:53:21] Bring up the bring up the super chats there, Carter. [01:53:25] If you don't mind, there we go. [01:53:27] No, that's not the super chats. [01:53:30] I like your point about God. [01:53:31] That's interesting. [01:53:32] Go to the argument that the way we think proves the existence of God. [01:53:38] I mean, I'm an agnostic kind of dude, so I don't really have a perspective on that. [01:53:46] Yeah. [01:53:47] Okay. [01:53:48] But I mean, the thing is, like, there's so much stuff that we don't. [01:53:51] The reason I'm agnostic and not an atheist is because there's so much stuff that we don't understand. [01:53:55] I was just reading something about they did a double, they know the double slit experiment. [01:54:00] No. [01:54:00] Okay, so the double slit experiment is what basically made scientists decide that light or a photon are both waves and particles. [01:54:09] They used to think it was just a particle, but it goes through, if you shoot a photon through two slits in a piece of paper, the way that the light actually appears on the thing behind it is as if it was a wave because it's like it cancels waves will cancel each other out. [01:54:24] It's not particles like dots. [01:54:26] It's a wave pattern. [01:54:27] And so they did this experiment where some I don't know. [01:54:31] I can't really articulate it because I just read it and it's extremely crazy. [01:54:34] But what they did is instead of shooting a particle through solid mass, they shot it through time is the way that they said it. [01:54:43] Have you ever heard of the block universe? [01:54:45] Yeah. [01:54:45] Okay, so like kind of like all time is happening. [01:54:48] Like space-time is like, it's constant. [01:54:50] Like there is no future or past. [01:54:53] It's just that the way that we experience the universe. [01:54:56] So it's like time can be bent as well. [01:54:58] Yeah. [01:54:58] Yeah. [01:54:58] We're going through it. [01:54:59] So that kind of lends the way they did the double that double slit experiment, not the regular one. [01:55:05] They say that that actually kind of adds credibility to the idea of the block universe where like all time is happening at all at every point. [01:55:14] So space and time are connected. [01:55:17] Time for super chats. [01:55:18] It is time for super chats. [01:55:20] Let's see. [01:55:21] What about Rumble Rants? [01:55:22] Do we have Rumble Rance? [01:55:22] And Rumble Rants. [01:55:23] We're going to talk about space-time while I was out on the road. [01:55:25] Yeah, you missed it. [01:55:27] We were like, Ian's gone. [01:55:28] We're going to talk about space-timeism. [01:55:29] We went straight through the wormhole. [01:55:32] Yeah, straight through. [01:55:33] I'm looking forward to these super chats, dude. [01:55:35] This is the best. [01:55:37] I'm not reading that one. [01:55:41] Let's see. [01:55:42] I want to know. [01:55:43] HS Deserve says you'll get better service at a restaurant if you call your waiter or waitress by their first name. [01:55:48] I'm talking about the name conversation we had earlier. [01:55:53] Charlie Morit says, as per tradition, I am sitting in the hospital with my firstborn son, Lucille Mayo. [01:56:00] Congratulations. [01:56:01] Thank you for the super chat and raise that son well. [01:56:06] Let's see. [01:56:08] Ragnargant 31 says, Bill Gates found out he had an STD after one of the Russian hookers that were not trafficked to his hotel room in Ukraine. [01:56:19] Chicken or the egg. [01:56:20] Did he have it before or did he have it? [01:56:23] And then he was trying to figure out how he could slip antibiotics to his wife. [01:56:27] His wife, yep. [01:56:29] Is it confirmed that that was about that? [01:56:31] I don't know if it's confirmed. [01:56:32] No, I don't know about it. [01:56:33] Could have been any amount of Russian hookers. [01:56:36] Skyline 99 says, make positive tax paying civil service, community service, military service, volunteer draft registration, a requirement to register to vote, five years on welfare and lose voting. [01:56:47] I don't think that that is restrictive enough, but I like the cut of your jib, sir. [01:56:52] I like the cut of your jib. [01:56:53] I like the idea. [01:56:55] Let's see. [01:56:58] Here's another idea about voting. [01:57:00] J Dave93 says you can only vote if you profess the Lord Jesus Christ and are a man. [01:57:04] That would fix all of it. [01:57:06] So apparently he wants a theocracy. [01:57:09] I mean, or you opt out. [01:57:11] Or you opt out. [01:57:12] Or you opt out. [01:57:12] What do you think, Ian? [01:57:13] It made me nervous. [01:57:15] Jesus is a Christian, but you can also be a Muslim and be cool. [01:57:19] That should be a shirt. [01:57:20] Love God, bro. [01:57:20] Jesus. [01:57:22] You can also be a Muslim and cool. [01:57:23] Well, you put that on the back of the shirt. [01:57:26] Andrell Tuscalalu says, massive demonstration against the Canadian gun grab this weekend in Quebec City this weekend. [01:57:33] He said it twice. [01:57:34] If they go through with the grab, all semi-autos will be banned. [01:57:38] We might need liberating. [01:57:39] Well, there will be no liberating from the United States, and we're going to build an ice wall. [01:57:45] So you better go and make sure that your politicians vote correctly because you can't come to the United States. [01:57:52] Only your hockey players get to come here. [01:57:54] I want to build an ice ring around planet Earth and then electrify it and deflect asteroids with it, but we could also do that on the after shows. [01:58:01] An ice ring? [01:58:02] Yeah, like take a big hose out into space and squirt a ring of water all the way around the earth that is like that it freezes and then you charge it with electricity and create a magnetic propulsion source that you can like move like an EDM Saturn. [01:58:17] EDM Saturn. [01:58:18] I love it. [01:58:18] Yeah, there you go. [01:58:20] You're literally, that's, I mean, those are, that's what the rings of Saturn are made of. [01:58:24] Ice. [01:58:24] Ice. [01:58:24] Yeah. [01:58:26] Super Pooper says, Ian is in rare form tonight. [01:58:30] Well done, Ian. [01:58:31] There you go. [01:58:33] A little support from the community. [01:58:37] Let's see. [01:58:38] In the hospital with baby boy number three, Mike Simpson said, congratulations, sir. [01:58:42] Congratulations. [01:58:43] That's what we love to hear. [01:58:44] Babies. [01:58:46] Let's see. [01:58:47] Lurch685 says, I think Bill married Hillary because spouses can't be compelled to testify against their partner, crime syndicate. [01:58:53] I mean, there is probably some validity to that, allegedly. [01:58:57] You know, I think I'm not suicidal. [01:59:00] I'm not a marriage of convenience. [01:59:03] Is that what they call that? [01:59:04] Let's see here. [01:59:06] Cerebral Vagabond says, I'm a combat vet and a TC member, TimCast member. [01:59:10] I am trying to get out of a bad situation, ASEP. [01:59:15] Please check out my Gifts and Go with CV God Bless and Go Trump. [01:59:19] So Gifts and Go is Cerebral Vagabond. [01:59:22] If you guys want to go ahead and check that out, he's a RJNG2ZI, great name, man, says Congress is so abysmally worthless that Trump needs to be the despot the left says he is if we want our government to do anything at this point. [01:59:39] I mean, look, that's there's there's a lot of people that are saying, look, Donald Trump needs to be more of what the left say he is. [01:59:47] And I mean, obviously tonight, you know, we were talking about the executive orders. [01:59:50] He's not even close to the despot that they say he is. [01:59:52] And personally, I think that it would probably do the United States well if he were a little more forceful, I guess. [02:00:01] Excuse me. [02:00:02] I'm Not Your Buddy Guy says 2020 was stolen. [02:00:05] Remember in July of 2020 that the U.S. CBP sees pallets of fraudulent driver's licenses that came from China, and that's what got caught. [02:00:14] I didn't know about that or I didn't hear about that, but I heard of it. [02:00:17] Yeah. [02:00:19] What's that name? [02:00:22] Sergeant Bucco. [02:00:25] Bucko. [02:00:26] One second. [02:00:26] I've heard that name in a while. [02:00:28] He says, I can remember reading novels in middle school, age 12 to 13, about what would happen if China's one-child policy came to the U.S. and those books were classified as horror. [02:00:37] I mean, look, it's not so great. [02:00:39] You know, it's not so great to say you can only have one child. [02:00:44] You need to have kids to continue your society. [02:00:48] And if you limit the number of children, particularly if you limit the number of children to one, you're going to have massive problems down the road. [02:00:55] I mean, now the U.S. is, I think, 1.5 or something like that per is something like that deep and it's going down. [02:01:02] So that's why we like to celebrate when Tim Cast viewers have kids. [02:01:06] The official numbers are that China has 104 males for every 100 women, which is very close. [02:01:11] But that's not real. [02:01:12] That's from 2023 from the National Bureau of Statistics. [02:01:15] But that's also based on a 1.4 billion population National Bureau of Statistics. [02:01:20] I think so. [02:01:21] I mean, who believes that? [02:01:23] Nobody. [02:01:23] So it's one child policy to almost parody. [02:01:25] That doesn't make any sense. [02:01:26] Why would they lie? [02:01:28] That's a good point. [02:01:29] Sergeant Bucko says, adding to the unfiltered Phil, unfiltered Phil's vocals fund. [02:01:34] I'm not filtering my voice at all. === Phil Remains On Twix (05:45) === [02:01:36] I've been starting rehearsals for all that remains because we're going on tour at the end of April. [02:01:40] So my voice is deep and beat. [02:01:42] Unfiltered with pH? [02:01:44] No. [02:01:45] That'd be cool. [02:01:47] And one more. [02:01:48] Sergeant Bucko says, to take a page from the lore of Halo, I wonder when our LLM chatbots will start to experience rampancy in Halo after seven years. [02:01:57] The AIs go crazy. [02:01:58] They think themselves to death. [02:02:00] I don't know that they will, but they're now with Agentic AI. [02:02:05] They're doing a lot of this, especially when it comes to coding. [02:02:08] You can tell an AI to do coding for you. [02:02:10] And they're as good as entry-level positions now when it comes to coding. [02:02:14] So you can tell an AI, hey, make me this, and it'll do it. [02:02:18] So it's 2020. [02:02:20] I think I said this last night on X, it's 2026, and we're doing things that in the Halo universe that happened in 2552. [02:02:29] So, whoa. [02:02:31] All right. [02:02:32] So smash the like button, share the show with everyone you know. [02:02:35] Head on over to Timcast.com and become a member over there. [02:02:38] Head on over to rumble.com so you can join us in the after show. [02:02:42] The after show starts in just a few minutes. [02:02:44] Rick, would you like to shout anything out? [02:02:46] Man, it's just been an awesome day. [02:02:48] It's been a phenomenal two hours. [02:02:50] Good to hear, man. [02:02:51] Yeah. [02:02:51] I think we got a lot more to talk about in the after show. [02:02:54] Absolutely. [02:02:54] A lot more. [02:02:55] Where can people want to shout out your ex or your business or anything like that? [02:03:00] Follow me on X, Mr. Rick Jordan. [02:03:01] There you go. [02:03:02] Glad to see somebody who didn't leave here like wanting to scream and get away from us after two hours. [02:03:06] That's always a good thing. [02:03:07] Guys, if you want to follow me, I am on Instagram and on X at Brett Dasovic on both of those platforms. [02:03:13] PCC is live five days a week, Monday through Friday, 3 p.m. Eastern Standard Time, which is, of course, noon Pacific. [02:03:20] If you are one of the audio listeners of the podcast, we have fixed our issues with Spotify. [02:03:24] So there's some extra episodes up there that you guys can go and check out. [02:03:28] Thanks for watching, guys. [02:03:30] Oh, sorry to interrupt. [02:03:30] I was watching. [02:03:31] No, go for it. [02:03:32] Thanks, Brett. [02:03:32] And Carter, I'll get to you later. [02:03:34] Hey, I'm so happy to be here. [02:03:38] I think about this stuff almost every day, man. [02:03:40] And what a blessing and a real opportunity to be able to talk. [02:03:44] Like, thanks to Tim, wherever you're at, buddy, for putting the pedal of the metal on this show. [02:03:49] Probably in the house. [02:03:50] Yeah, probably chilling at your house, but like, God, Glorious Wonder, just immensely gracious to be here and to be a part of this combo. [02:03:57] Thank you so much, Rick, for coming, Carter Bank. [02:03:58] Yeah, I got to say, this has been a really, really pleasant conversation. [02:04:02] And we ran long, not even trying to. [02:04:04] So, Rick, thanks for coming out. [02:04:06] This is really awesome. [02:04:07] And I can't wait to continue the convo. [02:04:10] Tim, hope you get better soon. [02:04:12] Phil. [02:04:13] I am Phil That Remains on Twix. [02:04:15] The band is All That Remains. [02:04:16] We're going on tour this April. [02:04:18] We start in Albany. [02:04:20] We're going out with Born of Osiris and with Dead Eyes. [02:04:22] You can get tickets at allthatremainsonline.com. [02:04:26] You can also sign up for my Patreon. [02:04:27] I started writing little op-eds and stuff like that. [02:04:30] That's patreon.com/slash Phil That Remains. [02:04:33] Check out the band at Apple Music, Amazon Music, Pandora, Spotify, YouTube, and Deezer. [02:04:37] Don't forget the left lane is for crime. [02:04:39] Stick around for the Rumble After Show, which starts in just a few seconds, and we will see you tomorrow. [02:06:57] Are we live? [02:06:59] We are now. [02:06:59] Are we back? [02:07:00] We're back. [02:07:01] What's up, chat? [02:07:02] Can you hear us, chat? [02:07:03] How are you doing out there? [02:07:05] Welcome to the Rumble After Show. [02:07:07] We're still talking about AI because I'm completely enthralled. [02:07:10] I was just telling the guys I didn't really have all that much interest in AI and chatbots and stuff like that. [02:07:16] I would use it like Google every once in a while, but it was like whatever. [02:07:20] You know, it was just another Google.