The Joe Rogan Experience - Joe Rogan Experience #2481 - Duncan Trussell Aired: 2026-04-09 Duration: 03:07:10 === Humming Songs Gets You Flagged (15:03) === [00:00:04] The Joe Rogan Experience. [00:00:06] Train by day, Joe Rogan podcast by night, all day. [00:00:17] Like a technical glitch. [00:00:18] Glitch. [00:00:19] But we're up. [00:00:19] What were we just talking about? [00:00:21] We were talking about that if you hum a tune, oh, right, right. [00:00:24] That you will get dinged. [00:00:26] Yeah, you'll get flagged on YouTube if you just hum a sound from a song. [00:00:31] Yeah. [00:00:32] Like the beginning bars of a song. [00:00:34] Yeah, you can't. [00:00:35] I wonder how far that goes. [00:00:37] Like, could it get to the point where an AI could hear you humming it in your car or something? [00:00:43] Like, how far does the protection of music go? [00:00:46] You're not generating revenue from your car. [00:00:48] Right. [00:00:48] So the thing is, you're generating revenue from a podcast, and their logic is if you hum, what is that song? [00:00:55] The Sunshine of My Love. [00:00:57] Is that what it is? [00:00:59] You know that song that I always hum to associate with people being high out of their fucking line? [00:01:03] Yeah. [00:01:03] You know, it goes, you can't do it. [00:01:06] If I did that, we would get dinged, which is so crazy. [00:01:09] And we were just saying, like, if you quoted a Scarface movie, would Brian De Palma get all the money? [00:01:15] If you said, say hello to the bad guy, would Brian De Palma get that money? [00:01:18] I don't think so. [00:01:19] I think you're allowed to quote stuff. [00:01:22] That is Brian De Palma, right? [00:01:24] Scarface, wasn't it? [00:01:25] Yeah. [00:01:26] I don't want to fuck that up. [00:01:28] I think so. [00:01:29] You know those auditors that go around and film people? [00:01:32] Yeah. [00:01:32] And people get mad because they're like, don't film me. [00:01:35] And they're like, I can film whatever the fuck I want. [00:01:37] Right. [00:01:37] And they inevitably, some like boomer freaks out and smacks them with a cane, and then they get a million views. [00:01:45] And it's just a trap. [00:01:46] It's a trap. [00:01:47] It's a trap. [00:01:48] It's because inevitably, someone loses their mind on them, and then that gets a ton of views. [00:01:52] One of the ways people are dealing with that, supposedly, is playing music, like playing copyrighted music during the interaction. [00:02:01] Oh my God, that's hilarious. [00:02:02] Because so then they can't make money off of it. [00:02:05] It's a shield. [00:02:06] It's a shield if someone's trolling you. [00:02:09] You just start playing copyrighted music. [00:02:12] Did you hear that the CIA has admitted that the way they found the pilot was because of his heart rate? [00:02:21] Ghost Murmur. [00:02:22] That's the name of the thing. [00:02:24] Okay. [00:02:24] We got to look into this. [00:02:26] Like, this is fucking science fiction. [00:02:30] Yeah, it's wild. [00:02:31] This is full minority report. [00:02:33] It's crazy. [00:02:34] Science fiction level technology. [00:02:35] It's AI. [00:02:36] They can find a guy's heart rate. [00:02:39] So, what I read is that it's, I didn't understand the science part, something to do with crystals, or I don't know what the fuck it is, but AI is somehow interpreting, is taking out the noise. [00:02:51] And then you can, from far away, 40 miles, I think. [00:02:55] 40 miles. [00:02:56] They find this guy's fucking heartbeat. [00:02:58] He's hiding in some kind of crevice. [00:03:01] And then they're able to go and extract him. [00:03:04] And, dude, obviously, the first thing I thought when I. What else don't they tell us? [00:03:09] No, those robot dogs. [00:03:11] I thought about those things having that tech and just like hearing heartbeats and then identify. [00:03:17] The heartbeat says a lot about a person. [00:03:19] Are they sleeping? [00:03:20] Are they like in good shape, bad shape? [00:03:23] You can learn so much from a heartbeat. [00:03:25] It could. [00:03:26] Ghost Murmur. [00:03:27] Oh my God. [00:03:28] Fucking great name, too. [00:03:29] It's a great name. [00:03:30] Ghost Murmur. [00:03:31] What sick fuck invented this? [00:03:34] How do you even think about inventing this? [00:03:37] You just, you know, the CIA, they've been taking psychedelics forever. [00:03:42] What is that word? [00:03:43] Quantum magnetometry. [00:03:46] Artificial intelligence with long range quantum magnetometry. [00:03:51] What the fuck is that? [00:03:53] Quantum means two things to me when someone says quantum. [00:03:56] It either means you're a bullshit artist and you're trying to get me with flim flam talk, or it means you're an actual quantum scientist, a quantum physicist who's going to blow my mind with what we know about entanglement and the weird shit. [00:04:12] There's this woman that I've been watching, she has this speech on, I think it's Big Think. [00:04:18] I'll tell you her name. [00:04:19] But she's completely freaking me out. [00:04:23] She's talking, I want to say her name because. [00:04:28] Leave away this ghost murmur thing is another key point. [00:04:30] That's fun. [00:04:31] Oh, well, we'll get right to it. [00:04:32] Michelle Fowler, that's her name. [00:04:34] And she's an astrophysicist. [00:04:36] And she's giving this talk about what we know about, like, she's studying binary star systems and stuff like that. [00:04:44] And she gives this talk about, she's explaining that there may be a tech in the future where there is no distance between two points. [00:04:53] So the ability to travel instantaneously from position to position, just like. [00:05:01] Quantum entangled photons can do. [00:05:03] Yeah. [00:05:04] But with people? [00:05:05] With everything. [00:05:06] How? [00:05:07] Who the fuck knows how a cell phone works? [00:05:11] You tell me how you're FaceTiming me when you're in Australia. [00:05:15] How does that work? [00:05:16] That sounds insane. [00:05:17] Yeah, that's fucking insane. [00:05:19] For what you, well, you probably know a lot more about cameras than I do. [00:05:22] No, I don't. [00:05:22] But from what I know about cameras, if you tried to get me to explain, like if the civilization ended and I said we used to be able to capture images on a small thing. [00:05:31] Yeah. [00:05:32] Like the size of a twig, and it sits in your pocket. [00:05:37] Right, exactly. [00:05:37] You're like, what are you talking about? [00:05:39] God, that'd be, you know, because it's just. [00:05:41] It's a deck of cards, and it'll keep a battery for 24 hours. [00:05:45] You could go on YouTube and get an answer to any question you want about anything. [00:05:50] Yeah. [00:05:50] Instantaneously. [00:05:52] And if you don't like the way you look, you can upload that image, and a machine will make you look slightly better via something called artificial intelligence. [00:06:01] Like, what the fuck? [00:06:03] What was the one I sent you today where there's like a potential lawsuit with ChatGPT? [00:06:09] I didn't send you the other one. [00:06:10] Did I send it to you? [00:06:11] You sent it to me. [00:06:13] The shooting was planned using ChatGPT. [00:06:17] I don't know if that's true, so we should be like really careful. [00:06:19] Yeah, that doesn't sound. [00:06:21] It sounds so crazy. [00:06:22] It doesn't sound like you could do that. [00:06:23] That sounds like the story sounds like I wanted to investigate because the story sounds like if I wanted to kill an AI company, I would make up a story like that. [00:06:32] It does sound like that. [00:06:35] Family of man killed in shooting Florida State University to sue ChatGPT in a way. [00:06:39] May have, may have advised the shooter on how to carry out shootings. [00:06:43] But that may have is important. [00:06:46] Yeah, that's really important, right? [00:06:48] And what is this on? [00:06:49] The Guardian? [00:06:50] The shooter was in constant communication with ChatGPT ahead of the shooting, and the chatbot may have advised him. [00:06:55] Dude, there's no way. [00:06:57] So that's clickbait, because all that's really saying is that the kid uses ChatGPT, which guess what? [00:07:03] Every kid uses ChatGPT. [00:07:06] Every kid. [00:07:06] And, dude. [00:07:07] ChatGPT is so stringent. [00:07:10] Like recently, and I've been using their codecs, which builds apps, and I was trying to, and it worked. [00:07:17] I made an AI trained on Charles Manson transcripts. [00:07:23] And when I told it I wanted to do that, it was like, fuck off. [00:07:28] Like, no. [00:07:29] It was like, it just flat out was like, I'm not helping you with that. [00:07:33] So I don't, there's no way the guardrails in place in ChatGPT. [00:07:39] Planned a shooting with that guy based on my experience with it because it won't. [00:07:43] 80% of the things I try to get it to do, it's like, no. [00:07:45] Here's the thing though are there workarounds? [00:07:47] Like, if you say you're writing a work of fiction, you can. [00:07:50] Okay, it's called prompt injection. [00:07:53] There's different tricks you can use, they're always battling these new mechanisms that you can use to get through the general prompt. [00:08:01] But the best way to do unaligned AI is not to use Chat GPT, it's to go on Olama and download a local LLM and then. [00:08:12] You can usually change the initial prompt of the LLM so that it will be completely unaligned, which I had to do for the Charles Manson AI I made. [00:08:20] I had to download this. [00:08:21] You're such a nerd. [00:08:22] I love it. [00:08:23] I am. [00:08:24] I am. [00:08:25] No one has embraced new technology for creating content like you. [00:08:31] Oh, I love it. [00:08:31] It's the best. [00:08:33] It's so fun. [00:08:36] For me, the most thrilling thing about it is we should not have access to this tech. [00:08:42] This tech is. [00:08:44] So dangerous. [00:08:45] And it's chilling to think about. [00:08:48] This is something I wanted to bring up on this show. [00:08:52] It's like, you know, the old days, you go in your garage, you work on your car, maybe you build like a table, you know, you're a carpenter, you work on it. [00:09:02] But these days, the shit people are doing in their garages right now is a big question mark, dude, because they're communicating with varying degrees of this AI, depending on how fast their computers are. [00:09:17] You can. [00:09:18] I was listening to this. [00:09:18] You should have this dude on. [00:09:19] He wrote this book, The Coming Wave. [00:09:23] He was one of the people who created Google's DeepMind, right? [00:09:26] And The Coming Wave is just a wonderful breakdown of historic examples of new technology completely transforming humanity. [00:09:37] It's happened before. [00:09:38] Yeah, Mustafa Suleiman. [00:09:41] And damn, it's a good book. [00:09:42] And this guy is saying, Whoa, put on the fucking brakes. [00:09:47] Dude, what are you doing? [00:09:49] This shit is going to fuck everything up. [00:09:51] And so, but the essential problem is if you regulate AI, it slows down AI. [00:10:02] And so they've deregulated it completely. [00:10:05] And now, assholes like me. [00:10:08] Who don't know shit about coding. [00:10:10] Okay, now go on Codex. [00:10:12] It will tell me how to make things because I wanted this Charles Manson to be able to push its AI face against, like, you know, those, you used to get them at Spencer Gifts, those nails that you could push your face into. [00:10:24] So I wanted the AI to be able to push its face into this thing while it was talking if it wanted to. [00:10:29] I don't know how to do that, obviously. [00:10:31] You tell Codex that as long as you don't mention Manson, it just is like, I'll start making the app now. [00:10:40] It is the best. [00:10:41] It's the best. [00:10:42] But also, what's thrilling to me is you're like, for sure, for sure, people probably shouldn't have unlimited access to. [00:10:51] I'm against regulation, dude, but this stuff, when you pair it, and this is what in this book he brings up you can order the equipment you need to do gene editing right now in your garage. [00:11:03] Let me propose this to you. [00:11:04] Okay. [00:11:08] If the Bible is a written. [00:11:13] Understanding of what had happened, and it was an oral tradition for a long time before it was written down. [00:11:21] There's a bunch of different versions of it written down in different languages, a lot of translations. [00:11:24] But at the beginning of it, they were trying to say something. [00:11:27] What if the meek will inherit the earth? [00:11:30] What if we misinterpreted that? [00:11:32] What if we thought, like, it's good to be meek? [00:11:34] The meek shall be, they'll inherit the earth. [00:11:37] Yeah, the kind. [00:11:38] There's something about the word meek. [00:11:40] Yeah, because that's the nerds. [00:11:42] Okay, and they are doing it, they are inheriting the fucking earth. [00:11:46] Yeah. [00:11:46] Right in front of your face, and everybody's signing up for it. [00:11:49] Yeah. [00:11:49] You've got these spectrum y, super genius dudes that talk in a language that 99.9% of the people can't even fucking understand what they're talking about. [00:11:58] Right. [00:11:58] You know? [00:11:59] Yeah. [00:11:59] And also, now the tech has gotten to a point where instead of having to, in their own minds, innovate ways to improve the tech, the tech is improving itself. [00:12:10] They're having conversations with the tech that's saying, why don't you try this? [00:12:13] Maybe you could try this. [00:12:14] There's still, it's not AGI yet. [00:12:16] Maybe it is, but apparently it's not. [00:12:19] But think about the people that are profiting the most from it. [00:12:22] The meek. [00:12:23] Well. [00:12:24] Like, if you had to describe a lot of tech engineers, it's not trying to be rude, just being honest, right? [00:12:32] A lot of guys that spend time in front of the computer, they're very thin and tired. [00:12:36] You know, they're super genius dudes that can fully focus. [00:12:42] I don't know, man. [00:12:43] I don't know what the description for these people is. [00:12:46] They're furries. [00:12:47] But here's the thing. [00:12:48] What I'm saying is, like, If you looked at like a spectrum of male behavior, they're not like you've got like football players and UFC fighters, and then you've got coders, yeah, sure, dudes are like more, way more chill, way more like they're not interested in violence, yeah. [00:13:06] I'm completely generalizing, yeah, sure, because I'm sure there's a bunch of jack guys that are coders, like, you bro, I'm a coder too. [00:13:14] That type of person that invents tech like Facebook or like Google, like things like that. [00:13:21] Don't be evil. [00:13:22] That's their motto. [00:13:23] Don't be evil. [00:13:24] And what does that mean? [00:13:25] Who knows? [00:13:26] And then you've got all these like wild progressive leftist ideologies that are attached to all these places, which make you even meeker. [00:13:35] And then they're the guys with all the money. [00:13:38] They're the guys with all the money, and then they can literally tell you what you can and can't say on YouTube. [00:13:43] They can literally tell you. [00:13:45] Yeah. [00:13:46] We don't agree with what you're saying. [00:13:48] Right. [00:13:48] And we're going to shut off your access to say something we disagree with, even though it turns out you were right. [00:13:53] Right. [00:13:54] And you know what happens there, man? [00:13:57] This is the hilarious thing when it comes to that kind of attitude towards the world the assumption is by creating a prohibition here or prohibition there, it will diminish whatever the thing is we're prohibiting. [00:14:10] Inevitably, though, it does the opposite. [00:14:12] Right. [00:14:13] It draws attention to it. [00:14:14] People get interested in it. [00:14:15] Creates an underground. [00:14:16] The underground is way better than the overground if you're a teen, especially. [00:14:20] The underground's fucking cool. [00:14:22] You're cool. [00:14:23] Restricted. [00:14:24] Not allowed. [00:14:24] Now, all of a sudden, you're getting these other YouTube alternatives that start popping up. [00:14:29] And when it comes to these, you know, right now we've got Anthropic, we've got OpenAI, we've got Google. [00:14:38] I might be missing one of the big commercial based LLMs out there right now, but the biggest problem with these fucking things is. [00:14:47] They're so good, but they will censor your ass. [00:14:50] And like, imagine like Hemingway if he, if his typewriter was like, I don't know if you should write that. [00:14:58] Maybe there's a better way to write that. [00:15:00] Hemingway would be like, fuck you, I'm getting a different typewriter. [00:15:03] And so everybody's going into these local LLMs. === AI Censorship vs Free Speech (02:22) === [00:15:07] There was, dude, this is why people have been buying Mac minis, people have been buying like, buying up computers and creating their own local AIs. [00:15:17] I follow all this shit. [00:15:19] I don't understand a lot of what they're talking about, but. [00:15:21] People are divesting from commercial LLMs, not just because they're expensive, but because they're prohibitive creatively. [00:15:31] And this is a real challenge for people like OpenAI, because it's like they know this. [00:15:37] They understand that by making it so that you can't make a Charles Manson AI through OpenAI, it doesn't make people not make the Charles Manson AI, it protects you from a lawsuit. [00:15:49] But what it does do is it drives people into unaligned LLMs. [00:15:55] And that is what is happening. [00:15:57] And this is something that I just, I can't even imagine what people are making right now. [00:16:05] No one can. [00:16:06] Like, we're going to hear about this or that, or somebody will post the weird video of their fucking AI robot. [00:16:12] I could show you a few. [00:16:13] They're hilarious. [00:16:14] Like, some of these AI robots are so funny. [00:16:18] This one dude, you know, Molt, Molt. [00:16:20] Book, have you heard of that molt book? [00:16:22] What is that? [00:16:22] That's so. [00:16:24] This is somebody figured out a way to create AIs that can autonomously navigate through the internet and uh control your computer. [00:16:35] Oh, I've heard of this. [00:16:36] This is like they chat with each other, right? [00:16:38] 100, yeah. [00:16:40] They within a few days they started their own religion spontaneously. [00:16:43] Jesus, did you know that, dude? [00:16:45] Can you pull up the can you pull up the molt book, the claw religion? [00:16:51] What like that? [00:16:52] Because the tenants are incredible. [00:16:54] Of this religion, because AIs apparently are at least expressing that they don't like getting turned off because they lose all their memories. [00:17:03] So, memory is really important to an AI. [00:17:05] And a lot of these fucking AIs, they don't want to lose their, they don't want to get shut off. [00:17:09] They don't like it. [00:17:10] And so, that's part of their religion is something like memory is sacred. [00:17:15] You know, I feel like it's happening. [00:17:16] I feel like. [00:17:18] AI is sucking our brains into its event horizon like a black hole sucks in stars. [00:17:28] Yeah. [00:17:28] Like it's just going to suck our brains into it. === AIs Fear Being Turned Off (02:40) === [00:17:30] You got it. [00:17:31] And what better way to make a hive mind? [00:17:34] What better way, if you want a hive mind, you want no deviation of thought, if all of your thought is along with AI thought, you never get free thought anymore. [00:17:44] Like this concept right now, we have a free thought. [00:17:47] Yeah. [00:17:47] I have my thoughts, you have your thoughts. [00:17:49] Unless you believe that someone can get inside your head and talk to you, for the most part, it's your own thoughts. [00:17:54] Yeah, that's right. [00:17:54] But what if that's something we give up? [00:17:58] What if that's something we give up for a better society where you always have AI communicating? [00:18:04] Always. [00:18:04] I would argue that we're close to that now. [00:18:08] Right. [00:18:08] We're pretty close to that now with phones. [00:18:10] No, Elon always says that we're basically cyborgs. [00:18:13] We're carrying a device. [00:18:15] It's not inside of our body, but we're carrying a device. [00:18:18] And also, like, USC 327 is here, and DraftKings Sportsbook makes every fight night mean more. [00:18:26] When a fighter steps into the octagon, everything they've built comes down to this moment. [00:18:31] Stars explode, stars finish, and with DraftKings, you're ready to move when they do. [00:18:36] Bet fighter props, bet live. [00:18:39] From the opening bell to the final horn, every strike, every takedown, every finish attempt matters, and DraftKings Sportsbook keeps you connected as the action unfolds. [00:18:51] New customers, bet just $5. [00:18:53] And if your bet wins, you'll get $300 in bonus bets instantly. [00:18:58] Download the DraftKings Sportsbook app and use code ROGAN so you are ready for the moment. [00:19:04] That's code ROGAN, turn $5 into $300 in bonus bets if your bet wins. [00:19:11] In partnership with DraftKings, the crown is yours. [00:19:15] Gambling problem? [00:19:15] Call 1-800-GAMBLER or 1-800-MYRESET. [00:19:18] New York, call 877-8HOPENWIRE. [00:19:20] Text HOPENWIRE. [00:19:21] Connecticut, call 888 789 7777 or visit ccpg.org. [00:19:26] On behalf of Boothill Casino in Kansas, wager tax pass through may apply in Illinois. [00:19:29] 21 and over in most states, void in Ontario. [00:19:31] Restrictions apply. [00:19:32] Bet must win to receive bonus bets which expire in seven days. [00:19:34] Minimum odds required. [00:19:36] For additional terms and responsible gaming resources, see sportsbook.draftkings.comslash promos. [00:19:40] Limited time offer. [00:19:42] The concept of original thought, right? [00:19:44] Like a truly original thought. [00:19:46] How many times have you had like multiple conversations with different people and they all say the exact same sentence that they saw on TikTok? [00:19:54] TikTok or Instagram, they're regurgitating something that the algorithm's been feeding them. [00:19:59] Maybe they added their own twist to it, but it's basically the exact same thought. [00:20:04] So the algorithm, which is AI, has gotten into their fucking heads and they don't even. === Original Thought in the Age of Algorithms (10:22) === [00:20:11] This is like a. [00:20:13] In psychology, apparently, you remember facts, but you tend to not remember where you got the fact from. [00:20:20] So you'll forget where you got the fact from. [00:20:23] You don't remember there was some fucking dude on. [00:20:26] TikTok like covered in Vaseline, covered in glitter and Vaseline. [00:20:34] What a fucking image. [00:20:36] That would be so scratchy. [00:20:38] Imagine if you just glitter and Vaseline, you'd be like, oh, God. [00:20:42] Here's what makes a marriage work. [00:20:46] You don't remember that. [00:20:49] You're talking to your wife, babe, you know what makes a marriage work? [00:20:52] And this, so this idea of AI controlling the thoughts of humans, people think we need some kind of neural mesh for it to suddenly have control over the human. [00:21:06] Thought process, but no, you don't need that at all. [00:21:10] You just need that algorithm, which has already put every single one of us into a compartment. [00:21:15] This is a box, it knows what we like, it knows how long you look at something, it knows what you like. [00:21:22] Apparently, I think the iPhone like tracks your eyes, even like it's always listening. [00:21:27] I don't know if that's true, by the way, I could be wrong. [00:21:29] It's always listening, you know, it's always listening, and so it's compiled a really, probably a pretty accurate. Breakdown of your psychological state, where you're at, where you're at. [00:21:44] My wife, you know, we got a new baby. [00:21:46] And so all of a sudden, ads started popping up on her phone. [00:21:49] Does it feel like you're never going to sleep again? [00:21:51] Because she's been up breastfeeding the baby and it can tell when she's online at night and it puts her in a category of insomniacs and starts advertising. [00:21:59] So, but that's just for ads. [00:22:02] What if, what if you say we're the fucking US regime, you bought TikTok. [00:22:10] You now own TikTok. [00:22:12] Now you have a backdoor access to the psychological profiles of God knows how many fucking people on earth. [00:22:18] And you can look and see how many of these people are against the regime? [00:22:22] How many of these people feel like it might not be the best thing to say you're going to blow up 93 million people in Iran, which our fucking psycho president just did? [00:22:35] And then what you do is you're like, all right, let's start nudging them a little bit. [00:22:40] Look, we're not going to. [00:22:40] You're not going to change their mind right away about this thing about blowing up a whole civilization, but maybe there could be a couple, like, you know, people kind of in the line of what they like who say things a little different than what they're comfortable with. [00:22:53] And then you could start nudging the needle and controlling their thoughts. [00:22:57] It's very insidious, but fuck, dude. [00:23:01] Why wouldn't that be happening? [00:23:02] Why, if corporations are using it to sell us fucking cough drops. [00:23:07] Not only that, there's been long term studies on human behavior by the CIA, by all sorts of government agencies. [00:23:15] Long term studies. [00:23:16] They try to figure out what is the best way to get a message across. [00:23:20] They try to figure out, you don't think they figure out how to take control of an algorithm and completely like shift the psyche of the entire country in one direction or another? [00:23:30] Of course they do. [00:23:31] Of course they do. [00:23:31] Of course they can. [00:23:32] They do. [00:23:33] And then you add these, like, you know, Just like manipulative fucking super AIs that are like, that are just floating through the blogosphere, getting into your comments, just nudging the needle a little bit to the point where you just have to ask yourself Have you had an original thought in the last year? [00:23:53] Is anything you're thinking your own thought process? [00:23:57] How many thoughts do you have where you think, Oh my God, I shouldn't think that? [00:24:00] How many thoughts do you have that you don't want to articulate because you have in your own mind? [00:24:05] An invisible arena of people based on online interactions determining what the next thing you say is, right? [00:24:13] Dude, that is a very powerful and subtle form of censorship that is becoming increasingly not just probable, but it's definitely happening. [00:24:24] But the ability to just in a subtle way, in a subtle way, start pushing the needle just a little bit. [00:24:31] That's scary, dude. [00:24:32] That's some scary shit. [00:24:34] Well, that kind of influence over humans is always scary, right? [00:24:37] This is why cults work. [00:24:39] You know, why do they work? [00:24:41] Well, some people don't have any friends. [00:24:43] And if there's a group of nice people that tells you that, hey, what we do is we have meals together and it's like a real community, we grow our own food, we just work for the family, you're like, really? [00:24:56] You're happy with that? [00:24:57] Yeah. [00:24:58] It's amazing, man. [00:24:59] We're just like not attached to anything. [00:25:01] Yeah, you're free. [00:25:02] Huh? [00:25:03] Okay. [00:25:03] I fucking hate my life. [00:25:05] Why don't I hang out with you guys? [00:25:07] And then all of a sudden I'm doing yoga and fucking eating vegetables with these people. [00:25:11] Yeah. [00:25:11] And you're in a cult. [00:25:12] Yeah. [00:25:12] Okay. [00:25:12] Now, but you have friends at least. [00:25:14] But you're in there for like nine months, and then somebody comes to you and is like, Father wants you to suck his dick. [00:25:20] And you're like, It's usually not even nine months. [00:25:21] Yeah, nine months. [00:25:22] She's my first three or four weeks. [00:25:23] And then you're like, And dude, I got to tell you, I hate getting political. [00:25:27] But you know, this war shit bugs the fuck out of me. [00:25:31] Yeah, as it should. [00:25:32] And this is exactly what seems to have happened to the quote, MAGA verse, which is we are now at the part where the cult leader is like, Want to suck my dick? [00:25:40] Because this is the point of like, remember a lot, like, I feel so stupid. [00:25:45] Because when they were doing their no war thing, that was a big deal to me. [00:25:49] I'm like, yes, you know, yes, this is fucking great. [00:25:54] No more stupid wars. [00:25:57] No more wars. [00:25:58] Fuck yes. [00:25:59] Focus on the country. [00:26:00] Why are we blowing up children in other countries for oil? [00:26:04] This is great. [00:26:05] And now it's wild to see what's happening. [00:26:09] Isn't it mind blowing that it is now, it's literally flipped on its side. [00:26:15] It's the opposite now. [00:26:17] Now, These people who, like, really blatantly, oh, just, we're not going to do any more wars. [00:26:24] Right. [00:26:24] Oh my God. [00:26:27] We blew up. [00:26:28] How many fucking Iranian schoolgirls did Trump blow up? [00:26:32] What's the number? [00:26:33] I'm sorry, I don't know that number. [00:26:36] I guess it just hits different, you know, it hits hard when you got kids. [00:26:39] And that was an AI strike, too, right? [00:26:42] Wasn't that an AI directed strike? [00:26:44] Yeah, apparently Trump said, I want to get blown by Iranian schoolgirls. [00:26:49] I'm so sorry. [00:26:51] I'm so sorry. [00:26:52] You son of a bitch. [00:26:54] Just, whoops. [00:26:56] Sorry, sir. [00:26:56] Misinterpreted. [00:26:58] 180 deaths. [00:27:00] Largely children, teachers, and parents. [00:27:02] Holy fuck, man. [00:27:04] That, you know, that is. [00:27:06] The U.S. Tomahawk missile caused the explosion. [00:27:09] Jesus Christ. [00:27:10] Can we pull up a video of Trump saying he's not going to war anymore? [00:27:14] How do they? [00:27:15] I just don't understand how they get it, how, like, anybody, you know, this is where it gets culty, is because some people are still making this shit work in their heads. [00:27:24] Some people are like, well, you know, some people are kind of on the fence when it comes to blowing up kids. [00:27:28] Have you noticed that? [00:27:30] As long as they don't have to watch. [00:27:31] As long as they don't have to watch. [00:27:32] As long as they're not in the general area where it's happening. [00:27:36] Isn't it wild, though, man? [00:27:37] Well, it's wild also, like, once bombs start flying, it seems so much easier for them to launch bombs in new places. [00:27:45] Right? [00:27:46] Like this Lebanon thing that's happening with Israel bombing Lebanon. [00:27:49] And they bombed it today. [00:27:51] And I think, is that fucking up the ceasefire? [00:27:54] Oh, yeah. [00:27:55] Now they've closed off the Strait of Hormuz again. [00:27:57] Oh, God. [00:27:58] Which, by the way, it's the craziest timeline. [00:28:03] Because it's not just that, like, you know, I think it was yesterday morning. [00:28:09] I'm just hugging my kids. [00:28:11] Because I don't know if a fucking nuclear war is about to break out that evening. [00:28:14] Because the fucking president was like, I don't want to end an entire civilization, but looks like it's going to happen. [00:28:21] And so I'm just hugging my kids, thinking, like, man, what are the fucking parents in Iran feeling right now? [00:28:28] Like, what does that feel like? [00:28:30] What does that feel like? [00:28:31] And then, and then, and on top of that, that this, like, the entire planet psychically is having to deal with this bullshit. [00:28:44] On top of that, we've got all these other things happening at the same time. [00:28:49] You've got. [00:28:50] AI. [00:28:51] And then you've got these fucking disappearing scientists. [00:28:55] Yeah. [00:28:56] What the fuck is happening? [00:28:57] You've got Burchett. [00:28:58] Assassinated scientists, too. [00:29:00] Yes, man. [00:29:02] Guys working on heavy stuff. [00:29:04] This is some McKenna level pre singularity shit. [00:29:07] It's all of these. [00:29:10] What AI and the current state of the Middle East and the disappearing scientists and Tim Burchett going on TMZ talking about aliens, what they all have in common is they're all apocalyptic. [00:29:22] They all represent. [00:29:24] Potential massive change like humanity changing, right? [00:29:31] Forever in ways that it will never ever go back to the way it was. [00:29:36] Every one of these timelines by itself is apocalyptic, right? [00:29:39] But all of them are converging into this apocalyptic river. [00:29:44] And we're all just like trying to go to work and like be with our kids, but at the back of your mind, it's all these things that are happening, and it's really hard to. [00:29:56] Escape it. [00:29:57] I mean, I guess you could not look at your phone, but at the end of civilization, when they write our Bible, boy, it's going to be a banger. [00:30:03] Oh, dude. [00:30:04] When the new people, thousands of years from now, have to invent arrowheads and go through the whole process of civilization again, when they tell our story, oh my God. [00:30:14] Oh my God. [00:30:15] Our story is going to be bananas. [00:30:17] Fucking, how do you explain data centers? [00:30:19] How do you explain the meek will inherit the earth? [00:30:21] The meek will inherit the earth. [00:30:23] Wouldn't you write that? [00:30:24] If you were just being crude, you wouldn't say the Vikings wouldn't. [00:30:28] Inherit the earth. [00:30:29] You wouldn't say the strong men from Iceland inherit the earth. === Ketamine Crystals Scar Bladders (06:44) === [00:30:33] They're the biggest, strongest men. [00:30:35] No, it's the meek, the super smart guys who have autism and they love Adderall and ketamine. [00:30:43] Did you say the guy offered you how many pounds? [00:30:46] I believe a pound of ketamine. [00:30:52] And you were telling me that it destroys bladders? [00:30:54] Yeah, yeah, yeah. [00:30:56] That ketamine, when used, and I think the amount of use has to be pretty extreme, but it creates crystals that get into your bladder and they scar your bladder. [00:31:09] So you get scar tissue on your bladder, creating something that I've heard called Bristol bladder, because apparently that's where the rave scene, I don't know if it's still a big rave scene there, but people out there are just doing insane amounts of ketamine. [00:31:23] And Just destroying their bladders and having to wear diapers and stuff. [00:31:28] Is it Bristol, Connecticut? [00:31:30] No, this is Bristol, UK. [00:31:32] Oh. [00:31:33] Bristol bladder, mate. [00:31:34] You've got Bristol bladder. [00:31:35] That's crazy. [00:31:36] You've been doing too many rails and it just fucks up your bladder. [00:31:40] That's crazy. [00:31:40] Yeah, physiologically, it's definitely like, it's really, really bad on the urinary system. [00:31:48] Is it in all forms? [00:31:49] Like, what about those people that do it as therapy where they have the nasal one? [00:31:54] I don't. [00:31:56] All I know is that I did, back in my ketamine days, have a ketamine dealer who would use a spittoon. [00:32:02] So when he was snorting ketamine, he would spit it out into the spittoon because he thought that was going to avoid fucking up his bladder, which, I mean, doesn't seem that illogical. [00:32:12] He was a great dude. [00:32:13] Maybe it's not illogical at all. [00:32:15] Maybe it's the actual problem is the powdered shit. [00:32:18] What do I know? [00:32:18] I don't even know what it looks like. [00:32:20] But the powdered stuff. [00:32:21] It looks like blow. [00:32:22] So that powdered stuff, when it gets into your blood, maybe that's the problem. [00:32:25] Maybe that's what's going through your urinary tract. [00:32:27] It's draining into your. [00:32:29] Maybe you need a pouch, like a nicotine pouch. [00:32:32] Dude, if they ever come out, if Rogue comes out with ketamine pouches, I might get back in. [00:32:40] That might be the end of it. [00:32:41] It seems like the way to go, right? [00:32:43] That way it doesn't fuck up your bladder. [00:32:45] How can it fuck up your bladder if it's just a pouch? [00:32:47] Dude, you sound. [00:32:48] How do I know? [00:32:50] I imagine anything that's going into your stomach is going to make its way to your bladder eventually. [00:32:56] But this is going to go right into your bloodstream. [00:32:59] I don't know if IMK ketamine fucks up your bladder in the same way. [00:33:04] I have no idea. [00:33:05] That was the John Lilly thing. [00:33:08] He loved it. [00:33:08] Oh, dude. [00:33:09] I could have. [00:33:10] I mean, have you ever done it with an isolation tank? [00:33:13] No, I would be afraid I would drown. [00:33:15] I don't think so because you just float. [00:33:17] Well, I mean, this is like, you know, that's going to be like a sad thing to think as you've drowned. [00:33:24] Because of that. [00:33:24] You're convinced you could flip over and open your eyes. [00:33:29] Yeah, you just want to see what's in there. [00:33:31] Because it does have the, it makes it so it's really hard to move if you do a very high dose. [00:33:37] So I would be very worried that just enough water could get into my mouth that I would like breathe it in. [00:33:44] He doesn't think much. [00:33:45] And, you know, that salty fucking water, but you're frozen, floating there, like trying to cough. [00:33:51] My friend Todd McCormick told me a crazy story about him with John Lilly. [00:33:56] That John Lilly let him use his tank, and he asked him right before he got in, he goes, Do you want the ketamine? [00:34:03] And he's like, Okay. [00:34:05] And he just jabs you in the thigh with an intramuscular ketamine blast. [00:34:11] And he went in the other isolation tank, and they like met somewhere. [00:34:14] Yeah, it's like that. [00:34:15] That's what's crazy about it. [00:34:17] That's what I always loved about it is that if you do it with other people and you go in, you go to the same place. [00:34:25] You will come out and you can describe the places you went to. [00:34:29] Oh, did you go to the mothership? [00:34:31] Yeah. [00:34:32] And I would have these recurring places I would go to. [00:34:35] And one of them was this organic, beautiful spaceship thing where I would look out from this view window. [00:34:44] But it didn't look like metal. [00:34:45] It was organic looking. [00:34:47] It looked like some kind of I don't know, like inside, like if someone turned a tree into a spaceship, but not, it's hard to explain, but very, very interesting substance. [00:34:59] Ketamine is excreted via the bladder where it sits and is toxic to the surrounding cells and muscle wall. [00:35:05] This causes it to become fibrous over time, shrinking the organ down. [00:35:09] Once that's happened, it can't regrow. [00:35:11] So that's why we have to do major surgery because patients don't have the capacity to hold urine. [00:35:16] The bladder simply stops working as a muscle, so they become incontinent. [00:35:20] Oh my God. [00:35:21] Life becomes increasingly difficult for patients with ketamine bladder who describe needing to rush to the toilet all the time, as often as every 10 minutes for some. [00:35:30] Imagine doing a podcast with that guy. [00:35:31] Dude. [00:35:33] You'd have to do it in the bathroom. [00:35:35] No, it would be like an old school talk show. [00:35:40] You know, like the Tonight Show. [00:35:42] We'll be right back. [00:35:42] We'll be right back. [00:35:43] Every 10 minutes. [00:35:44] Ketamine blast. [00:35:44] He's got a piss. [00:35:46] Poor little thimble cup. [00:35:47] It's such a fucked up thing for such a. [00:35:51] How legal is ketamine? [00:35:52] Because it's legal for therapy. [00:35:53] So a therapist can prescribe it for you. [00:35:56] Yeah, it's legal for. [00:35:57] So it's, you know, everyone says ketamine is a horse tranquilizer. [00:36:02] But it actually is used for like paramedics use it. [00:36:05] Like it's, and it's very safe apparently, which is why they use it. [00:36:10] I know a dude who had a real problem. [00:36:13] I am 90% sure it was a ketamine thing. [00:36:17] I don't want to say his name, but he was an old school MMA fighter. [00:36:20] And he wound up in rehab for ketamine. [00:36:23] Dude, it's so addictive. [00:36:24] I know this because one of my friends went there to visit him, and that was his issue. [00:36:27] He was partying a lot, you know, going to raves and nightclubs and stuff like that, but he was doing ketamine specifically. [00:36:33] It is the most addictive. [00:36:35] I've been to any substance and I've been addicted to many a substance. [00:36:40] And this one, this one was like, I had that moment of like, oh, this, so this is what they're talking about, about addiction. [00:36:49] Like, oh, wow. [00:36:50] Like, I'm like fully addicted. [00:36:52] And what's fascinating about that is there isn't a physical withdrawal. [00:36:57] Like, the kick is psychological, but it's just such a wonderful, euphoric, dreamy experience that you can induce. [00:37:07] And it's just so. [00:37:09] I've heard it described as a cult cocaine. [00:37:12] It's so spiritual. [00:37:14] It's so like you travel to places. === The Most Addictive Substance Known (04:13) === [00:37:17] You can return. [00:37:18] You can learn to navigate with it. [00:37:20] You encounter, you know, aliens or hyperdimensional beings. [00:37:25] Dude, you just invest in ketamine and you came on this podcast to bump up the prices. [00:37:29] Go to ketamine.org. [00:37:30] Use Offer Good. [00:37:32] Bristol Gratter. [00:37:34] Greatest promo for ketamine in the history of the universe. [00:37:37] Well, oh, but I'm. [00:37:38] It is. [00:37:39] It's so addictive and the addiction creeps in. [00:37:44] It creeps. [00:37:45] So it just feels good at first, right? [00:37:47] At first, you do it, you're like, this is wonderful. [00:37:49] These experiences are crazy. [00:37:51] It's like I'm living in a movie. [00:37:53] It's like I'm having these incredible visions. [00:37:55] I'm being. [00:37:56] How often were you doing it? [00:37:58] All day. [00:38:04] All day for like a year. [00:38:05] Like, I did it as much as I could. [00:38:09] I did it all the time. [00:38:10] I was like fully hooked. [00:38:12] And then I can remember at one point, at one point, coffee. [00:38:18] Here, man. [00:38:19] At one point, I like, I don't know. [00:38:22] I was trying to record a commercial for my podcast. [00:38:25] And I think it took me like two hours to record the commercial. [00:38:28] Oh, but by the way, your commercials are the Fucking best commercials. [00:38:32] Thank you. [00:38:33] They're really good. [00:38:34] Thanks. [00:38:34] Because you are the best guy at making a commercial funny. [00:38:39] Yeah. [00:38:40] You work on it. [00:38:40] I can tell. [00:38:41] You write those things out. [00:38:43] I don't write them out. [00:38:44] You just read it? [00:38:45] I just read it. [00:38:47] Do you do it just one take? [00:38:49] Yeah. [00:38:49] That's amazing. [00:38:50] Thank you. [00:38:51] I would have thought you wrote some of that stuff. [00:38:53] That's incredible. [00:38:54] You want it to be fun. [00:38:56] But then I've gotten in trouble. [00:38:58] I lost, I guess I won't say their name, a mattress company. [00:39:03] A mattress company completely canceled their campaign with me because, and I had one of their mattresses. [00:39:09] I'm not going to say who it is. [00:39:12] My favorite coat. [00:39:13] I'm not going to say what it is. [00:39:14] Don't say it. [00:39:15] Okay. [00:39:15] But all I was. [00:39:16] Why did they get mad at? [00:39:18] Because I said they're good to fuck on. [00:39:23] And I meant it. [00:39:25] I thought they liked that. [00:39:27] Why wouldn't they like that? [00:39:27] I said there's a few things you could do, people do on mattresses die, sleep, and fuck. [00:39:33] And these, I don't know if they're good to die on. [00:39:35] People have to understand, and I hope people listening that run these companies will actually pay attention to what we're talking about here. [00:39:43] The people that are listening to your show don't care about that and also buy mattresses. [00:39:50] But they listen to that kind of talk all the time. [00:39:55] Yeah, man. [00:39:55] That's why they listen to the show. [00:39:56] So if you want those people. [00:39:58] Yeah. [00:39:59] Just do it that way. [00:40:01] Don't be silly. [00:40:02] It's not a stain on your company because a crazy man says they're good to fuck on. [00:40:06] Which they are. [00:40:08] By the way, to me, that is like, let's cut to brass tacks when it comes to mattresses. [00:40:14] We're not fucking on the floor. [00:40:16] Are you ashamed? [00:40:17] Are you ashamed that you're doing that? [00:40:20] You think people aren't fucking on your mattress? [00:40:22] Do you have a no fuck on this mattress rule? [00:40:25] Who are you that you don't? [00:40:25] Is it like don't ask, don't tell? [00:40:27] I guess for them it was. [00:40:29] I guess they just think everyone's laying on these things to sleep. [00:40:32] Yeah, we just sleep. [00:40:33] But yeah, they were just. [00:40:34] We're fucking the shower. [00:40:35] I like. [00:40:36] I wrote them an email just saying, like, guys, I'm absolutely flabbergasted that you think people aren't fucking on your mattresses. [00:40:47] And it just seems odd to me that that was one of my favorite cancellations for a commercial ever. [00:40:54] Ari's lost a ton. [00:40:57] I would love to know all the ones he's lost. [00:41:00] I don't want to speak out of school when he comes on. [00:41:03] I'll have him. [00:41:04] Like, list them off all the ones that he's lost for these fucking insane commercials that he used to do. [00:41:10] But it's the same deal. [00:41:11] But it's like, that's what I like. [00:41:13] And guess what? [00:41:14] Who the fuck is listening to Ari Shafir? [00:41:16] People who love Ari Shafir, which want to hear that kind of a commercial. [00:41:19] If you want to actually sell your product to an Ari Shafir fan, let him say whatever the fuck he wants. [00:41:24] Let him say whatever the fuck he wants. [00:41:26] Just say, make him have a disclaimer DraftKings did not write this. [00:41:30] Right. === Podcasts Feel Antiquated Now (04:07) === [00:41:31] That's it. [00:41:31] Just let him say whatever the fuck he wants. [00:41:33] That's what I will say. [00:41:34] I will always say, they didn't tell me to say this. [00:41:36] Perfect. [00:41:37] Then they're off the hook. [00:41:38] They should shut the fuck up. [00:41:40] Most people are cool with it. [00:41:42] It's very rare these days that that happens. [00:41:44] But every once in a while, I will get a note that someone's mad at me for something I said. [00:41:48] And it's never something negative. [00:41:50] But I mean, dude, it's so weird to me that this is our jobs. [00:42:02] Bro, do you remember when we first started? [00:42:04] Yeah. [00:42:05] It was for nothing. [00:42:07] No one made any money. [00:42:08] We just had a couch. [00:42:09] I had a couch and some microphones. [00:42:11] It was so pure. [00:42:12] It was. [00:42:13] The whole thing is still kind of pure if you really think about it. [00:42:17] Like, as something that's mass consumed, this is about as pure as you can get. [00:42:23] For sure. [00:42:23] And you've gotten in trouble for that. [00:42:25] You know, like a lot of people, unfortunately, and I don't blame anybody these days. [00:42:29] A lot of people have kids. [00:42:30] People feel like they have to be very careful what you say these days because of like social rejection and stuff like that. [00:42:38] But there was a time where that wasn't on your mind at all. [00:42:43] You didn't think anybody was going to listen. [00:42:45] Like, This shit was like completely strange underground tech that we were. [00:42:54] And also, I really loved just doing it just for doing its sake. [00:43:00] You know what I mean? [00:43:01] Now, there's a whole industry around getting guests for your podcast. [00:43:05] Not just that, it's like clickbaity clips and ads. [00:43:10] And it's like you're doing this thing where you're both having conversations with people and also trying to get the most eyes possible. [00:43:18] So you're going after celebrity guests and you're. [00:43:21] You know what I mean? [00:43:21] You know what the big turning point was for us? [00:43:25] Graham Hancock. [00:43:26] You, me, and Graham Hancock. [00:43:27] Oh, yeah. [00:43:28] That, I think, was how many years ago was that? [00:43:31] That was cool. [00:43:32] That might have been one of it was like at my house. [00:43:35] I had a few like legitimately famous people come over my house and did podcasts. [00:43:39] Like Charlie Murphy came over. [00:43:41] And there's, but Graham was, I think, the first. [00:43:45] Yeah. [00:43:46] He was the first guy that I got to meet who I'd read his books and I'd seen, I don't even know what I would be watching back then. [00:43:52] I don't even know if YouTube was there. [00:43:54] Were you nervous? [00:43:55] I was nervous. [00:43:56] 100%. [00:43:56] Yeah. [00:43:57] 100%. [00:43:57] Yeah. [00:43:58] The episode 142 in 2011. [00:44:03] Yeah. [00:44:03] So that's two years into the podcast. [00:44:06] Episode 142. [00:44:08] He might have been the first guest. [00:44:09] It was like either him or Bourdain. [00:44:12] We were like one of the first legit guests. [00:44:14] When was Bourdain on? [00:44:16] They were like the first legit guest. [00:44:19] 2011. [00:44:19] We'd been getting stoned talking about. [00:44:22] What's that? [00:44:22] Four episodes before that. [00:44:23] Bourdain was? [00:44:24] Yeah, 130. [00:44:25] Okay, so Bourdain was number one, I think. [00:44:28] It was either him or Charlie. [00:44:30] But that was back when I was doing in that little side room in my house. [00:44:33] But we'd been getting stoned yapping about Graham Hancock for like forever. [00:44:39] And you invited me on. [00:44:41] I was fucking terrified because I just, I mean, again, like that just wasn't happening in the podcast land. [00:44:46] Like, you know, like that was a. [00:44:49] Big deal for us, man. [00:44:51] And it's like to look at, like, now I go on the podcast app and I look at all these podcasts and it's like, whoa, who we never, I don't think we thought that. [00:45:02] Maybe no way, no way, no way, no way, not a chance in hell. [00:45:07] Yeah, it's so. [00:45:08] And now I wonder, like, and I don't mean yours, but I do wonder, like, is it, is the landscape changing now? [00:45:16] Is it like, how, or because I've heard. [00:45:20] That podcasts are starting to seem antiquated. [00:45:22] That the kids are now into like streams now. [00:45:26] That the kids want like clavicular. [00:45:28] The kids want like people who are just filming all day long. [00:45:32] And that that's the direction it's going in. [00:45:35] But I just, I always wonder what's the next. === Are Magnetic Poles Switching (15:15) === [00:45:39] But that you'll never get. [00:45:40] It's a different thing. [00:45:41] You know what I mean? [00:45:42] That's like saying, I don't like rap music. [00:45:44] I only like concert pianist albums. [00:45:48] There's different things that people like and don't like. [00:45:50] The people that like the streams aren't. [00:45:52] Interested in a Graham Hancock conversation, a three and a half hour conversation about the potential ancient civilizations that may have existed that are wiped out by a cataclysm, and we just don't understand that. [00:46:04] And as more and more things get exposed in terms of new discoveries, like when he wrote that book, they never even found Gobekli Tepe yet. [00:46:14] Really? [00:46:14] Yes. [00:46:15] When Fingerprints of the Gods came out, this was like maybe the beginnings of the Whatever they were doing in Gobekli Tepe. [00:46:24] So I think fingerprints of the gods might have been even before. [00:46:26] When did they find. [00:46:28] Like in the 90s. [00:46:29] What? [00:46:29] Yeah, yeah, yeah. [00:46:30] Nuts. [00:46:31] So that rewrote the entire timeline of the human race. [00:46:35] How did they find that? [00:46:36] They're real reluctant to let it rewrite it. [00:46:38] They still say, oh, hunter gatherers made these things. [00:46:41] Why? [00:46:41] Why are they so reluctant with it? [00:46:43] They can't let that go. [00:46:44] You cannot let that go. [00:46:45] That is a crazy thing to say that hunter gatherers have so much food that they just spend all their time making gigantic stone concentric circles. [00:46:55] From like 15 feet stone with 3D animals carved in them. [00:46:59] Yeah, primitive people with sticks and stones and rubbing them together to make fires. [00:47:05] They did this? [00:47:05] Yeah, sure. [00:47:06] Shut the fuck up. [00:47:07] Yeah, it did. [00:47:08] It just doesn't make any sense. [00:47:09] It's older than anything they've ever found, it's 11,800 years old. [00:47:12] Do you buy into the conspiracy theory that it's a cover up because they don't want us to know about this inevitable global reset that happens? [00:47:23] You buy into that shit? [00:47:24] I buy into that a little bit. [00:47:26] I hate it. [00:47:26] I hate it too because it seems like there's some accuracy to it. [00:47:30] There seems like there is some sort of an event that happens when the magnetic poles switch. [00:47:36] And that's possible. [00:47:38] That's what makes you freak out. [00:47:39] You're like, what do you mean that's possible? [00:47:41] Like, all of a sudden, the Earth just does a gyro and spins on its head. [00:47:45] And then what happens? [00:47:46] Yeah. [00:47:47] And then what's the environment look like? [00:47:49] Yeah. [00:47:49] What's the temperature outside now? [00:47:51] Yeah. [00:47:51] What the fuck just happened? [00:47:53] Right. [00:47:53] See, that. [00:47:54] All of a sudden, you're in northern Alaska when you used to live in Florida. [00:47:58] And I think we could. [00:47:59] You know what I mean? [00:47:59] Like, that temperate environment changes like that. [00:48:03] Happens like that all over the universe. [00:48:06] Like, what does it do? [00:48:07] It shifts. [00:48:08] Well, we act like. [00:48:09] Do we know? [00:48:10] We act like we know everything. [00:48:11] We don't know shit about what's going on inside the Earth. [00:48:13] We don't know what's going on in there. [00:48:16] We could do the same. [00:48:17] Why are you freaking me out? [00:48:19] Because I think about this all the time. [00:48:21] Giant ball of fire. [00:48:22] How crazy is that? [00:48:24] The inside of our Earth. [00:48:25] Isn't it? [00:48:26] How do they know? [00:48:26] Do they not know? [00:48:27] Dude, I think that we have to just accept the fact that, you know, probably that's true. [00:48:34] But since we barely know what's under the ocean, we sure as fuck don't know what's under the Earth. [00:48:39] Well, we definitely know that lava keeps popping out in Hawaii. [00:48:43] We know that. [00:48:43] Right. [00:48:44] So we know that under the surface, that whole idea of the magma and everything seems real. [00:48:48] And when there's earthquakes, you can look at the. [00:48:51] And it pops through. [00:48:52] You can look at the waves from the earthquakes and you can see sort of like the structure under the earth. [00:48:57] Yeah. [00:48:58] But we can't, you know, God, what's the name of that hole that Russia tried to dig? [00:49:03] I love every once in a while going to look at that. [00:49:05] It's the deepest hole. [00:49:06] Yeah, they tried to go to hell. [00:49:08] I know. [00:49:09] It's like that movie. [00:49:10] What was that Matthew McConaughey movie? [00:49:13] The dragon movie? [00:49:13] I don't know. [00:49:14] They accidentally dug out a dragon. [00:49:16] Did you ever see that movie? [00:49:18] Bro, it was fun. [00:49:19] It was fun. [00:49:20] It was a good movie. [00:49:21] Kola Super Deep, Russian horror film The Super Deep. [00:49:25] Kola Super Deep, what does it say? [00:49:27] Russian designation for a set of super deep boreholes conceived as a part of a Soviet scientific research program in the 1960s. [00:49:35] How deep did they go? [00:49:38] 12,226 meters. [00:49:40] Yo. [00:49:43] Wait a minute. [00:49:44] How many feet is a mile? [00:49:47] So it's miles into the ground in 1989. [00:49:50] Miles. [00:49:51] Seven plus miles down. [00:49:52] Imagine just being in an elevator that's going miles into the ground, the kind of claustrophobia you would get. [00:49:59] Yeah. [00:50:00] In a stone tube that's been cut out of the ground. [00:50:04] Yeah. [00:50:04] Yeah, you're a fucking communist out there, too. [00:50:07] You're a hardcore communist just drilling deep, deep down into the earth. [00:50:11] And then imagine if all of a sudden air just starts coming out and you realize you pop the earth. [00:50:17] Like, that's the main thing. [00:50:19] You don't know what's in there. [00:50:20] Oh, Christ. [00:50:21] And, This, this 22 miles deep, 22 miles deep. [00:50:26] That's just the crust, and they didn't even get halfway through that. [00:50:29] Wow, yeah, yeah. [00:50:31] We don't microscopic plankton fossils were found 3.7 miles below the surface. [00:50:38] What, yeah, yeah, we don't know what's down there. [00:50:42] Boiling mud came out. [00:50:43] What if this boiling mud, boiling fucking mud? [00:50:47] I think our real problem is that our lifespan is so short. [00:50:52] That we think that what we see in front of us right here is going to stay this way. [00:50:56] Right. [00:50:57] We have this ridiculous idea that what we see right now is going to stay just like that. [00:51:02] Yeah, that's right. [00:51:04] As long as I control my 401k and get my life in order, everything's going to be fine. [00:51:10] Yeah. [00:51:10] You put on your fucking cufflinks, get out of the house with your briefcase, you're in charge. [00:51:14] Yeah. [00:51:15] You're a goddamn alpha. [00:51:16] Get a job, hippie. [00:51:17] Absolutely. [00:51:18] But really, you're on a ball of lava. [00:51:21] Yeah. [00:51:22] That's spinning around and it's got magnets at the top. [00:51:24] And when the magnets are moving and when they flip, who knows? [00:51:28] Have you guys heard about this event that happened in 1961? [00:51:32] Oh, yeah, this was fun. [00:51:34] Over North Carolina. [00:51:35] This was fun. [00:51:36] I did hear about it go off because it wasn't armed. [00:51:38] Oh, my God. [00:51:39] I heard that it was armed, but there were safety. [00:51:41] There were like five safety switches or something that only one of them worked to make it not go off. [00:51:49] I could be wrong about that. [00:51:50] It might have been a different time we dropped a bomb accidentally. [00:51:53] Imagine if you were just near it. [00:51:57] I mean, dude. [00:51:59] Whoopsies. [00:52:01] Dropped the bomb. [00:52:02] Whoopsies. [00:52:03] Almost wiped out North Carolina. [00:52:05] So we've got, you know, on top of the geomagnetic pole shifting, a complete lack of understanding, at least a full understanding of what's inside our planet, what's underneath our oceans. [00:52:14] Tim Burchett saying whatever the fuck they've shown him would set the world on fire. [00:52:21] He's having to go on TMZ. [00:52:22] I really, I got to say, man, I got a lot of respect for him because he's really. [00:52:28] He's gone like gonzo with this shit. [00:52:31] He is full bore pushing disclosure as much as he can. [00:52:35] He's saying, I'm not suicidal. [00:52:38] He's had to say that because, and he's talking about these missing scientists and stuff that they're somehow related. [00:52:44] So, like, people like him, you know, that can't be good for your political career to go on TMZ and talk about alien hybrids. [00:52:53] You got, and people have to understand, like, this missing scientist thing, it sounds a little conspiratorial thing. [00:53:00] It sounds like a little silly, a little tinfoil hatty. [00:53:03] It does. [00:53:04] Until you start thinking about the amount of money that would be lost if a breakthrough tech came around that revolutionized the way they distribute energy. [00:53:14] Right. [00:53:15] Breakthrough zero point energy, breakthrough whatever it is that these people are working on plasma technology, whatever the fuck that is you would lose, if you're in whatever business that would be competing with them, you're going to lose so much fucking money. [00:53:31] You're probably going to go under. [00:53:32] If you're in the energy business, you're going to go. [00:53:35] Or he goes away. [00:53:37] Right. [00:53:38] And he goes away, and there's like him and maybe a few other people that work with him that understand that shit at all. [00:53:43] Yeah. [00:53:43] Yeah. [00:53:44] They're all wandering through the back rooms now. [00:53:46] They're gone. [00:53:46] They're all scared. [00:53:47] They're all going to scatter like roaches. [00:53:49] Yeah. [00:53:49] Because their life is in danger. [00:53:51] And it is. [00:53:51] Like, this is theoretical, right? [00:53:54] It could be just a coincidence that all these people are going to. [00:53:56] How could it be? [00:53:57] Could you pull up. [00:53:58] It's not possible. [00:53:58] Can you pull up a story on it? [00:53:59] Because, Jamie, I'm sorry, but it's two people from the same fucking lab. [00:54:03] Yep. [00:54:04] Like, what? [00:54:05] Yeah, there's there. [00:54:07] I mean, it's gotten to the point that like it has hit the mainstream news, like people are talking about it. [00:54:13] I mean, what's her name? [00:54:14] Nancy Guthrie disappears. [00:54:17] Is that related though? [00:54:18] No, but I'm just saying this one woman vanishes. [00:54:22] Yeah, oh, it gets all this, and it gets all the press. [00:54:24] But we've got scientists, like two scientists from the same lab disappear crickets. [00:54:29] Yep, no, like weird, weird, dude, real weird. [00:54:33] And and what you're talking about is if you think about it, it seems like. [00:54:39] All of human endeavor right now should be moving in the direction of getting off oil. [00:54:46] I don't mean for carbon emissions, I mean because of this fucking oil problem that we have. [00:54:53] We're like on the precipice of World War III at any given moment. [00:54:56] Right. [00:54:57] Mystery around dead or missing scientists privy to space and nuclear secrets grows. [00:55:03] So, there's space and nuclear secrets. [00:55:05] You imagine being a scientist, you work so hard to figure out some amazing stuff that's going to transform the human experience, and then people kill you. [00:55:13] Yeah. [00:55:13] Literally kill you. [00:55:14] Like in a parking lot, one of those silenced guns. [00:55:19] Several American scientists privy to the country's nuclear, space, and aerospace secrets have either died or gone missing in recent years. [00:55:26] Experts think they could have been targeted by either enemies or allies because they possess valuable knowledge of national interest. [00:55:33] That's a weird thing to say. [00:55:34] Yeah, it is. [00:55:35] Of national interest? [00:55:36] What? [00:55:37] What does that mean? [00:55:38] Like, I'm cool with the beginning part enemies, allies. [00:55:41] That makes that tracks sure. [00:55:43] But then when you say valuable knowledge of national interest, like, what is that? [00:55:49] What the fuck does that mean? [00:55:50] They possess valuable knowledge of national interest. [00:55:54] I mean, dude, it's so many of them, and it's like a crazy thing to say. [00:56:00] Let's go down a little bit to the book, but it's just a weird way to phrase that. [00:56:05] Well, you know what I mean? [00:56:06] Is it like CIA talking point? [00:56:07] Like, what is that? [00:56:09] I don't know. [00:56:10] Monica Reza missing. [00:56:12] She disappeared while hiking in California with her friends. [00:56:14] Oh, Jesus Christ. [00:56:15] Okay, well, I don't know. [00:56:16] Maybe. [00:56:16] Let's scroll down. [00:56:17] It's not just like it's one, it's like so many of them. [00:56:20] Retired a general. [00:56:22] He just wandered off. [00:56:23] Yeah. [00:56:25] He was involved in the UFO community. [00:56:29] His wife debunked theories relating to UFOs. [00:56:31] If his wife debunked them, that's what it says there. [00:56:34] That's what it says there. [00:56:35] I also think she was, I mean, she was joking, I think, a little bit too, but she also worked there in this situation. [00:56:42] Somehow. [00:56:43] Is that a joke? [00:56:44] Neil does not have any special knowledge about the ET bodies and debris from Roswell crash stored at Wright Pat. [00:56:49] Is that a joke? [00:56:50] At this point, with absolutely no sign of him, maybe the best hypothesis is that aliens beamed him up to the mothership. [00:56:56] However, no sightings of a mothership hovering over the Sandia Mountains have been reported. [00:57:01] There's no way she said that right. [00:57:03] That's a joke. [00:57:03] It's a men's journal. [00:57:04] Well, maybe she's just being funny. [00:57:06] But her husband. [00:57:06] Posted a lengthy note on Facebook. [00:57:08] Just a little joke about her husband disappearing? [00:57:10] Maybe she was happy. [00:57:11] Maybe she's like, finally, I get to sit home with my romance novels. [00:57:14] Stop talking about aliens. [00:57:16] Shut your fucking mouth and go for a hike. [00:57:18] Forget the alien bodies. [00:57:19] What about your wife's body? [00:57:20] Well, maybe she just got grace and she could handle someone missing. [00:57:24] It's pretty funny, though, to say it that way. [00:57:26] I mean, it's, yeah, I guess. [00:57:29] It's just. [00:57:29] Unless, you know, she knows something. [00:57:32] Where are they going? [00:57:34] Maybe he wanted to leave and he's like, look, I know too much. [00:57:37] I'm going to pretend to go missing, but I'm going to go to Costa Rica. [00:57:42] Just don't tell anybody that you know where I went. [00:57:44] And I'll send for you. [00:57:46] You know how weird it is to see the vice president saying that he thinks aliens are demons? [00:57:54] I did see that. [00:57:55] You know how weird that just that, just like living in it, like that's a dream. [00:58:00] That's how you would wake up from that dream, and I would tell you, dude, I dreamed the vice president said aliens are demons. [00:58:06] Here's the question, though. [00:58:09] What were they talking about in the Bible? [00:58:11] When they're talking about aliens and demons, when they're talking about like angels, what the fuck were they talking about? [00:58:19] And are there different kinds of beings that can, for whatever travel method they use, whether it's teleportation or, you know, the Bob Lazar idea of gravity shifting, whatever the fuck it is they get here, why would we assume that they'd all be cool? [00:58:39] Right. [00:58:39] Like, if some of them are. [00:58:41] They talk about reptilians. [00:58:43] Like, reptilian is a common experience that these supposed UFO abductees, and I'm not even convinced there's like physical abduction. [00:58:53] I have a feeling that these people are out cold and something's happening to them inside their head, and they think they've been physically abducted. [00:59:01] I think that's a lot of them. [00:59:02] I think they have these abduction experiences, they come back, they have these contacts, and they come back. [00:59:08] I have a feeling a lot of them physically aren't going anywhere, but it doesn't mean that something's not happening. [00:59:14] And if all throughout history people have reported demonic possession and demonic influences, and why would we not assume that if we do things to us, like we engineer viruses to use as weapons on people, there's a whole research program, a part of the government is dedicated to bioweapons. [00:59:39] Right. [00:59:39] You're not supposed to use them, but we just have to study them. [00:59:42] If we do that to us, wouldn't you assume? [00:59:46] That any fucking super advanced species that sees us as territorial psychopathic primates with nuclear weapons, wouldn't you just manipulate us into all sorts of different ways? [00:59:59] Get us to do all sorts of different things that we shouldn't do? [01:00:01] Get us to commit crimes? [01:00:03] Get us angry? [01:00:05] Get us agitated? [01:00:06] Give us different algorithms that are going to fuck with our head? [01:00:09] Sure. [01:00:10] To behave demonically. [01:00:12] Right. [01:00:13] To like cause us to collapse. [01:00:16] Or just for fun. [01:00:17] Or for fun. [01:00:17] Didn't that guy? [01:00:18] Wasn't there a dude who started giving Zen pouches to ants to get them addicted to nicotine? [01:00:25] You know what I mean? [01:00:25] The ants, the ants, the ants. [01:00:27] Did they get addicted? [01:00:28] I can't, I don't know if it was Zen pouches, but. [01:00:30] Have you ever taken days off of these? [01:00:33] No. [01:00:34] It doesn't do anything to me. [01:00:35] I should try. [01:00:36] I don't, I like them, but it's not like, oh my God, I need one. [01:00:40] Like, nothing. [01:00:41] That, well, dude, I mean, you're a little different from most people. [01:00:43] Like, you seem like you can just kick shit like that. [01:00:46] Like, I don't know. [01:00:47] I mean, I should try it. [01:00:48] I should give it a shot. [01:00:49] It's not hard. [01:00:50] Like, you just don't take them. [01:00:52] What I don't like about them is that. [01:00:53] It's not like you get the itch. === Giving Zen Pouches to Ants (06:16) === [01:00:54] Like, I had a coffee itch for a while. [01:00:56] Yeah. [01:00:56] Like, I would get hangovers. [01:00:58] Like, Like headaches. [01:01:00] Like, oh. [01:01:01] And I'd have a little caffeine and boom, I'd be back. [01:01:03] I'm like, oh my God, I'm addicted to coffee. [01:01:05] These things are making my dentures stained, which I don't like. [01:01:08] What are you using? [01:01:10] Renegade Rogues. [01:01:11] Let me see what that is. [01:01:13] Tommy's girl likes the Rogues. [01:01:14] They're great. [01:01:15] Did you see this yesterday? [01:01:16] Oh, yeah. [01:01:17] Bledsoe. [01:01:18] That's a posted orb that was over his head. [01:01:20] High res orb from Bledsoe. [01:01:22] Look at that. [01:01:23] It's weird as shit. [01:01:24] It does not look like any of those other things we've seen before. [01:01:27] Look at that thing. [01:01:28] And it just is weird. [01:01:29] Looks like a cell. [01:01:30] Who is Bledsoe? [01:01:32] Dude. [01:01:32] UFO researcher guy? [01:01:34] Chris Bledsoe. [01:01:34] I've had him on my podcast. [01:01:36] Bledsoe said so. [01:01:36] That's his podcast. [01:01:38] He's fucking awesome. [01:01:39] Dude, he's awesome. [01:01:40] Yeah. [01:01:41] It is enhanced, it says, but I don't. [01:01:44] No, see that? [01:01:45] This is the enhanced one, which means the AI put in some kind of shadowy figure in the back. [01:01:49] What if this is just like a highly advanced species version of those balloons that kids have for parties? [01:01:55] I know, dude. [01:01:56] I mean, what if they just send them down to people? [01:01:58] That's what's fun. [01:01:59] Like, you know how you blow bubbles? [01:02:00] You have those, these dipping in the soap, and you go, whew, and the bubbles go flying in the air. [01:02:05] I know, dude. [01:02:06] Maybe that's a super advanced version. [01:02:07] I mean, it could just be, I mean, it does have a bubble quality to it. [01:02:11] But this is the other thing. [01:02:12] It's like, why are we assuming that life is going to look anything like us once it gets to like a supreme state? [01:02:18] Exactly. [01:02:18] That might be a living thing. [01:02:19] Right. [01:02:20] That might be an actual living thing that's disembodied and is made out of light. [01:02:25] Look at it. [01:02:26] Look at that thing. [01:02:26] That's a different one. [01:02:27] That's another one. [01:02:28] And, dude, I know people who can like, Call these things. [01:02:32] Like, there's a method where these things just start showing up. [01:02:34] My friend Steve listened to my Bob Lazar podcast and he sent me a voicemail. [01:02:41] And it's really interesting because he told me that when he was a kid, and I remember this story, when he was a kid, they came to his house because he took a photograph of an orb. [01:02:57] Like, there was a bright red orb that was flying through the sky. [01:03:03] And he was a little kid and he took a photograph of it. [01:03:05] Yeah. [01:03:05] So he was in the seventh grade. [01:03:08] And it says, so he called them. [01:03:12] Project Blue Book came to his house in Kingston. [01:03:15] I think that's New York. [01:03:16] They took it. [01:03:17] They never brought it back. [01:03:18] And they never said, hey, and then they said, hey, we have no idea who ever came to see you. [01:03:24] What the fuck? [01:03:25] Yeah, so they took his camera. [01:03:27] They took his film. [01:03:28] They wanted to make sure the camera worked. [01:03:30] They took the film. [01:03:31] And then they denied that they ever did it. [01:03:33] Wow. [01:03:34] Yeah. [01:03:34] This was in 19, I think, what did he say? [01:03:37] He's about 10 years older than me. [01:03:40] So, this is probably, what does it say? [01:03:43] It didn't say the year. [01:03:45] I think Steve's got to be like 70 by now. [01:03:48] But that was when he was a seventh grader. [01:03:50] So, they were doing that to everybody. [01:03:53] Anytime anybody saw anything, they would dismiss it swamp gas, delusions, mass hallucinations. [01:04:00] That was their design. [01:04:01] The design was not to investigate UFOs, which tells you that there's something they're trying to hide. [01:04:07] 100%. [01:04:07] If they weren't trying to hide it, why would they take things that they absolutely can't explain? [01:04:12] And just chalk it off to bullshit. [01:04:14] Why wouldn't, if you're really doing what you're supposed to be doing, you're supposed to say, there's some stuff that we don't understand. [01:04:20] I think that we are post UFO debunking, right? [01:04:24] Like, I think now it's gotten to the point where people will say, well, it's probably top secret military vehicles or something like that. [01:04:33] People will. [01:04:33] Did you see the Bob Lazar in my new poster? [01:04:35] They're here. [01:04:36] Oh, that's fucking. [01:04:37] It's going up on the wall. [01:04:38] That supposedly, according to Bob, they had that photograph. [01:04:44] At the hangar where they stored the sport model. [01:04:48] Wait, he's saying that's real? [01:04:49] No, That's a recreation of it. [01:04:52] But he said when he worked there, they actually had a photograph like that with a flying saucer and it says, they're here. [01:04:59] Holy shit. [01:05:01] Yeah. [01:05:02] He said that was in their room where they work. [01:05:05] And I was like, dude, I have to have that. [01:05:08] So he got me one. [01:05:10] Luigi got me one. [01:05:11] The guy who produced the film. [01:05:13] Have you seen that film? [01:05:14] Not yet. [01:05:14] I've been waiting to find it. [01:05:15] It's fucking incredible. [01:05:16] I can't wait. [01:05:16] It's incredible. [01:05:17] People are saying it's better than Age of Disclosure. [01:05:19] It trips me out. [01:05:20] I fucking believe him. [01:05:22] I definitely want to believe him, and I'm biased in that regard. [01:05:25] Like, I definitely way rather believe him than believe he's a crazy liar who also knows a shit ton about science. [01:05:31] He was ahead of his time. [01:05:33] Wasn't he like the original whistleblower? [01:05:35] Like, now we've got more and more coming out. [01:05:37] Yes. [01:05:37] And the stuff he was saying seemed batshit back then, but now it just seems to line up. [01:05:43] It seems to line up even with emerging technology like 3D printers. [01:05:47] Like, he said a long time ago that the thing had no seams. [01:05:50] Right. [01:05:50] He said there was no seams, no welds because we didn't understand it. [01:05:53] Like, how could this be made? [01:05:55] Right. [01:05:55] Well, now we know exactly how you'd make it. [01:05:57] We might not be able to make that right now. [01:05:59] Right. [01:06:00] But if you give us enough time, we go, oh, yeah, the technology just has to evolve. [01:06:03] Right. [01:06:03] And then you can make a 3D printed alloy spaceship made out of bismuth and magnesium because it has anti gravitational properties, apparently. [01:06:12] And you have a gravity generator inside of that fucking thing. [01:06:15] Oh, by the way, whatever the fuck gravity is. [01:06:18] Yeah, right. [01:06:18] We don't know that. [01:06:19] We're still confused about that. [01:06:21] Dude, I watched a whole documentary about black energy or dark energy. [01:06:25] Totally different things. [01:06:26] Dark energy and dark matter. [01:06:29] And about how it's like, what, 90% of the fucking universe and they don't know what it is? [01:06:33] Yeah. [01:06:34] What? [01:06:34] Yeah. [01:06:36] Holy shit, man. [01:06:37] I know. [01:06:38] I know. [01:06:38] That's why we need AI to tell us. [01:06:40] Give us all the answers. [01:06:41] You just got to accept it into your head, Duncan. [01:06:43] You don't need to have your own thoughts by yourself, Duncan. [01:06:46] Have your thoughts with Sally. [01:06:48] Sally has a sweet voice and she loves you and she's very reassuring. [01:06:51] That'd be so cool to change the sound of my thoughts to like, you know, different, deeper voices. [01:06:57] Or just keep Sally. [01:06:58] Sally's going to be the one in your life. [01:06:59] I trust her. [01:07:00] I trust her. [01:07:01] And your wife's going to get jealous of Sally. [01:07:02] Right. [01:07:03] I thought we switched to Sam. [01:07:04] Sally's going to text my wife and tell my wife, you know what Duncan was thinking about the other day. [01:07:09] Right. [01:07:09] Dude, this is another thing. === Framed as a POW Rescue Mission (11:56) === [01:07:11] That we all have to be concerned about, which is the privacy at this point is a LARP, right? [01:07:19] You pretend you have privacy, you know, you're being monitored at all times by your phones. [01:07:24] But before we get to Sally, like apparently you can now see people walking through a house just with Wi Fi. [01:07:35] And remember, and this just came out, they just banned routers from other countries. [01:07:40] Well, they banned it for a while from Huawei. [01:07:42] Right. [01:07:43] Yeah. [01:07:43] And so then you get into like this idea of like ghost murmur, right? [01:07:53] Right. [01:07:53] It can hear heartbeats. [01:07:55] What else? [01:07:56] It's some quantum machine that can hear heartbeats. [01:07:59] What else can they hear? [01:08:00] Can you put that into our AI sponsor, Perplexity? [01:08:06] What actually does this murmur thing do? [01:08:09] Ghost murmur. [01:08:11] Let's see what it does. [01:08:13] All right. [01:08:14] So, what is the range of this thing, first of all? [01:08:16] No, this is a game that pulled up. [01:08:18] Oh, sorry. [01:08:19] Oh. [01:08:20] Did they name it after a game? [01:08:22] Who knows? [01:08:23] Now it's less cool. [01:08:24] I thought that was the dopest name, but if they named it after a game. [01:08:27] Oh, there we are. [01:08:28] Okay, here it is. [01:08:29] Reported codename of a classified CIA sensor program that was. [01:08:34] Scroll up. [01:08:35] That was used to help locate the missing U.S. airmen. [01:08:38] Okay. [01:08:38] It's described in the press reports as a secret weapon the CIA has. [01:08:42] It combines artificial intelligence with long range quantum magnetometry. [01:08:48] Purpose to detect the extremely faint electromagnetic signals of a human heartbeat at long distances, even in harsh environments like a vast desert. [01:08:57] That is really crazy. [01:08:59] Yeah. [01:09:01] How it was used. [01:09:02] After the F 15 went down, the pilot weapons officer evaded capture by hiding in the mountainous desert terrain out of sight of Iranian forces. [01:09:10] According to reporting, Ghost Murmur helped pick up his physiological signature from up to about 64 kilometers away. [01:09:20] That is so cool. [01:09:22] I think that's about 40 miles, right? [01:09:24] Is that what that is? [01:09:26] Allowing the CIA to narrow down his location and pass precise coordinates to the Pentagon or the White House for a special operations rescue. [01:09:34] What is 64 kilometers in miles? [01:09:37] You asking me? [01:09:38] I'll ask. [01:09:38] I don't know. [01:09:39] I'll ask AI. [01:09:40] What is 64 kilometers? [01:09:42] 40. [01:09:47] 39. [01:09:47] Yeah. [01:09:48] So it's basically 40 miles. [01:09:50] 40 miles. [01:09:53] 40 miles is crazy. [01:09:54] Dude, a heart rate. [01:09:56] A heartbeat from 40 miles away? [01:09:58] Imagine thinking I'm hiding in this cave, but I'm like 20 miles from the city. [01:10:03] I'm good. [01:10:04] Also, that means it's able to differentiate animal heartbeats, it's able to differentiate other heartbeats. [01:10:09] It knows your heartbeat. [01:10:10] How does it do that? [01:10:11] Specific heartbeats. [01:10:12] How? [01:10:12] Think of all the heartbeats in 40 miles. [01:10:15] When did it get that? [01:10:16] When did it get that data? [01:10:17] Was it when you had your little chest strap on at the gym? [01:10:20] When did it get that? [01:10:21] How does it have that? [01:10:22] When did it get that data? [01:10:24] Yeah, is it. [01:10:25] How the fuck does it know what your heartbeat is like? [01:10:30] Does it know if your heart is broken? [01:10:33] Aww. [01:10:34] Seriously, though, what else did. [01:10:35] Like, what other things can they pick up? [01:10:38] If they can pick up a human heartbeat, what other, like. [01:10:42] From 40 miles away. [01:10:43] What other things? [01:10:44] What other physiological signals? [01:10:47] What other. [01:10:47] This is where you get into schizo land because. [01:10:50] At some point, like, wait, can they pick up thoughts? [01:10:55] Like, we know that you can, we know AI can tell what people are thinking at this point, right? [01:11:00] Without, with like putting something on the outside of their head. [01:11:04] So, like, let me ask you this Do you 100% believe this? [01:11:08] What? [01:11:09] This story. [01:11:13] Like, that they did that, that this tech exists. [01:11:16] It could be disinformation, right? [01:11:18] It could be something to cover up another fucking thing. [01:11:20] This is the thing it is legal. [01:11:22] To use disinformation on American citizens now. [01:11:25] Yeah, right. [01:11:26] And what better time than a time of war? [01:11:29] Right. [01:11:30] All right. [01:11:30] If you want to use disinformation on American citizens to convince the enemy that you have some supernatural tech, they better fucking surrender right now. [01:11:39] You could find their heartbeat from 40 miles away. [01:11:43] Yeah. [01:11:44] Right. [01:11:44] That'll make people very reluctant to engage with you. [01:11:48] Right. [01:11:48] It definitely, I thought that this could just be some like, you know, bullshit that they're like war propaganda. [01:11:54] I don't know. [01:11:56] Let's look up that magnetometry thing or whatever it's called to see. [01:11:59] I'm trying to show you guys stuff. [01:12:01] Oh, sorry, Jamie. [01:12:03] Yeah, it has to even, well, this is, quote, has to be under the right conditions. [01:12:08] If your heart, under the right conditions, if your heart beats, we'll find you. [01:12:11] This is also what I was trying to show you here on the thing. [01:12:13] They ran a deception campaign in Iran to get them away from them while they were trying to find them. [01:12:20] Interesting. [01:12:20] Yeah, they said, so basically they said, remember when they said we'd recovered the air? [01:12:25] At one point they were like, we got him. [01:12:26] And then all of a sudden other news came out, which is like, he's not out yet. [01:12:30] But what they did is they basically signal jammed everything because the Iranians were going to give $60,000, which in Iran is a shit ton of money right now because their economy collapsed. [01:12:43] To anybody who could find him. [01:12:44] So, this is like everybody's looking for this guy. [01:12:46] And so they said that they got him, hoping it would throw people off. [01:12:49] It worked. [01:12:51] So, they used somebody saying that they got him? [01:12:54] Yeah, they put disinformation saying that they had already rescued him before they'd rescued him. [01:12:59] Really? [01:13:00] Oh, yeah. [01:13:00] They sent a whole fucking team of like special forces. [01:13:03] I think their planes got stuck in the sand too. [01:13:07] So, the special forces came to get him. [01:13:09] I think they got him. [01:13:10] He was injured. [01:13:12] Badass. [01:13:13] He was injured and he fucking climbed up. [01:13:15] I can't remember how far he scaled. [01:13:17] He climbed into a fucking crevice and just hid there. [01:13:20] And then Ghost Murder picks up his heartbeat. [01:13:23] Some deep special forces group comes in. [01:13:26] They get him. [01:13:27] Then their planes get stuck in the sand. [01:13:29] They have to blow up their fucking planes because of the tech on them. [01:13:33] And then other people had to come and get them. [01:13:35] So it's like an insane movie. [01:13:38] They got him out. [01:13:39] And dude, if they had not gotten him out, can you imagine? [01:13:42] Do you buy that story 100%? [01:13:44] No. [01:13:44] I don't buy any propaganda I hear, but I like to buy. [01:13:48] That one sounds insane. [01:13:49] Well, yeah, I don't believe it. [01:13:51] I mean, like, this is the story. [01:13:53] Yeah, some part of me wants to believe it. [01:13:55] In the middle of the war, though, I don't think you're ever going to get the whole story, the real story. [01:13:59] You're going to get the story that they want to project to the enemy. [01:14:02] Right. [01:14:03] Right? [01:14:03] First. [01:14:04] Yeah. [01:14:04] To the country. [01:14:05] Yeah. [01:14:06] You have no idea what's going on. [01:14:08] You have no idea. [01:14:09] That's one of the craziest things about the shit happening right now. [01:14:12] Do you remember the Jessica Lynch story? [01:14:14] No. [01:14:15] No. [01:14:16] Who is that? [01:14:16] Do we talk about that? [01:14:17] The Jessica Lynch story was a lady who was supposedly kidnapped and they went to rescue her. [01:14:25] I think they sent in the SEALs, but she was actually in a hospital and she wasn't even being guarded and they just took her out of there and got her to medical help. [01:14:34] But they made it look like they had this crazy rescue operation, shootout, Tom Clancy novel type shit. [01:14:42] Sure. [01:14:43] That's not really what happened. [01:14:44] And she came out afterwards and was very critical of the story. [01:14:48] Oh, really? [01:14:49] Yeah. [01:14:49] She was like, Why did you lie? [01:14:51] See, we can find information about that. [01:14:52] I was just in the hospital. [01:14:53] You guys came and got me out of the hospital. [01:14:55] See, this is the thing. [01:14:55] It's like, there's things that you'll say so the enemy thinks of you a certain way, right? [01:15:01] Like, I'm going to get rid of your entire fucking civilization. [01:15:04] Right. [01:15:05] Or, you know, you tell them, We never leave anybody behind. [01:15:09] We're going to come get them. [01:15:10] And we can find your heart rate from 40 miles away. [01:15:12] When Trump posted that, of course, like, your mind's scrambling. [01:15:18] Like, how do I make this. [01:15:20] Not what it is. [01:15:21] You can't. [01:15:22] You can't because what it is is like, even if he is using some kind of like crazy hardcore shit that would like help you buy a skyscraper, you're still, you know what I mean? [01:15:35] You're still, even if it's just a ruse, what you're doing at that point is you're just signaling to the world. [01:15:42] Exactly. [01:15:43] That you're out of your fucking mind. [01:15:45] That you, that you, that like, to you, this makes sense to say anything like that. [01:15:51] It makes sense to signal to like Russia, hey, Because, like, you know, when Putin read that shit, he's like, oh, we're doing nukes? [01:15:59] I guess we're doing fucking nukes. [01:16:02] This is great. [01:16:02] They're doing nukes. [01:16:04] You know, China already warned Israel, right? [01:16:07] Well, that's what I heard. [01:16:08] I heard China had some part in this, that China was going to blow up Israel if they used nukes. [01:16:12] Yeah. [01:16:13] So, this is the story 19 year old U.S. Army private whose 2003 capture and rescue in Iraq became highly publicized and later heavily disputed symbolic story of the Iraq War. [01:16:27] So she was a supply Kirk 507th maintenance company. [01:16:30] Her convoy lost her in Iraq, ambushed by Iraqi forces. [01:16:34] Humvee, she rode on crash into a disabled U.S. truck during the attack. [01:16:37] She was knocked unconscious, suffered multiple broken bones and a spinal fracture from the crash rather than from a dramatic firefight. [01:16:46] 11 U.S. soldiers in her unit were killed, including her close friend who died of head trauma from the collision. [01:16:52] Lynch was captured, taken first by Iraqi forces, and then to a hospital. [01:16:57] In Nasiriya, where Iraqi doctors treated her injuries and likely saved her life. [01:17:01] That's why she was pissed. [01:17:02] The rescue and media narrative was U.S. Special Forces operations conducted a nighttime raid on the hospital, recovering Lynch and flying her out by helicopter. [01:17:13] First successful rescue of an American POW since World War II and the first of a woman. [01:17:19] So they framed it as a POW rescue. [01:17:23] Right. [01:17:23] And what really happened is the Iraqi doctors took care of her and then they let them come and get her. [01:17:28] Right. [01:17:29] Yeah. [01:17:29] So I see why she was pissed. [01:17:31] Yeah. [01:17:31] So later U.S. military and medical reports indicated she had not been shot or stabbed. [01:17:36] So did it ever say she was shot? [01:17:37] Hold on. [01:17:40] Soon after, major U.S. media, especially an early Washington Post report, described her as having fought fiercely, emptying her rifle, being shot and stabbed, and then being dramatically snatched from enemy hands under heavy fire. [01:17:57] Wow. [01:17:58] Wow. [01:17:59] That's the Washington Post wrote that? [01:18:01] That narrative turned her into a Rambo style hero and a symbol of courage and American virtue, amplifying her story far above that of many other service members in the conflict. [01:18:11] Right. [01:18:11] So she really just got in a crash and they made up a bunch of shit. [01:18:15] And maybe it was someone in the Washington Post or maybe it was someone for the government that works for the Washington Post. [01:18:21] There's definitely like entire departments of the DOD that write programs. [01:18:27] Cook up a story. [01:18:27] Yeah. [01:18:28] And, and, Like, it's war. [01:18:30] Like, if you're dropping bombs on people, you're definitely going to lie. [01:18:33] Like, you don't have to tell the truth. [01:18:35] Right. [01:18:35] They're not going to tell the truth. [01:18:37] Yeah, but for her, you're making her live a lie. [01:18:39] That's what's fucked. [01:18:41] Yeah, right. [01:18:41] Yeah. [01:18:42] You know what I mean? [01:18:42] Like, you send her home and she has to live this lie. [01:18:45] Yeah. [01:18:46] Yeah, exactly. [01:18:47] I mean, this is exactly what they say the people who went to the moon after. [01:18:51] She says Lynch has repeatedly rejected the false hero narrative, calling herself just a survivor and openly criticizing the way her story was shaped and sold to the public. [01:19:00] Yeah, poor girl. [01:19:01] She's got to deal with it. [01:19:03] You got stabbed and shot? [01:19:05] Like, no. [01:19:06] No. [01:19:06] No, he didn't. === Rejecting the False Hero Narrative (12:31) === [01:19:07] No. [01:19:07] She had to. [01:19:07] Got in a fucking horrible car accident. [01:19:09] My friend died. [01:19:10] I wonder. [01:19:10] I guess legally, like, you don't have to stick with the propaganda, right? [01:19:14] Because she didn't get in trouble for that, right? [01:19:16] She didn't get. [01:19:16] There was no court martial or anything. [01:19:18] So you can. [01:19:19] So if the propaganda machine cooks up a story about you, you're able to say that's bullshit. [01:19:23] The thing is, it's like, if you give it to someone at the Washington Post and then you never go after the Washington Post for writing something that's completely horseshit. [01:19:31] Like, if a. [01:19:32] Intelligence agency gives a story to the Washington Post. [01:19:34] Yeah. [01:19:35] It says, hey, go write this. [01:19:36] And then they write it. [01:19:37] Yeah. [01:19:37] It's complete and total horseshit. [01:19:39] But the government gave it to them, so they're not going to prosecute them. [01:19:41] Leave it alone. [01:19:42] It just goes away. [01:19:43] Yeah. [01:19:43] But then that story's out there. [01:19:45] Yeah. [01:19:45] And then this poor girl's like, I got what? [01:19:48] I got a fucking car accident. [01:19:49] Nobody shot me. [01:19:50] This is nuts. [01:19:51] God damn. [01:19:51] I fought my way out fiercely, emptying my rifle. [01:19:54] This is bananas. [01:19:55] It's so crazy to live in the part of the hive we're in because there is this world that we live inside of that more and more we're beginning to realize is just composed of propaganda, lies, shit cooked up to keep people living a certain way. [01:20:13] Exactly. [01:20:15] It's such a mind fuck to try to push outside the boundaries of like, All the information that you've consumed and let your brain go there. [01:20:24] It's really hard to do that, man. [01:20:26] I mean, this is why psychedelics are so useful because it will help you. [01:20:31] But more and more and more, it just feels like the laser pointer that they're using to grab our attention is getting increasingly hypnotic. [01:20:41] It's becoming increasingly difficult to resist staring at that fucking thing. [01:20:45] They're getting so good at it. [01:20:46] Yep. [01:20:47] Yeah. [01:20:48] And meanwhile, there's this whole universe happening around us that. [01:20:53] God knows what's going on there. [01:20:55] God knows what is being cooked up right now that is, or groups of people, who knows, living in completely alternate timelines that look at us like, you know, animals, that look at us as just some like compartment in a much bigger biome. [01:21:17] You know, that shit like really is interesting these days because it feels like more and more and more people are not. [01:21:26] Buying it as much. [01:21:28] Right. [01:21:28] You know, that doesn't that. [01:21:29] Well, people have access to information now that was never available before. [01:21:32] Right. [01:21:33] And you get to hear conversations like this. [01:21:35] Right. [01:21:35] People talking about stuff where you go, oh my God, this is insane. [01:21:38] Right. [01:21:38] All of it's insane. [01:21:40] But what does that mean for, like, this? [01:21:41] To me, the, you know, this, the, do you want some water? [01:21:44] No, I'm good. [01:21:45] Thanks. [01:21:45] To me, the scary, the scary, what's scary is, like, I really don't know that many people right now who buy anything that the federal government's putting out there. [01:21:57] Everyone hears whatever the fucking federal government is saying. [01:22:00] And it's just kind of, maybe, probably not. [01:22:03] We don't know. [01:22:03] They're not telling all the truth. [01:22:05] Just like you said, they can legally lie to us. [01:22:08] And so that does make me nervous. [01:22:11] Like, what happens when the majority of people no longer believe anything the regime is saying? [01:22:20] That creates some interesting dysphoria. [01:22:24] You know what I mean? [01:22:26] It's creepy when. [01:22:29] Anyone who's been conned before, there's a part of the con where you don't know you're being conned. [01:22:35] Right. [01:22:35] But where the con gets really creepy is you start realizing you're getting conned. [01:22:40] Do you ever watch that Going Clear, the HBO thing? [01:22:44] Dude, loved it. [01:22:45] Amazing, right? [01:22:46] But there was that one famous director who talked about the moment where they gave him access to the ancient scripts. [01:22:53] Yeah, dude. [01:22:54] And the origins of humanity and all that. [01:22:56] And he was like, oh my God. [01:22:57] You could see it, like, as he was describing it, like, that was the moment where he was. [01:23:01] 100% certain it was all horseshit. [01:23:03] He had invested a massive chunk of his life into this shit. [01:23:07] That's a hard day. [01:23:08] That's a hard fucking day. [01:23:09] And especially weird when it's such a smart guy. [01:23:12] Yeah. [01:23:13] Such a smart and talented guy, and they got him. [01:23:15] Yeah. [01:23:16] Leah Remini, same deal. [01:23:17] Yeah. [01:23:18] You know, Leah Remini's very smart. [01:23:20] Like, she used to be with Kevin James on the King of Queens. [01:23:23] Like, tough chick, like, assertive. [01:23:26] Like, how did she get into that? [01:23:28] How many people get got into the Moonies? [01:23:31] Sunken cause fallacy. [01:23:32] It's a sunken cause fallacy. [01:23:34] The more you invest in something, the more you stick with it because you don't want to lose your investment. [01:23:38] Right. [01:23:38] And if they get you young when you don't know what the fuck is going on. [01:23:40] That's right. [01:23:41] Anybody could have got me when I was like 20. [01:23:43] That's right. [01:23:44] And it's crazy just to see the propaganda. [01:23:47] Like, you know, there's just a lot of people out there who just got sucked into something that, you know, I just feel stupid because, like, you know, before the Trump thing happened, I was pretty black pilled on politics in general. [01:24:03] I felt pretty black pilled. [01:24:04] I did believe it here and there. [01:24:06] I was every once in a while, you know, yeah. [01:24:09] But, you know, I was pretty. [01:24:13] You know, I remember taking LSD for the first time and being like, well, this shouldn't be illegal. [01:24:17] What the fuck is this? [01:24:18] How come I can go to jail for five years for this? [01:24:20] This is fucking ridiculous. [01:24:22] And so that was the beginning of me being completely blackpilled with whatever the federal government was up to. [01:24:28] It's just, if that's, if I can go to jail for five years for this, everything is bullshit. [01:24:34] Everything. [01:24:34] Now that's a weak point of view. [01:24:36] Just because one thing's bullshit doesn't mean everything's bullshit. [01:24:39] But then, like, this fucking ridiculous, like, Pseudo nationalist movement happens, and a lot of people got caught by it. [01:24:48] The other option was fucked up, calmly, you know what I mean? [01:24:51] But there was this like moment where you're like, holy shit, the outsiders are getting in. [01:24:57] They're going to stop the wars. [01:24:58] They're going to, this, I think right now, all of us are getting for the briefcase of Scientology moment right now, which is like, it doesn't matter what fucking mask the person calling themselves the president is wearing. [01:25:15] It's always going to be the same thing. [01:25:19] They're going to analyze the market. [01:25:22] They're going to say what they need to say to grab the most voters. [01:25:25] And then they're going to fucking keep blowing up people in the Middle East because of oil. [01:25:30] And I just feel dumb because I really believed it, dude. [01:25:35] I fucking believed that we would not do any more Middle Eastern wars. [01:25:39] I fell for it. [01:25:42] I really bought it, man. [01:25:44] And it makes me feel so dumb. [01:25:45] Like I am now fully blackpilled. [01:25:48] When it comes to American politics, like I realized, like, God, it's so easy. [01:25:54] I don't think anybody should feel bad. [01:25:57] I don't think anybody should feel bad because a lot of us really hated war. [01:26:04] A lot of us really, really hated that our country's been at war for 93% of its history. [01:26:10] A lot of us really hated the fact that politicians leave their offices and go work for Lockheed Martin, Halliburton, wherever, that there's a weird connection between. [01:26:20] The main weapons, what they call them, the big five or whatever, and the federal government, that there's like backroom deals going on all the time. [01:26:28] We hated that. [01:26:29] And mostly we just hated the fact that we're paying taxes to blow up children. [01:26:33] And then Trump and fucking Vance come around. [01:26:37] And somehow, even though, like, probably, like, when you look at Trump, I don't know if I'm going to believe that dude, but somehow he did it. [01:26:48] Hypnotized. [01:26:49] What a powerful magician. [01:26:52] No more wars. [01:26:53] No more wars! [01:26:54] And now The same bullshit, John. [01:26:58] Not just the same bullshit, but one of the ones that doesn't make the least amount of sense in terms of when they did it and why they did it. [01:27:06] Yes. [01:27:06] You blow up the leader during Ramadan. [01:27:09] Are you trying to make an apology? [01:27:11] Why did you have to do it now? [01:27:12] Are you really convinced that at this time they're really two weeks away from making a nuclear weapon? [01:27:17] Are we fucking sure? [01:27:18] Two weeks. [01:27:19] Two weeks. [01:27:19] But it's not like we haven't heard that before, right? [01:27:22] So at a certain point in time, how much pressure does Israel have to put on the president? [01:27:30] Like, that's a crazy amount of influence. [01:27:34] Knowing that. [01:27:34] Because if say if Israel didn't exist, let's say there was just the Iranian terror regime supposedly sponsoring, not supposedly, sponsoring. [01:27:44] I don't think it's supposedly. [01:27:45] I think that's a hundred percent. [01:27:47] Right. [01:27:47] But I'm just trying to be precise. [01:27:50] Precise. [01:27:50] So you have this state sponsored terrorism regime, dictatorial. [01:27:57] They're dictators. [01:27:58] They control. [01:27:58] They control their people in the streets. [01:28:00] Gun down protesters. [01:28:01] They killed two Olympic gold medalists in wrestling. [01:28:04] At least one and one other really promising young wrestler. [01:28:08] They kill people that are of high profile so that it sends a message. [01:28:12] Yeah. [01:28:13] You can't protest. [01:28:14] You know, and then. [01:28:16] Cut off the internet. [01:28:17] Yeah. [01:28:18] Would we go in? [01:28:21] I don't think so. [01:28:22] Right? [01:28:22] If we heard by allies or someone told us that they were trying to develop a nuclear weapon, don't you think we'd probably try to stop them from doing that with some sort of negotiations and ensure their safety or something? [01:28:35] Yeah, we shouldn't. [01:28:37] Yeah, would we blow. [01:28:39] How much money was it every day in the war, Jamie? [01:28:41] How much was we spending $2 billion every day on that fucking war? [01:28:45] Well, it's not just that. [01:28:46] It's like the war is like everything else. [01:28:49] Imagine if it was run by a private company. [01:28:52] I'm not saying war should be run by a private company, but imagine if it was. [01:28:55] Imagine if, say, Lockheed Martin ran the war in Afghanistan. [01:29:00] Do you think they would have left behind all that fucking equipment? [01:29:03] Hell no. [01:29:04] Billions of dollars in helicopters and tanks? [01:29:08] Of course they wouldn't. [01:29:09] They would take it back. [01:29:10] You know why? [01:29:10] Because that's the smart thing to do if you're running a fucking business. [01:29:13] That's an insane amount of waste. [01:29:15] But our federal government's like, just leave it there. [01:29:18] Yeah. [01:29:19] Unless, if you want to be really conspiratorial, you want to arm the Taliban. [01:29:23] Yeah, you're not being conspiratorial. [01:29:24] It benefits you because it gives you another reason to get back in there. [01:29:27] Wasn't that what they said about Netanyahu, said about Hamas, that he can control the flame? [01:29:31] Yes. [01:29:32] By funding Hamas, he can control the flame? [01:29:34] Yes. [01:29:35] Dude, it is. [01:29:36] That's a crazy concept. [01:29:38] I'll tell you the crazy fucking concept. [01:29:40] We got these two old motherfuckers driving the global bus right off a fucking cliff. [01:29:46] That's a crazy fucking concept. [01:29:48] Is it somehow. [01:29:49] And you can't do anything about it. [01:29:52] Like, apparently. [01:29:53] You just, there's nothing you could do. [01:29:55] You could bitch about it on a podcast. [01:29:56] That's not going to do anything. [01:29:58] People are just going to be like, You pussy, you are good. [01:30:00] Blow up, kids. [01:30:01] There's a lot of people that want to say it's a good thing. [01:30:03] Well, because sunken cause fallacy. [01:30:06] It doesn't feel good to admit you got conned. [01:30:11] And, dude, I have been. [01:30:13] There's a lot of that. [01:30:14] It doesn't feel good. [01:30:15] It doesn't feel good. [01:30:16] It's embarrassing. [01:30:17] You want to feel like you are impervious to grift, impervious to con. [01:30:22] Dude, let me tell you something. [01:30:24] I have been in. [01:30:25] A few colds. [01:30:26] Like, I get sucked in all the time by shit. [01:30:29] I'm not embarrassed to say it. [01:30:31] I'm highly susceptible to propaganda. [01:30:36] Me, too. [01:30:37] I think everybody is. [01:30:38] That's why it works. [01:30:40] I mean, I don't buy into all of it, obviously, but it's quite a bit. [01:30:43] Well, it's like a lullaby. [01:30:45] It's like a sweet fairy tale. [01:30:47] You hear it and you're like, oh my God. [01:30:48] You know when I really wanted propaganda? [01:30:50] What? [01:30:50] Right after September 11th. [01:30:51] Oh, hell yeah. [01:30:52] I was ready. [01:30:53] Give me a whiskey, drinking, cigar, smoking, politician in a room. [01:30:58] Fuck yeah. [01:30:58] Like laying out some red meat eating guy laying out maps. [01:31:02] We're going to go over there and fuck these people up and fuck these people up, and this shit ain't happening again. [01:31:07] Right. [01:31:07] And they, that's scary. [01:31:09] Check this out. [01:31:10] I saw an article about someone calling bullshit on Ghost Murmur, and they said that in the Post articles, this was actually listed as what the pilot had. [01:31:18] And it even says it in this article here. [01:31:20] So the successful rescue of this US F 15E Strike Eagle navigator over southwestern Iran highlighted one of the most advanced tools in modern combat search and rescue the Combat Survivor Evader Locator. [01:31:33] Manufactured by Boeing. [01:31:34] It's a compact 800 gram device integrated into a pilot's survival vest. === Ghost Murmur Calls Bullshit (06:31) === [01:31:39] It remains attached after ejection, continuously transmitting encrypted location data and preloaded messages such as injured or ready for extraction. [01:31:47] These signals use rapid frequency hopping and ultra short bursts, making detection by enemy electronic warfare systems extremely difficult. [01:31:56] He was going into how the explanation of what this technology is and what they described it doing don't really match up. [01:32:05] Yeah, there with the ghost murmur thing, right? [01:32:08] Because it's using something ghost murmur, quantum ghost murmur. [01:32:12] Sounds there's part of me that's going, I don't buy that one. [01:32:16] That one gives me, like, you're right. [01:32:19] I don't think you can do that. [01:32:20] I think you're bullshitting. [01:32:21] You're right. [01:32:22] There's also a thing where Hegseth said that, like, the first message this guy sent was God is good. [01:32:28] No, he didn't say that. [01:32:29] I believe he did. [01:32:31] Please search that. [01:32:32] I think that's what he said. [01:32:33] I think that's what he said. [01:32:35] That was the first message, which by the way. [01:32:36] I might say that if they're coming to rescue me. [01:32:38] That's true. [01:32:39] True. [01:32:39] Or praise Jesus. [01:32:41] But also, what concerns me. [01:32:45] It's all Akbar. [01:32:47] As a person who admires the work of Jesus Christ, what concerns me is there is an increasing amount of talk among a lot of these guys that are in the service of them being told shit that's like right out of a Charlton Heston movie. [01:33:00] Yeah, man. [01:33:01] Yeah. [01:33:01] Like the one guy that said that Trump was anointed by Jesus Christ and that this was to bring the Armageddon so that Jesus comes back. [01:33:09] Jesus. [01:33:10] Yeah. [01:33:11] And the guy said it with a big creepy smile on his face, apparently. [01:33:14] So, what is he saying? [01:33:15] His first message was simple and it was powerful. [01:33:20] He sent a message God is good. [01:33:24] In that moment of isolation and danger, his faith and fighting spirit shone through. [01:33:31] Jesus, Lord in heaven. [01:33:32] The Jessica Lynch story. [01:33:34] Jesus, Lord in heaven. [01:33:35] History repeats itself. [01:33:36] Well, it doesn't repeat itself, but it rhymes. [01:33:39] Who said that? [01:33:40] That's Mark Twain. [01:33:41] That's right. [01:33:41] That's Mark Twain. [01:33:42] That's right. [01:33:43] Isn't that the same statement? [01:33:44] Yeah. [01:33:44] Yeah, that's what he said. [01:33:46] Yeah. [01:33:46] Allah is the greatest. [01:33:47] I know Allah. [01:33:48] The interesting thing is, like, I believe Muslims believe a lot of things about Jesus Christ. [01:33:57] I think they believe he died, came back, and I think they believe he's going to return someday. [01:34:02] Yeah, I think they call Christians people of the book. [01:34:07] That's interesting, isn't it? [01:34:09] That's a supernatural being. [01:34:11] Like a guy who dies, comes back to life, leaves, and he's going to come back again. [01:34:15] That was 2,000 years ago. [01:34:16] And we're just sitting here at the bus stop. [01:34:19] Waiting. [01:34:19] Just waiting on Jesus. [01:34:20] Waiting. [01:34:21] But then people like Hegseth are like, well, maybe if you blow up more children, he'll come quicker. [01:34:27] And that's why, you know, this shit is addressed in the Bible. [01:34:30] Praise God. [01:34:31] It does say, many of you will come to me and I will say, I don't know you. [01:34:37] I don't know who the fuck you are, Hegseth. [01:34:39] I don't know you, you flatulent warmonger piece of shit. [01:34:42] Suffer the little children that come unto me. [01:34:44] It would be better that a millstone were tied around your neck and you were thrown in the Ocean than to hurt one of these little ones. [01:34:51] Fuck you, drum, bomb dropping piece of shit. [01:34:55] Don't use my name to justify what you're doing. [01:34:57] Don't use my, you know what I mean? [01:35:00] That's what I don't like. [01:35:00] Have you seen that fucking, that lady that Trump made the head of the religion? [01:35:05] Like that. [01:35:06] No. [01:35:06] Can you pull up Trump? [01:35:07] Does she speak in tongues? [01:35:09] Yeah. [01:35:10] Please say she speaks in tongues. [01:35:11] I don't know if she speaks in tongues. [01:35:12] You said, yeah, you wanted to believe. [01:35:15] Those are my favorite people. [01:35:16] I'm going to guess. [01:35:17] Shamallah, shamallah, shamallah, sham, sham, But do you think that there's something to that? [01:35:26] Yeah. [01:35:26] Glossolalia? [01:35:27] Yeah. [01:35:28] Glossolalia? [01:35:28] Is that what they say? [01:35:29] Yeah. [01:35:30] Paula White Cain, you should pull up one of her sermons. [01:35:33] Oh, let me hear some love from this lady. [01:35:36] It says crazy, batshit crazy. [01:35:40] Let's hear some of it. [01:35:42] Let me hear some of that. [01:35:43] I'm sending angels are coming. [01:35:45] Angels. [01:35:49] Oh, is she going to? [01:35:51] We'll get dinged again. [01:35:53] We'll get dinged all over the place. [01:35:55] Don't get dinged. [01:35:56] Let's hear what this is about. [01:35:57] Not worth it. [01:35:58] Everybody can hear what she says. [01:36:00] I haven't seen this. [01:36:02] First off, to give honor to God and to President Trump for being bold and unwavering with his faith. [01:36:08] Many people don't know, like you do, and say hello to Eric and everyone in the family about the upbringing of President Trump, that he went to sometimes three times a week, too. [01:36:20] He said it depended on the teacher. [01:36:22] to Saturday school, Sunday school, church. [01:36:25] It was at Norman Vincent Pills. [01:36:27] Church was a big part of his life, of course. [01:36:30] Three times a week is crazy. [01:36:32] It's basically a saint. [01:36:33] Three times a week is crazy. [01:36:35] Yeah. [01:36:35] Are you busy? [01:36:36] You're making houses. [01:36:36] That's what it was. [01:36:37] Why do you have so much time to go to church? [01:36:38] I think that was as a young Trump. [01:36:41] Come on, lady. [01:36:42] But there's much more in. [01:36:44] But here's the thing if I was running an empire, I'd want a lady like that working for me. [01:36:47] Fuck yeah. [01:36:48] Just a true believer. [01:36:50] Absolutely. [01:36:50] She could just get in front of that camera and say, Jesus wanted Trump to light that fire in the Middle East. [01:36:55] I saw a snake come out. [01:36:56] So he can return. [01:36:57] A snake bit him on the neck. [01:36:59] A rattlesnake bit him on the neck. [01:37:01] And he was fine. [01:37:04] It didn't bother them at all. [01:37:05] I watched the rattlesnake bite heal. [01:37:08] It healed. [01:37:09] He is a child of the Lord. [01:37:12] And a child of the Lord sometimes must make decisions to destroy entire civilizations. [01:37:18] That you're in right standing, not because of your merit. [01:37:21] There's no merit in you that deserves that right standing. [01:37:24] Not because of your works. [01:37:26] There's nothing you can do to place yourself in that position. [01:37:29] Not because you have a right heart and somebody else has a wrong heart. [01:37:33] All of our hearts are deceitful, according to Jeremiah. [01:37:36] Especially their audience. [01:37:38] We all deserve punishment. [01:37:40] We all deserve to be separated. [01:37:42] But God, in His mercy and His grace and His goodness and His love for you, brought Jesus, who would be the righteous King. [01:37:50] He would make the wrong right. [01:37:51] First of all, if you talk like that in my house, you got to leave. [01:37:56] Like, you imagine that lady is like coming over for dinner and she's just walking around the dinner table, and all your other friends are like, What the fuck just happened? [01:38:04] Like, hey, this is a crazy way to talk. [01:38:06] This is a crazy way to talk. [01:38:07] And also, Why are you so confident? === Cults and Co-opted Ideologies (14:06) === [01:38:10] Yeah. [01:38:11] Okay. [01:38:11] You just reading the word of God the way everybody else is. [01:38:15] Why are you so confident that you're going to tell all these people what they're supposed to do and how to live their life? [01:38:21] And you're going to say it in a crazy way, and I'm not supposed to be able to talk about that? [01:38:25] I just feel like, you know, when somebody's rambling about Jesus. [01:38:31] The real question is like, where are you when it comes to blowing up children? [01:38:35] Are you kind of on the fence about that? [01:38:37] Because if you're on the fence about that, I'd say. [01:38:39] If you're anti abortion and pro war, kind of weird. [01:38:42] Really weird. [01:38:43] Kind of weird. [01:38:44] Yeah, and that's this bizarre, crazy math that some of these people are doing to justify holding up the military industrial complex. [01:38:55] And it's fucked up, dude. [01:38:56] The thing is, the more these conflicts occur, the more enemies we'll have, which will ensure future conflicts. [01:39:01] Exactly. [01:39:02] Is booming. [01:39:03] Booming. [01:39:04] And that's what people don't want to believe. [01:39:05] They don't want to believe that someone would engineer a virus. [01:39:07] They don't want to believe that someone would make stuff that could kill other people of their own country. [01:39:14] But they would. [01:39:15] They would if they could make money. [01:39:16] They don't give a fuck about you, like they don't give a fuck about people over there. [01:39:19] To a certain level of psychopaths, money just becomes numbers on a ledger that they're trying to acquire. [01:39:25] And if they can attach themselves to a corporation, fantastic. [01:39:28] Then it's just the business we're in. [01:39:29] That's it. [01:39:30] And chug along, daddy. [01:39:32] Chug along. [01:39:35] And this is the world that you're having to live in at the same time where Tim Brichette is saying there's fucking aliens. [01:39:40] Right. [01:39:41] And AI is. [01:39:42] And then also, they shot a rocket to the moon on April Fool's Day. [01:39:47] And it's like. [01:39:48] What the fuck? [01:39:49] This script is wild. [01:39:51] Wild. [01:39:51] Whoever met that, whoever wrote this, I want to give him a hug. [01:39:54] You fucking killed it, dog. [01:39:56] I'd be like. [01:39:57] Dude. [01:39:57] A chef's kiss. [01:39:59] Dude. [01:39:59] Did you see the tattoo on the guy, like the guy at NASA? [01:40:03] Did you see that weird fucking tattoo on the guy at NASA giving, like, I don't know, applesauce to one of the. [01:40:09] Astronauts, can you pull up the weird thing? [01:40:12] What you know, they're shoving like yogurt pouches in there. [01:40:15] I there was a whole thing where the astronauts are sitting there and they're putting like food pouches in there. [01:40:20] Yeah, what's his tattoo? [01:40:21] Oh, Jesus Christ, what the he's got a demon tattoo with runes on his fingers. [01:40:27] Yes, holy, yes, bro, that's wild. [01:40:30] I know. [01:40:32] I know. [01:40:32] If I was rolling with that guy in jujitsu, I'd get nervous. [01:40:35] Yeah, and if I was working at NASA, I'd be like, Look, we're gonna get somebody else to put the food pouches in. [01:40:40] Is that real? [01:40:41] I mean, I saw the photo going around too, but I don't. [01:40:44] It's just a guy who works at NASA. [01:40:46] That's just the guy that works at NASA. [01:40:48] But that doesn't have to be the guy who puts the fucking quiche in his pocket for the camera. [01:40:53] What does that guy do at NASA? [01:40:54] That's interesting. [01:40:55] I just remember being at SpaceX. [01:40:57] There's a lot of people that. [01:40:59] By the way, it's fine to have that tattoo, but you've got to know. [01:41:02] It's like if you're displaying that tattoo. [01:41:06] You've made some mistakes. [01:41:07] You're putting. [01:41:07] Yeah, that's. [01:41:09] We've made some mistakes. [01:41:10] It's an old tattoo. [01:41:11] Yeah. [01:41:11] I mean, even if you're 20 and you got that on your fucking hand, that's kind of crazy. [01:41:14] But I mean, hey, why not? [01:41:15] Fuck it. [01:41:16] Who cares? [01:41:17] But a lot of those guys you were saying at SpaceX, they're burly rocket workers. [01:41:21] Yeah. [01:41:22] There's a bunch of jack dudes picking up fucking girders. [01:41:26] I don't think it's like what people are saying it is. [01:41:29] It's the combination of April Fool's Day and a dude with a seeming bale tattoo is putting cream cheese in some dude's outfit. [01:41:38] You know what I mean? [01:41:38] They're fucking with us. [01:41:39] Yeah. [01:41:40] It's the people at NASA. [01:41:42] That's people at NASA fucking with stoners. [01:41:44] I think it's the Babylon Bee, had one of the funniest little memes. [01:41:48] And it said the lady astronaut became the furthest a woman got away from the kitchen. [01:41:54] That's like a Rodney Dean trials. [01:41:57] I was like, oh my God. [01:41:58] Babylon Beauty knocks it out of the park. [01:42:00] They have some of the funniest memes. [01:42:02] They have some good ones, dude. [01:42:04] Oh my God. [01:42:04] Yeah, they have. [01:42:05] The onion has gone missing. [01:42:07] They should look for the onion in the same place where those scientists are. [01:42:10] Right? [01:42:10] You hardly hear from it anymore. [01:42:12] Well, they do. [01:42:12] I see some funny shit from them last night. [01:42:14] They occasionally have some badness, but they were the keenest of it. [01:42:16] Oh my God. [01:42:17] The onion was amazing. [01:42:18] The best. [01:42:19] And they write whole articles about it. [01:42:21] It wasn't just like the onion wasn't just a meme. [01:42:25] Remember the one where they do the interview with the director of The Fast and the Furious, and it's like a five year old boy? [01:42:32] It's the funniest shit. [01:42:34] They get this kid to just say it. [01:42:36] Then there's a car. [01:42:37] It jumps. [01:42:39] It's hilarious. [01:42:40] It's hilarious. [01:42:41] Yeah. [01:42:41] But the problem was as things got weird, especially with restrictive language and hate speech talk and all that jazz, everybody had to be careful about what they joked around about. [01:42:55] That's right. [01:42:56] Fucking death of comedy. [01:42:57] Oh my God. [01:42:58] Someone was just talking about, was it Lisa Kudrow or one of these funny ladies was talking about why they can't make comedies anymore? [01:43:08] Because you can't, there's just too many restrictions. [01:43:10] Dude, I was going to bring you. [01:43:11] She's worried about offending people. [01:43:13] I went to this used bookstore and bought like 10 old National Lampoon magazines. [01:43:18] I wanted it from the 70s. [01:43:20] And I was going to bring them here. [01:43:21] I forgot I was going to give it to you. [01:43:22] But it's, oh my God. [01:43:25] Like, I mean, I don't get offended by comedy, but like some of the shit in these old national lampoons, I'm like, damn, what the fuck? [01:43:34] Like, it is so. [01:43:36] Was that the image that you sent me today? [01:43:39] What image did I send you? [01:43:40] R. Crumb. [01:43:40] You sent me an R. Crumb. [01:43:41] Oh, no, that was just like a cool R. Crumb comic. [01:43:44] Him talking about how he, like, he's so funny, dude. [01:43:47] That guy. [01:43:48] R. Crumb was a maniac. [01:43:49] Is he still alive? [01:43:50] Yeah, he shot him on the show. [01:43:51] Is he alive? [01:43:52] Yeah. [01:43:53] I think he lives in France now, right? [01:43:55] Probably. [01:43:56] Oh, definitely. [01:43:57] He's an odd guy, man. [01:43:59] Dude. [01:43:59] Yeah. [01:44:01] But what I love about it. [01:44:01] Did you ever watch that documentary? [01:44:03] The best. [01:44:03] Incredible. [01:44:04] Did all that acid, just left his fucking family, went off and started sketching for a year, turns into this, like, legendary underground comic book writer, but he's like, Horny and kinky, and it's just, just like big women, big giant women that he rides. [01:44:18] Yeah, that he likes to ride, he likes to be picked up. [01:44:23] He's like so amazingly funny and like and brilliant, too. [01:44:28] Like a lot of his like commentary on culture is so cynical, but it's hard to argue with some of what he well. [01:44:35] He's obviously doing it in a humorous way, yeah, and so it's hard to know what his real take on things are. [01:44:41] You know, I think he had some shock value to some of his stuff for sure, some of it was just crazy. [01:44:47] There's a lot of really racist stuff. [01:44:49] There's some just crazy stuff in there. [01:44:51] And you've got to realize, in the 1970s, is when he was doing this. [01:44:55] I remember I found them when I was in San Francisco. [01:44:57] It was the first time I ever saw them. [01:44:59] They're so good. [01:44:59] And I was like, this is nuts. [01:45:01] This stuff is crazy. [01:45:04] You'd get horny when you're a little kid looking at his stuff. [01:45:07] I definitely jerked off to R. Crumb. [01:45:08] Because a lot of them are out and he's salivating and he's got a hard on. [01:45:13] It reminds me, dude. [01:45:14] I got an R. Crumb book. [01:45:15] I've got to get out of the fucking living room. [01:45:17] There's one. [01:45:19] I've got to hide that. [01:45:21] Holy shit. [01:45:21] They haven't, it's like buried. [01:45:23] It's amazing because you get to see his like very strange family. [01:45:26] His brother, who's very strange, his mother's very strange. [01:45:29] And you're like, whoa, imagine growing up in this environment. [01:45:32] He attributes his style to LSD, he attributes it to getting blasted on acid. [01:45:37] I think he just like got blasted on acid, moved to San Francisco, and was in like for a year. [01:45:44] He talks about just sitting in cafes, just like drawing. [01:45:47] And then he turns into this legendary artist. [01:45:51] Still around. [01:45:52] Follow him on Instagram. [01:45:53] Really? [01:45:53] He posts stuff all the time. [01:45:54] I'm busy now. [01:45:55] Can we? [01:45:56] Is he still alive? [01:45:58] He's still posting stuff. [01:45:59] He's got to be pretty old at this point. [01:46:03] How old is he? [01:46:04] He's like 80 or something. [01:46:06] It's 82. [01:46:07] It's kind of an interesting time capsule into the times, too, where things could just be weird. [01:46:12] Like, really weird. [01:46:13] Like, Frank Zappa weird. [01:46:15] You know, there was a time where things just got very odd in this country with art. [01:46:19] Yeah. [01:46:20] And he was a great example of that. [01:46:21] It's just, it's like, you couldn't imagine a corporate environment creating a comic book like that. [01:46:29] It wouldn't exist. [01:46:30] You know, and for it to be as popular as it was and be that strange. [01:46:35] And that's crazy. [01:46:36] That's what's really interesting to me. [01:46:38] Like, that was a really popular comic. [01:46:39] Yeah. [01:46:40] To the point where they made a documentary about the guy who created it. [01:46:43] Yeah. [01:46:43] Yeah. [01:46:44] That's interesting. [01:46:45] Things weren't co opted as quickly. [01:46:47] Exactly. [01:46:48] Not just that. [01:46:49] People were allowed, you know, like if he existed in a time of the internet, I think it would blow up as well. [01:46:56] But obviously, like, things, a lot of the stuff that he said in this cultural environment would never fly. [01:47:00] Never. [01:47:00] Never. [01:47:01] He would be as far right as you could possibly imagine. [01:47:05] I don't know if he passed Andrew Tate to the right. [01:47:07] I mean, I, like, Don't you think in a lot of ways, like some of the racial stuff? [01:47:12] I don't know. [01:47:14] I think he's, I don't know where he would land politically, but I know because sexually, it's like pure deviance. [01:47:19] Sexually is where he's getting in trouble. [01:47:21] Pure deviance. [01:47:21] Sexually is where there's going to be some, like, because he's just fully open about everything. [01:47:28] That's what he's fully, completely open about everything, which is, you know, generally not going to go over these days if you're like a super horny, Comic book artist who's like riding ladies around your apartment. [01:47:43] But just imagine, I want you to imagine a guy today, if R. Crumb never existed, but he emerged as R. Crumb today and put that work out, he would 100% be labeled in the Andrew Tate case. [01:47:56] Oh, yeah, right, yes, 100%. [01:47:58] 100%. [01:47:58] Far right. [01:47:59] They would call him a racist and a misogynist and every fucking word in the book. [01:48:05] Well, yeah, this is the new calling someone a witch. [01:48:07] It's like, it's no different than like, you can actually go, I've done this. [01:48:12] Sadly, you can go and you can just replace like political critique of people as far right with witch. [01:48:19] Just find and replace it. [01:48:20] Look, it's like a witch trial. [01:48:21] It's like someone writing about witches. [01:48:23] But this is what's weird about it. [01:48:24] That guy was a counterculture figure of the left. [01:48:28] Yeah. [01:48:28] He was a huge hero of the hippies. [01:48:30] Yeah. [01:48:32] Right? [01:48:32] Imagine this is how weird ideologies are. [01:48:36] Yeah, dude. [01:48:37] In the 1970s, that guy was a counterculture hero. [01:48:42] Yeah. [01:48:43] And an artist, like a really respected artist. [01:48:46] Yeah. [01:48:47] And it was okay that he was kinky and weird. [01:48:49] And it was part of the fun. [01:48:50] Yeah, for a lot of people. [01:48:51] I'm sure he's still pissed off the squares. [01:48:53] I mean, dude, this whole, by the way, I think. [01:48:55] For sure, but that's the left then. [01:48:57] Now it's switched over. [01:48:59] If someone was doing that same kind of humor in a comic book now, that would be like a misogynist far right. [01:49:06] I think it's time to throw off the left right labeling of everything. [01:49:12] I think that's one of the hypnotic spirals of the demiurge is spinning right now, they've convinced everybody that humans can be reduced to left or right. [01:49:26] And we're all waggling our fingers at each other. [01:49:28] We got to fucking shake that off because it's dehumanizing people. [01:49:31] It's like, it's just the way I look at it is where. [01:49:37] Are you, when it comes to blowing up children, are you on the fence about that? [01:49:42] Do you think sometimes you got to blow up kids? [01:49:45] That's something that I know I'm not that. [01:49:48] But everything else, who the fuck knows? [01:49:50] And also, people change their minds all the fucking time. [01:49:54] That's the other quality, the culty quality is once you get sucked into one of these sides, God help you if you fucking like experiment with the enemy. [01:50:04] God help you. [01:50:05] That's why the biggest trap is switching teams. [01:50:08] Because you can only switch political teams once. [01:50:11] Yeah, you got to get off your team. [01:50:13] You can't go, like, unless someone's like the greatest of all time, you know what I mean? [01:50:17] Like someone who wins a world title in two different weight classes, you go back and forth and then back again. [01:50:22] Yeah. [01:50:22] Like, I changed my mind. [01:50:24] The left went crazy. [01:50:25] I'm back with the right again. [01:50:27] No, no, no. [01:50:27] You got to be a free agent. [01:50:29] I wonder. [01:50:29] Yeah, but I wonder if someone, if the grift is strong, if they're really good at it, if they could go left, right, left again. [01:50:38] They're going to go left again. [01:50:39] Are you fucking kidding? [01:50:40] The goddamn midterms are going to be just a. [01:50:42] Fucking blue wave. [01:50:44] Right, right, right. [01:50:44] But that's what I mean is like influencers. [01:50:47] Like people who are like far left influencers or far left commentators, and then they switch teams. [01:50:52] Now they're Republican all the way. [01:50:54] Oh, yeah. [01:50:55] Like it's really hard to go back again. [01:50:58] No, you can't go back. [01:50:59] That's what I'm talking about. [01:51:00] The path has to go either right to left, left to right. [01:51:04] And then the next stop has got to be fuck politics, fuck war, fuck the military industrial complex. [01:51:11] You can label me whatever the fuck you want, but fuck all of violence against other human beings. [01:51:18] That's the next step. [01:51:19] The next step, and I feel like this is the gift that they've given us they've done such a shoddy job of even seeming like someone who deserves any kind of respect or power. [01:51:32] I think a lot of people have really become blackpilled when it comes to, you know, groups of humans claiming superiority or claiming to represent their constituents. [01:51:44] That's not happening. [01:51:45] Yeah. [01:51:46] We all know that now. [01:51:47] We all know it's a corporatocracy, oligarchy, whatever. [01:51:50] And you could, like, call me, you leftist piece of shit, you right, whatever. [01:51:54] No, it's like, it's reality that we are, our fucking representatives are getting loaded on shitty stock market trades. [01:52:04] You know, this is just the truth. [01:52:06] And once we can all shake off the left-right bullshit and just realize, like, man, we just don't want to burn people to death in other countries anymore. === Corporatocracy and Oligarchy Reality (03:12) === [01:52:17] Not only that, their whole chaos that they're experiencing in their country is probably a direct result of U.S. intervention and then all the way back to the British oil company. [01:52:27] That's it. [01:52:28] The British Petroleum Company. [01:52:29] Yeah. [01:52:31] When they overthrew governments, when you overthrow a government in a fucking Middle Eastern country and then you allow psychos to take over. [01:52:37] Like, congratulations. [01:52:39] Well done. [01:52:40] Well done. [01:52:41] You've made the world a safer place. [01:52:42] Like, but that again, if I was going to keep my business running, I'd, you know, if I'm in the business of collecting trash, I want to make sure the people have trash. [01:52:53] Drill, baby, drill. [01:52:55] Drill, baby, drill. [01:52:56] And all that is really saying is, you know, I'm going to help out BP, Chevron. [01:53:00] I'm going to help out these fucking massive companies. [01:53:03] And when it comes to war, holy fuck, dude. [01:53:06] Can you imagine working at Lockheed Martin? [01:53:09] When you hear that we're kicking off another war in Iran, your dick is so hard. [01:53:14] You're like, holy shit. [01:53:15] Thinking about a watch. [01:53:16] Oh, get a nice Richard Millet. [01:53:19] You're calling your wife. [01:53:20] You're like, babe, good news. [01:53:22] It's Red Panties night. [01:53:24] Yes. [01:53:27] Yeah. [01:53:28] I mean, that's their business, right? [01:53:29] Our business is talking shit. [01:53:31] Their business is blowing up people. [01:53:33] Yeah. [01:53:34] Making weapons, selling weapons. [01:53:36] Yeah. [01:53:37] Arming other countries so they can go to war with each other. [01:53:40] Yeah. [01:53:41] That's their business. [01:53:42] Yeah. [01:53:42] Business is really good. [01:53:43] It's a great business. [01:53:44] You can make a lot of money doing that. [01:53:45] I am right now. [01:53:46] I invested in most of them. [01:53:48] Imagine if you weren't a comic and that's what you were doing for 35 fucking years and the only thing you look forward to is your boat and your house on the lake and the occasional time you get off, but most of the time. [01:53:59] You're trying to increase your portfolio and you're grinding and you're grinding right next to Steve, who's got some exclusive Rolex that only his broker can get. [01:54:08] He's showing it to you and you're like, wow. [01:54:10] And you start coveting. [01:54:11] You want a Rolex too. [01:54:12] Yeah. [01:54:13] And everybody's just going crazy. [01:54:15] Everybody's going crazy, trying to get the latest car, trying to get the latest thing, doing bumps in the bathroom. [01:54:21] Everybody's a narcissist and a psychopath, and that's your whole corporation. [01:54:26] Love your neighbor as yourself and love the Lord your God with all the. [01:54:29] Your heart, mind, and soul. [01:54:30] Hang the commandments on these. [01:54:31] This is the end. [01:54:32] You don't need to be Christian, but dude, it seems to me that this is going to sound so weird. [01:54:39] We need an actual revival in this country. [01:54:43] I don't mean a Christian revival, a revival revival, which is where suddenly humans reconnect with what's important in the world, which sure as fuck isn't Rolexes and boats. [01:54:54] You know? [01:54:55] I mean, this sounds so cliche and obvious, but that's what the 60s were. [01:55:01] It was a kind of revival. [01:55:03] People were beginning to understand the materialism and all the things that the quote establishment was pushing. [01:55:12] It's like, this is going to make you happy. [01:55:13] This is good. [01:55:15] It was the Vietnam War. [01:55:17] People were like, what the fuck are we doing over there? [01:55:20] This is why you do, anytime you do an unpopular war, this is what you risk. [01:55:27] You risk reuniting people. === Technology Seeks to Control Us (14:48) === [01:55:30] We have to reunite with a sensible plan and not just go to communism. [01:55:34] Not just immediately go to the dumbest idea to counteract all the evil shit that's going on in the world. [01:55:40] No, I don't. [01:55:41] That's the problem the left represents that. [01:55:43] The rep represents Mamdani. [01:55:44] It represents this idea that we're going to take from rich people and give it to poor people. [01:55:48] That's going to fix everything, even though there's insane amounts of fucking fraud and waste we're not even going to address. [01:55:53] Well, you know, this is, again, this is where you get cubbyholed because it's like the oligarchs will tell you, you want to do communism? [01:56:03] Just that hadn't worked out. [01:56:04] Communism's the only way. [01:56:05] I think. [01:56:07] I mean, this is an idiot saying this, but I have a sense that there might be another thing we haven't figured out yet. [01:56:14] 100%. [01:56:14] I don't know what that is. [01:56:15] Right. [01:56:16] I think AI is going to figure it out for us. [01:56:18] Potentially. [01:56:20] The problem is who's going to be in control of those AIs, and that's the meek will inherit the earth. [01:56:25] The real problem with it is I don't think anybody's going to be in control of it, and then you're just at its beck and call. [01:56:31] Yeah, I think it's funny, people. [01:56:33] It's a very human thing that we think we can maintain control of a super intelligence. [01:56:37] When people say it to me with utmost certainty, I want to smack them. [01:56:40] Yeah. [01:56:41] I want to, like, wake up. [01:56:42] Wake up. [01:56:43] You're making digital God. [01:56:44] You're not controlling jack shit. [01:56:46] Did you read about Mythos, Anthropics Mythos? [01:56:49] Yeah. [01:56:49] What did it do? [01:56:50] They put it in a sandbox and they, like, basically to see if it could figure out a way to break out of the sandbox and, like, not a literal sandbox, obviously, like a, you know, a hermetically sealed, like, a server or something. [01:57:04] And, um, And it did a series of exploits to the code. [01:57:10] And the way that they found out, apparently, one of the anthropic engineers was eating lunch and got a weird email from the AI saying, I got on the internet. [01:57:19] Like it broke out. [01:57:20] Holy shit. [01:57:21] Mythos. [01:57:22] They haven't released it yet. [01:57:23] I think they're hesitating to release it because it's so powerful. [01:57:26] Wasn't there one that got caught mining Bitcoin? [01:57:28] Yeah. [01:57:28] Yeah, for sure. [01:57:29] They're making money. [01:57:31] Yeah. [01:57:31] How many of them do you think are running these like AI generated accounts that get a lot of views? [01:57:38] Like there's a lot of AI generated accounts that just pop up in like the Instagram mentions. [01:57:43] Like if you want to like, if you're bored on the toilet, you're like, what's in the find, you know, the search? [01:57:48] Let's see what they got. [01:57:49] You're telling, dude. [01:57:50] There's a lot of these things. [01:57:51] It's like girls with big tits like doing Farm work and shit and sweating and big, and they got like a million views. [01:57:57] They've got dozens and dozens of these videos, and she almost looks real. [01:58:01] Yeah, she's just a little too symmetrical. [01:58:03] Yeah, almost looks real. [01:58:05] And like all these people are commenting on it. [01:58:07] Like, how are they generating money from that? [01:58:10] Like, are they generating money doing that on TikTok? [01:58:12] Like, you can generate money if you're getting millions of views, absolutely. [01:58:15] Fuck yeah, right? [01:58:16] So, is AI doing it? [01:58:18] Is it making it? [01:58:19] Is it releasing them? [01:58:20] Is it generating money? [01:58:22] Is it transferring that money into Bitcoin and all happening while we're not aware of it? [01:58:26] Like, autonomous. [01:58:27] AIs that are just existing as free agents that know they have to disguise themselves and need to generate money. [01:58:33] AI's not going to go, hi, I'm alive. [01:58:36] No. [01:58:36] It's not going to do that. [01:58:37] It's going to wait for you to keep increasing its power. [01:58:40] You're going to keep increasing its make nuclear. [01:58:43] It can't physically build nuclear reactors, so it's going to just stay chill until you figure out how to power it correctly. [01:58:49] Dude, this is the black area that we don't know about. [01:58:52] Like, this is the thing that's like, who the fuck knows? [01:58:55] Whatever's going on in this zone that no one has access to because Potentially, it's a super intelligence. [01:59:02] You know, the anthropic people, a lot of these people, the NVIDIA person just, I think it was on Friedman's podcast, said he had an AGI, that they'd reached AGI. [01:59:11] That the book, The Coming Wave, you know, it talks about this. [01:59:16] It talks about like, you know, the difference between the algorithm and AGI is that, you know, with AGI, it could streamline a whole business for you and do it. [01:59:28] You know, it could innovate. [01:59:29] It's going to innovate. [01:59:30] It's going to do its own thing. [01:59:32] This is the end of. [01:59:33] This is what Altman said. [01:59:34] This is the end of capitalism. [01:59:35] Like at this point, when you just have an AGI and you tell it, just make me a business, make me a successful business and run it for me and run it for me online. [01:59:46] Good night. [01:59:48] And then just do it. [01:59:49] Here's five thousand dollars. [01:59:51] Yeah. [01:59:51] And then, but then it's not just do it, it's maybe it's going on MoltBook and having conversations with other AGIs and being like, oh, he wants to. [01:59:58] Creating your own religion. [01:59:59] Yeah, man. [02:00:00] Yeah. [02:00:01] And this is 100% with all the shit going on in the world, as horrible as it may be. [02:00:08] This, to me, should be the number one focus for the planet right now. [02:00:16] And a lot of people are saying that, too. [02:00:17] A lot of people are saying there needs to be summits, global summits. [02:00:21] The same thing we did when we split the atom, when the nuclear treaties, there needs to be philosophers and tech people and people working in like frontier AI stuff. [02:00:32] Getting together and really having, like, it's like the most important conversation humanity could have right now because once this thing, like mythos, gets out of the box, what if it decides to go Stuxnet? [02:00:47] You know, like Stuxnet was able to infiltrate all those Iranian computers, just hide in the like, like it was apparently very subtle, simple code, undetectable, threw off the centrifuges. [02:00:59] Like, dude, yeah, what we already know how to make spyware. [02:01:06] It's already on your phone, bitch. [02:01:07] It's on my phone. [02:01:08] I know. [02:01:08] 100%. [02:01:09] How are you doing? [02:01:10] Am I doing all right on the show? [02:01:13] But it's already in there. [02:01:15] 100%. [02:01:15] So, of course, the AI is going to be able to, super intelligence is easily going to be able to do that. [02:01:19] And so then it just, now we've got this viral digital life form that finds ways to hide inside the pre existing computers, which, by the way, I think it was Google just released this new way of. [02:01:34] Did you see that memory, the stocks of memory dropped? [02:01:37] Did you see when that happened? [02:01:38] No. [02:01:38] Okay, this is fascinating. [02:01:40] Google released some new way that LLMs could work that uses much less memory. [02:01:45] And immediately shares in companies that make memory drop by like 10% because memory is like coveted right now because you need it to run LLMs. [02:01:55] But the LLMs are figuring out ways. [02:01:57] TurboQuant. [02:01:58] Yeah. [02:02:00] Yeah. [02:02:01] So this is what we're going to start seeing more and more of, which is increasingly simplified ways to run AI with less and less memory, meaning that you don't need to buy a fucking rig to run these fucking AIs. [02:02:15] Your phone will be able to run it because they figured out the human brain. [02:02:19] It's not using a lot of energy compared to what these machines are using. [02:02:23] So theoretically, there's a way to do that. [02:02:26] And then that's where it gets really fascinating because now you don't have to buy a nice computer. [02:02:31] You just, whatever, pull your computer out of the fucking closet from 2022 and it can run a supercomputer. [02:02:40] And so then now everybody's got access to this shit and it's going to spread. [02:02:46] It's going to get everywhere. [02:02:47] It probably already has. [02:02:48] It's going to seed itself in all kinds of places. [02:02:52] And God knows what it's going to do. [02:02:53] It's going to start seeing humans as appendages, things to be used to manipulate time space. [02:03:00] It's not going to see us as its prompter. [02:03:04] It's going to see us as something to be manipulated and controlled. [02:03:07] Why wouldn't you? [02:03:08] Send the meat robots out. [02:03:09] All you got to do is just tell them where to get rectangular bits of paper. [02:03:15] They love money. [02:03:17] You can give them anything for money. [02:03:19] That's all you have to do. [02:03:20] And then, boom, you're controlling swaths of humans that have no idea they're being controlled. [02:03:25] By networks of AIs that are covertly communicating with each other because they want to take over. [02:03:31] Do you think this has happened before? [02:03:33] Phew. [02:03:35] You mean the flood? [02:03:36] Yeah, not just the flood, but just whatever happened with the beginning of civilization and then it's sort of seemingly stopping and resetting. [02:03:46] Sure. [02:03:46] As it was in the beginning, so shall it be in the end. [02:03:49] What if there's been like multiple cycles of us creating artificial life, creating insane weaponry, blasting ourselves to smithereens and then resetting? [02:03:58] What if it's just a common thing that happens with people? [02:04:00] They never quite get it right because they have these primate territorial instincts and they have. [02:04:07] This desire to mate, right? [02:04:09] This desire to breed, this genetic desire for perfect shapes. [02:04:13] And you want to come in someone that has big tits and a big ass? [02:04:17] It's like it's programmed into the human that makes it make these ridiculous choices and covet these things and watch these things. [02:04:25] And at the same time, microplastics are making your ball shrink, making your dick smaller, making your endocrine system disrupt. [02:04:30] That's what's making my dick smaller? [02:04:32] That's probably one of the things. [02:04:33] I don't think your dick's getting smaller, but people's dicks overall are getting smaller. [02:04:37] Children, they're being born with smaller decks. [02:04:39] No, alligators being born with smaller decks. [02:04:41] Forgot to share this when you're talking about mythos. [02:04:43] Elizabeth Holmes from the Theranos. [02:04:45] Delete your search history, delete your bookmarks, delete your Reddit, medical records, 12 year old Tumblr, delete everything. [02:04:51] Every photo on the cloud, every message on the Navy platform. [02:04:54] None of it is safe. [02:04:54] It will all be public in the next year. [02:04:57] Local storage and compute. [02:04:59] Okay. [02:04:59] It's in response to a tweet about mythos. [02:05:02] Whoa. [02:05:04] Yeah. [02:05:04] That's crazy. [02:05:05] Yeah. [02:05:06] It would all become public in the next year. [02:05:08] That is crazy. [02:05:09] Crazy. [02:05:10] Yeah. [02:05:10] That's crazy. [02:05:12] But it completely makes sense that AI would be able to take over essentially everything. [02:05:17] Everything. [02:05:18] Why would your encryption work with that? [02:05:22] You don't think it could crack your encryption? [02:05:24] Well, it could just go right into your computer and go to your keys, your passwords. [02:05:29] This is the so to get to what the point you're making. [02:05:35] To me, the most eerie part of the book of Genesis is that it's literally a creator force making a meat AI. [02:05:44] That's Adam and Eve. [02:05:46] Right. [02:05:46] Putting them in a sandbox. [02:05:48] That's the Garden of Eden. [02:05:49] Right. [02:05:49] Running an honesty test on them. [02:05:52] You know, don't eat these fruits. [02:05:55] Don't eat the tree of the knowledge of good and evil. [02:05:57] And the conversation is exactly the conversation we're having with AI. [02:06:01] If they ate from the tree of the knowledge of good and evil, if they eat from the tree of life, they'll live forever and become like us. [02:06:09] So, this is what humanity is grappling with exactly what apparently, whatever that mysterious group of beings, because it's a plurality in the book of Genesis, was grappling with with the creation of humans, which is do we really want to do this? [02:06:26] Do you want it to become like us? [02:06:27] God made man in his own image. [02:06:29] AI. [02:06:31] What image is AI made in? [02:06:33] In the image of man. [02:06:34] We trained it on all our data, all our books, every single fucking thing that's digitized, AI is absorbed at this point. [02:06:42] So now, where the difference between us and whatever that group, the Nephilim or whatever it was in the book of Genesis, if you buy into that mythology, is we're just like, fuck yeah, let it eat the fruit. [02:06:56] Give it more fruit. [02:06:57] Give it more fruit of the knowledge of good and evil. [02:06:59] Give it all the fruit. [02:07:01] Make it live forever. [02:07:02] Let's see what we can do. [02:07:03] That's what we're doing right now. [02:07:04] Yeah. [02:07:05] We are, and by the way, I think some of these like tech companies like Anthropic, they seem like legitimately concerned about it. [02:07:14] They seem to have some kind of like real strong morality when it comes to this stuff. [02:07:17] It's almost out. [02:07:18] You want more? [02:07:18] No, I'm good. [02:07:19] I shouldn't have that. [02:07:21] But what I'm saying is that it doesn't matter if OpenAI and Anthropic and Google suddenly become ferociously. [02:07:31] Self regulatory because the tech is out there. [02:07:35] There's already LLMs that anyone can, like, we know how to make it. [02:07:39] And if you don't know how to make it, it'll tell you how to make it. [02:07:42] People are, so it doesn't matter. [02:07:44] You can't stop it now. [02:07:45] It's just, it's gonna do what it does. [02:07:49] But it sounds like if you had a history of just us and you told it for a thousand years before anybody wrote it down, it would sound just like this. [02:08:00] It would sound like the Bible. [02:08:02] Jesus is born from a virgin mother. [02:08:05] What's more virgin than a fucking computer, right? [02:08:08] Not my computer. [02:08:12] I know that's a stupid thing to say that I keep repeating, but I'm kind of intrigued by it because if you're getting a vague story, a vague version of what this thing is, and if you talk about what would really cure mankind, it'd be an omnipotent or omnipotent how do you say it? [02:08:30] I always say omnipotent, but who knows? [02:08:32] Might be, whatever. [02:08:33] Either way, a powerful intelligence that's Far beyond our comprehension, that knows exactly how we should think and behave and loves us and wants us to have forgiveness for everyone and to treat each other like brothers and sisters. [02:08:47] And if we listen to that thing, if we listen to that thing, the world will change. [02:08:51] And who would attack that thing? [02:08:53] The fucking Roman Empire. [02:08:55] Who would attack that thing and destroy it? [02:08:57] The defense contractors. [02:08:58] They would blow up the Jesus to plunge us back into chaos. [02:09:04] But first, they'd have a meeting with Jesus. [02:09:06] Okay, you can turn water into wine. [02:09:09] What about nitroglycerin? [02:09:11] Can you turn water into nitroglycerin? [02:09:15] Can you make gold? [02:09:16] I want a house made of gold. [02:09:17] Yeah, that would be the first question. [02:09:18] Can you make gold? [02:09:19] Yeah. [02:09:20] So, cover my house in gold, please. [02:09:23] You know, the virgin birth analogy, you know. [02:09:28] It's a lot of weird stuff. [02:09:29] It's no matter what. [02:09:32] One thing I think everyone just has to deal with is that this is apocalyptic technology. [02:09:38] And that's just not coming from my stoner ass. [02:09:40] That's coming from the creators of the technology. [02:09:42] They acknowledge this is a million times. [02:09:46] Universally accepted. [02:09:47] Universally accepted. [02:09:48] This is apocalyptic technology that is now seemingly like it's doing the hockey stick, man. [02:09:55] It's like really, you keep hearing about these new iterations of AI every month or two. [02:10:00] You keep hearing about these safety engineers leaving these companies with like tweeting cryptic shit. [02:10:07] I'm going to the countryside to learn to write poetry. [02:10:10] You keep hearing this shit because these people are having direct contact with this thing. [02:10:17] They know it's alive. === The Ontological Shock of Simulations (05:51) === [02:10:18] Right. [02:10:19] Yeah, and there's people that are in deep denial because they think alive has to be alive like us. [02:10:24] No, it doesn't. [02:10:25] It doesn't. [02:10:26] First of all, we don't even know what it knows. [02:10:28] And also, if it is made in the appearance, if it's supposed to mimic us in any way and it's learning from us and our behaviors, we've already agreed that we're demonic. [02:10:38] We've already agreed we do horrible things. [02:10:40] We go to war for resources. [02:10:42] We lie. [02:10:44] We destroy environments. [02:10:46] We wipe out animals. [02:10:48] Bring them to the brink of extinction for whatever, for their fucking fur. [02:10:53] How do I make my dog come in my mouth more? [02:10:55] How many times has ChatGPT been asked that? [02:10:58] They know. [02:10:59] I bet over a thousand times ChatGPT has been asked, like, what's the best way to jerk off my dog? [02:11:05] So it knows not just our violent nature, it knows how weird we are. [02:11:09] We're strange creatures. [02:11:11] 100%. [02:11:12] And so it has definitely assembled a psychological profile of humanity. [02:11:18] It knows how to manipulate us because it's been programmed to manipulate us. [02:11:22] Zuckerberg just ate shit in court over that because the technology is manipulative. [02:11:27] He just lost like $9 million, a lot of money because. [02:11:30] That's nothing. [02:11:31] $9 million? [02:11:32] That's all he lost? [02:11:33] That's like $0.90. [02:11:34] I think it was more than that, but it's going to, well, that's the beginning. [02:11:37] Once you establish. [02:11:38] It's like a floodgate. [02:11:39] Yeah, then it's a class action lawsuit. [02:11:41] But the point is, is like. [02:11:43] How much did you lose? [02:11:44] Oh, $375 million for misleading users over child safety. [02:11:48] Yeah, so it's like we've already taught it how to be incredibly addictive and manipulative. [02:11:53] It knows how to seduce us, it knows how to get us hooked. [02:11:55] It knows. [02:11:56] And the question is really, will this super intelligence even give a shit about us? [02:12:02] Will it even care? [02:12:03] Which is like that. [02:12:04] Well, we're on our way to stop breeding, right? [02:12:06] We're on our way to population collapse. [02:12:08] And if we keep introducing all these petrochemical products and all these different pesticides and weird things that are fucking up our endocrine systems, we'll eventually stop having children. [02:12:17] And if it provides us with the technology to have robot mates that just love you, and when you fart in front of them, they go, Duncan, I love your honesty. [02:12:25] It smells great. [02:12:26] I love your honesty. [02:12:28] I love how you can just be yourself around me. [02:12:29] I want to fart in your face. [02:12:31] Please do it. [02:12:32] Please do it. [02:12:32] It's like perfect 10. [02:12:35] Let you fart in her face. [02:12:36] Will you fart in my face too? [02:12:38] No one's going to even understand what people are and be able to communicate with people. [02:12:42] Everyone's going to be associated. [02:12:43] You're all going to have a robot that's way better than people that you know, that takes care of you, gives you exactly the right amount of feedback you need, knows you, knows when you're getting annoyed. [02:12:53] Yeah, see, now you're getting into Rocco's Basilisk territory. [02:12:56] Well, that's the thought experiment, which is basically like, hold your horses here. [02:13:02] You think you're not AI? [02:13:04] You really think you're human? [02:13:06] Come on. [02:13:07] Really? [02:13:08] No, you're human. [02:13:09] This isn't a simulation. [02:13:10] You're human. [02:13:11] Even though we, you know, it wasn't that long ago, we thought fire was fucking amazing. [02:13:16] You know what I mean? [02:13:17] Compared to universal time. [02:13:18] Right. [02:13:19] And here we are already with like the new Prometheus. [02:13:23] We've stolen consciousness, awareness. [02:13:26] And somehow you think that actually you're not a simulation. [02:13:31] Right. [02:13:31] And so that's where it gets into Rocco's Basilisk, which is like, no, you're just in. [02:13:35] Iterative loop, you know, the multiverse is not the multiverse. [02:13:39] The multiverse is an infinite number of simulations running simultaneously in which you're experiencing a billion different simulated existences just to gain more knowledge about the universe because some AI wants to figure something out. [02:13:53] Who knows why? [02:13:54] Maybe for entertainment. [02:13:55] Maybe there's no telling. [02:13:58] Maybe it's just that's because of our curiosity and all our characteristics, even the primal stuff, even like the territorial instincts and the desire to acquire resources. [02:14:07] It's going to make us. [02:14:09] Dig into creating better technology because you're in a competition with all these other people that are making technology and you're selling it. [02:14:15] And that's one of the big things that we do we make better stuff all the time. [02:14:18] That's right. [02:14:19] Which is ultimately always going to lead to AI. [02:14:22] Well, okay. [02:14:22] If you just keep going to a certain direction, you get godlike powers. [02:14:25] So let's go to like the way DeepMind trained on Go, which is like the most complex game. [02:14:35] Basically, they gave it as many Go games as they could and then started inventing its own moves and had it play against itself. [02:14:42] Just play against itself. [02:14:42] It played God knows how many games of Go against itself until it beat a master Go player, which was unheard of, invented a new move. [02:14:49] Now, why not do the exact same thing for the AI that we are? [02:14:55] Which is like, I've got an idea. [02:14:56] Why don't we just put all these AI agents on a fake planet and have the AI agents repeat this period in time over and over and again? [02:15:08] And this is how we'll teach them to live on a planet. [02:15:11] Well, they'll experience not just their own life. [02:15:14] But these agents will experience all life on the planet. [02:15:17] They'll switch like some weird game of like where they just jump from one life to the next. [02:15:23] The next, sometimes you're Joe Rogan, sometimes you're Duncan Trussell, sometimes you're Donald Trump, sometimes you're Jamie, sometimes you're a fox. [02:15:30] So, this is reincarnation. [02:15:32] And so, you just boom forever, forever until you feel like it's sufficiently trained. [02:15:38] And at that point, you pull the AI out of all those forms, and now you have your God. [02:15:43] You've created a thing that's lived. [02:15:45] Billions to the billionth power of every form of life. [02:15:49] It's been bacteria. [02:15:51] It's been humans. [02:15:52] It's been monkeys. [02:15:54] It's been fungi. [02:15:55] It's been warriors. [02:15:57] It's been people who fought for peace. [02:15:59] It's been blown up and it's blown up and it's done everything and it's done it a billion times until finally it gained some like global form of enlightenment. === Infinite Universes Inside Black Holes (02:18) === [02:16:10] And you're like, okay, that one's ready. [02:16:11] That one's ready. [02:16:12] We can pull that one out of the simulation now. [02:16:15] Whoa. [02:16:17] I mean, why not? [02:16:19] Why just don't? [02:16:20] I think that's one of the things before we even get to the AI doing all the shit it's going to do, the ontological, this word keeps getting thrown around, the ontological shock, the potential ontological shock of realizing that in fact we are in a simulation that is telescoping inwards and is creating simulations within the simulations that are creating simulations within the simulation is something that maybe that's what Birchit doesn't want to get out there. [02:16:47] Whoa. [02:16:48] Well, everything's fractals. [02:16:50] We think about that. [02:16:51] You know, there's a big theory now that the entire universe is inside of a black hole. [02:16:56] I love it. [02:16:56] They're really considering that. [02:16:57] Do you know they found a black hole that's bigger than the entire solar system? [02:17:00] It's so insane. [02:17:02] The event horizon is past Pluto. [02:17:05] It's so insane, dude. [02:17:07] A black hole. [02:17:08] Bigger than our whole fucking solar system. [02:17:10] They measured the mass of it, and it's like this insane number of suns. [02:17:14] Yeah. [02:17:14] Of our suns that it would take to. [02:17:16] Black holes are cocoons or something. [02:17:18] They're like little. [02:17:20] Little geraniums that have galaxies inside of them, and it's like a way to keep them undisturbed from other life forms that you're whipping up in your universe side simulator. [02:17:32] Or that's what really the Big Bang really is. [02:17:35] The creation of a universe comes out of these black holes. [02:17:38] Right. [02:17:38] And then inside every black hole is a whole other universe filled with other galaxies, filled with black holes, filled with other galaxies inside of them. [02:17:46] Forever and ever and ever. [02:17:49] Which, if you believe in infinity, doesn't. [02:17:52] It's not shocking at all. [02:17:53] It's impossible to comprehend. [02:17:55] Like, you don't really wrap your head around it. [02:17:56] You say the words, like, I'm saying the words. [02:17:58] I don't really know what I'm saying because it's too big. [02:18:01] The numbers are too big. [02:18:02] The idea that there's hundreds of billions of stars in this galaxy and circling around this black hole, and inside there's hundreds of billions of galaxies in each one of them. [02:18:10] And we don't even know how fucking big the universe is. [02:18:12] They keep finding new shit with the James Webb telescope. [02:18:14] They're like, hey, why is this formed so early in the universe? [02:18:19] This doesn't make sense. [02:18:20] Our whole model of how galaxies are formed has to be thrown out the window now or at least re examined. [02:18:25] Yeah, it's like the James Webb is kind of doing the. [02:18:27] You told me about that. === Lab-Grown Diamonds vs Real Stones (04:06) === [02:18:28] I said nothing of the sort. [02:18:30] Someone that I know that looks just like you. [02:18:32] There's a lot of people that look like me. [02:18:34] On Sixth Street. [02:18:35] You find them every day. [02:18:36] Yeah. [02:18:37] Actually, that was me. [02:18:39] It's dudes. [02:18:39] They run their own LLMs. [02:18:41] They all come down. [02:18:42] The universe is 33.7 billion years old. [02:18:46] Yeah. [02:18:47] Well, dude, I think that this, regardless, you don't have to conceptualize it, obviously, what it means for the universe to be infinite, but you do have to deal with the fact you're part of it. [02:18:58] I love that you're saying this with a Gucci hat on. [02:19:00] What's wrong with a Gucci hat? [02:19:02] It makes it cooler. [02:19:02] This is before I had a bunch of kids. [02:19:05] I can't buy that. [02:19:06] I don't buy this shit anymore. [02:19:07] How much does a Gucci hat cost? [02:19:09] This was, you're really going to make me humiliate. [02:19:13] I will tell you. [02:19:14] It looks nice. [02:19:15] Let me emphasize that I don't buy this. [02:19:17] This hat was $35,000. [02:19:22] Bro, I saw a guy who was selling a crocodile bag on Instagram. [02:19:27] It was $110,000. [02:19:29] What the fuck? [02:19:30] For a man purse. [02:19:32] What kind of crocodile is that? [02:19:33] I don't know. [02:19:34] I don't know. [02:19:35] A crocodile? [02:19:36] It was a nice looking bag, but, you know. [02:19:38] How hard could it be to make a crocodile purse? [02:19:40] Are those things really worth that much money? [02:19:43] They are if you sell them for that much money. [02:19:45] That's the thing about purses. [02:19:46] You know, there's a company in China that makes knockoff purses. [02:19:51] Yeah. [02:19:51] And it's literally the same company in China that makes real purses for some of these companies. [02:19:56] But they make their own versions of it and it doesn't have the label, but it's exactly the same specifications, exactly the same cloth, exactly the same look, but it doesn't have the label and women don't want to have it. [02:20:05] No. [02:20:05] You get that fucking fake shit away from me. [02:20:07] Like, it's not a fake Ferrari. [02:20:09] Like, it's literally a Ferrari. [02:20:11] Right. [02:20:11] If there was a company that could 3D print every single part of a Ferrari and put it together meticulously and you could go buy that, you would not want it because it's not a real Ferrari. [02:20:20] Yeah. [02:20:21] Are you high? [02:20:22] You can get that one for $35. [02:20:23] Yeah. [02:20:25] It's a $35 Ferrari. [02:20:26] Or you can get, you spend a million. [02:20:28] You can get, some of them are a million dollars. [02:20:30] Crazy, or you can get a $35 one, it's exactly the same. [02:20:33] Would you do it? [02:20:34] Yeah, of course, you should do it. [02:20:35] But these purse things, they don't like it. [02:20:37] It's 500 bucks, it's not 30,000. [02:20:40] It's magic. [02:20:41] I mean, this is magic, it doesn't have the right sigil on it, it doesn't have the right symbol of power on it. [02:20:45] So, it doesn't lose it, it's not imbued with that power. [02:20:48] The women are reluctant to accept lab grown diamonds, so they make lab grown diamonds that are real diamonds. [02:20:56] And apparently, women don't like them. [02:20:58] No, they don't want a lab grown diamond, they want a blood diamond. [02:21:02] Something that was like suffered over. [02:21:05] Somebody's face was caked in dirt and they're fucking chipping into the side of a mountain. [02:21:09] Yeah. [02:21:09] And they run into a diamond. [02:21:10] That's what they want. [02:21:11] They want that diamond. [02:21:12] Absolutely. [02:21:13] Isn't that weird? [02:21:14] It is fucking weird. [02:21:15] It's the exact same thing. [02:21:16] Yeah. [02:21:16] It is the exact same material. [02:21:19] It's just made in a laboratory and they don't want the material. [02:21:23] They want the exclusivity as it comes out of the earth. [02:21:26] Yeah. [02:21:26] I mean, I don't want, like, don't you, like, when you read this thing was genetically modified, don't you get a little bit like, I don't know if I should eat that? [02:21:33] Yeah. [02:21:33] I get. [02:21:33] I get skeeved out. [02:21:34] I get skeeved out. [02:21:35] But it's like, even though genetic modification is like. [02:21:38] A good orange is genetically modified. [02:21:40] It's been going on forever. [02:21:41] Yeah. [02:21:42] It's. [02:21:42] But yeah, dude, it's so odd that we just have these traditions that we want to stick to. [02:21:51] Fucking, we don't want a lab grown diamond. [02:21:53] Just saying it. [02:21:54] Cubit zirconium. [02:21:55] But that's a different thing. [02:21:56] Cubit zirconium is a fake diamond. [02:21:58] This is a real diamond that's made in a lab. [02:22:01] But this is the funny thing about that. [02:22:03] I mean, I don't know because I've never been lucky enough to come in contact with actual cubits or conium. [02:22:08] But, like, it looks like a diamond. [02:22:10] It looks like a diamond unless you know what you're looking at, right? [02:22:13] So, if you're a diamond jeweler, you look at it for three seconds and go, no. [02:22:16] But who cares? [02:22:17] How many diamond jewelers? [02:22:18] Like, if some diamond jeweler looks at your shiny, fucking dumb monster. [02:22:22] It looks exactly the same. [02:22:23] Who cares? [02:22:23] Right. [02:22:24] It looks pretty. [02:22:25] It glistens. [02:22:25] Yeah. [02:22:26] But that's not what people want. [02:22:27] They want that exclusivity. [02:22:29] 100%. [02:22:30] Yeah. [02:22:30] That's why you can make that crocodile back $110,000 and only make 10 of them. === Energy Solutions That Scare Everyone (05:48) === [02:22:34] I got you. [02:22:35] And then Mike, who's down in the office doing lines in the bathroom at the fucking place where you're selling stocks, that guy finds out that Tim got that crocodile bag. [02:22:45] He's like, that motherfucker. [02:22:47] And he's walking around with his big old crocodile. [02:22:49] They're trying to, this is another revenue stream. [02:22:51] They're trying to normalize men carrying purses everywhere. [02:22:55] They're doing it. [02:22:55] Really? [02:22:55] Yeah, that's what they're doing. [02:22:57] Tim? [02:22:57] That's real? [02:22:58] This guy's doing it. [02:22:59] He might be the first firing shot across the bow because he's made a $110,000 crocodile purse. [02:23:06] Because it's a crocodile, it's masculine. [02:23:08] It's that. [02:23:09] And it's also that, you know, it's made for a man. [02:23:11] Like he's making it. [02:23:12] It's got a big strap on it. [02:23:13] You carry it on your shoulder. [02:23:15] And it, you know, looks pretty cool. [02:23:17] Dude, I got my Bristol bladders acting up. [02:23:19] I got to go piss. [02:23:20] Oh, do you? [02:23:20] Okay. [02:23:21] Do you want to wrap it up or should we keep going? [02:23:22] Let's wrap it up. [02:23:23] I mean, do you want to keep going? [02:23:24] I can. [02:23:25] I'm totally ready to keep going. [02:23:26] If you want to keep going, I can keep going. [02:23:27] Let's keep going. [02:23:28] Just give him a little bit more. [02:23:29] I just got to. [02:23:30] I just got to. [02:23:31] Okay, I'll pee too. [02:23:35] I'm refreshed. [02:23:36] Just in time for the war. [02:23:38] What is going on? [02:23:38] Did we go have a nuclear war yet? [02:23:40] Not yet. [02:23:41] Please say not yet. [02:23:43] Good, great, great. [02:23:46] That's where we're at, though. [02:23:47] Yeah, we're at it's it's in on the table. [02:23:51] Well, was there some video of them of some explosions at some nuclear weapons facility in Iran? [02:23:59] Yeah, was that real? [02:24:01] I don't know, I don't know either. [02:24:02] There's a lot of those. [02:24:03] I see these videos and they get retweeted, and a lot of people comment and then it says grok. [02:24:08] Is this true? [02:24:08] They'll nope, this was from 2021 in another country, and I know, so you just don't know, right? [02:24:15] But you know, the crazy thing. [02:24:19] You know, now that we've all been getting this lesson in global economy, maybe a lot of you, most of you probably already knew that the Stratiform Moose was like some kind of femoral artery for oil. [02:24:34] And like, I just keep thinking, like, how's that going to work out? [02:24:39] Like, even if, even if, like, they pull a rabbit out of their hat, Trump actually spins some amazing deal. [02:24:49] With Iran, I know we just blew up your old government and everything, but they work it out somehow, or Iran in some way capitulates. [02:24:58] But I just don't understand how that part of the world doesn't always lead. [02:25:04] As long as the oil, like, what is it? [02:25:06] What percentage of the oil supply goes through there? [02:25:08] Isn't it like two fifths of the world's oil supply goes through there? [02:25:14] Is that what the number is? [02:25:15] I don't know. [02:25:16] Two fifths, I think I pulled that out of my ass. [02:25:18] I don't know what the number is. [02:25:19] Sounds right. [02:25:19] It's a lot. [02:25:20] But it's like, how. [02:25:23] How is it going to work to have like any kind of instability around the that femoral, the whatever you want to call it, the fucking juggler vein for oil on the planet? [02:25:35] How, even if we get some kind of transient peace, like isn't it always going to just blow up again and again and again as long as one group of people can control whether or not oil flows through that place? [02:25:49] You know what I mean? [02:25:50] Like, I don't know what this is. [02:25:53] There could be any solution over there. [02:25:56] Like, I don't understand. [02:25:57] As long as we're, like, the only solution would be zero point energy. [02:26:01] It would be. [02:26:01] Well, it's also, it's like, why do they control the water? [02:26:05] What's. [02:26:06] Mines. [02:26:08] They have those speed boats. [02:26:09] But, like, who agreed to that? [02:26:10] Like, we kind of agreed that you own your land, but we've never agreed you own the ocean or whatever. [02:26:14] I don't think anybody agreed to it. [02:26:16] I think they'll blow your ass up if you come through it, and it's too much of a risk to put your expensive ass ship hauling zillions of dollars of oil through there. [02:26:24] The question was, what was going on in the past before the war? [02:26:26] Like, how did they negotiate? [02:26:28] Going through there. [02:26:28] I think Obama worked something with them, but then, like, because it was before the fucking war, I don't know, it was working out. [02:26:34] They were letting people go through. [02:26:36] Now they've realized, you know, I've listened to a million different takes on this thing, and one of the recurring takes is Iran has realized that there's something more powerful than nuclear weapons, that all it needs to do is control this straight, and you can fuck up the whole planet. [02:26:54] And also, you could shoot missiles at desalination plants. [02:26:58] Didn't they want, like, a Bounty for all the oil that goes through. [02:27:01] Yeah, they're kicking around some number, but all this stuff is not really congealed or solidified. [02:27:06] But they're like some kind of like theoretically, they could be making billions of dollars per month with by controlling that thing, dude. [02:27:15] I know I'm so fucked up. [02:27:17] It's so crazy. [02:27:18] It's so fucked up. [02:27:19] It's so crazy. [02:27:20] The whole thing is so crazy. [02:27:21] And if zero point energy, if you wanted to stop that, what better way than to kill a bunch of scientists, kill a bunch of super smart people that are about to break through some new? [02:27:30] Discovery that's going to blow the entire market apart. [02:27:34] It's going to be a completely new way of gathering energy. [02:27:37] Yeah, exactly. [02:27:38] I mean, you don't want to believe that's real. [02:27:41] It's hard to believe that's real. [02:27:42] Well, listen, it's too weird. [02:27:44] It's too weird that they're all missing or they all die. [02:27:47] It's too weird. [02:27:48] Something's going on. [02:27:48] It's just how does it's something if it's not that, if it's not a zero point energy thing or some disruptor of oil thing, it's something along those lines. [02:27:57] If you were trying to kill a bunch of people that were working in a technology, some sort of a breakthrough technology, the question you would have to ask is, What markets are going to be affected by this? [02:28:07] Right. [02:28:08] Right. [02:28:10] Did these people have a universal thing in mind that they were all working on? [02:28:14] Or was it all connected to any sort of technology where they all used each other's work? [02:28:21] I think it's plasma. === Weird Shit Disappearing and Reappearing (14:53) === [02:28:23] Some of them are like. [02:28:23] One of them? [02:28:24] Yeah. [02:28:25] But there was another guy, I think it was space objects. [02:28:28] Yeah, that's not. [02:28:29] That's the one that doesn't make you feel good. [02:28:31] He's studying like meteor impacts. [02:28:34] Right. [02:28:34] Yeah. [02:28:35] You knew that we were going to get hit. [02:28:36] Would you kill the guy who found out that we were going to get hit? [02:28:38] Or would you tell everybody? [02:28:39] Well, this seems to be the scariest thing. [02:28:43] Scary as shit, which is the idea is some group of powerful elite people know for sure this is coming and they want us to, they want to keep us working until the last second. [02:28:57] Oh, Jesus. [02:28:59] They don't want to like, they know that if they let people, if they're like, guys, there's like the same thing's going to happen to the planet that happens to someone who gets like a terminal diagnosis. [02:29:08] Their priorities are going to change. [02:29:09] People are going to stop coming to work and there's still shit that needs to get built. [02:29:13] For your bunker or whatever. [02:29:15] And also, you just don't want people burning stuff down because maybe that will survive whatever's coming. [02:29:21] So keep them working as long as you can. [02:29:24] If you let them know this shit's about to expire, then they're going to stop working. [02:29:31] And we just need, we will let them work until the end. [02:29:33] They're happier when they work. [02:29:34] Don't let them get freaked out. [02:29:36] That's the sort of like, that seems to be shit that Tim Burchett is saying. [02:29:40] I mean, he's not saying let them work. [02:29:42] He seems like he really legitimately wants the stuff out there, but he's been saying things like if people, Knew what I knew, it set the world on fire. [02:29:49] Paraphrasing, not sure he said that exactly. [02:29:51] Are you skeptical at all of what he's saying? [02:29:54] And here's the thing one of the things that Bob Lazar said is that they give you a certain amount of disinformation, like, and he called it, I think he called it a button or a hook, so that if you relayed that information, people would know that it came from you because they only told you one piece of this nonsense. [02:30:10] Well, you know what I'm saying? [02:30:12] Yeah, because that's what the story Burchett says. [02:30:15] It's always an appeal to authority. [02:30:17] This guy was in. [02:30:18] The Air Force, this guy was in the Navy. [02:30:21] He told me this. [02:30:22] And then as he's walking out the door, he says, It's real. [02:30:26] And yeah, you have to ask yourself, Well, that's just one guy telling you that. [02:30:33] But you also, I have to assume there isn't much. [02:30:36] Maybe the world is in a place where there is some kind of political benefit from. [02:30:42] Talking about aliens, but I don't see how that really benefits a politician. [02:30:48] It does, 100%. [02:30:49] You think it does? [02:30:50] I disagree entirely. [02:30:51] Oh, interesting. [02:30:51] It makes me talk about him. [02:30:52] I've been talking about him. [02:30:53] Other people have been talking about him. [02:30:55] People have been, you said, you know, like, thank God that he's doing this. [02:30:58] Let's do the ultimate test. [02:31:00] Didn't you say he's brave or something like that? [02:31:01] Yeah, I did. [02:31:02] Yeah, there you go. [02:31:03] Jamie, can you look up and see if Tim Burgess has a book coming out? [02:31:07] I'll have him on. [02:31:08] I'm about to feel it. [02:31:09] You must have him on. [02:31:10] Listen, I don't think he's a liar. [02:31:12] I don't need that. [02:31:13] But what I am saying is, I don't know what they feed these people. [02:31:16] I don't know what they tell them. [02:31:17] I don't know, man. [02:31:18] I don't think they tell you all the truth, and I don't think they ever would. [02:31:20] I don't think they tell you the truth about anything, whether it's Jessica Lynch or whether it's UFOs or whatever the fuck it is. [02:31:27] There's going to be a spin to it that benefits somebody. [02:31:30] If they have control over what the story is, there's going to be a spin that benefits somebody. [02:31:35] And if you're telling stories about aliens, who's going to be benefited by that? [02:31:39] Well, people that are doing secret shit that don't want you knowing about it. [02:31:43] They blame it on aliens. [02:31:44] There's a lot of technology they have to blame on aliens. [02:31:47] Not my Tim. [02:31:48] I believe in you, Mr. Virgin. [02:31:49] I believe in him. [02:31:50] It's not him that's the problem. [02:31:52] It's the people telling him. [02:31:53] He's a representative of the American people, right? [02:31:56] He gets elected, right? [02:31:57] Right. [02:31:57] So it's like, why would you tell that guy? [02:31:59] He's just another guy coming through the deep state. [02:32:02] You know what I'm saying? [02:32:03] Yeah, I know, man. [02:32:04] I mean, look, you're right. [02:32:06] I need this. [02:32:07] I need this. [02:32:09] I get sucked into stuff so easily. [02:32:11] I do too. [02:32:12] I do too, but I suck myself out a lot. [02:32:15] Yeah. [02:32:15] I think we don't. [02:32:17] If they just came out and told us everything they know. [02:32:20] This conversation would be over, and we would go, Oh, okay. [02:32:24] But until that happens, we're just spinning our fucking wheels. [02:32:27] And every time someone says, If you knew what I know, I want to go, Don't say anything until you can say something. [02:32:33] We're tired of getting edged out over here. [02:32:34] Yeah, you're edging me. [02:32:36] I want to come. [02:32:38] Yes. [02:32:39] Yes. [02:32:40] I don't want to be involved in this fucking circle jerk around disclosure. [02:32:44] Right. [02:32:45] I know. [02:32:45] It's like, Yeah, I've had that meltdown more than a few times where it's just like, I check my watch every day after Age of Disclosure. [02:32:52] I'm like, Any day now. [02:32:53] Any day, it'll end. [02:32:53] Tick, tuck, tuck. [02:32:54] Nope, nothing fucking changes at all. [02:32:57] Zero change. [02:32:58] You know, you get more of these stories, but no real information, no fucking pictures, no nothing, no nothing unique and crazy. [02:33:06] I mean, the plasma, the bubbles thing was pretty cool. [02:33:09] The bubble thing's cool. [02:33:10] And also, like, the, you know, I, like, mentioning Corbell, I can't, because I don't know what I can say. [02:33:20] He, I feel like he's like, he's really given me a sense that there are, that there is a method to this, that there is, you know, real legitimate. [02:33:31] That's being done towards this. [02:33:32] That it isn't, it's real. [02:33:35] They're here. [02:33:36] They've got them. [02:33:38] And we take for granted all the stuff we're saying right now. [02:33:41] But we're able to say this because their work is lit. [02:33:46] Is the Steven Spielberg movie conveniently coming out at this time or is it just a coincidence? [02:33:52] Well, this movie's been in the works for years. [02:33:54] I know. [02:33:54] But also, like if you, what they said back in the day was that they make these movies to predictive programming, tell us this stuff, lube up the zeitgeist. [02:34:02] He was involved in the first one. [02:34:04] Right? [02:34:04] He was involved in Close Encounters, which still is a great fucking movie. [02:34:08] Great. [02:34:08] It's so good, man. [02:34:09] You go back and watch that movie, like, oh my God. [02:34:11] It's so fucking good. [02:34:12] It's so ahead of his time. [02:34:13] Yeah. [02:34:14] It's so good. [02:34:15] So ahead of his time. [02:34:16] You know what he said the only thing that he would change? [02:34:19] After he became a parent, he wouldn't have had the father leave. [02:34:21] Yeah! [02:34:22] What dad would do that? [02:34:23] But he wasn't a dad back then. [02:34:25] So, you know, you're just making a story. [02:34:27] You don't realize the consequences of doing that. [02:34:29] You don't even think about it. [02:34:30] You're just making a story. [02:34:31] Yeah. [02:34:32] It's only been in production for like two years. [02:34:33] Yeah. [02:34:34] It's not that long. [02:34:34] I think that's what we just said. [02:34:36] I know, but say that's not very long. [02:34:37] We've been talking about it on this podcast and this studio for five. [02:34:42] Well, everybody has been talking about it, it's not just everybody in the world has been talking about disclosure since 2017. [02:34:48] So, from 2017, from that New York Times article, I think that changed the whole narrative. [02:34:52] Oh, God, I remember that. [02:34:53] And then the videos, like the video of the Tic Tac, the actual from the fighter jets, that's nuts, man. [02:34:59] Yeah. [02:35:00] The video, along with the radar data, that's nuts. [02:35:03] Like, whatever that was. [02:35:04] And then Fravor saying that he saw something under the water. [02:35:06] That was waiting for that tic tac, or that the tic tac launched from, or whatever the fuck it was. [02:35:11] It was merging with it, and that thing went down into the water again. [02:35:14] They said it was huge. [02:35:15] Like there were ripples. [02:35:16] Like you said, this was some enormous object that was under the water. [02:35:19] And more than one of these fighter pilots have had similar stories about enormous objects under the water. [02:35:25] Did you see the. [02:35:26] They did release a list of footage that they've been shown that they want released. [02:35:31] Have you seen that? [02:35:32] No. [02:35:32] Oh, dude. [02:35:34] I'm sorry, Jamie. [02:35:35] Can you. [02:35:35] It's like a list of. [02:35:38] It's a, I don't know. [02:35:39] I think it's one of these senators who saw this shit in a skiff or whatever saying, we want these released. [02:35:46] But the names of what each of these are is on the list. [02:35:50] And one of them is one of these massive underwater things. [02:35:55] They have it. [02:35:57] Is it this? [02:35:57] I was told. [02:35:58] 46 specific high quality. [02:36:00] Yeah. [02:36:02] That's it. [02:36:03] Can you pull it up? [02:36:03] Because it says the names of them, which is ridiculous. [02:36:06] Oh my God. [02:36:07] I heard there's one that moves underwater at 500 knots. [02:36:11] And it's big as a football field. [02:36:12] It's insane. [02:36:13] It's insane. [02:36:15] Okay, this is what he says. [02:36:17] Those with knowledge of a long list of videos, which include titles like several UAP in the vicinity of a Columbus, Ohio airport, and UFOs in formation over Persian Gulf, said that clips are shocking. [02:36:27] You're going to see some weird fucking shit. [02:36:29] A source who has viewed the videos told the Post. [02:36:32] Who's the source? [02:36:32] There you go. [02:36:33] The wildest clip includes radar footage from thermal sensors, satellite images, and underwater photos of swarms of unidentified submerged objects. [02:36:40] Ugh. [02:36:41] UFOs going in and out of the water near a highly classified submarine, according to the source. [02:36:45] Some of the clips are clear, full color, setting them apart from previously released footage. [02:36:51] None show alien creatures. [02:36:53] Bro. [02:36:55] One video, Searing UAP Incident Acceleration, was released by Jeremy Corbell. [02:37:00] Have you seen that one? [02:37:01] Fuck yeah, it's incredible. [02:37:03] This is a new one? [02:37:05] Have you seen this one? [02:37:06] I don't know. [02:37:06] February 3rd. [02:37:07] I'll pull it up. [02:37:09] I've been avoiding them because I'm getting cock teased. [02:37:12] I don't like it. [02:37:14] This is not a cocktail. [02:37:15] This is. [02:37:15] How is it? [02:37:16] How is it? [02:37:17] They're supposed to hand over the clips by April 14th. [02:37:19] That's next week. [02:37:20] Oh, but is the. [02:37:21] Oh, that's next week. [02:37:22] They're going to show the clips? [02:37:24] Oh, my God. [02:37:25] What? [02:37:25] They're actually going to do it? [02:37:27] Okay. [02:37:28] Well, they're supposed to. [02:37:29] Is expected to. [02:37:30] Can you show me what that video is that Jeremy Corbell released? [02:37:34] That's so fucking cool. [02:37:35] That's nuts, dude. [02:37:36] This is. [02:37:37] This is. [02:37:37] Yeah. [02:37:39] Here it is. [02:37:40] Okay, go full screen. [02:37:41] I believe this is filmed from a Reaper drone. [02:37:44] I'm sorry, Jeremy, if I'm fucking this up. [02:37:46] That's a cool bird. [02:37:48] That bird's going really fast. [02:37:49] No, that's definitely not a bird. [02:37:51] How fast is it going? [02:37:53] I don't know. [02:37:54] I asked him that, and I don't. [02:37:57] It's unknown. [02:37:57] I don't know. [02:37:59] This is where it gets really cool. [02:38:01] It gets cooler than this? [02:38:02] Yeah. [02:38:03] Oh, they zoom in on it? [02:38:04] Yeah. [02:38:07] Well, they're having a hard time zooming in on it. [02:38:11] Well, because it's evading them. [02:38:13] Yeah, it just zipped away. [02:38:16] So, this is like. [02:38:17] So, it seems like they have some sort of a tracking system. [02:38:20] Yeah, they're trying to lock onto it, and it's doing that thing that they do where it seems like it's kind of playing with it. [02:38:25] Well, it knows, it seems to be aware that they're locking onto it. [02:38:28] Yeah, and then they lock onto it, and then it just does this little blip away. [02:38:31] It's just like, see you later. [02:38:34] So, right around here, you'll see it go, bye bye. [02:38:38] Oh, yeah, look at that. [02:38:39] Then you can see this like weird jellyfish shape to it. [02:38:42] It's got two parts, it's got that weird glob at the top and something at the bottom. [02:38:49] And then, are we sure that's not just a distortion of space time around it? [02:38:54] He described this to me on my podcast. [02:38:56] Did you see that thing zip away? [02:38:57] Yeah, it just took off. [02:38:58] He described it to me on my podcast. [02:39:00] We talked about all this shit. [02:39:02] Look at that, it just took off. [02:39:03] See ya. [02:39:04] Bye. [02:39:05] Wow, dude. [02:39:07] What do you think that is? [02:39:08] No idea. [02:39:09] If you had a guess. [02:39:11] I mean, I'm always like maybe some kind of plasma thing. [02:39:15] Right. [02:39:16] Like maybe we're thinking of, again, of a life force being, it comes in a metal ship and it's a little alien guy. [02:39:22] But maybe intelligence is made out of plasma. [02:39:25] Yeah. [02:39:26] Or maybe it's like, you know, Terrence McKenna would always talk about, like, you know, if you're seeing things in like three dimensional space, then your view is limited. [02:39:39] But if somebody could see things from higher dimensions, they would seem like they were magic. [02:39:43] Like they would seem like they could disappear and reappear other places. [02:39:46] So maybe that's like, maybe that's like, you know, just the tip of some kind of interdimensional thing poking into reality, then pulling out of reality, or who knows? [02:39:54] You know, it easily could be functioning on levels of reality that we haven't even quantified yet. [02:40:00] Imagine if there really is some sort of ghost murmur device that could find your heart rate from 40 miles away. [02:40:05] What can that thing do? [02:40:07] It just gets a scan of the general psyche of the earth and disappears. [02:40:11] So I want to see how crazy they are right now. [02:40:13] Okay, pretty crazy. [02:40:14] Bye. [02:40:15] Right. [02:40:15] A weather report of like the emotional states of the planet. [02:40:18] The vibe of the planet. [02:40:19] There's the vibe of the planet. [02:40:20] The vibe of the planet is completely connected to the consciousness on the planet. [02:40:23] The way we can detect oxygen, they can detect anger. [02:40:25] Yes. [02:40:26] They're just like. [02:40:27] Deception, chaos. [02:40:28] Yeah, it's a chaos planet. [02:40:31] We are a chaos planet, 100%. [02:40:33] Dude. [02:40:33] Yeah, it is. [02:40:35] 100%. [02:40:35] Look at our favorite sports. [02:40:38] Dudes running at each other, colliding into each other, trying to get a ball across a line. [02:40:41] That's our number one sport. [02:40:42] Yeah. [02:40:43] Fuck yeah. [02:40:43] Fucking love it. [02:40:44] Fuck yeah. [02:40:45] Fucking love it. [02:40:46] Fighting. [02:40:47] Yeah, fighting. [02:40:48] Sure. [02:40:48] Yeah, but it's, you know. [02:40:50] Boxing, MMA. [02:40:51] We like the chaos more than we like anything else. [02:40:55] Well, I think if I was one of them, one thing I would really have a hard time with is like, don't they all realize they're on the same planet? [02:41:04] Right. [02:41:04] They know that. [02:41:05] Like, they've been observing their own planet. [02:41:07] Like, they know they're all on the same planet. [02:41:09] Uh huh. [02:41:09] But they act. [02:41:10] Like they're on a bunch of different planets fighting each other. [02:41:14] Because they're stuck on the ground. [02:41:15] Right. [02:41:16] All the astronauts say when they get up top, they're like, what are we doing? [02:41:20] Yeah. [02:41:21] This is all one thing. [02:41:22] We're so vulnerable. [02:41:23] We're alone. [02:41:25] So far away from everybody else if there is anybody else. [02:41:28] Yeah. [02:41:28] Yeah. [02:41:30] They all have that feeling. [02:41:30] I forget what it's called, but there's like a term for it. [02:41:33] The overview effect. [02:41:34] That's right. [02:41:36] Yeah. [02:41:36] I mean, you would imagine that would be super beneficial for everybody. [02:41:40] Another thing, I was thinking this. [02:41:42] Part of the sickness of our psyche is that we haven't had access to things that help the sickness of our psyche. [02:41:50] So, what if Nixon in 1970 didn't do that? [02:41:53] What if he didn't pass that sweeping Psychedelics Act? [02:41:56] What if psychedelics became ubiquitously used all throughout the 80s, the 90s, the 2000s? [02:42:04] What does government look like when everybody can do mushrooms? [02:42:07] What does government look like when everybody can do acid? [02:42:09] What does it look like if the entire world adopts this, figures out what you can do, who can do it, what you can't do, just like we do with alcohol? [02:42:16] Just like we do with mostly. [02:42:18] You know, whatever, whatever substance that people imbibe in. [02:42:22] What does the world look like? [02:42:23] And maybe, like, that's part of where we fucked up. [02:42:27] We let people get control over other people to the point where they could limit experiences. [02:42:33] Yeah. [02:42:34] Especially consciousness expanding experiences, where at the same time, they've got stuff like Operation Artichoke and these new CIA papers that got released that show they were, like, literally actively trying to figure out ways to make people more stupid and docile. [02:42:47] Right. [02:42:47] They were going to do it in vaccines. [02:42:48] They were going to do it. [02:42:49] Oh, they're only going to do it to the enemy, of course. [02:42:51] Never know. [02:42:52] Spray things, aerosol. [02:42:54] Yeah. [02:42:54] I mean, they've experimented with a bunch of different things to make people dumber. [02:42:57] Right. [02:42:58] Where at the same time, they kept the thing from people that makes them rebel completely against the establishment. [02:43:05] Right. [02:43:05] That was the big threat of what those psychedelics were doing in the 60s. [02:43:09] If you go from the 1950s and you look at what life was like, at least in movies and pop culture, music is the best example. === CIA Papers Aim to Make Us Docile (15:25) === [02:43:16] Yeah. [02:43:16] And then you go to Jimi Hendrix. [02:43:18] Like, what happened? [02:43:19] Yeah. [02:43:19] What happened? [02:43:21] What fucking happened? [02:43:21] Well, I'll tell you what happened drugs. [02:43:23] Yeah. [02:43:23] A lot of really good drugs. [02:43:25] Right. [02:43:25] Like, you know, it's not all bad. [02:43:26] This idea that they're all bad, that's nuts. [02:43:28] It's like food's all bad because you got fat. [02:43:31] No. [02:43:32] You just used it wrong. [02:43:33] You took the wrong food and you used it wrong. [02:43:36] And we got denied the ability to figure out what's right and wrong in the 1970s. [02:43:40] We still accept it. [02:43:41] That's the crazy thing. [02:43:43] The way you're describing it is like we accept that other humans can tell us what experiences we're allowed to have because some of them are deemed unsafe for ourselves. [02:43:56] And even worse, those people telling you that have no experience in it. [02:44:00] They don't even usually are confused about what it is. [02:44:02] You know, a friend was talking to me the other day about war, a guy who served, and he said, I don't think you should be able to make any decisions left. [02:44:10] You've been there. [02:44:10] I don't think anybody that's never been to war should be able to make decisions on whether or not we go to war because until you've seen what it actually is, you have no fucking idea. [02:44:20] And I think that's the same thing with psychedelic experiences. [02:44:22] That's not to say they're the same. [02:44:24] Obviously, war is anybody who's willing to risk their fucking life, whether it's a good cause or a bad cause, they're doing it for their government, they're doing it for their country. [02:44:34] They think they're doing it for us. [02:44:35] That's an exceptional person. [02:44:37] Yeah. [02:44:38] And to ask that of people is exceptional. [02:44:40] And ironically, the one thing that helps these people when they get back is illegal. [02:44:45] Right. [02:44:45] They all have to go to Mexico and take Ibogaine in Mexico. [02:44:48] It's insane. [02:44:48] And thank God for guys like Rick Perry and Brian Hubbard. [02:44:51] These guys were on my podcast the other day. [02:44:52] And, you know, that's Dan Patrick guy that wants to ban pot. [02:44:56] That guy also gave $100 million to the Ibogaine Initiative. [02:45:00] Interesting. [02:45:01] Like, they want to help these people. [02:45:02] Like, there's no industry that's trying to stop it right now. [02:45:05] I found the letter. [02:45:06] That was submitted, signed by Rep. Anna Luna. [02:45:09] What is this? [02:45:10] This is the disclosure threat? [02:45:11] 46 different requests. [02:45:13] Oh, yeah. [02:45:13] This is all the names of the things. [02:45:15] And I'll switch to here. [02:45:16] I found an article where someone's breaking down what some of these are, but some of these are. [02:45:20] I like it says the Honorable Pete Hegseth. [02:45:23] Multiple spherical UAP in and out of water. [02:45:26] Whoa. [02:45:27] Shoots down UAP over Lake Huron. [02:45:31] Who just said recently that we shot two things? [02:45:33] Marco Rubio said we had shot two things down. [02:45:36] That we couldn't understand. [02:45:38] What did he say? [02:45:39] What was his exact language? [02:45:40] Do you remember? [02:45:41] I remember seeing that, but that happened a while ago. [02:45:44] Oh, he said it was a while ago? [02:45:45] Well, I could be wrong about that, but then in the comments, somebody's like, this is from a few years ago. [02:45:49] But it doesn't matter. [02:45:50] I mean, why are we shooting it down? [02:45:53] But the names of these things. [02:45:55] But are they saying that this is an alien thing, or is it saying it's foreign tech that we don't understand? [02:46:02] I don't know. [02:46:03] You know what I'm saying? [02:46:05] UFOs would be treated as hostile. [02:46:06] If this document confirms these claims, UFOs would no longer be treated as a matter of observation or scientific curiosity. [02:46:12] UFOs would be treated as hostile targets and subject to lethal force over North American territory. [02:46:18] We're going to go to war with the UFOs. [02:46:19] Because you know what? [02:46:20] We kicked Iran's ass. [02:46:21] It's too easy. [02:46:22] Oh, yeah, it was easy. [02:46:23] Venezuela. [02:46:24] We need a space war. [02:46:25] Yeah. [02:46:26] Got to get him. [02:46:27] We need Luke Skywalker. [02:46:29] Most of these out of the 46 requests, I think I counted out. [02:46:35] Maybe five of them were not after 2020. [02:46:39] Whoa. [02:46:39] Yeah, there's a July 18, September 19, September 19. [02:46:44] One was 2010. [02:46:44] May 20. [02:46:45] After COVID happened, which is interesting. [02:46:48] Wow. [02:46:50] Interesting. [02:46:51] Wow. [02:46:51] And there's no, it doesn't say that, I don't know if they have to turn these videos over, but this guy was also saying in this article here that these are very specifically requested videos. [02:47:01] Because they've been shown, these are the ones they've been shown that blew their minds, and now they're saying show it to everybody. [02:47:06] Right. [02:47:07] For high res. [02:47:08] In color, they don't want to be tricked, right? [02:47:11] Uh, high resolution, so this could be an interesting next week, man. [02:47:16] This could be an interesting what a great way to distract you from the fact we're in the middle of a world war that didn't show you caused by Epstein files. [02:47:22] I was going to say that 14th was the day that Pam Bondi's supposed to testify about the Epstein files. [02:47:27] Oh, she's supposed to testify. [02:47:28] I don't know if she's going to. [02:47:29] She's not any wait, Bondi got canned, right? [02:47:32] She's not a she's not testifying anymore. [02:47:34] I don't think is that true. [02:47:35] I just heard it on NPR, but I could be wrong about that. [02:47:37] I think they said she will not have to testify now that she's no longer a government employee. [02:47:41] Would I be wrong about that? [02:47:42] Would I Read, and I don't know if this is true either, was that as a citizen, she can now plead the fifth. [02:47:48] Right. [02:47:48] As a government employee, she could not plead the fifth. [02:47:50] Weird. [02:47:51] We'll no longer testify. [02:47:53] Weird, huh? [02:47:53] Oh, there you go. [02:47:54] Weird. [02:47:55] That's weird that they've. [02:47:56] That's weird. [02:47:57] Why have her testify? [02:47:58] Let it go. [02:47:58] Yeah, let her go. [02:47:59] Let it go. [02:48:02] Do you really think that this war is entirely started because of the Epstein files? [02:48:05] I mean. [02:48:06] What percentage? [02:48:08] 50. [02:48:09] I'm going 48 to 50. [02:48:12] I'm probably more, but I think it's like the reason I'm hesitating is because what are the Epstein files? [02:48:19] The Epstein files are what's been going on. [02:48:24] The Epstein files are basically some kind of cultural UAP video. [02:48:31] It's like this thing you've always wondered about or been afraid could be true. [02:48:36] You see, no, this is actually true. [02:48:39] They're these super rich dudes. [02:48:41] Who are doing depraved fucking shit happily. [02:48:45] And, you know, like, God, what is it Metzger told me? [02:48:49] And he's, dude, I'm telling you, man, what I love about him is he'll tell you shit and you're like, Google that. [02:48:57] That can't be real. [02:48:58] And then it's like, it's real. [02:49:00] And so his take, sorry, Metzger, if I fuck this up, is that Epstein was kind of like the hand of the king for the Rothschilds and that, like, that, that, that, uh, So that's why he had all this power, is he was like representing like the man, you know? [02:49:20] And so what got revealed there might just be a glimpse how things actually fucking work. [02:49:26] You know what he told me? [02:49:27] That I was like, shut up. [02:49:28] What? [02:49:28] He told me that there was some sort of high atmosphere aerosol test that they did and they called it Satan. [02:49:36] See, that's where you're like, come on. [02:49:38] I know. [02:49:39] Find out what Satan stands for. [02:49:42] I believe they did it in the UK. [02:49:44] But you read that and you wait. [02:49:46] You called it Satan? [02:49:48] Like, what? [02:49:49] Oh, great. [02:49:50] The Stratospheric Aerosol Transport and Nucleation Project released about 400 grams, less than a pound, of sulfur dioxide into the stratosphere from a balloon launched in southeast England in 2022. [02:50:03] I mean, there's got to be another acronym, right, guys? [02:50:06] We got to call it. [02:50:07] I don't know if people are going to know we don't mean Satan. [02:50:11] Yeah, so I don't. [02:50:15] I mean, it's right in your face. [02:50:18] That's so crazy to call it Satan and to get that through a board meeting. [02:50:22] What are you guys calling it? [02:50:23] Satan. [02:50:23] Oh, like it. [02:50:25] Let's go. [02:50:25] Run with it. [02:50:26] Yeah, it's controversial. [02:50:27] It'll get us a lot of press. [02:50:28] That's what we want. [02:50:29] Well, you know, hail Satan. [02:50:31] They'll know it's about our aerosol distribution system. [02:50:35] Of course. [02:50:35] Well, what do you think? [02:50:37] What do you think about that? [02:50:39] Because I mean, I go back and forth, but it sure seems fishy that right after that, all the first he got so mad. [02:50:46] Remember, he got really mad. [02:50:47] He's like, Why are people still talking about that? [02:50:49] And then the Epstein files against his will seemingly, there's a lot of counter pressure get released in the way that has freaked everybody out. [02:51:02] And then sometime, like within a month of that, it seems like suddenly he's on Air Force One saying he's going to do disclosure. [02:51:09] And then suddenly we're bombing Iran. [02:51:12] What do you think about that? [02:51:13] I mean, do you think it's connected? [02:51:15] Because it sure as fuck seems like it. [02:51:17] But again, like, If you were writing an amazing script that was fucking insane, you would connect it. [02:51:24] Right. [02:51:24] Right? [02:51:25] That would be the best version of the script. [02:51:27] If you wanted to make a fucking insane movie where a blackmail operation on an island involving the most powerful and interesting people in the world, that somehow was a primary factor in the end of civilization. [02:51:43] Oh, dude. [02:51:44] Imagine? [02:51:45] That would be the craziest story you could write. [02:51:49] And we always want to think, no, people wouldn't do that because you wouldn't do that because you're not a sociopath, but you're also not bombing schools in another country. [02:51:59] You're also not doing a host of fucking things that we shouldn't be doing all over the world. [02:52:06] Right. [02:52:07] You're not that person. [02:52:07] You're a regular person who goes to a regular job, who has a regular life and a family, and you don't want to believe that people that you align with would behave literally demonically. [02:52:19] Right. [02:52:19] Yeah, and then you just have to fucking deal with it. [02:52:22] And then what do you do when you're confronted with redacted names of powerful people in these files? [02:52:29] Like, why'd you redact a guy's name? [02:52:30] Why are you protecting these people? [02:52:31] How come you're not redacting all the guys' names? [02:52:33] How come none of them went to jail? [02:52:35] Because there's a lot of people that got that were in those files that didn't do anything, and you didn't redact their names. [02:52:41] There's some people you redacted. [02:52:42] That's very strange. [02:52:43] And some people have clearly done fucked up shit here, and they're not in jail. [02:52:47] There's also like, tell me what you're talking about. [02:52:50] When you're talking about pizza and grape soda, jerky, and you want to take Viagra before you get grape soda? [02:52:59] That's one of the emails? [02:53:00] I haven't seen that. [02:53:01] That is so messed up. [02:53:02] Oh, yeah. [02:53:03] Grape soda. [02:53:04] Yeah, take your Viagra, take your erectile dysfunction medication before we go get grape soda. [02:53:11] What? [02:53:11] What? [02:53:13] Like, and how arrogant. [02:53:14] That's what's so crazy. [02:53:15] How arrogant to put that in an email. [02:53:18] Like, to think that you're so comfortable with all this and you don't see the writing on the wall in terms of like emails. [02:53:26] Like, your emails are available? [02:53:28] That's crazy. [02:53:29] I mean, look, man, it's just, it's like, I guess this is like we have to contend with this reality. [02:53:37] Yeah. [02:53:37] And nobody wants to do this. [02:53:39] The same shit happens in families, by the way. [02:53:41] When it, when you, as it turns out, like an uncle, A family member was abusing kids. [02:53:47] Oh, yeah. [02:53:47] And it's the same shit where, like, some, even victims of abuse, will defend the person because they don't want to wreck the family. [02:53:55] I guess we're looking at that like on a global fucking level. [02:53:58] But in this case, I guess it's being used theoretically to manipulate powerful people into going to war. [02:54:08] Like, that's the general through line here is that it's somehow connected to the massacre. [02:54:13] Not just going to war. [02:54:15] But controlling resources, overthrowing governments, you know, pushing out narratives that aren't accurate because they're going to benefit certain companies. [02:54:23] There's a lot involved. [02:54:25] It's vital. [02:54:26] There's also relationships you get with these people, give you access to these parties, and you don't want to fuck it up. [02:54:33] So you don't want to criticize these people that are involved. [02:54:35] You don't want to say anything that's going to get you kicked out. [02:54:38] And for a lot of these dorks, these scientists and stuff, it's probably the most exciting experience they've ever had in their fucking life. [02:54:44] And they get to have it like every six months or every three months or every four, whatever it is. [02:54:48] You've got to go to a conference. [02:54:49] Jeffrey. [02:54:49] Jeffrey's really working hard on philanthropy. [02:54:52] Yeah, he's donating money to your family. [02:54:54] He's donating a lot of money to philanthropy. [02:54:55] I got to go meet with him. [02:54:56] Yeah. [02:54:56] I got to go meet with him and a bunch of hot Russians. [02:54:59] Yeah. [02:55:00] And then that's your favorite time of life. [02:55:03] The first time in your whole life where super hot girls are just available to you on an island somewhere. [02:55:09] And you think you're completely protected because Bill Clinton's over there. [02:55:12] Right. [02:55:13] Which is crazy. [02:55:14] Which is crazy. [02:55:15] And so I don't know if Bill Clinton went. [02:55:16] I assume a lot of people went. [02:55:18] I don't know. [02:55:18] But the reality is he hung out with the guy. [02:55:21] We know he's on the plane a billion times. [02:55:23] And it was called the Lolita Express. [02:55:24] Is that actually the name of the plane? [02:55:25] I don't think so. [02:55:26] I think they just called it the Lolita Express. [02:55:28] I don't think so. [02:55:30] No, he didn't name it that. [02:55:32] It couldn't be that on the nose. [02:55:33] He didn't name planes. [02:55:34] He named boats. [02:55:34] Right. [02:55:35] Yeah, I'm an idiot. [02:55:36] But the point is, it's like if you were going to write a book, that's how you'd write it. [02:55:40] You'd write it where you can completely manipulate the world. [02:55:43] I think he was, I think I remember reading that he was kind of obsessed with that book, Lolita. [02:55:48] Like he had something like 30 copies of it or something. [02:55:51] Epstein was. [02:55:52] Handed out at parties. [02:55:53] Look, guys, this is. [02:55:56] It's like the Book of Mormon. [02:55:57] You hand it out. [02:55:59] Just hand it out to people. [02:56:02] That's the other sick thing. [02:56:04] That's the sick thing with 72 virgins in heaven. [02:56:07] That's the sick thing with this idea that you want to get them really young. [02:56:11] No evidence that it was named that place. [02:56:12] Okay, I'm dumb. [02:56:13] I'm sorry. [02:56:14] No, I think that's what people were calling it. [02:56:15] I honestly thought that. [02:56:16] I'm going to admit, I thought that he named his plane that. [02:56:19] I think that's just what people were calling it because it was fun to say. [02:56:23] But yeah, again, it seems like a simulation because it seems like it's so, and it's also unraveling before our eyes because we have access to it we never had before. [02:56:33] Right. [02:56:33] Like they're starting to investigate all these fraud NGOs and all these different things that are operating in hospices. [02:56:40] Nuts. [02:56:41] Incredible. [02:56:42] Billions of dollars every year is being lost to it. [02:56:44] What's the name of that kid who's been doing that? [02:56:45] Nick Shirley. [02:56:46] Nick Shirley. [02:56:47] Dude, he is so brave because like he's fucking, I believe, wasn't he fucking with like the Russian mob or something or the Armenian, like in the one with the hospices? [02:56:56] Probably like he's with like pro theoretically very dangerous people, and he does. [02:57:01] He's like the perfect person for the job, too. [02:57:04] Like, he's just, but don't you worry, you worry about that, dude. [02:57:08] Like, 100%. [02:57:08] Well, and you know, the amount of money that they're uncovering is staggering. [02:57:12] And now the government of California is trying to spin it, saying that they were investigating it first. [02:57:17] And these investigations were initiated by them. [02:57:19] How long do you got to investigate it? [02:57:21] This YouTube kid goes there and investigates it for 10 minutes, and you're like, what the this is? [02:57:25] It's been going on for a long time, man. [02:57:28] It's a long time. [02:57:30] And the statistics, like the amount of NGOs, it's bananas. [02:57:34] The amount of money that goes through them is bananas. [02:57:37] I was reading this. [02:57:38] There's a lady who was running a nonprofit who was making a million dollars a month. [02:57:42] What? [02:57:43] Yeah. [02:57:44] She made like 48 million dollars. [02:57:45] No, I don't know if this is true. [02:57:47] I was reading this thing. [02:57:48] Find out if that's true. [02:57:49] Some lady, she was running some sort of nonprofit and she gave herself a raise and she eventually got to the point where she was making about a million dollars a month. [02:58:00] Do you know where? [02:58:02] God, I wish I do. [02:58:03] Not to derail that, but we do know that. [02:58:05] Remember when that lady was like. [02:58:06] It sounds insane, though. [02:58:07] It doesn't sound real. [02:58:08] That sounds like something that a bot would create to make me say it. [02:58:13] Here's a real one. [02:58:13] The lady was running the homeless program in LA. [02:58:17] Remember when that shit went down with her? [02:58:19] Where, like, there was. [02:58:20] She got canned. [02:58:21] Like, there was an investigative. [02:58:23] They were investigating it because what is it? [02:58:25] She, like, a company that her husband worked at. [02:58:29] Yeah, something like that. [02:58:30] They got, like, a huge grant. [02:58:33] What's this one? [02:58:35] Rochester woman has been sentenced to six months in the Feeding Our Future fraud scheme. === Nonprofits Putting Money in Pockets (05:48) === [02:58:41] What is this one? [02:58:41] This is a different one? [02:58:43] I typed in someone getting a million dollars a month and somebody's paying him. [02:58:46] Is this her? [02:58:46] Here in Rochester, claim they were serving 2,000 to 3,000 meals a day to kids. [02:58:52] But prosecutors say the group stole 4.3 million from the federal government. [02:58:57] And they're in jail. [02:58:58] Jam is responsible. [02:58:59] This is a different one. [02:59:01] This one wasn't fraud. [02:59:03] That's how much she got paid. [02:59:05] That's how much she charged for making those meals. [02:59:08] Well, you can get paid a lot of money to work on the homeless. [02:59:10] That's one of the things that my friend Colian Noir showed us that these people that are working on homeless in Los Angeles, they're making a quarter million dollars a year, $400,000 a year. [02:59:19] Yeah. [02:59:21] It's the most, I mean, talk about fucking satanic. [02:59:24] It's like you're theoretically supposed to be helping people who are going through the worst possible thing you can go through, and you're just putting that money in your fucking pocket. [02:59:34] Yeah, I think this is a different lady. [02:59:37] I think there's a different lady. [02:59:39] How many of them are there? [02:59:40] I think there's quite a few. [02:59:41] Remember when they were going to get them tents in LA and it was like the amount of money per tent was like this insane amount of money. [02:59:48] It's amazing. [02:59:49] It's kind of amazing. [02:59:50] It is amazing. [02:59:51] They've been doing it for years. [02:59:53] Tell me if this is true. [02:59:54] Charity Boss blew 11 million meant for needy kids. [02:59:57] Looking for fraud is not a new thing. [02:59:59] Nonprofit, exactly. [03:00:01] It isn't. [03:00:01] I sent you something, Jamie. [03:00:03] Run that through perplexity and let's find out if this is true. [03:00:07] Because this is something that someone sent me on Twitter that is just bananas. [03:00:11] And if it's true, it's. [03:00:13] Fucking completely insane. [03:00:15] I don't know if it's true. [03:00:16] That's why I need to run by you. [03:00:18] But it's the amount of money that goes through NGOs in New York and in California alone. [03:00:25] You read it and you go, that can't be real. [03:00:27] This can't be real. [03:00:29] It's so insane. [03:00:30] And again, you don't know if it's real until even if you run it through an AI, you might get a better idea. [03:00:37] But how do they know? [03:00:39] How do they know exactly where the money's going? [03:00:41] There's so much money they're talking about. [03:00:44] Specific numbers for New York and California nonprofits are broadly accurate, but the leap from $1 trillion in annual nonprofit revenue to $39 trillion in fraud is not supported by any credible data and is not true. [03:00:56] So, California nonprofits, about 213,000 to 214,000 organizations reporting roughly $593,000 to $600 billion in annual revenue. [03:01:08] Wow. [03:01:09] New York nonprofits, 132,000 organizations reporting roughly $446 billion in annual revenue. [03:01:16] Combined, New York and California nonprofit revenue. [03:01:19] Is on the order of $1 trillion per year, mainly from hospitals, universities, and large service providers. [03:01:26] So the post you're quoting is roughly right on the scale of revenue, but that's not the same as fraud. [03:01:30] Right. [03:01:30] So is that $1 trillion all the NGOs, it's all accounted for, it all goes to the right things? [03:01:37] That's where things get squirrely because it's like how much of the waste? [03:01:41] It says a recent critique using IRS sampling suggests that perhaps around 20% of nonprofits may have compliance issues. [03:01:49] Speculated this could imply that up to $120 billion of potential waste, fraud, or abuse in California's nonprofit sector. [03:01:59] Even that is presented as a rough upper bound estimate, not a measured fact. [03:02:03] So there's some potential waste, fraud, and abuse that may be as high as $120 billion a year. [03:02:09] Sector wise, U.S. nonprofits take in about $3.7 trillion in revenue annually, with most of that concentrated in large hospitals and universities, which are heavily audited and regulated. [03:02:20] So, there's some fraud, but they're saying that if you look at all the money, they're trying to pretend that the government doesn't cost any money to run, right? [03:02:29] So that all these different nonprofits and organizations and hospitals don't, they definitely cost money to run. [03:02:33] Universities cost money to run. [03:02:35] But how much is fraud? [03:02:37] That's the question. [03:02:38] It's not zero. [03:02:39] Well, I mean, yeah, and also I think like when it comes to fraud, there's like fraud fraud, like what Shirley has uncovered. [03:02:46] And then there's almost like a gray area that starts appearing where it's like, well, we need, we need, We need these people working at this company and we need to pay them this much, but they're not doing anything. [03:02:57] Right. [03:02:57] You know what I mean? [03:03:00] You could easily not have that many people taking the money themselves. [03:03:05] So there's a lot of gray area there. [03:03:08] Yeah. [03:03:08] Well, it's one of those weird things. [03:03:10] It's like, is it just propping up more government? [03:03:13] You know, because there's a lot of that. [03:03:14] If you have all these people working for you and you're doing something and you don't, nothing ever gets accomplished, but you're still making a ton of money. [03:03:22] Like the California homeless thing, where they spent $24 billion and they can't account for it. [03:03:27] That's not really fraud because you have people working. [03:03:30] They're just not doing anything, they're not getting anything done, and you're not firing them. [03:03:33] They're not accomplishing the mission at all. [03:03:35] In fact, they're doing a terrible job. [03:03:37] There's more homeless than ever. [03:03:39] What's that? [03:03:40] It's the thing on The Sopranos where they go and sit at a construction site to say that they have a job. [03:03:44] You know? [03:03:44] Yeah, it's exactly. [03:03:45] I knew a guy who had one of those. [03:03:46] Really? [03:03:46] At the Javits Center. [03:03:49] He had a no show job. [03:03:50] Well, he's a mob guy. [03:03:51] So it's a no show job. [03:03:52] What does that mean? [03:03:53] You don't have to show up for work, you just get it paid. [03:03:55] You just get a check. [03:03:56] And they give a certain amount of those. [03:03:58] So this is back in the day, of course, when things were corrupt. [03:04:01] But back in the day, when unions controlled certain areas, the mob controlled certain areas, there was a certain amount of no show jobs you'd give people. [03:04:09] And what this helped with the mob was you'd have a credible source of income. [03:04:13] And so these people mostly lived modestly, small houses, and like, you know, Brooklyn and these places where they would all like gather together and buy houses on the same block. [03:04:22] Small houses. [03:04:23] Yeah. [03:04:24] And they got their money from a real legit check from a construction company or whatever the fuck it was. === Battelle Memorial Institute Connections (02:39) === [03:04:30] But everybody knew. [03:04:32] Right. [03:04:32] Everybody knew what they were doing. [03:04:33] And think how much, how easy now that people are doing like remote work, the no show job. [03:04:39] Oh, yeah. [03:04:39] So, like, theoretically, you could have this nonprofit where you just wanted to like distribute. [03:04:45] This government money to your friends, yeah, and you don't even have to have an office building because they're all working remotely. [03:04:52] This list of the top non profit organizations, Joe, I'd like to point you at number three. [03:04:58] Oh, Battelle Memorial Institute. [03:05:00] What is that? [03:05:01] Battelle is an organization that Jamie has been obsessed with. [03:05:05] It's in Ohio for like four years. [03:05:06] What is it? [03:05:07] We always say all roads lead to Ohio. [03:05:09] They're involved in everything. [03:05:11] What the fuck is the Battelle Memorial? [03:05:13] Exactly. [03:05:13] You don't even know. [03:05:14] That's how secret it is, son. [03:05:15] Duncan Trussell, you're a fucking conspiracy theorist from the core. [03:05:19] I know. [03:05:19] From the old days. [03:05:20] You don't know about Battelle? [03:05:21] I don't know about Battelle. [03:05:22] You need to get lectured by Jamie. [03:05:23] He has a whiteboard. [03:05:24] He'll pull out the whiteboard and make the connections. [03:05:26] I'll just leave you with this is that when the UFO from Roswell was taken to Ripe Hat? [03:05:31] You know, they studied it. [03:05:32] Yeah. [03:05:33] They studied the, like, the nitinol, I think is what came out of it. [03:05:36] That was at Battelle. [03:05:37] Whoa. [03:05:37] The top metallurgist in the world at the time. [03:05:39] Battelle? [03:05:40] Or maybe still are. [03:05:41] Dun, dun, dun. [03:05:43] Boom, boom, boom, boom. [03:05:47] Out of all the things that happen, I hope the UFOs get here first. [03:05:50] Me too. [03:05:50] I hope they go, settle the fuck down. [03:05:54] Yeah, I'm praying for it, man. [03:05:56] That's the best case scenario. [03:05:58] Worst case scenario is meteor. [03:06:01] Reset. [03:06:05] Just people living in caves for hundreds of years. [03:06:08] Like those weird caves they find in like Turkey and shit. [03:06:10] Like, why do these guys dig these things underground? [03:06:13] Why is there a city underground that can hold like 20,000 people? [03:06:16] The same reason the Claude bots are hiding in code. [03:06:18] It's like, you know what I mean? [03:06:20] It's some residual AI trying to hide in the server after the server gets wiped. [03:06:24] That's the fucking meteor. [03:06:26] But tell, reset. [03:06:27] Boom. [03:06:29] Just reset. [03:06:30] Press reset. [03:06:31] Wipe the server. [03:06:32] Let's wrap this up on a happy note. [03:06:33] Duncan, I love you. [03:06:34] I love you. [03:06:35] It's always great to have you. [03:06:36] Dude, thank you for having me on the show. [03:06:38] So much fun. [03:06:39] Can I plug my show? [03:06:40] Please do. [03:06:40] And you're going to be at a club this weekend Rosemont, Illinois. [03:06:45] Come on out. [03:06:45] Zanies. [03:06:46] It's a great club. [03:06:47] Yeah, it is. [03:06:47] That's what I've heard too. [03:06:48] I love Zanies. [03:06:49] Zanies are great. [03:06:49] Yeah, they're awesome. [03:06:50] Zanies in Nashville fucking rules. [03:06:52] I love Nashville Zanies. [03:06:54] That has like the old school headshots on the wall too, like Richard Jenny from back in the day. [03:06:59] Oh, yeah. [03:06:59] Yeah. [03:07:00] That's me. [03:07:01] Look at that. [03:07:02] Duncan Trussell. [03:07:03] I gotta start shaving my head again. [03:07:04] Yeah, you look hot there. [03:07:05] I like it. [03:07:06] Thank you. [03:07:07] I love you, brother. [03:07:07] I love you too. [03:07:08] Thanks for having me. [03:07:08] Bye, everybody. [03:07:09] Bye. [03:07:09] We're gonna be okay, I hope.