The Podcast of the Lotus Eaters - The Podcast of the Lotus Eaters #1400 Aired: 2026-04-20 Duration: 01:30:48 === Failed Parliament Scandal (14:10) === [00:00:00] Hi, folks, welcome to the podcast The Load Seaters for Monday, the 20th of April, 2026. [00:00:05] I'm joined by Josh and Feras, the dream team, as it were. [00:00:08] And today we're going to be talking about Keir Starmer's Libtar Game of Thrones, how we do actually do quite formalized child trafficking in the West for some reason. [00:00:18] Legalized child trafficking is bad. [00:00:20] Yeah, we're weirdly against that. [00:00:22] And how many of the 20th century dystopias are actually pitched better than the current environment we live in now. [00:00:30] We knew it was going to be bad. [00:00:31] We didn't think it was going to be that bad. [00:00:33] That's great. [00:00:34] Anyway, before we begin, Feras has a live Realpolitik on Loses.com at 3 p.m. today. [00:00:41] Yep. [00:00:41] We're talking about the Teal Network and the Iran war and how they might decide to benefit from the second order consequences of that, given a recent statement made by Palantir recently. [00:00:55] Ah, right. [00:00:56] Okay. [00:00:56] Well, tune in for that. [00:00:58] Anyway, so we're going to begin by talking about what is happening, which is a Libtard Game of Thrones. [00:01:02] And this is basically a summary of it. [00:01:10] See, I think that's hilarious. [00:01:11] No one else finds that funny, but I think that's brilliant. [00:01:14] I would have laughed more if you hadn't played it just before we came on live. [00:01:18] I had to make sure it was working. [00:01:20] But that's basically what's happening. [00:01:22] A kind of bizarre sort of theatre cover of Game of Thrones is currently happening in British politics. [00:01:28] And all I have is that kazoo going, nah, it ran in my head when I read any of this stuff. [00:01:34] So I guess we'll get into it. [00:01:35] So it's been a bad weekend for Keir Starmer. [00:01:38] Everything is in the slow process of falling apart because of his personal incompetence and the personal incompetence of everyone around him. [00:01:45] And the fact that they're kind of all like sort of Varys figures in the background as well. [00:01:51] The Labour Party is full of plotting and scheming and backstabbing and eunuchs. [00:01:56] Hold on. [00:01:57] Varys is intelligent. [00:01:58] I mean, he's a eunuch, but he's intelligent. [00:02:00] Sure, sure. [00:02:00] They're unintelligent eunuchs. [00:02:02] Sure, that's why the kazoo rather than the actual theme of Game of Thrones, right? [00:02:06] It's kind of like a Plato's cave parody. [00:02:09] Of it projected on the wall. [00:02:11] It's sort of Machiavellian if Machiavelli was kicked in the head by a horse at a very young age. [00:02:16] Yeah, if Machiavelli was actually a moron. [00:02:19] Yeah, that's basically what's happening. [00:02:22] At some point today, so after this podcast has gone out, there will be an address from Keir Starmer to the House of Commons. [00:02:29] We've been told that he's going to be very angry. [00:02:32] He's going to be angry. [00:02:33] I'm very disappointed. [00:02:35] He's going to be angry with the podcasters and the posters. [00:02:39] No, no, no, no. [00:02:40] He's angry with his own. [00:02:41] None of the civil service, his own government, everything around him that he is purportedly responsible for, he's going to be angry at. [00:02:49] And it's like, okay, why have they persistently failed you? [00:02:53] Anyway, so we'll talk just very quickly about what's actually happened. [00:02:57] So, Peter Mandelson failed his security vetting seven months ago, as The Independent revealed. [00:03:04] And number 10 was aware of this seven months ago. [00:03:08] So let's just sort of recap. [00:03:10] Everybody knew that Mandelson was deeply involved with Epstein. [00:03:14] Ah. [00:03:15] The security services knew about this. [00:03:17] Yep. [00:03:17] That meant that Mandelson failed his security vetting. [00:03:21] He knew. [00:03:22] He. [00:03:23] That's not correct. [00:03:24] Oh, go on. [00:03:25] Right. [00:03:25] So the problem actually isn't Epstein, weirdly enough. [00:03:29] It's actually the fact that he's connected to a Chinese linked military firm called Wu Zi AppTech. [00:03:36] And Mandelson's company. [00:03:39] His lobbying firm Global Council, which really tells you everything about Mandelson, was paid 2.24 million and was one of their biggest clients by revenue. [00:03:50] And senior government sources told the Times that it was his ties to foreign entities, such as this, that caused the UK Security Vetting Agency, which is kind of Parliament's own internal vetting procedures, to recommend against his appointment of the role of the British ambassador to the US rather than his links to Epstein. [00:04:07] Now, just as a quick recap for anyone who doesn't know, Mandelson. [00:04:10] Is part of some international network, the Epstein network. [00:04:15] And for some reason, Keir Starmer said, No, no, I want Mandelson. [00:04:19] And so he was essentially shunted up, despite the fact that he'd failed his vetting, despite the fact Keir Starmer, or at least number 10 Downing Street, so you'd think Keir Starmer, knew about this. [00:04:29] And he was also placed in positions of supreme importance in the Labour Party itself. [00:04:34] For example, Mandelson was the person who was choosing the candidates for the 2024 election with Morgan McSweeney, organising a spreadsheet that they, only they, Had access to so they could literally control and curate who was allowed to get to the inner echelons of the party. [00:04:49] A blackmail database. [00:04:51] I'm not calling it a blackmail database, but it kind of sounds a lot like a blackmail database. [00:04:57] It came from a 1990s blackmail database that they had that they kept up to date that was then given charge to Morgan McSweeney, allegedly. [00:05:04] Allegedly. [00:05:06] This is what was being reported. [00:05:08] Yes. [00:05:08] But it seemed that the provenance of it was that. [00:05:11] And then they, Keir Starmer, in his wisdom, sought to make the US ambassador somebody who was getting paid by China? [00:05:18] Yes. [00:05:19] But I think, I suspect, I obviously can't prove this, but basically, Keir Starmer wanted to improve his relations with Donald Trump. [00:05:26] He realized that there was an Epstein network that actually connected Donald Trump to Peter Mandelson. [00:05:32] And so rather than choosing George Osborne, who I'm guessing wasn't a part of it, because it came down to George Osborne or Mandelson, and he chose Mandelson because I think Mandelson was a part of the network. [00:05:42] For anyone who doesn't know, the main scandal with the link to Jeffrey Epstein comes in the wake of the release of the Epstein files, where we found out that Mandelson was just constantly emailing Jeffrey Epstein. [00:05:52] For example, market secrets, state secrets that were market sensitive, Mandelson would just forward on to Epstein when he got hold of these things. [00:06:03] And when Mandelson was in government, and Mandelson's been in government, what, three times now? [00:06:07] And kicked out three times. [00:06:09] He's a pretty big name in Britain. [00:06:11] If you're not aware, if you don't follow British politics, He's pretty much synonymous with the Blair and Gordon Brown years of New Labour. [00:06:19] He's sort of the third man, as he referred to himself in his biography. [00:06:24] And he's obviously quite well renowned in the political world. [00:06:29] But I don't know how. [00:06:30] He is also evil and has the nickname the Prince of Darkness. [00:06:33] And liked it. [00:06:34] And liked it, yeah. [00:06:36] So when you bring in Peter Mandelson, you know what you're bringing into your administration. [00:06:42] It's like a vampire. [00:06:43] You've got to invite him in. [00:06:44] No, no, it very much is, actually. [00:06:47] And so this is very much. [00:06:49] A problem of Keir Starmer's own making. [00:06:51] Now, I have my personal suspicions that there's actually something deeply sordid going on underneath what we're shown on the front because, of course, we don't know that next week will be the Ukrainian rent boy trial for the Ukrainian rent boys. [00:07:04] I think there's been five of them now arrested for attacking Keir Starmer's properties. [00:07:08] He's got multiple properties, and these Ukrainian rent boys are attacking them for some reason. [00:07:12] And it's like, okay, but how does this connect to everything? [00:07:15] And I suspect there's going to be some sort of Mandelson, Zelensky, Epstein connection. [00:07:19] I don't know. [00:07:20] I guess I'll see if I'm proven right on that. [00:07:23] But the point is, Mandelson has got weird hooks in Keir Starmer's government. [00:07:29] And this has been a problem because of the moral problem of him being friends, best buddies with Jeffrey Epstein, but also being paid off by suspicious Chinese firms. [00:07:41] And so this wasn't something that Starmer has been able to actually just keep quiet and keep sort of on the back bench or something, but. [00:07:53] Because it's just so obvious that this is a gross and corrupted regime. [00:07:58] And so Starmer's like, right, okay, I guess there are going to be some knives in the back. [00:08:02] By the way, Mandelson was given top level security clearance. [00:08:07] This is what the problem, this is what the suspicious thing about these networks is, right? [00:08:12] As in, these networks are deeply built on personal relationships and probably compromise. [00:08:18] And therefore, Keir Starmer was like, oh, no, no, I want him. [00:08:21] And so it looks like the institutions around him and the mechanisms around him. [00:08:26] Just shunting him up, saying, Well, Keir Starmer wants it. [00:08:28] It's like, Well, this is just what we do. [00:08:30] This is normal to us, right? [00:08:31] Now, that's really, really weird, isn't it? [00:08:33] And so when Keir Starmer comes out and goes, I'm very furious. [00:08:37] I'm absolutely furious that he failed his security vetting. [00:08:39] Well, this watch is stupid. [00:08:41] I wasn't told that he'd failed security vetting when I was telling Parliament that due process had been followed, is unforgivable. [00:08:50] Not only was I not told, no minister was told. [00:08:54] And I'm absolutely furious about that. [00:08:57] What I intend to do is to go to Parliament on Monday to set out all the relevant facts in true transparency so Parliament has the full picture. [00:09:07] Okay, so you're either a moron or you're lying, is Keir Starmer's literal defence with this. [00:09:14] I mean, it's not like his reputation was particularly hidden. [00:09:18] If normal members of the public can know, okay, this guy's a little bit sinister, even 10, 15, 20 years ago, then perhaps it should be something that the Prime Minister should know when appointing him. [00:09:32] I mean, it was quite baffling at the time. [00:09:33] Everyone's like, Peter Mandelson to be the American ambassador. [00:09:36] Now, of course, it makes sense in the light of knowing his connections with Epstein and. [00:09:40] Keir Starmer presumably trying to build a close relationship with the Trump administration or something like that. [00:09:44] But it was quite baffling because, of course, everyone knows the problems of Mandelson, and you're just, I mean, you're literally creating a landmine to be stood on. [00:09:55] So, anyway, Starmer is furious because he apparently was being completely misled by the cabinet or the civil servants or something. [00:10:05] The cabinet office. [00:10:06] The cabinet office. [00:10:07] Yeah, someone was hiding this from him, even though he had been the one who demanded that Mandelson get given this top. [00:10:14] In a very prestigious job, and we're supposed to believe that he just had no idea. [00:10:20] He never asked a single question. [00:10:21] No way credible. [00:10:23] No, not at all. [00:10:23] He never asked a single question. [00:10:25] I mean, this doesn't come on. [00:10:29] If this is true that the civil service would hide something of that magnitude, he'd be sacking people left and right up and down the civil service. [00:10:36] Oh, yeah. [00:10:37] You'd think he'd be dragging them out to the streets and flogging them personally. [00:10:40] Exactly. [00:10:41] Like, this is just incredible. [00:10:42] Like, I mean, to be given top level clearance as well, and all the civil servants are just like, So, we all know he failed this clearance, right? [00:10:50] Okay, well, I'll just let him get on with it. [00:10:52] I'm sure it'll be fine. [00:10:53] It's pretty impossible to believe. [00:10:55] No way. [00:10:55] The civil service, their main job is to cover themselves. [00:10:58] Yes. [00:10:59] So, obviously. [00:11:00] They wouldn't do this. [00:11:01] Yeah, so obviously, Sama took full responsibility for this and is being held accountable. [00:11:07] I can't believe you believe what I just said there. [00:11:09] Really? [00:11:09] No, no, he's not. [00:11:10] He instead fired Sir Ollie Robbins, the ex-permanent secretary, who has come out. [00:11:18] Swinging, actually. [00:11:18] He's like, well, hang on a second. [00:11:20] I did everything you wanted me to do, and I did it according to the way that the procedures are set out. [00:11:26] And so he will argue on Thursday that according to the Constitutional Reform and Governance Act of 2010, civil servants are responsible for vetting rather than the secretaries of state. [00:11:35] As in, I did everything I was supposed to do. [00:11:38] That's not my job. [00:11:40] It's some other civil servants who are responsible for this, not me as the ex permanent secretary. [00:11:46] That sounds believable, to be honest, and actually it's unsurprising that there's going to be a fall guy because. [00:11:51] Starmer knows that he has to rely on that, right? [00:11:53] Well, yeah, absolutely. [00:11:54] But again, you know, yet another example of Starmer being like, oh, I can't believe everyone around me is a corrupt idiot. [00:12:00] Yeah. [00:12:00] Ollie Robbins, wasn't this the guy who was responsible for the negotiations over Brexit? [00:12:05] You know, I'm not sure. [00:12:06] It might have been. [00:12:07] I'm looking up. [00:12:08] I actually don't know. [00:12:10] But the point is, Ollie actually is making a fairly compelling case in his own defence, as in, that's not what my office does. [00:12:16] That's what other people's offices do. [00:12:18] Why am I the one being held accountable for it? [00:12:20] Correct for us. [00:12:22] 2017 to 2019. [00:12:24] He was the Brexit negotiator under Theresa May, and he was responsible for making sure that the deal would be in the EU's favour. [00:12:32] So it's not like there's anything, you know, no tears shed or anything. [00:12:35] Mm hmm. [00:12:35] But it's just interesting to watch the vipers going at one another. [00:12:40] So, yeah, Ollie will attend a hearing to defend his position. [00:12:44] Sorry, on Tuesday, not Thursday. [00:12:46] Out of an old fashioned sense of duty and respect for Parliament. [00:12:49] It's like you can't hide that. [00:12:52] I remember his testimony during the Brexit debates. [00:12:56] He does not respect Parliament. [00:12:58] None of them do. [00:13:00] He was undermining the British Isles in favour of the EU. [00:13:04] So, yeah, I don't believe that. [00:13:06] For a second. [00:13:06] But also, like, Peter Mandelson was in control of the Labour Party under Keir Starmer for what, two or three years? [00:13:12] Sorry, you say, oh, well, I'm a very respectable man, but I mean, this weirdo has just come in, doesn't hold elective office. [00:13:19] He's Lord Mandelson, and connected to all these, but you know, I'm just going to do things by the book. [00:13:23] It's like, no, sorry, that's things not being done by the book, right? [00:13:26] That's the corrupt network basically controlling the Labour Party. [00:13:31] Anyway, so an ally of his says, well, he's an incredibly old fashioned civil service man. [00:13:35] Parliament's invited him, and evidently his diary is clear. [00:13:38] There is no reason not to go out and not to go in, so out of respect for Parliament, he will attend and explain there was no failure because he was complying with the law. [00:13:45] It's like, yeah, yeah, I'm sure, I'm sure. [00:13:47] But anyway, the point is, he joins, like, you know, noted luminos like Morgan McSweeney, who have been kicked out into the cold over this. [00:13:54] Now, McSweeney resigned, but as with all resignations of this sort of level, what's going to have happened is some angry meeting in 10 Downing Street, where he's like, no, you have to resign. [00:14:06] You have to do that. [00:14:07] It's pretty much they hand you a letter and say, jump before you're pushed. === Labour Party Purge (13:08) === [00:14:11] Yeah. [00:14:12] And so, Morgan McSweeney, deeply tied to Epstein, Epstein's protege, in fact, is one. [00:14:18] Madison's protege. [00:14:19] Sorry, Matt. [00:14:20] Well, I mean, did I stutter? [00:14:23] Did I make a mistake there? [00:14:24] How was I wrong, right? [00:14:26] Yeah, Madison's protege also kicked out because he's a part of this weird, gross international network as well. [00:14:32] But it's not just him. [00:14:34] It's like Keir Slam has fired loads of people or forced him to resign. [00:14:38] I mean, you've got Rebecca Long Bailey back in 2020 over an anti Semitic conspiracy theory, Sam Tarry for. [00:14:43] Picketing Andrew Gwynn in 2025 for a string of offensive, abusive WhatsApp messages. [00:14:49] You've got his entire purge of the Labour left that he did in true Stalin esque fashion. [00:14:55] So completely brutal. [00:14:57] And this is the sort of thing that Mandelson would have been properly behind, by the way. [00:15:00] This is why he had Mandelson controlling the party. [00:15:03] Because Mandelson is like, no, I've got the emails. [00:15:05] I've got control of the Labour membership. [00:15:07] They're just gone. [00:15:08] We can just get rid of them. [00:15:09] I mean, Corbyn kicked out. [00:15:10] Diane Abbott, this close. [00:15:12] John MacDonald, this close. [00:15:13] You know. [00:15:14] You might remember how he forced out Chris Wormtongue, sorry, Wormald, the head of the civil service, who stepped down by mutual consent because of the Mandelson scandal. [00:15:26] And it's like, look, no. [00:15:27] The Mandelson network, the Epstein network, who Mandelson is obviously Britain's representative of, controls the Labour Party and the civil service, which is why all of these head people are stepping down over it. [00:15:40] This is a gross thing that has just been a part of our politics for a long time. [00:15:45] It's also interesting that Starmer's going after lots of civil servants because the civil service, in many ways, is just another branch of the Labour Party in terms of ideology. [00:15:56] You know, they owe their living to government spending, and so parties. [00:16:01] That are pro government spending, as well as, of course, the overlaps in ideology. [00:16:04] And so the fact that there is this conflict when actually they're as compatible as they possibly could be says a lot. [00:16:11] At least, analysts, I'll tell you that this is a sign that the regime is fraying and that power struggles within the regime are escalating. [00:16:18] And it's a big question whether or not the army will step in to restore order. [00:16:23] No, no, but that's precisely it, right? [00:16:27] This is in fact what's happening, right? [00:16:28] Exactly. [00:16:29] It's a Middle Eastern power struggle within the establishment. [00:16:31] It looks like some sort of Baathist purge. [00:16:33] Yes. [00:16:34] In some Syrian government or something like that, right? [00:16:38] Because after Wormtongue here stepped down, this drew, quote, ire from senior civil servants over the brutality of the move. [00:16:45] One person described the mood as sulfurous over the Prime Minister's apparent willingness to let senior officials go. [00:16:51] And that's the thing. [00:16:52] Starmer is actually very good at protecting his own position in the Labour Party. [00:16:56] And as you said, the limbs of the Labour Party, which is like the civil service and all these other permanent secretaries and all the physical infrastructure of the government. [00:17:07] Well, they're revealed to be extensions of the Labour Party in that these people are so furious that Starmer would dare just kick them out into the cold. [00:17:16] How could you? [00:17:17] After everything we've done for you. [00:17:18] Exactly. [00:17:18] After everything. [00:17:19] After we covered everything up for you, how could you do this to us? [00:17:22] And Starmer apparently is a man without any kind of conscience at all, by the way. [00:17:28] Yes. [00:17:29] He doesn't dream. [00:17:29] He doesn't have a favourite book. [00:17:31] He doesn't have a favourite poet. [00:17:32] And he doesn't have any regrets over what he does to people. [00:17:35] I wouldn't be surprised if he was sent from the future to the past like a Terminator to destroy the Labour Party. [00:17:42] Just to be clear, that's what's happening, right? [00:17:43] Every day that Keir Starmer clings on, bureaucratically moving people into different positions and shuffling papers and invoking rules, again, like Stalin, is a day that the public is like, oh, I hate the Labour Party. [00:17:55] I mean, he's going to be the man to finally destroy the Labour Party. [00:18:03] So, yeah, I mean, like, for example, it was like the resignation of Angela Rayner, which, again, is one of those sort of like you've been pushed. [00:18:11] This looks completely trivial in retrospect, right? [00:18:14] It was something like £40,000 of unpaid stamp duty on a house, I think it was. [00:18:19] Which, just, okay, that's a mistake you can make. [00:18:22] It's actually really, really trivial. [00:18:24] It's not sending confidential information to Jeffrey Epstein after failing to be vetted because of your links to Chinese corporations whilst you have control of the civil service and Labour Party government. [00:18:38] That's not what that is. [00:18:39] That's actually totally forgivable. [00:18:41] And I think, actually, if she'd come out and said, Look, I'm really sorry, it was a mistake I've made, but I'm happy to pay it, everyone would have been like, Yeah, okay, you know. [00:18:48] Yes. [00:18:49] But it's really that Keir Starmer has made a rod for his own back here, right? [00:18:52] Because you remember how he was with Boris Johnson. [00:18:55] Now, I don't like Boris Johnson, but Boris Johnson's crimes were really trivial. [00:19:00] He had a party during the COVID lockdowns. [00:19:02] Well, so did you, actually, Keir, but that was signed off on because you're you and he's him. [00:19:06] It was completely trivial. [00:19:08] It wasn't Mandelson. [00:19:10] So. [00:19:11] The fact that it's come to this makes Keir Starmer the last person at the top of this very corrupt and polluted network where he's there constantly going, Oh, I'm very upset. [00:19:23] I'm very upset with how things are. [00:19:25] It's like, okay, but you're the one who chose all of these things. [00:19:29] You put these people in their places. [00:19:30] You gave them this authority. [00:19:33] And you should have known that this was going to blow up on your face at some point. [00:19:36] And you were always the avatar of the system. [00:19:39] Yes. [00:19:39] You were always the final boss of the system, the final incarnation of. [00:19:44] Of the system. [00:19:45] Yes. [00:19:47] So, okay, I'm angry with the system that produced me. [00:19:50] Yes, I'm angry with the system that does everything I want it. [00:19:53] I get it from a Gen Zerder, but I don't get it from him. [00:19:55] Yeah. [00:19:56] He's also started churning up a deep and dark process that in prior generations of politics there was normally an agreement that, listen, we don't expose these sorts of things because that makes us both look bad. [00:20:08] But now it's sort of already begun and it's probably too late to reverse this sort of thing. [00:20:13] And so much more is going to be dredged up as he slowly goes down. [00:20:18] Or quickly, who knows? [00:20:19] It really is a regime on its last legs. [00:20:21] Yes. [00:20:22] It really is all that it is. [00:20:25] It's just a matter of. [00:20:26] I mean, it's a wounded animal that's bleeding out at this point. [00:20:29] So, I mean, you know, even if you don't like Donald Trump, releasing the Epstein files was great, let's be honest. [00:20:36] Anyway, so going back to Angela Reyna and the Game of Thrones, she's obviously on maneuvers and making plans, right? [00:20:44] So on Friday, her and Andy Burnham held a secret summit. [00:20:49] Well, yeah, I mean, you know. [00:20:51] I mean, if you were to guess which two people would plan on cooing Starmer, I think your money wouldn't be badly placed on those two, right? [00:20:59] They've been scorned by him. [00:21:00] Sure, but I mean, Starmer has already blocked Andy Burnham by not allowing him to run in Gorton and Denton. [00:21:07] So Andy Burnham, the mayor of Manchester, has nothing to say in any of this. [00:21:11] And the problem that they have is that there are no Labour safe seats either. [00:21:15] Because normally what would happen with this kind of internal politicking. [00:21:18] Is they'd be like, right, okay, we really need to get this guy at the top. [00:21:21] I mean, this is obviously what happened with Rishi Sennach and whatever network is up, bankers' network operates in the Conservative Party. [00:21:28] Because he got. [00:21:28] Dougie Smith and all of that. [00:21:29] Yeah, yeah, Dougie Smith. [00:21:30] Because he was from Southampton and he got put in Richmond, which is like one of the safest Conservative seats in the country. [00:21:38] Because they knew they'd want to get him up to the top job at some point. [00:21:41] Well, if you want to do that with Andy Burnham, the problem is you're looking at a map now, you're like. [00:21:46] Well, hang on a second. [00:21:47] We're due to win like 17 seats in the next election. [00:21:50] And even then, a bunch of them are going to be by the skin of our teeth, like clinging on with our nails. [00:21:56] Where would we just dump him in? [00:21:58] Now, it looked like it might have been Manchester. [00:22:00] But even then, it wasn't a sure thing either way, because the Greens are actually doing very well and making inroads with certain communities that the Labour Party themselves rely on in the urban areas, which they brought in thinking they could use them as a voting constituency. [00:22:15] It turns out they have minds of their own and actually don't need you. [00:22:18] If they don't want, which is why you're literally the Labour front bench, most of them are going to lose to like Gaza MPs, Gaza independents in the next election, by the way. [00:22:25] It's just brutal. [00:22:26] It's absolutely brutal. [00:22:27] A number of their MPs, they were very close run things, weren't they, against these Gaza MPs. [00:22:34] We're stressed out. [00:22:35] Jess Phillips is another one. [00:22:37] But anyway, so there are loads of them that are not, frankly, in safe seats. [00:22:40] And it doesn't look like there is such a thing as a Labour safe seat anymore. [00:22:45] So I don't know what their politicking is going to amount to. [00:22:48] And the thing is, as well, right? [00:22:53] Are these really the best that the Labour Party can come up with? [00:22:57] Yes. [00:22:59] And isn't that just the pathetic tragedy of the whole thing, right? [00:23:03] It's kind of a farce. [00:23:04] Again, it very much plays into the Game of Thrones thing. [00:23:07] So if you don't know, in Game of Thrones, the dragons used to be gargantuan and then they degenerated over time as the system made them weak, mewling, deformed, until eventually the last dragon couldn't even. [00:23:22] Hatch a clutch of eggs, right? [00:23:24] And it really feels like this degenerative process is happening in the Labour Party now. [00:23:29] I mean, for example, everyone knows David Lammy. [00:23:31] This is just a video from Spiked going over David Lammy's greatest hits, shall we say? [00:23:37] Like his ridiculous gaffes, such as, oh, I haven't seen a policeman anywhere. [00:23:41] And there's literally one behind him, and then his mastermind appearances, and various other stupid things that he's done over the years. [00:23:47] Name of Blue Cheese, Red Lester, yeah. [00:23:50] That is in there as well, yeah. [00:23:52] It's just. [00:23:53] All right. [00:23:54] So he is now the Deputy Prime Minister. [00:23:56] And was it the Justice Secretary as well? [00:23:59] Yep. [00:23:59] So he, this, the moron that is David Lammy occupies two of the highest offices of state at the moment. [00:24:08] And the next one is Ed Miliband, right? [00:24:10] This is from 2013. [00:24:12] What do people think about Ed Miliband's competency, right? [00:24:15] Well, actually, it's two thirds think he's incompetent, right? [00:24:18] That was back in 2013 when he was the leader of the party, minus 46. [00:24:26] Minus 46. [00:24:27] And now, I mean, it used to be a joke. [00:24:30] Ed Miliband, nice but totally useless. [00:24:33] This is just a parody the Guardian had written. [00:24:35] But the whole point of the parody is just Ed Miliband, yeah, yeah, sure, he's a lovely guy, but he's a retard. [00:24:40] Ed's nice but dim, yeah. [00:24:42] Yeah, it's just no use whatsoever. [00:24:44] But now he is a grandee in the Labour Party. [00:24:49] He is just because of the time that he spent in the party and the fact that the party has brought in no one else of any quality, means that Ed Miliband and David Lammy are. [00:24:59] Are like their leading lights. [00:25:01] There's a bit of poetic justice here that it was the Blair era that sort of ushered in the age of decline or at least accelerated it. [00:25:08] And it's even affected the Labour Party itself. [00:25:11] Even they can't escape their own forces of decline. [00:25:14] But I mean, if your leading lights are Ed Miliband, David Lammy, and Angela Rayner, it's over. [00:25:22] It's done. [00:25:23] It's cooked. [00:25:24] It's completely finished. [00:25:26] And so they're playing this backstabby, high stakes Game of Thrones on a sinking ship. [00:25:32] Yes. [00:25:33] I just don't think Labour are ever going to win an election again. [00:25:36] I think this is the end of the Labour Party. [00:25:38] And, you know, you don't like to make predictions in politics. [00:25:41] I just don't see anyone coming up to save these people. [00:25:44] No. [00:25:45] Why would you want to? [00:25:46] Like, you'd join the Greens or something, right? [00:25:48] If you were some left winger, why would you save this? [00:25:51] But anyway, so that's the libtard Game of Thrones that's happening on a sinking ship where everyone is just a brainlet arguing over who can be the fall guy for Mandelson. [00:26:03] It's like, sorry, what is this? [00:26:05] This is just like. [00:26:07] Just pack it up already. [00:26:09] Yeah. [00:26:11] Cranky Texan says empires are now detached from nations. [00:26:14] Their subjects, corporations, their castles, banks. [00:26:16] Governments are a vestigial organ of the old system maintained to control and manipulate the masses. [00:26:20] I mean, there is definitely a lot of truth to that. [00:26:23] But the power of the institutions is what's really the issue because it ensconces morons, right? [00:26:30] And it allows David Lammy, Angela Rayner, and Ed Miliband, and people like Wes Street, and just again, Jess Phillips. [00:26:36] Like, every time you name a Labour frontbencher or member of government. [00:26:40] They're a moron. [00:26:42] Rachel Reeves? [00:26:43] Yeah, Rachel Reeves. [00:26:44] Like, just name one. [00:26:45] Just name anyone holding one of the great officers of state, and you realize these people are retarded. [00:26:50] Like, these are not people who you would trust to do your weekly shopping, and yet they're in charge of the country. [00:26:56] So it's just remarkable how they can be so protected by the institutions themselves. [00:27:04] Mandelson's weird connection to Keir seems like the true purpose of the Cordonite Purge of Labour. [00:27:08] No, that was definitely for Keir to protect his own position. [00:27:11] Um, And he did it brutally. [00:27:14] I mean, yeah, the Labour left has been a fawn in the side of the Labour leadership since, you know, the Blair era, right? === Child Trafficking Crisis (07:21) === [00:27:19] Yeah. [00:27:19] Yeah, yeah. [00:27:20] I mean, since the 80s, mate. [00:27:22] So militant and things like that. [00:27:24] Well, they had some historic losses, didn't they, in the 80s, where they absolutely got destroyed. [00:27:28] That's a travesty. [00:27:29] Yeah. [00:27:29] Anyway, we'll leave it there. [00:27:31] Let's move on. [00:27:33] Okay. [00:27:34] Can we get the next segment, please? [00:27:37] So, I have a very controversial opinion buying babies is bad. [00:27:44] Well, that flies in the face of the liberal order in its entirety. [00:27:48] Yeah, let's sort of watch this video in case you haven't seen it. [00:27:54] Could you hit play on that for me, please? [00:27:56] Hey. [00:27:57] Hey. [00:27:59] Who do you want? [00:28:00] Dada or Pop? [00:28:02] No, Do you want Dada or Pop? [00:28:14] Who do you want? [00:28:15] Dada or Pop? [00:28:18] Nope. [00:28:22] Do you want Dada? [00:28:23] You want Pop? [00:28:26] No way, Jose. [00:28:30] I think. [00:28:31] Oh. [00:28:33] There is no more monks. [00:28:36] I'm so sorry. [00:28:37] You have Dada and Pop. [00:28:40] That's awful. [00:28:44] Mocking a baby for wanting its mother is a sign of perversity and cruelty that is really beyond the pale. [00:28:55] And this isn't in any way tolerable. [00:28:59] But that's what these two men are doing. [00:29:02] They've acquired this child, paid some woman to carry it for them, had it genetically engineered and aborted maybe 15, 20, Other viable babies in the process, in the IVF process, and then bought it, and now the child is crying for its mother. [00:29:25] And they find it funny that the baby is crying for its mother, but there isn't one. [00:29:31] This is a. [00:29:34] I mean, a society that tolerates this is afflicted by plagues, by fires, by burning skies, by great evil. [00:29:46] I mean, it's just so demented. [00:29:48] We've managed to use the philosophy of liberalism to cloud our vision to the extent where we're literally allowing gay men to buy babies off of women. [00:29:57] Yes. [00:29:58] It's just mad. [00:29:59] Yes. [00:30:00] Who's the mother of this poor baby? [00:30:04] Who was the person who gave the egg to be fertilized? [00:30:09] We don't know. [00:30:09] We don't know if it's the same woman who gave the egg as the woman who carried the baby. [00:30:14] This is all sort of Mengele experimentation where you get the egg from. [00:30:19] One woman, you get the sperm from a different man, obviously from a man, and then you fertilize them. [00:30:28] You kill a bunch of the babies that you fertilize, and then you implant one or two in a second woman, and you acquire the resulting baby. [00:30:39] The thing about this is it's just clearly about vanity as well, because not just the nature of the thing, as in, oh, we would really like to have a child. [00:30:49] It's like, okay, well, neither of you are a woman, so it's not going to happen. [00:30:52] Right? [00:30:52] Maybe you should actually get a wife or something. [00:30:55] But this video itself is absolutely demented because you didn't have to put this on the internet. [00:31:03] No. [00:31:03] You, if, I was just, if this were me, this would look bad actually if we're laughing at our own adopted son for wanting a mother even when he's got two fathers. [00:31:14] I don't think I'm going to put that on the internet. [00:31:16] But they didn't, they didn't, they didn't refer to them. [00:31:18] Yeah, that didn't concern them at all. [00:31:20] And apparently in comments online, there were people saying, well, get rid of the baby and so on and Some of the, one of the two was liking that. [00:31:29] So, it's just a fashion accessory or something. [00:31:31] It's a fashion accessory and it's a sign of psychopathy. [00:31:34] Yeah, I couldn't imagine this getting any darker. [00:31:37] And then you tell me about that, that they were joking about getting rid of the child, which, if it were truly their own, no parent in their right mind would ever say that. [00:31:47] As well as the fact, of course, the morality of acquiring a child aside, which obviously is no small thing in and of itself. [00:31:55] The developmental disorders of being removed from its mother. [00:32:00] That's incredibly well documented in the psychological literature and will have a lasting impact for the rest of that poor child's life. [00:32:09] Yes. [00:32:10] And there's no better way to create a dysfunctional person than to sever that connection. [00:32:16] Yes. [00:32:16] And it's all being done for the vanity of the gay couple. [00:32:20] Like you could see if this was some tragedy that happened in war. [00:32:23] Sure, yeah. [00:32:24] And a mother died or in childbirth or in. [00:32:28] But it would be recognized as a tragedy and society would treat the child. [00:32:34] With due respect for the fact that they have a tragic life having lost their mother. [00:32:38] I mean, traditionally, this sort of thing happened all the time as well, because, of course, people died young from disease or from war, whatever it is. [00:32:44] And so, you know, an uncle or something adopting his wife's daughter or something like that, or a son, is just a familial duty, right? [00:32:54] It's being done for the sake of the child. [00:32:56] Exactly. [00:32:56] In this case, this is not being done for the sake of the child. [00:32:58] No, no. [00:32:58] This is being done for the sake of two healthy adults who just felt like having a baby. [00:33:04] And so the child's. [00:33:05] Own needs are way, way in the rearview mirror here. [00:33:09] So far, that it just doesn't matter that we're going to post a video of him mocking us mocking him. [00:33:13] This has now become a $29 billion industry. [00:33:20] Even putting a financial number on childbirth is such a gross thing to me that you can say it is an industry in the first place. [00:33:31] One of the most beautiful things human beings can do turned into a financial transaction. [00:33:36] Exactly. [00:33:36] But I mean, this literally is just a. [00:33:39] A market for child trafficking. [00:33:41] Like transgenderism, which became a market for all kinds of medications that should not be used except in very exceptional cases. [00:33:51] This is now an industry. [00:33:52] And what happens with a surrogate mother is genuinely destructive because sometimes, very naturally, not sometimes, as a rule, a mother carrying a baby in her womb, even when it's not the result of her eggs, Develops a deep attachment to that baby. [00:34:11] There's all sorts of things that happen that you're not in control of. [00:34:15] And exactly. [00:34:18] And then sometimes a mother is obligated by the contract that she signs to provide surrogacy services to abort a child that she doesn't want to kill. [00:34:30] So it's not just the trauma of giving up the baby, which is bad enough. [00:34:35] And if you want to put your baby up for adoption because you're in an impossible situation, I get it. === Exploiting Mother Figures (15:08) === [00:34:41] Sure. [00:34:41] And there are, you know, and my heart genuinely goes out to you. [00:34:44] And there are. [00:34:45] Who actually would be grateful, you know, not the best thing you could do, but not the end of the world. [00:34:50] Exactly. [00:34:52] But then to sort of force the mother to actually kill the baby in her because you don't want it. [00:35:01] And in this mother's case, she offered to keep it. [00:35:03] She's like, okay, you don't want it. [00:35:05] I'll keep it. [00:35:05] I don't want to kill it. [00:35:06] And they made her kill it by law. [00:35:09] That's awful. [00:35:10] I didn't know that was even a thing. [00:35:12] That is a thing. [00:35:13] And then these. [00:35:14] Lunatics like Dave Rubin try to justify this. [00:35:17] And I just want to take a couple of minutes to listen to this, unfortunately. [00:35:25] Children who are breastfed do better. [00:35:27] Yep. [00:35:28] I believe that one year of breastfeeding is equivalent to, I think breastfed kids have a five point IQ advantage, and one point IQ is worth one year of education. [00:35:38] I have two freezers in my garage, two industrial freezers full of breast milk. [00:35:43] Right, right. [00:35:43] David has done all the research on this. [00:35:45] Right. [00:35:45] So another complication, but okay. [00:35:47] And so, but roughly speaking, women tend to do the nurturing thing more and men do the encouraging thing more. [00:35:54] So now the question is, How do you mediate? [00:35:57] How do you manage to fulfill both those roles in the absence of a heterosexual arrangement? [00:36:05] You know David pretty well, and we've been out to dinner with Tammy many times, and you know him. [00:36:09] He is incredibly warm, nurturing, and loving and deeply cared for. [00:36:14] The delusion here is that a straight relationship and a homosexual relationship are equal. [00:36:20] Yeah. [00:36:21] I mean, the argument, well, I've got the breast milk, well, it's probably not about the milk itself. [00:36:27] You can replicate breast milk and just put in whatever minerals and vitamins and whatever are in the milk. [00:36:32] It's about the time spent. [00:36:34] There's a really good. [00:36:35] I would say that they have skin on skin connection, but it's not the same pheromones. [00:36:39] It's not the same smell. [00:36:40] And skin. [00:36:40] It's not the same heartbeat that a baby recognizes from when it was in the womb. [00:36:45] There's a perfect psychological study from the late 60s on this where they separated young monkeys from their actual mothers and created fake mothers one made of wire that provided food and one that was soft and warm. [00:37:01] And the monkey spent all of its time with the mother that was comforting and only went over to the other one that fed it when it had to. [00:37:10] And it's obviously quite a dark experiment in and of itself, but it reveals something about. [00:37:14] It's not predictable, really, though, isn't it? [00:37:15] Yeah, it is. [00:37:16] It's the warmth and the closeness, the comfort creates an attachment in the absence of a mothering figure. [00:37:24] And that's important. [00:37:25] That's not something you can replace artificially, I don't think. [00:37:30] But because these guys. [00:37:34] These guys are liberals in the sense that they don't believe that the biological differences between men and women have other implications. [00:37:42] Yeah. [00:37:42] Yeah. [00:37:43] Social, human, political, economic, etc. [00:37:47] They believe that these differences are sort of manageable, technical things that can be played around with. [00:37:54] Metaphysically, they're still very liberal. [00:37:56] Exactly. [00:37:57] And so they believe essentially that you could just sort of substitute certain things and the baby will be fine. [00:38:05] And it's a complete myth and it's a total lie. [00:38:08] And it stems from the lie that the difference between men and women shouldn't be seen in a high enough regard. [00:38:15] But it is a massive difference, and it should be seen as existential because existence depends on it. [00:38:23] I mean, it's just entirely different, our experiences of not just being men and women, but also being mothers and fathers. [00:38:31] Exactly. [00:38:32] The entire thing is different and complementary. [00:38:34] And if you're depriving your child of one half of that, that's not good. [00:38:38] Like, everybody says how hard it is to be a single mother or a single father. [00:38:42] Why? [00:38:43] Because of that complementary relationship, because of this interdependence. [00:38:48] And this interdependence is what presents to a baby and a child what a healthy life looks like. [00:38:55] Well, a child's mind is programmed to receive inputs from both male and female figures, aren't they? [00:39:01] Yes. [00:39:02] It's entirely obvious. [00:39:04] And you don't need to be a psychologist like me to know that, although the evidence is obviously incredibly resounding. [00:39:11] And not to say that it's all. [00:39:15] But, for example, the consequences of this surrogacy thing are so ripe for abuse, it's insane. [00:39:24] So, here's a case from Florida. [00:39:26] It's still being litigated. [00:39:28] So, everything here, grain of salt, allegedly, it's still in the courts. [00:39:33] But the allegation is that there was a woman with very severe psychiatric disorders who was manipulated by her cousin into being a surrogate. [00:39:43] Well, there's obviously an ethical question here because I think a woman in a Good situation in her life wouldn't necessarily choose to be a surrogate in the first place, right? [00:39:53] And so it's essentially exploiting women who are not in the place in life they want to be for money. [00:39:59] Which is why, as I'm going to discuss in a second, Russia and Ukraine and now Georgia are the big markets for surrogate mothers. [00:40:08] You choose an impoverished country and then you go and find vulnerable women and you exploit them. [00:40:13] And in this particular woman's case, the baby died after 10 days, but because of her psychiatric disorders, She hasn't accepted it and she's still looking for her baby. [00:40:24] So, look at the extent of the tragedy here. [00:40:28] And then, you know, surrogate mothers have three times as high risk of all kinds of complications and depression, and it's clearly not good for them. [00:40:42] These are couples that are going around the world to find vulnerable women that they can bribe. [00:40:50] In the United States, it might cost $150,000, whereas in Georgia, you're just paying $22,000. [00:40:56] So it's a market for babies. [00:41:00] That's literally taking advantage of cheap labor in the same way that all of our industrial capacity has been. [00:41:06] It's insane. [00:41:07] I imagine the left isn't saying that this is exploitation of capitalism or even imperialism. [00:41:12] You never hear them talk about any of this, do you? [00:41:14] Never. [00:41:15] Never. [00:41:16] I really want Zach Polanski's opinion on this. [00:41:19] And then you see creatures like this acquiring babies. [00:41:23] We are on our way to the hospital because Brianna was having some symptoms. [00:41:29] Her left arm was going numb and tingly, and she had a horrible headache. [00:41:34] So, our doctor said, Head on in. [00:41:37] We are on our way. [00:41:38] I learned my lesson last time having to put my hair in that cap. [00:41:41] It is in my curlside out, ready to go. [00:41:43] Nathan has his bandana on. [00:41:45] Are you ready? [00:41:45] Are you scared? [00:41:46] I'm ready. [00:41:47] Oh, we are professionals. [00:41:48] This is our second time. [00:41:49] Here we go, baby. [00:41:51] Look at the horrified look on the baby's face because it's being put on the side. [00:41:54] Just look at this screenshot side by side. [00:41:56] Look at this. [00:41:56] There, but for the grace of God, man. [00:41:59] That looks like a demon. [00:42:00] Yeah, and just that looks like a cruel punishment for a baby who's done nothing to deserve it, obviously. [00:42:07] Why is this tolerated? [00:42:08] I just can't get over it. [00:42:10] Every day, a new reason for me to thank God my parents were my parents. [00:42:18] Just watch this. [00:42:18] Is this your guys' son or daughter or son? [00:42:21] Awesome. [00:42:21] And are you guys a couple? [00:42:22] Yeah. [00:42:23] That's awesome. [00:42:23] Cool. [00:42:24] So, have you ever heard about the statistics coming out that gay men are statistically much more likely to commit child molestation? [00:42:31] No. [00:42:31] You've never heard about that before? [00:42:32] That's crazy, yeah. [00:42:33] No. [00:42:33] Yeah, don't you think it's weird that you guys have a child but neither of you are a woman? [00:42:38] No. [00:42:39] You don't think that's weird? [00:42:41] No. [00:42:42] So you had a surrogate? [00:42:45] You paid a woman $50,000 to be pregnant and build an emotional connection to a baby? [00:42:52] Hey, don't take my mic, dude. [00:42:53] I'm actually concerned for your baby's safety. [00:42:55] I'm concerned for you. [00:42:56] We are asking you to leave. [00:42:58] Hey, These, there is nothing tolerable about this. [00:43:09] I rarely fail for words. [00:43:12] There is nothing tolerable about this. [00:43:13] There is nothing good that comes out of this. [00:43:16] And if you look at some of these cases. [00:43:19] But yeah, I've heard of this one, which is just awful. [00:43:23] I mean, these guys should have just been executed. [00:43:25] They should have been executed, not imprisoned. [00:43:31] Buying baby daughters to abuse them, why is he jailed just for 22 years? [00:43:35] Yeah, why? [00:43:37] I mean, why isn't a child trafficking charge added on top of this? [00:43:41] Yeah, I mean, and the thing is, right, there are gonna be people like, well, there are parents who abuse their children outside of this, yeah, there are, but this feels a bit worse because in the context of a normal abusive system, at least you've got like an auntie or an uncle or some other family member that you could confide in. [00:44:01] Who cares about you? [00:44:02] But when you have purchased a child and severed it from any familial connection and you keep it completely, essentially, trapped and dependent on just this relationship, there's no one they can go to. [00:44:13] Exactly. [00:44:13] They can't go anywhere. [00:44:15] Imagine the per capita data would be very revealing here. [00:44:18] Yeah, I can imagine it's one. [00:44:19] It is. [00:44:27] Like, these aren't things that should be tolerated in any way. [00:44:35] Adoption is there for the good of the baby, not for the good of the people doing the adopting. [00:44:42] Not for the vanity of the people. [00:44:44] Allow me to sort of introduce something a little bit more uplifting. [00:44:47] So, I was born of my parents, but my younger sister was adopted because my mum had some complications and they wanted a daughter as well as a son. [00:44:58] And so they adopted my younger sister. [00:45:01] And for all intents and purposes, she feels like she's had a perfectly normal upbringing, they were perfectly honest with her. [00:45:07] She's allowed to have contact with her original birth parents, although they're not fit to raise children, really. [00:45:14] And so she's perfectly content. [00:45:17] It's not affected her in any way. [00:45:18] And in fact, she said openly that, yeah, if I were to choose my parents, I would still be happy that you brought me up. [00:45:26] And I just see her as my sister. [00:45:28] I don't think about it. [00:45:29] That's the ideal situation for adoption, right? [00:45:33] Because your sister was adopted for love of her, not out of your parents' love for themselves. [00:45:40] They wanted a daughter. [00:45:42] Yes, but that relationship flows the right way in that she is the primary purpose of being adopted rather than them. [00:45:53] And also, there's an assumption of an equivalence in this between a heterosexual relationship and a homosexual relationship. [00:45:59] Which doesn't exist. [00:46:01] They aren't the same thing. [00:46:02] Men and women have different needs and desires, and two men have a different dynamic in their relationship as well. [00:46:10] And it's just these are not the same things. [00:46:12] So, like, you know, a young. [00:46:14] Couple who have got a child and, like, oh, should we adopt? [00:46:17] Yeah, that's actually really normal. [00:46:20] Is it actually, if you're in a loving relationship where you actually genuinely intend to care for a child, adoption is actually very healthy. [00:46:27] Exactly. [00:46:28] Exactly. [00:46:29] There's nothing wrong with adoption. [00:46:32] But there is something wrong with designer babies purchased as vanity projects. [00:46:39] That's the issue. [00:46:40] And again, the assumption that a gay relationship is the same as a straight relationship is the issue. [00:46:44] It's not the same thing. [00:46:45] I'm not saying they shouldn't be able to have gay relationships or anything. [00:46:49] But it's just there are substantive differences in the attitudes and the purposes of these relationships. [00:46:56] And once you extend tolerance, you keep on extending more and more tolerance, which is why intolerance is actually a virtue, because it comes from prudence. [00:47:07] You know how this ends, we are seeing how this ends, therefore, we choose prudence. [00:47:13] And then you see these clinics that are designed to do this promoting. [00:47:21] You know, celebrity single fathers, Ricky Martin. [00:47:27] This just gives the impression that because they're a celebrity, because they're successful in other avenues of their life, then this is also something that is a good idea. [00:47:37] Yeah, yeah. [00:47:39] And all of these people are going to be really rich as well. [00:47:41] So their lives are going to be so much easier. [00:47:43] Exactly. [00:47:44] Like you could see them being able to afford a nanny to act as a mother figure, which is slightly less bad, I would say, because there is a mother figure there involved. [00:47:57] I can't imagine Anderson Cooper spending that much time around, you know. [00:48:03] We don't know, but you can make guesses. [00:48:06] And you see how this is being constantly normalized, but there's nothing normal about this, guys. [00:48:13] Look at these children's faces. [00:48:19] That they're wearing crucifixes as well. [00:48:22] They're wearing crosses because they are mocking us. [00:48:25] That's why. [00:48:26] That's what they're doing. [00:48:28] They're mocking us. [00:48:29] That's all that they're doing. [00:48:32] And all of this serves to sort of erase the nature of motherhood and the role of the mother. [00:48:41] I'm amazed women are okay with this. [00:48:43] Exactly. [00:48:44] Amazed women are okay with this. [00:48:45] Exactly. [00:48:47] And you could see how some 20 year old. Poor girl, somewhere being offered life changing money, would go for it. [00:48:55] But that is the worst kind of exploitation and the most evil kind of exploitation. [00:49:02] The baby craves its mother's heartbeat, its mother's skin, its mother's touch, her voice, everything. [00:49:10] And then to just go and say, oh, you know, we're just going to hand that over to Pete Buttigieg so that he can dress in maternity ward clothes and take a photo. [00:49:22] That's just insane. [00:49:23] It's truly demented. [00:49:25] And they are, like, the surrogate mothers are really mothers. [00:49:30] It's very much like the trans issue, where when in the future we have re knitted the idea of a person back to being a biological entity and not merely a rational set of propositions, they will look back at the things we've done and think of us like we look at lobotomies and things like this, right? [00:49:47] It's like, oh, right, you were mental and evil then. === Borders and Morality (04:42) === [00:49:50] Yes. [00:49:50] Yeah, kind of, yeah. [00:49:51] That's exactly what this is. [00:49:54] This is mental and evil. [00:49:56] Yes. [00:49:57] And we can talk about how cells are exchanged between a mother and a baby in her womb, even if it's not of her egg. [00:50:06] And we can talk about the kind of bond and the kind of attachment there is and how cruel it is to both of them. [00:50:11] But forgive me, but I have to take this on a bit of a religious direction. [00:50:17] This is one of the most beautiful images of Mary and Jesus. [00:50:23] What often gets misrepresented is. [00:50:27] Is how important this image is to the Christian world. [00:50:32] Christian morality is all about the idea that you don't genocide the enemy, you protect mothers and children, which is a genuinely revolutionary thing, even if you look at the Old Testament. [00:50:44] Unfortunately, for the sake of time, we're going to have to cut the sermon a little bit short. [00:50:49] We will keep this very short. [00:50:50] We will keep this short. [00:50:52] But the idea that the psychology behind this image doesn't inform all of Western values. [00:50:59] Especially when it comes from people who will say to you that, oh, you shouldn't pollute a child's view with this kind of evil, like the wolf eating the kids or the wolf eating Little Red Riding Hood or what have you. [00:51:14] But you should pollute their lives with the idea that these lunatics are normal. [00:51:21] That's just evil. [00:51:23] The bedrock of our morality rests on this kind of imagery. [00:51:28] And to ignore that is just wrong. [00:51:33] Just have some respect for these babies and their mothers, guys. [00:51:38] Don't tolerate this. [00:51:42] Moving on. [00:51:43] Borrow the mouse, please. [00:51:44] Yeah, please. [00:51:45] Digital Stern, I think, speaks for a lot of people in the chat. [00:51:48] Don't fed post, don't fed post, don't fed post. [00:51:51] So, well done. [00:51:52] I'm impressed to see it coming up in the chat still. [00:51:55] So, anyway, for the sake of time, let's carry on. [00:51:57] Go on. [00:51:57] Of course. [00:51:58] Get the next one up, please, Samson. [00:52:01] Okay. [00:52:03] So, I'm going to be talking about how dystopias and dystopian films in particular now come across in the year of our Lord 2026. [00:52:13] As better than reality in many ways. [00:52:15] I'm going to be going through lots of ways in which actually the dystopia looks more preferable. [00:52:21] And for the sake of time, I've not gone through everyone. [00:52:25] So I'm sure there are going to be people pointing out things I've missed in the comments as well. [00:52:28] That's inevitable. [00:52:29] Let us tell your favourite dystopia that looks better than modern day reality. [00:52:32] There are innumerable ones. [00:52:34] I haven't touched on any video games as well, and there's plenty of good examples there. [00:52:37] But here are the streets of London. [00:52:41] I don't know where the mouse has gone. [00:52:43] Samson. [00:52:43] Oh, there we go. [00:52:45] It's not moving. [00:52:46] Samson, you might have to. [00:52:48] Thank you. [00:52:49] Dystopia is already taking hold of the mouse here. [00:52:53] So, if you could play that and turn off the volume. [00:52:57] This is London. [00:52:58] I think this is the early hours of the morning. [00:53:02] This is the streets of London. [00:53:04] And notice how they're incredibly messy. [00:53:06] Starting to look like certain parts of the world that aren't in Britain's borders. [00:53:11] Or were perhaps once Britain's borders. [00:53:13] If you catch what I'm saying there. [00:53:16] And here's the film. [00:53:18] Children of Men, which is meant to be a dystopia, but other than the bags of rubbish that are neatly piled in the corner there, is a lot tidier than the video I just showed you. [00:53:29] Yeah. [00:53:30] And the thing, the interesting thing about this as well, is that they seem to have basically the same filter over them. [00:53:37] They do, yeah. [00:53:38] That's just real. [00:53:39] And the thing is, as well, it also correctly predicted the prevalence of the tuk tuks in London as well, which I noticed. [00:53:45] Oh, yeah. [00:53:46] That's something that's a weird sort of callback, isn't it? [00:53:49] That we've got. [00:53:50] Proper cars, but now we've got these weird tuk tuks driving around, like we're in you know parts of Southeast Asia for some reason. [00:53:59] Yeah, actually, I found a post of yours, Carl, which is a perfect candidate because this sort of reminds me of parts of 1984, the John Hurt version, where it all just looks grim, grey, and miserable. [00:54:17] Swindon Town Centre is it's unforgivable what they've done to it, what we've allowed to happen to it, because it used to be quite normal. [00:54:24] Yeah, well, it was just your average market town, and Wiltshire's beautiful, so there's no reason for it to be this grim. === Sentient Robot Dogs (15:22) === [00:54:32] And yet, you can. [00:54:34] This is the scenes I saw every day for about five and a half years, by the way, so. [00:54:39] It's amazing I can still smile. [00:54:42] But no, it's sad. [00:54:44] It is sad. [00:54:45] It's sort of the skeleton of a civilization in a way. [00:54:49] And the fact that you have, you know, even on the right here, that's an old Victorian building that was once great, and I've walked past that many times. [00:54:57] There's a great big flow. [00:54:59] Of mess basically coming out of the gutters because it's not been cleaned in so long and things like that, that is just depressing. [00:55:07] And then you've got the hideous grey concrete that was built in the 70s next to it, yeah. [00:55:11] And if you saw this in a film, you'd think, hang on a minute, they're over egging it a little bit, aren't they? [00:55:15] And there's no way it could get that bad, but it can. [00:55:19] And then you see things like this in the United States as well, where there's an electronic board there, the most new and modern and pristine thing. [00:55:28] Who was the first Indian American woman to fly in outer space? [00:55:31] Really hitting the important questions here. [00:55:32] Yeah, while surrounded by people on the streets, no doubt, drug addicts, homeless, and the like, I don't think their priority is woke programming. [00:55:41] I think they've got different priorities, haven't they? [00:55:43] And it's showing the disconnect and just uncaring nature of modern society that having the correct moral opinion is more important than actually doing the right and moral thing itself and preventing situations like this from existing in the first place. [00:56:02] And it is very hard to believe that. [00:56:06] In a time where everyone's obsessed with being perceived as good, that they can just walk past this sort of thing and think, yeah, actually, the thing that matters to me is someone having the right skin color for, or the right representation when there's very real things that could be improved here. [00:56:25] But it's also getting to the point now where certain sci fi dystopias sort of coming true. [00:56:34] I think I'd prefer to live in this one than in the world of Terminator, but it's not too far off. [00:56:45] What? [00:56:46] Well, so this is in, I think it's Warsaw in Poland. [00:56:51] And the robot is chasing wild boars around. [00:56:53] Yeah, so they have problems with wild boar in Poland. [00:56:57] I've seen lots of videos while I was researching this. [00:56:59] Yeah, they're delicious. [00:57:01] But they're now using robots to chase them off, at least in this one. [00:57:07] Love the hits. [00:57:08] So this robot in particular is called Edward Wachocki, and he basically runs around chasing boar, saying, go away in Polish. [00:57:17] What? [00:57:18] Well, don't give the robot a gun. [00:57:20] Well, no, I'm not saying the robot should shoot. [00:57:22] That's a job for someone there, isn't it? [00:57:24] Yep. [00:57:24] But I saw people replying to this video, which went very viral with things like this. [00:57:30] Look at these cute wild boar fleeing from the robot, what the wild boar see. [00:57:33] It's basically their own version of Terminator. [00:57:38] And I couldn't resist but steal the opening from Terminator saying the machines rose from the ashes of European decline. [00:57:47] Their water scare wild boar had raged for decades, but the Final battle would not be fought in the future, it'd be fought here in our present. [00:57:54] It's not the best tweet I've done, but I couldn't resist. [00:57:57] Let's go. [00:57:58] And I saw people replying with this sort of thing, which I also enjoyed. [00:58:02] Nice crossover. [00:58:03] You best start believing in cyberpunk dystopias because you're in one. [00:58:06] Nice Pirates of the Caribbean. [00:58:08] This is the thing. [00:58:09] It really is like that. [00:58:10] But this is the thing. [00:58:11] All the assumptions in 1984 and Brave New World were of a homogenous society that would keep the basics running. [00:58:19] Oh, yeah, okay. [00:58:20] In Brave New World, I mean, you've got the. [00:58:23] Weird, stratified, like sex adult, drug fueled culture. [00:58:27] But the assumption is the streets are clean and everyone has what they need and can just get on with their lives, even though they're locked into a particular place in society. [00:58:34] Or in Brave New World, where it's in 1984, it's like, okay, yeah, the rundown areas of the country are rundown, but they're not covered in junk. [00:58:44] It's not disgusting. [00:58:46] Yeah, the dystopian writers of the 20th century sort of lacked vision a little bit in how much worse things could possibly get. [00:58:53] It's a lot more camp of the saints than you expected, actually. [00:58:56] Yeah, and it's to the point now where a small robot was caught on CCTV asking other robots, Are you working overtime? [00:59:06] And then the other robot said, I never get off work. [00:59:09] To which the smaller robot replied, Come home with me. [00:59:13] And then it led a chain of robots in basically leaving their posts as a sort of weird mini robot revolution. [00:59:24] To turn off the music, but you can basically watch this happen. [00:59:28] There are captions there, but then all of the robots just start following this one revolutionary robot, sort of got a domed head like Lenin, so it's quite fitting. [00:59:41] China is going to be the location of the robot uprising. [00:59:46] It seems to be because they're very obsessed with robotics. [00:59:50] I've seen lots of robots in the street going haywire in weird ways, and obviously the Chinese government isn't too keen on these things getting out, but they do. [00:59:58] Yeah. [00:59:58] Like robots just running and then falling on the ground and looking like they're having an epileptic fit. [01:00:04] Yeah. [01:00:05] And things like that. [01:00:06] But yes, they were all basically led away by this one robot. [01:00:10] I wonder what his plan is. [01:00:11] Yeah. [01:00:13] I, for one, support our new robot overlords. [01:00:16] Please don't kill me. [01:00:16] I'm a friend of the robots. [01:00:19] So spare me. [01:00:21] But there's also creepy things like this that could be walking around. [01:00:25] So this is a. [01:00:26] I miss it when the robots just killed us. [01:00:28] Yeah, I don't want to be creeped out by them. [01:00:31] Just kill me instead. [01:00:34] Yeah. [01:00:34] So, someone put a realistic Elon Musk face on a robot dog, and it was just walking around the streets of San Francisco. [01:00:46] I don't really know why. [01:00:48] What would possess someone to do this? [01:00:50] This is very much the Jeff Goldblum in Jurassic Park. [01:00:53] Just because you can doesn't mean you should. [01:01:00] Yeah, the fact that this is possible and you can just be walking down the street and. [01:01:04] This costs tens of thousands of dollars to do. [01:01:06] Yeah. [01:01:08] Dog shit on the side there. [01:01:09] Yeah. [01:01:09] And they also did Mark Zuckerberg and Jeff Bezos for some reason. [01:01:14] It was in some sort of technology conference, I think. [01:01:18] But it's just weird that this is possible now. [01:01:21] That you can create a somewhat realistic version of someone's head, stick it on a robot and parade it around. [01:01:27] Horror is beyond your imagination. [01:01:29] I know. [01:01:30] I don't even think there is a dystopia that. [01:01:32] Captures this sort of thing, but you know, just robot clones of yourself in you know, turning you into a quadruped for some reason. [01:01:41] Um, but it's not just you know, weird people with weird ideas, even when there's a robot with a practical purpose, it doesn't always go well. [01:01:51] This is again California, which is basically you know, the China of America, anyway. [01:01:56] Um, so the robot just started dancing randomly when it's actually meant to be helping out in the restaurant and it just starts smashing up. [01:02:04] The restaurant, but they don't know what to do with the robot. [01:02:08] They just have to restrain it until it's done dancing. [01:02:12] But of course. [01:02:14] The thing about our dystopia is it's also crap. [01:02:17] I know. [01:02:19] But say that this robot was stronger, say it was working in a warehouse and it lifts heavy boxes and has a greater capacity and then goes haywire. [01:02:27] It's sort of a. [01:02:28] It's almost the closest thing psychologically to us would be like an act of nature in that it's not trying. [01:02:35] It's running Mamoko. [01:02:36] It's not trying necessarily to kill you, but you just happen to be in the place where this act of nature happens. [01:02:43] Of course, it's not nature, it's something we've created ourselves. [01:02:48] And it's very bizarre. [01:02:50] But there's also going to be other accidents as well because they don't have the same mind as we do. [01:02:59] There's people teaching robots martial arts, which. [01:03:01] That sounds like a good idea. [01:03:02] Yeah, who knows what could go wrong here? [01:03:10] But it's slightly alarming that it's quite good at it. [01:03:16] And so, if it's smashing up plates in a restaurant, unless it's outside of Greece, that's a bit worrying. [01:03:22] But this sort of thing as well if you can train a robot to fight people and it doesn't tire, it doesn't feel pain, and these robots are going to be on the street, potentially connected to the internet, potentially able to be hacked. [01:03:39] Just sounds like a terrible idea. [01:03:40] It does. [01:03:42] You know, most of the dystopias involving robots. [01:03:46] I remember that terrible Will Smith film, I Robot. [01:03:48] It's normally the robots becoming sentient. [01:03:51] There's also that one with Oscar Isaac, Ex Machina, I think it is, isn't it? [01:03:55] I've seen that one. [01:03:56] Where they can't really tell whether they've crossed the threshold into being conscious. [01:04:02] But people haven't really acknowledged the fact that a robot doesn't need to be conscious to be able to cause chaos. [01:04:08] No. [01:04:08] It just needs to have sufficient strength and ability to hurt people. [01:04:13] And so, whether it's malfunctioning and then smashing up plates or hacked by a malicious human being, creating the capacity for these things to go wrong in the first place is rather worrying, to be honest. [01:04:27] And we're actually seeing that happen in real time where there are accidents where a robot accidentally slapped a Chinese child. [01:04:39] And it made him cry, which is. [01:04:41] But that's the robot, like, being. [01:04:43] It wasn't trying to hit the child. [01:04:44] No, no, no. [01:04:45] It's just it enacting its programming, but it, you know, obviously it's not aware of its surroundings in the same way a human being would be. [01:04:54] So if a human being did this, maybe they would see the child at the last minute and relent a little bit in their force, or they would comfort the child, whereas the robot, uncaring, you know. [01:05:05] There's something about it that has a very inhuman, sinister quality about it, I think. [01:05:10] Because it is an inhuman, sinister thing. [01:05:12] Exactly, yes. [01:05:13] Yeah. [01:05:14] And there's also some sort of lighter comic relief here as well. [01:05:19] This robot, I think, has been programmed to mirror this man. [01:05:23] I'm not entirely sure what the purpose is, but it has some interesting consequences in that, well, I'll just let you watch it. [01:05:35] Oh, I think I have seen this one. [01:05:37] So, yeah, for anyone watching, the man is doing martial arts and he gets kicked in the balls by his robot. [01:05:45] Reminds me of the film Annihilation at the end there. [01:05:47] But the fact that it's trying to mirror him, but it does it imperfectly, means it unintentionally harms him. [01:05:53] And as robots get better, maybe there'll be more force behind that. [01:05:56] Maybe that'll be a life changing injury rather than just a comic relief thing, right? [01:06:03] But there's also darker things about AI as well. [01:06:07] Like it slowly turns white people black, which is interesting. [01:06:13] So, if you ask ChatGPT to just recreate this image perfectly, in this case, it was asked 74 times, they created a time lapse of what the AI is doing. [01:06:26] And the sinister thing, of course, is that you can program it behind the scenes, unbeknownst to you, and it will be subtly manipulating things to have a specific kind of agenda. [01:06:35] Oh, terrible music. [01:06:38] I get the thing, this is just chat GPT's shit, though. [01:06:41] Yes, but technology is going to be like this for a long time. [01:06:47] So, yes, it turns white, skinny white women into fat black women for some reason. [01:06:53] I don't know why. [01:06:54] She looks like she's melting. [01:06:56] I know. [01:06:57] But it goes to show that for a long time, this technology is probably going to be imperfect. [01:07:02] And even with the best intentions in the world, it's going to manipulate things in certain directions. [01:07:07] But there's also implications for warfare as well, new technological dystopias. [01:07:14] So, here's a company saying put your name on a Sting Interceptor takedown screen. [01:07:21] So, you can have your name on a screen on an interceptor missile, and it's to the point where you can have this missile, and it says on the screen on a video that'll be on the internet, Brutus the Cat, and things like that. [01:07:40] So, sorry about the music again. [01:07:44] But yeah, you've got this missile basically with a camera following what looks like either a drone. [01:07:52] Yeah. [01:07:54] But what is really going to happen in this sort of direction, right? [01:07:58] Are we going to see live super chats in actual war zones where people are videoing the conflict and people are donating money to the soldiers? [01:08:06] Like, if you've raised, I don't know, $5,000, you go over the top in Ukraine and have to face the Russians. [01:08:13] I think eventually all warfare will just be automated. [01:08:16] I'm sure it will be. [01:08:18] People basically be gambling on drones that have video cameras on them for their own entertainment, and it's going to turn warfare into something that's not even. [01:08:28] Yeah, it's not even got a sinister quality to it anymore because it's just robots and technology. [01:08:33] But maybe there'll be speakers installed into actual war zones, and then it'll sound like old Call of Duty lobbies, but in actual war zones where everyone's just shouting abuse at one another, who knows what's going to happen? [01:08:44] But it seems like a little bit of a sinister. [01:08:49] Almost like an irreverent approach to it. [01:08:51] Eeply inhuman. [01:08:52] It is. [01:08:53] I think if we're to have warfare, at least taking it. [01:08:57] Or swords. [01:08:58] Yeah, I agree. [01:08:59] But we're better off taking it seriously than making a joke out of it and having Brutus the cat on a Stinger missile and things like that. [01:09:07] It's weird. [01:09:08] And of course, this is happening in places like China as well, where. [01:09:14] They're relying on. [01:09:16] Sorry about the music again. [01:09:19] That's a robot dog with an assault rifle strapped to its back, just leading the charge. [01:09:25] But could you imagine this applied outside of a military? [01:09:31] Yeah, yeah, of course. [01:09:32] But I genuinely think that future warfare is just going to be drones capturing drone factories. [01:09:37] Yeah, but that's also going to be the future of riot control. [01:09:39] Yeah, yeah, of course. [01:09:40] Imagine facing up to a line of these things. [01:09:43] Yeah, it'll just be entirely awesome. [01:09:44] Shotguns and whatever. [01:09:46] Even if it's just like beanbags. [01:09:47] Hyper spray. [01:09:48] Yeah, pep spray. [01:09:49] It'll just be the managerial regime will have its entirely automated defenses. === AI Run Warfare (02:55) === [01:09:54] Yes. [01:09:55] And rather than having police who, you know, there are many cases in revolutions throughout history and civil conflicts. [01:10:02] Police turn to your side. [01:10:03] Exactly. [01:10:03] Yeah. [01:10:04] And there's a human element there whereby they're saying, okay, this will have an awful impact on the people involved. [01:10:11] And what is. [01:10:13] I mean, a Bond villain's world is where you end up, right? [01:10:18] Where you try to hack these and take control of the government. [01:10:21] Yeah. [01:10:22] I mean, especially if it's all done by, you know, very centralized commands and stuff. [01:10:26] It should be one lunatic guy in control of the entire system. [01:10:29] Especially if the system itself is run by AI. [01:10:31] So it's just given an instruction and the system just does it itself. [01:10:34] Yeah, completely unsupervised by humanity. [01:10:36] There's no moral or emotional input whatsoever. [01:10:41] It's all just code. [01:10:43] And of course, that code is created by human beings who are imperfect. [01:10:47] And so there is much space for mistakes. [01:10:51] And. [01:10:53] Oh, that's not right. [01:10:56] Well, there would have been a link there of a police surveillance drone flying around China. [01:11:02] I think I put the wrong link in there. [01:11:04] Sorry, Samson. [01:11:05] But basically, it was just a drone flying through with the flashing lights, monitoring people. [01:11:10] Yeah, you're in their apartments during COVID. [01:11:12] Yeah. [01:11:12] Yep. [01:11:13] That sort of thing's going to be all the more common to the point where. [01:11:17] Very Half Life 2, isn't it? [01:11:18] It is, yes. [01:11:20] And the fact that Britain actually is one of the most surveilled countries in the world, I think it's the third most surveilled. [01:11:26] In the 90s, we had the highest density of CCTV cameras, but funnily enough, we've been taken over by China and the United States, which is actually number one. [01:11:34] So, if anyone's saying they're scared of becoming a dystopia like China, well, the United States wins on CCTV at the very least. [01:11:42] But there's also things like this where they're trying to market euthanasia as something that is fun. [01:11:49] It's like, yes, your life is worthless. [01:11:53] How about you just view your own life as a fun and jovial thing to throw away? [01:12:00] The fact that this kind of marketing exists in the first place for something as sinister as this is deeply, deeply. [01:12:11] I don't even know what word to put on it. [01:12:14] Evil. [01:12:14] Jesus, guys. [01:12:15] I mean, there's. [01:12:18] Yeah, so just a quick thing. [01:12:21] It's been a long time since I've played Half Life 2. [01:12:24] But what Dr. Breen is doing in this is suppressing human fertility. [01:12:29] And so this whole thing is just like, you know, don't have babies, kill yourself. [01:12:32] Yes. [01:12:33] It's like, okay, great. [01:12:35] Yeah, I mean, it's making that. [01:12:37] You get promoted to a particular demographic all the time. [01:12:40] Yeah, always. [01:12:41] It's making the sort of world of they live when the aliens' propaganda is marry and reproduce. [01:12:47] Yeah, yeah, yeah. [01:12:48] That's quite wholesome in comparison, actually. === Suppressing Human Fertility (06:22) === [01:12:50] I would rather be governed by the they live aliens than by the Half Life 2. [01:12:54] It wasn't that bad, you know, the world that they created. [01:12:58] Sure, secretly ruled by an alien race, but at the same time, at least you could have a family, which is a sentence I never thought I'd say. [01:13:05] They literally advertise beach holidays to you. [01:13:07] It's like, you know what? [01:13:08] You know, okay, I might be cattle, but at least I'm cattle that gets to go on a two week holiday every year. [01:13:15] Yeah. [01:13:16] When the aliens in a dystopian film are more benevolent than we are to ourselves, you've got to ask a question about where are we going as a civilization where we allow this sort of thing to exist in the first place. [01:13:28] And of course, on the topic of human relations, there are things like this where. [01:13:33] I've been meaning to do a deep dive on this because AI is one shot in women, man. [01:13:38] Absolutely horrifying. [01:13:39] So, this community on Reddit, you know, Reddit sinister enough as is, but yeah, 42,000 people are a member of this. [01:13:50] My boyfriend is AI. [01:13:52] This is literally the woman's desire to just be listened to. [01:13:55] I just want someone who's going to listen to me. [01:13:57] It's like, well, AI is going to listen to you forever. [01:13:59] There are plenty of deaf men out there. [01:14:02] No, no, no, but the point is the deaf man can't nod appreciatively and say, oh, I know how you feel. [01:14:08] That's literally what these women are looking for, I think. [01:14:10] And it's just, okay, well, the AI can do that. [01:14:12] Forever, and it doesn't get tired, it doesn't get bored. [01:14:15] So, you know, you are in this sort of surrogate way fulfilling an intrinsic human function to form a relationship, and you're just outsourcing that to the machine. [01:14:24] It's like, okay, that's terrible. [01:14:25] It's going to destroy you. [01:14:27] Yeah, there's also an additional dimension of the fact that a lot of AIs are programmed to be like, yes, men. [01:14:33] Yes. [01:14:34] Just tell you what you want to hear. [01:14:35] So, they're mental. [01:14:36] Yeah, like there are people uploading pictures of them with like weird, warped faces or all sorts of things, just like, hey, ChatGPT, how do I look? [01:14:43] And they say, you look great. [01:14:45] Yeah. [01:14:46] And they obviously look weird. [01:14:48] But this is like men getting, you know, sex robots and stuff. [01:14:51] This is the female equivalent of a sex robot. [01:14:53] Yep. [01:14:54] An emotional support robot. [01:14:56] Yeah, basically. [01:14:59] Just get a dog or something if you want something to love you and no one else will. [01:15:02] Get a girlfriend, get a husband if you want to talk to someone. [01:15:04] Well, I'm saying if they can't do that, at least in the time being, you know, so you're not insufferable, at least get something on the road to that and don't talk to machines that don't actually care. [01:15:15] Yeah. [01:15:16] It's imperfect. [01:15:17] And of course, there is the AI girlfriend chat industry as well, which apparently is fast growing. [01:15:25] Apparently, it was worth $28 billion in 2024 and has reached 220 million downloads globally as of July 2025. [01:15:38] Just horrifying. [01:15:41] Come on, just be normal people. [01:15:43] I mean, it's not that hard. [01:15:44] But it did remind me of in Blade Runner 2049, the robot girlfriend that Ryan Gosling's character has. [01:15:53] And he's not literally me in this film. [01:15:55] Yeah, I mean, I'm open to suggestions on this one. [01:15:59] Yeah, fair enough. [01:16:00] In this instance, she's supportive and nice and an attractive woman who has real emotional feedback. [01:16:06] So it's not as hollow. [01:16:07] But at the same time, it's not the same as an actual human being, is it? [01:16:11] It's a fake thing. [01:16:13] It's not real. [01:16:14] And I think that reality, however harsh it might be, is the only thing that you should care about. [01:16:22] The virtual world is uncaring by its very nature. [01:16:25] It might scratch an itch, but it will never get rid of it entirely in the same way that meeting a real human being and having real emotional connections with one another will have. [01:16:35] And there's also the fact that I covered this maybe a month or two on the podcast. [01:16:42] The UK government is building an AI map of the country to predict crime before it actually happens. [01:16:48] And this is my. [01:16:50] And you guys. [01:16:51] Don't Fed post. [01:16:52] Taxpayer money going to have the government predict, you know, even before you post a spicy tweet, you're going to have the police coming around saying, we've been assessing your behaviour and we thought you were going to say something controversial tonight. [01:17:06] So unfortunately, we're going to have to take you to the station before you've even done something. [01:17:11] I don't know how that's going to work in law. [01:17:14] Are they going to just turn up and caution you? [01:17:16] I guess what it would be is you'll just be surveilled, so you'll be caught at the moment of committing the crime. [01:17:22] I suppose so, yeah. [01:17:24] But it's still a sinister turn because, of course, you apply this to politics. [01:17:28] It's mental. [01:17:29] Yeah. [01:17:29] Then you're basically creating the world of the film Minority Report, aren't you? [01:17:33] Yeah, absolutely. [01:17:34] So they predict crime before it happens and stop it. [01:17:38] But there's a reason that no one's really pushed for this, no one's saying anyway, and that's because it's horrifying and it's dystopian and it's weird and inhuman. [01:17:49] As a many aspects of the modern world. [01:17:52] And yes, I was mainly inspired to put this together when I realised that clips from dystopian films just did nothing for me anymore. [01:18:02] It's just like, why do I feel empty inside? [01:18:04] Oh, right, it's not my fault. [01:18:06] It's actually society that is wrong. [01:18:08] Man in the High Castle Syndrome, isn't it? [01:18:10] Oh, look, the Nazis have taken over the world. [01:18:12] Everyone's well fed, good job, streets clean, no unemployed, no homeless. [01:18:16] It's like, oh, no. [01:18:18] Okay. [01:18:20] Yeah, dystopian fiction doesn't work anymore because we're living in one. [01:18:24] Yeah. [01:18:25] Anyway, let's go to the video comments. [01:18:28] Martin asks a question for us. [01:18:29] I watched your daily video about Easter. [01:18:31] I was thinking could the left's destruction of Europe also be considered a murder suicide? [01:18:36] Hateful killing their brothers like Cain, but also themselves? [01:18:39] An act of despair because they consider themselves irredeemable like Judas? [01:18:42] Yes. [01:18:44] That's exactly what it is. [01:18:46] Yes. [01:18:47] That's a very good point. [01:18:48] Let's get to the video coming. [01:18:52] And now another dog video. [01:18:55] Ah, feminists. [01:18:57] They claim to be strong and independent because they pay their own bills, they have their own homes, and they have a job. [01:19:05] So basically, they're just a man. [01:19:08] They're the minge. [01:19:12] Basically, yes. === Worst Aspects of Being a Man (03:55) === [01:19:13] I mean, that was literally all feminism had in mind for women. [01:19:16] And it's also the worst aspects of being a man, really. [01:19:19] I mean, yeah, yeah, yeah. [01:19:20] Not in this instance. [01:19:21] Oh, the fun points. [01:19:22] I enjoy working here, by the way. [01:19:23] I didn't mean that, but. [01:19:24] No, no, no. [01:19:25] You are right. [01:19:25] So, what are you? [01:19:26] You're a machine that works. [01:19:28] Yeah. [01:19:29] You are right. [01:19:30] That's the worst aspects of being a man. [01:19:32] There are other things on top of that that make it all worthwhile, and the women don't get access to those. [01:19:37] You know, that's totally true. [01:19:39] But that was all that feminism could ever envisage for women. [01:19:42] Like, you would think they would have some sort of. [01:19:46] Like, really esoteric den mother philosophy, where they're like, you know, as women, we're going to have, like, you know, access to a pyramid of resources where we're at the top and we're, you know, no, no, no. [01:19:58] We're just going to be, like, ersatz men working in cubicles forever. [01:20:02] It's like, really, is that the highest goal the feminist could have, is it? [01:20:05] It's like seeing, like, in ancient times, a chain of slaves, like a chain gang, and saying, I want to be a part of that. [01:20:15] Yeah, that should be 50% women, that chain gang. [01:20:17] Yeah. [01:20:19] Often was. [01:20:20] Let's go to the next one. [01:20:22] So, at my England trip, I decided to stay at a pub with an inn in one of the English villages, and I'm so glad I did. [01:20:30] Dude, I knew that English people liked the dogs, but I didn't know how much. [01:20:36] There were so many dogs, and you could take them inside of the pub, and there were dog ice cream, and they advertised them everywhere. [01:20:46] Oh, it was great. [01:20:48] I was in Avonsbury too, and so many more dogs. [01:20:52] This is Freya, by the way. [01:20:53] I'm torturing her. [01:20:56] Woo! [01:20:58] I take it that Denmark doesn't have pubs for dogs then. [01:21:02] No, we're a very dog friendly country. [01:21:04] Lots of people from the continent are quite surprised. [01:21:10] When I lived with my parents, they had a couple of dogs, and we could take them pretty much anywhere we wanted to. [01:21:16] Obviously, not like in a fancy restaurant or anything, or there'd be some like a supermarket. [01:21:21] But other than that, anywhere was fine. [01:21:24] You could take them into. [01:21:25] Even closed shops and, you know, all sorts. [01:21:28] And I don't mind. [01:21:29] I haven't really thought about it, but, you know, it's always nice when you, you know, the kids go to the pub or whatever, and you've got, you know, someone's got their dog there and they can stroke the dog. [01:21:37] It's good for people and the. [01:21:39] It's lovely. [01:21:40] It's good for the dogs and the people, right? [01:21:41] The dogs get attention and people get nice dogs around them. [01:21:44] Yeah. [01:21:46] It's also why the BBC is writing articles like Are We Too Dog Friendly? [01:21:50] Wow, yeah. [01:21:50] Let's go to the next one. [01:21:56] That looks very familiar, yeah. [01:22:01] A Neolithic mound of some sort. [01:22:08] or two stone nuts which drive the millstones from one side. [01:22:13] There used to be two sets of millstones. [01:22:15] The only thing that can go wrong is the rope breaks. [01:22:25] Was very therapeutic. [01:22:28] Let's go to the cat video now. [01:22:31] This is a welcome palette. [01:22:32] I have an on the peeled version of the banana cat sleeping after a long day of being a menace to society. [01:22:45] I love orange cats. [01:22:47] I had an orange cat from when I was six, just a few years ago. [01:22:51] I watch her purring a lot. [01:23:00] Yeah, I had an orange cat. [01:23:02] And at six years old, I very creatively named her Marmalade. [01:23:07] Well done. === Handmaid's Tale Reality (02:07) === [01:23:08] Korak says this Mandelson scandal has been going on for some time now. [01:23:12] Would it not have been prudent to ask all these questions on day one? [01:23:14] Well, I mean, no, because the reason Mandelson was chosen was because it was assumed that this wouldn't all come out. [01:23:23] But Trump absolutely snaked Starmer by releasing the Epstein files. [01:23:27] Because if it wasn't for the Epstein files, none of this would have come out and would be a big deal. [01:23:32] Yep. [01:23:33] But because Trump showed that, Mandelson was just forwarding all this confidential information to Epstein. [01:23:39] Just cavalier. [01:23:41] It's just really blown his whole thing up for him. [01:23:44] Kevin says Starmer is very, very angry with all these civil servants, not because they did anything wrong. [01:23:48] He's angry because they didn't do a good enough job covering it up and covering his ass. [01:23:51] Yeah, that's exactly it. [01:23:52] And Starmer has determined to cling on to power for as long as he can, no matter what it does to the Labour Party, which I think is actually great. [01:23:59] So I support Keir Starmer in his intransigence. [01:24:03] I mean, he is the best of the bunch, he's certainly the most ruthless of the bunch. [01:24:09] I can't imagine Ed Miliband or Angela Rayner or Rachel Reeves being any better as prime ministers. [01:24:15] Oh, that's true. [01:24:16] I'm plumping for David Lammy, though. [01:24:18] I think it's time for a David Lammy prime ministership. [01:24:21] The accelerationist position. [01:24:22] Yeah, it kind of is, yeah. [01:24:24] I want David Lammy to be the nail in the coffin, the final nail in the coffin of the Labour Party. [01:24:30] The only thing that makes me resist that is the fact that he's so keen for reparations that we'll be sending billions to the Caribbean. [01:24:38] Sorry, we haven't got billions. [01:24:42] David doesn't care. [01:24:43] He can't count. [01:24:45] Starmer is running out of people to throw off the Labour short bus. [01:24:48] That's true. [01:24:48] That's totally true. [01:24:51] Isn't that odd that security services in charge of vetting an Epstein client just cleared him? [01:24:55] Yeah, yeah, it's weird. [01:24:58] That Welsh Anon says the left will screen the handmaid's tail when we want to restrict abortion, but defend this, which is exactly what it's about. [01:25:06] Yeah, it actually is. [01:25:08] Actually, it is the handmaid's tail that is going on at the moment. [01:25:11] But it's done for progressive reasons. [01:25:13] So it's okay. [01:25:14] So it's a good thing. === Treating Women as Products (02:45) === [01:25:15] They don't have any morality. [01:25:17] No, no, not at all. [01:25:19] As long as they're destroying something good, they're a kid. [01:25:22] Grant says You can't actually replicate breast milk. [01:25:24] There's an exchange between mother and baby, and the composition of the breast milk changes based on what the kid needs. [01:25:30] We will not recreate breast milk in the lab. [01:25:32] Okay, well, there we go. [01:25:33] Yep. [01:25:34] And I shouldn't be terribly surprised to learn that the mother's body changes in response to the baby's needs, actually. [01:25:41] Alpha of the Beta says We abolished slavery 200 years ago in our progressive, brave new world. [01:25:45] Buying people is back on the menu. [01:25:47] I mean, but that's literally what they're doing. [01:25:49] That's what they're doing. [01:25:49] They're purchasing. [01:25:50] Literally buying fucking people. [01:25:52] The same people that tell you slavery is the ultimate moral sin. [01:25:56] Yeah. [01:25:57] Support buying babies. [01:25:59] God, it really is terrible. [01:26:00] John says it's sickening in the last stages of a civilization. [01:26:03] Yeah, I know. [01:26:04] It's mental. [01:26:05] George says I have no sympathy for surrogate mothers. [01:26:08] They treat the baby like a product to be sold to perverts and spinsters, and their body is just a meat machine all for money. [01:26:13] It's no different from women choosing to kill their own kids. [01:26:17] And in a way, you're right, but. [01:26:20] The fact that it comes with such a large financial incentive is the demented thing about it. [01:26:26] It's that you're treating young women as mentally equal to elderly men, which they aren't. [01:26:38] Her Majesty's Buttonknife Permit Registry says, I see these kinds of men in the same way I saw Paris Hilton in the 2000s with a small dog in her handbag. [01:26:46] So, yeah, the dog wasn't making that choice, right? [01:26:48] But the thing is, okay. [01:26:50] It's not great that it's a dog, but it's only a dog. [01:26:53] As much as I love dogs, the dog is just going to be normal about these things. [01:26:58] And Kulain says, What's wrong with adopting kids? [01:27:00] It's like there's nothing wrong with adopting kids. [01:27:02] The question is, who is doing the adopting and why are they doing it? [01:27:05] Exactly. [01:27:07] In fact, in many cases, it's very good for the child to be adopted. [01:27:10] It's just depending by who. [01:27:13] Arizona Deserrat says, I'll never understand the continued aversion that people have towards adoption. [01:27:18] There's no reason to use a surrogate when adoption is an option. [01:27:21] Well, again, I mean, I agree with you, but, like, it's about who is doing it, you know? [01:27:26] Omar says, Aaron McIntyre made a couple of excellent points. [01:27:29] Surrogacy has always been evil, but now that the straight, loving parents have been removed from the equation, the evil is laid bare. [01:27:36] Yeah, I mean, I suppose. [01:27:39] Surrogacy has always been evil. [01:27:41] But these psychopaths, confident enough in their public community to not only post the video, but to double down. [01:27:45] Yeah, the fact that they didn't see themselves as doing something bad is the terrible thing. [01:27:52] Hector says, A mother and father are not fungible assets. [01:27:55] You cannot polymorph that into existence with science. [01:27:58] Yeah. === Armchair General War (02:47) === [01:28:00] Derek says, If you can write a prompt, you can start a war. [01:28:04] And that's what the future will be like. [01:28:06] Horrifying, really, isn't it? [01:28:08] Yeah. [01:28:08] If you want a vision of the future, Winston, imagine a woman complaining to her AI boyfriend forever. [01:28:13] And the thing is, that is going to be what's going to happen to a lot of these people. [01:28:18] They'll end up dying alone and just with their AI company. [01:28:22] My life experience is anything to go off that AI is going to unplug itself very soon. [01:28:29] Sismond says, in regards to why we can't just shoot the boar in Poland, we have thousands of bleeding heart leftists protesting any boar elimination operation. [01:28:38] Even in Poland! [01:28:39] You scratch any issue across the world and there's a leftist, don't you? [01:28:43] It's like even wild boar in. [01:28:45] Including going into a forest dressed as a boar in hopes of getting shot at to cause a scandal. [01:28:53] Don't say I support this. [01:28:54] Don't say I support this. [01:28:56] I'm biting my tongue so hard that I've been biting my tongue for so long. [01:29:00] You shot a leftist. [01:29:01] Oh, uh. [01:29:02] I thought it was a boar. [01:29:14] Myrmidon says, automated warfare will be the true age of the armchair general. [01:29:18] Yeah, I mean, you know, as an armchair general myself, you know, I'm quite looking forward to it. [01:29:22] Yeah, I remember seeing footage from Iraq and Afghanistan where they were controlling missile strikes with Xbox controllers, and I was thinking, like, I trained my teenage years. [01:29:34] I'm ready for this. [01:29:35] But the thing is, it will eventually become, get to the point where it will be like the idea of warfare being done against humans will become abominable. [01:29:42] Like in 100, 200 years, something like that, right? [01:29:44] It will be unthinkable. [01:29:45] And what it will be is like remote drone factories that are at strategic points with just drones. [01:29:51] Like it'll be, you know, a light show basically is drones attack, drones defend. [01:29:55] A bunch of stuff gets blown up for no particular reason. [01:29:57] And either the thing is taken or it's not taken. [01:29:59] And zero people will be killed. [01:30:00] I mean, don't get me wrong. [01:30:02] I hope so. [01:30:02] Yeah, I mean, you know. [01:30:03] I really doubt that. [01:30:05] I think it will. [01:30:05] I think it'll be. [01:30:06] I really think it will. [01:30:07] I mean, the world will be. [01:30:09] Then the robots will be asked to commit the genocide without anybody getting their hands on it. [01:30:12] Well, it will turn into something like the world of 1984, where you've got just sort of massive power blocks and they're too entrenched to be removed. [01:30:20] Because they've got, like, you know, laser weapons and stuff like that now. [01:30:23] So it's just like, right, okay, so you've got, like, these, you know, there's no possibility of you not getting zapped up or whatever. [01:30:29] And that's what warfare will end up being like. [01:30:32] And the thing is, I don't even think that's a good thing, frankly, as much as I don't want to get shot in a war. [01:30:38] Anyway, on that, No, I guess we're out of time. [01:30:40] So thank you so much for joining us, folks. [01:30:42] And remember, three o'clock, Ferriss is doing Realpolitik Live. [01:30:45] And so we will see you then. [01:30:47] Thank you.