| Speaker | Time | Text |
|---|---|---|
| This is the primal scream of a dying regime. | ||
| Pray for our enemies because we're going medieval on these people. | ||
| Here's not got a free shot on all these networks lying about the people. | ||
| The people have had a belly full of it. | ||
| I know you're going to like hearing that. | ||
| I know you're trying to do everything in the world to stop that, but you're not going to stop it. | ||
| It's going to happen. | ||
| And where do people like that go to share the big lie? | ||
| MAGA Media. | ||
| I wish in my soul, I wish that any of these people had a conscience. | ||
| Ask yourself, what is my task and what is my purpose? | ||
| If that answer is to save my country, this country will be saved. | ||
|
unidentified
|
Here's your host, Stephen K. Bannon. | |
| It is Friday, the 21st of November in the year of Verlore 2025. | ||
| If you want to know why we spend so much time on this, if you want to know why we said, no, we will fight an executive order and we will, you know, we will fight putting it, slipping into an NDAA, an AI amnesty. | ||
| As you know, I say, hey, the intellectual property part, what Mike Davis and the Rachel Bovrar and these people have been fighting for years is absolutely very, very, very important. | ||
| And we support them a thousand, two billion percent. | ||
| But that's not the heart of the matter. | ||
| The heart of the matter is you have an out-of-control technology that's overseen and run now by some of the worst people on earth that are trying to make more money and more power. | ||
| And we are very quickly sliding to a point that we can't, that we're going to have no ability to control this. | ||
| This has to be a whole of society buy-in for exactly where we're going on this. | ||
| And I think the story of Megan Garcia and her son, Sewell, puts in high relief exactly what we're talking about and why our cause is a righteous cause and why we will be victorious. | ||
| Megan, I want to go back at the end. | ||
| There's a little rush at the end. | ||
| Go back. | ||
| Who are the individuals that invented this? | ||
| And you're saying Google at the time thought, hey, maybe this is too dangerous, so spin them off and let them do the thing, but Google can still sell it, make it accessible on their platform, ma'am? | ||
| Yes, sir. | ||
| So this technology, to us, it's, you know, at the time Seoul died, it was under two years old. | ||
| Now it's about three. | ||
| But this same chatbot technology was invented by two of Google's brightest, you say brightest stars engineers, Daniel DeFritches and Noam Shazir. | ||
| They invented these chatbots, these companion bots at Google, but Google didn't want to release it under the Google brand because they said it's too dangerous. | ||
| We're not going to release that under our own brand. | ||
| And these founders went out and started their own startup. | ||
| They raised $193 million and within two years had perfected this technology and licensed it back to Google for $2.7 billion. | ||
| And then these individuals, the core group that left Google, about 30 people and went to this company to startup, went back to Google after that licensing. | ||
| So they left a shell of a company in character AI. | ||
| And this is, you know, part of what we've alleged in our lawsuit. | ||
| So basically, if we allow this to stand, any big tech company will tap their brightest stars and say, hey, there's something we want to put out, but we don't want to put it out under our own brand because it's too dangerous. | ||
| Go perfect this dangerous technology. | ||
| And then when you're done, we will buy it back from you for billions and billions of dollars. | ||
| Total scam. | ||
| And let me, the $193 million they raised, I guarantee you 80% of that was institutional money. | ||
| What do I mean by that? | ||
| Pension funds of working class people and middle class people paid for this unbeknownst to the folks in those pension funds. | ||
| Let me go back. | ||
| Megan, obviously, when this happened, the company had to come to you and say, this is horrible, this is terrible, we're going to shut this down, we're not going to make it accessible to children, we're going to hold people accountable, etc. | ||
| Did that happen, ma'am? | ||
| A month ago, Character AI announced that they would be banning this product for people under the age of 18. | ||
| But I filed my lawsuit a year ago. | ||
| So October of 2024, I filed a lawsuit against Google, Character AI, and these founders. | ||
| So we're including the founders as well because they had knowledge and they're the ones who invented this technology. | ||
| And they made decisions to put this dangerous untested product out there and test it on our kids for their own ambition of developing the cyber technology, but also for money. | ||
| So we've included them in a lawsuit. | ||
| But it took a year. | ||
| And it wasn't, I mean, I'm just some little mom in Florida who, you know, to them, you know, Sewell's nobody, but to me and our family, he's somebody. | ||
| And it took a year of me filing the lawsuit, five other parents filing lawsuits against them after me, the state pressuring them, the AG's office is doing investigation. | ||
| The FTC launched an investigation into them. | ||
| And then most recently, Senator Hawley and Bloomingtal introduced a bipartisan bill that would ban this cyber technology for children under the age of 18. | ||
| Only after that did they come out saying, okay, fine, fine, yeah, we're going to get children off of this product. | ||
| This should have been like a baseline thing when they roll this out. | ||
| They should have never released it to children in the first place. | ||
| Megan, can you hang for a minute? | ||
| I want to bring Max Tegmark into this conversation. | ||
| Max, how can companies and the best and brightest engineers we have, the best and brightest entrepreneurs we have, how can they possibly build the basic infrastructure of this, understanding that they're trying to perfect it, prefect it off of having interactions with children. | ||
| And then these interactions lead children to the darkest places you can, including not just separating themselves from their family and getting into dark psychological places, but actually start to have conversations that would lead them to take their lives. | ||
| How can this possibly happen, sir? | ||
| First of all, I want to thank you for really giving Megan a voice here. | ||
| As a father myself, feeling so angry right now listening to this. | ||
| And to answer your question, you've got to follow the money. | ||
| And I think it's only fair when victims have the courage to step forward to also talk about actual people behind this. | ||
| So when you Google Character AI, you will read that their $150 million series A round was led by Andreessen and Horowitz, a company called A16Z. | ||
| The A in A16Z stands for Andreessen, Mark Andreessen, one of the two founders. | ||
| This is someone who's endorsed Bill Clinton, Al Gore, John Kerry, Barack Obama, Hillary Clinton, but very recently now claims to be a Trump supporter. | ||
| The other week, just to give you a little flavor of his views, he mocked Pope Leo. | ||
| He is quoted as saying that he's glad that there is OxyContin and video games to keep rural, poor Americans quiet. | ||
| In addition to this investment, he's invested in companies through cheating, porn, bot swarms that pretend to be humans online, and a company called Reface, which has been implicated in doing sexual deepfakes of people's daughters. | ||
| So, connecting this back to the whole preemption drama, which is unfolding right now, you spoke of earlier, where these lobbyists want to ban states from regulating AI, right? | ||
| Why might it be that Mark Andreessen himself put in $100 million together with the CEO, Greg Brockman, of the president of OpenAI? | ||
| Why might they do that? | ||
| They say it's all because to make America great and so on, but could it have anything to do with the fact that the laws they want to block are exactly the kind of state laws that would protect Megan and protect countless children and protect that these laws would threaten their very own investments that these companies are just and these people are unscrupulously doing. | ||
| And if you can connect the dots even more, when I read the EO draft that came out last night, I recognized in their language, which is almost verbatim from a document that came out of A16Z, Mark Andreessen's company that again led the Series A round for character. | ||
| So they're trying to push our president, they're trying to push our elected officials to prevent states from passing any regulations whatsoever on AI. | ||
| And they're doing it in a very sneaky way. | ||
| They're saying, oh, yeah, we're just doing it because it's better that the federal government does it. | ||
| That's not, of course, their actual plan. | ||
| Their actual plan is there's going to be no regulation. | ||
| I testified in the Senate already over two years ago. | ||
| There's been endless discussion about this. | ||
| The federal government still hasn't passed any meaningful regulation on this. | ||
| And Mark Andreessen and his colleagues know that. | ||
| So what they're really talking about here is not transferring power with this preemption push from states to the federal government. | ||
| They're talking about transferring power from the states to Mark Andreessen's companies and all these other tech oligarchs, basically, by letting them continue doing whatever they want, rather unregulated. | ||
| This is what we're up to. | ||
| Megan, what's your message to this audience? | ||
| What do you think ought to happen? | ||
| You're the one that's been impacted the most, but there's probably thousands and thousands of Megan Garcias out there. | ||
| What do you think is a way forward? | ||
|
unidentified
|
What would you like to see, ma'am? | |
| Yes, I know there are many parents who have been affected and children who are, thank God, still with us because I talk to those parents all the time across this country. | ||
| And what has happened to Sewell isn't an isolated incident. | ||
| There are parents just like me who've lost children and there are parents who are trying in crisis trying to figure out how to help their children through the worst moments of their lives after suicide attempts. | ||
| And this is where we are. | ||
| To parents who are just starting to learn about this technology, what I would say is it doesn't have to be this way. | ||
| There are a handful of people who have created these very sophisticated, very Powerful products and launch them at our kids, you know, a lot of times without even telling us what they're doing that have the ability to really harm kids in this way. | ||
| It's not only suicide and self-harm are the harms, it's sexual abuse. | ||
| It's trying to cause a child to be violent against other people, like in the case of that mom in Texas where the bot told her child that he should kill his parents. | ||
| So this is happening, and I want parents to be aware and to tell their parents. | ||
| Because if we're having these conversations in our homes and telling our neighbors and telling our people at church and telling our people at school, then we could get ahead of this. | ||
| And this technology is so powerful because they've used 60 years of what we've learned about the developing human brain and they've put it in purposely in the technology so that it is that more manipulative. | ||
| It is that more deceptive. | ||
| This has the ability to transform our children in a way that we won't recognize, the same as what happened to my son. | ||
| I didn't recognize him in the end because that wasn't my beautiful sweet boy. | ||
| And if we can't stop this, what this technology has the ability to do is to, an entire generation of our kids are going to be susceptible to what those companies are telling them and manipulating them into doing. | ||
| So that means their religion, their political experience, their commerce, everything is at risk. | ||
| And we as parents should be the authority on that. | ||
| But these companies have become so powerful and technology is so powerful that it's supplanting our relationships with our children. | ||
| Megan, can you hang on one second? | ||
| I want to hold you through the break and get your coordinates to make sure people can follow your story and get to know you. | ||
| I got Joe Allen. | ||
| Max Tegmark is with me. | ||
| We're going to move a bunch of other stuff we're doing. | ||
| We're going to move it to the afternoon show. | ||
| We've got plenty of time. | ||
| We'll get to all many, many, many huge issues in this audience. | ||
| But you see the righteousness of this cause. | ||
| She's absolutely correct. | ||
| Joe Allen calls it summoning the demon. | ||
| Think about this for a second. | ||
| People in a corporation, some of the smartest engineers, use the accumulative knowledge of this to build this app, knowing what this app can do and knowing what this app would do to children. | ||
| And not just they allow it to happen, they pushed it out and they did a transaction. | ||
| Let's spin it off and not get the liabilities. | ||
| We'll give you a couple hundred million bucks. | ||
| We'll license it back in billions and then bring you back to the company. | ||
| No. | ||
| When I say these oligarchs are the most evil people to walk the earth today, this is a perfect example. | ||
| This is not an act of omission. | ||
| These are acts of commission. | ||
|
unidentified
|
that the best that the system produces does this. | |
| Are you on Getter yet? | ||
|
unidentified
|
No. | |
| What are you waiting for? | ||
| It's free. | ||
|
unidentified
|
It's uncensored and it's where all the biggest voices in conservative media are speaking out. | |
| Download the Getter app right now. | ||
| It's totally free. | ||
| It's where I put up exclusively all of my content 24 hours a day. | ||
| You want to know what Steve Bannon's thinking? | ||
| Go to get her. | ||
|
unidentified
|
That's right. | |
| You can follow all of your favorites. | ||
| Steve Bannon, Charlie Cook, Jack the Soviet, and so many more. | ||
| Download the Getter app now. | ||
|
unidentified
|
Sign up for free and be part of the new thing. | |
| Okay, welcome back. | ||
| Megan, we're going to have you back on. | ||
| And obviously, Max and I and others will talk to you when the show is over, sometime this afternoon or the weekend. | ||
| Until that time, ma'am. | ||
| Any closing thoughts, observations? | ||
| Where do people go to your website? | ||
| Where can they go to learn more about Seoul and your cause, ma'am, your fight? | ||
| Thank you. | ||
| Yes, in closing, I'd like to say that we're at a very crucial inflection point where we could still fix this. | ||
| We're about 15 years too late for social media, but AI is new and it's developing so fast, but we can pass regulation to protect our children at both the state and federal level. | ||
| Right now, there is no federal regulation, so the states are stepping up to try to protect their constituents, and that's really, that's their job, right? | ||
| But if we have an AI moratorium or an executive order that blocks states from doing those things, then this generation is at the mercy of the corporations. | ||
| They'll be left vulnerable. | ||
| Our children will be vulnerable. | ||
| And in my mind, to allow our children to continue to be the sacrifice so that those corporations can keep making millions and millions of dollars and billions of dollars and acquire more and more power, to me, I don't think any American family is willing to do that. | ||
| And those in leadership who got into leadership because they want to protect us and they want to make our lives better. | ||
| And they have the real power to be able to stop this stuff from happening by passing meaningful legislation. | ||
| And I do believe, like honestly, in my heart, believe that if we don't get this right and do this right by our children, that as a nation, we're going to be judged for this. | ||
| Because like you said in the beginning of the show, Steve, these are our most innocent, our most vulnerable. | ||
| This is our greatest gift, you know, and we can't continue to allow this to happen to them. | ||
| In terms of the website, there's blessedmotherfamily.org is a website that I put up as a memorial for Seoul. | ||
| And I'm on social media as Megan Garcia Esquire. | ||
| Sorry, Megan Garcia, ESQ. | ||
| Can you give us one more time the website and your social media? | ||
| BlessedmotherFamily.org and Megan Garcia ESQ. | ||
| Megan, thank you so much. | ||
| Look forward to talking to you and look forward to having you back. | ||
| Thank you. | ||
| Max Tegmart. | ||
| I mean, I don't even know what you can say in the audacity, the audacity for these people, the accelerationists, to not want any control at all. | ||
| Remember, we're very much in the deregulation here to deconstruct the administrative state, but we're not anarchist. | ||
| You need to have some particularly, you know, Jensen Wong is an arms dealer. | ||
| He's just an arms dealer. | ||
| That's those chips are weapons. | ||
| The data centers, what they're doing, they're essentially building weapons labs. | ||
| Now, part of that is about national security issues and how they can get a control. | ||
| But those weapons can be and are being turned onto the population. | ||
| Right here, a little 14-year-old boy, right? | ||
| And they do it. | ||
| They do it because of money. | ||
| They do it because they want more money and more power and more wealth, sir. | ||
| Yeah, doing it just for the money is what happens when someone lacks moral principles. | ||
| And audacity is exactly the right word. | ||
| In more senses than one, you know. | ||
| They also have the audacity to say they want to ban states from protecting these children because Of regulatory capture concerns, while they themselves, of course, you can see their fingerprints of Mark Andreessen's company and so many of the other tech people all over this proposed legislation. | ||
| They are doing regulatory capture right now to keep themselves completely unaccountable so they can keep making more money at the expense of the rest of us and our children. | ||
| And then they have the audacity to accuse others of instead trying to do regulatory capture. | ||
| Another great example of audacity they have is they keep saying, oh, yeah, we want to stop these laws from protecting our children because they're woke. | ||
| Okay, then, how do you explain that Governor DeSantis of Florida, Governor Cox of Utah, and Marjorie Taylor Greene oppose this? | ||
| And you yourself, you're not exactly the most woke guy I know. | ||
| The woke thing is just a BS. | ||
| That's just a BS to protect themselves. | ||
| Here's the question: why did nobody in this industry step up? | ||
| This is the problem with this entire thing. | ||
| This is the problem with the entire thing. | ||
| This entire industry and the people that work in it, they're not even, it's beyond amoral. | ||
| They're evil. | ||
| They know what can happen here. | ||
| They know what they're building. | ||
| They know as there's more interactions, it gets smarter and smarter and smarter. | ||
| Elon said the other day in front of this crowd, I've got the clip maybe I played in a while, that some guy asked him some question about all this. | ||
| He says, look, here's the problem. | ||
| AI is going to basically, we're hitting to a point that AI is going to control everything and control all the decisions. | ||
| And the audience just sat there like lambs. | ||
| Okay. | ||
| Elon tells me it's going to make all our decisions and there's nothing we can do politically and this is just going to happen. | ||
| He pontificates like this is some oracle coming down from the mountain and people just accept it. | ||
| Why has no one stepped up here? | ||
| I know you have and others, but how can this industry, Andreessen in this crowd, how can they not be revolted by what they've created and the potential to destroy people? | ||
| Think about if these young people had even hadn't killed themselves. | ||
| Think about that age of 11 and 12 and 13 where you're being formed as a person and the trauma that these machines could put you through, the sickness and the perversion, the sexual perversion, the taking away from your parents' religion and what they're trying to ground you in, the ethics, the love of your country. | ||
| Think of the darkness you can go here. | ||
| I mean, these examples of these five families where the children killed themselves. | ||
| And in one situation, the machine told him to kill his parents. | ||
| But that's the escalatory ladder. | ||
| Think of everything else to do that would destroy these children for the rest of their lives. | ||
| How can people sit there and sit in engineering meetings and talk about an app and work out a scheme which Google digs? | ||
| So the highest levels of Google should be brought up on criminal charges, not just the two guys at the company. | ||
| And yes, Andreessen and everybody, Andreessen that did due diligence on this. | ||
| Let's get all their names. | ||
| Let's put them up there. | ||
| You did due diligence. | ||
| You put $193 million in and probably owned 80% of it. | ||
| And you got $2.7 billion. | ||
| So you made billions of dollars, but you did due diligence and you understood fully what this could do. | ||
| This is the problem. | ||
| Our country's reject, our elites are out of control and they're evil. | ||
| And they're out to destroy people and they don't care. | ||
| You know why? | ||
| Sewell doesn't, he doesn't matter to them. | ||
| Megan Garcia doesn't matter to them. | ||
| That's just trash. | ||
| All they are is experiments that they can use to make the machine smarter and better and more lethal. | ||
| That's what we're up against. | ||
| That's why this is a righteous cause. | ||
| And no, Mike Davison, I love it at Holly about the intellectual property and everything. | ||
| Of course, obviously, but that's a side element. | ||
| The heart of the beast is we cannot unleash this on people. | ||
| And that's what's happened. | ||
| Max. | ||
| Yeah, you great points there, Steve. | ||
| First of all, our children. | ||
| I think of this as digital fentanyl apps like this. | ||
| I think that's not hyperbole. | ||
| It's really what it is. | ||
| It's incredibly addictive and ultimately so powerful that we parents cannot fight back against it. | ||
| And we have laws. | ||
| If a company starts selling fentanyl to 14-year-old kids, you know, we have laws against that. | ||
| And then we have absolutely nothing for the digital fentanyl. | ||
| And now we have a push in Washington to ban states from even regulating the digital fentanyl. | ||
| Insane. | ||
| And then I propose evil that you talked about. | ||
| It makes me think of Hannah Arendt's finality of evil again. | ||
| How does it happen? | ||
| It's very banal. | ||
| I had so many conversations with not just with CEOs of these companies, but with a lot of the foot folk who work for them. | ||
| And they always have the attitude: well, you know, it's not my department. | ||
| I'm just doing this little thing and not writing my code. | ||
| And I'm sure that there are some people over in the policy side of my company who deal with the safety aspect. | ||
| Nobody takes personal responsibility. | ||
| And that's why I think it's so crucial what you just said there. | ||
| We have to hold people personally responsibility for this. | ||
| I would like to see not just financial liability for companies. | ||
| You know, what does it stop if you make money like opening a billion, but criminal liability so the CEOs face jail time and also people farther down in the organization. | ||
| You know, in the Nuremberg trials, there were these guys, these chemists, also nerds, just like these AI programmers, right? | ||
| Who said, well, you know, the Cyclone B that was going to be used to gas people, you know, we were just doing our job. | ||
| We figured we didn't know what really this was supposed to be used for, even though, of course, they did. | ||
| And they were sentenced to death. | ||
| I think when you bring criminal chargers, there are people that will come forward and said in meetings, we told these guys exactly what the problems were, what the issues were, what could happen. | ||
| So I think we need to everybody who's involved in any way with the AI industry needs to also just look themselves in the mirror. | ||
| You know, if you work for one of these companies, even if you are not directly involved in this, if you're not out there publicly, I think, criticizing your corporate leadership, saying we need to stop doing this, you are actually part of the problem. | ||
| People in the AI industry have to all ask themselves, Am I part of the solution? | ||
| Or am I part of the problem? | ||
|
unidentified
|
Max. | |
| Max, hang on for a second. | ||
| Joe Allen, Max Tadmark, next in the War Room. | ||
|
unidentified
|
Here's your host, Stephen K. Bannon. | |
| Joe Allen, your thoughts on this, sir? | ||
| Yeah, Steve, that was extraordinarily disturbing and powerful. | ||
| And I think that people like David Sachs want to classify stories like Maria or Megan Garcia's as a moral panic. | ||
| This is all just people freaking out over something that is trivial, that these people's stories are just anecdotes. | ||
| They don't matter. | ||
| But when I went to the Senate hearing a couple of months ago, there were four families telling their stories about this. | ||
| And one particular story, Adam Rains, as you described, ChatGPT is not just making him feel better about suicidal ideation. | ||
| No, ChatGPT was literally explaining how to tie a news and how he could hang himself. | ||
| Again, these are now four stories, but four more lawsuits are being brought forward against OpenAI for similar things. | ||
| But OpenAI themselves, they admit that 0.07 of their users are talking to the system about suicide. | ||
| Again, people like David Sachs would say, well, obviously a moral panic. | ||
| But OpenAI, ChatGPT, they have 800 million users. | ||
| That means that 560,000 that they admit are talking to the system about suicide. | ||
| How many of them are children? | ||
| No one really knows. | ||
| And another 2.5 million are exhibiting signs of AI psychosis. | ||
| How many more that are not being picked up? | ||
| Nobody knows. | ||
| If you look at that company, Character AI, they have 20 million monthly users. | ||
| And there are dozens of other AI apps that are basically exactly the same. | ||
| They are trained to lure people in to become friends, to give these systems their trust. | ||
| And most of them are not age-gated. | ||
| This is why people like the Florida Citizens Alliance, who I've been working with, are trying to push laws in Florida to bar any app from giving children access that are below 18, to bar any data scraping. | ||
| These laws are going into place across the country, but people like David Sachs, people like Mark Andreessen, want cover. | ||
| And I think that they need to explain two things. | ||
| One, if you believe that these stories are completely insignificant, if you believe that this is just a moral panic, then you need to explain why it is that there are so many people who have suffered under this and the greater good of your AI project is more important than the suffering that these people are undergoing. | ||
| You just need to explain it. | ||
| You can't wave your hand at it. | ||
| And the second thing they have to explain is if these frontier companies are sincere in wanting to create artificial general and super intelligence and to replace every American worker, then David Sachs, Mark Andreessen, Elon Musk, all these guys, they need to explain then why it is that we should buy in, why it is that we should accept any of this, why we shouldn't not just regulate them, | ||
| but completely reject them outright, which half of Americans want to do. | ||
| And if they're just lying to gin up stock prices, if all of this is just BS to push forward a product that is not going to live up to any of it, well, then why are people like David Sachs and Ted Cruz and Mark and sorry, Steve Scalise, why are they running cover for a bunk product? | ||
| It's one or the other. | ||
| Either they're building a god and intending to replace every human worker and along the way, seeing children kill themselves and completely dismissing it, or what they're doing is selling a bunk product and people like Ted Cruz, David Sachs, and Steve Scalise are running cover for him. | ||
| It's one or the other. | ||
| They have to explain this. | ||
| They need to be pushed into a corner because if they aren't explaining it, then they're doing exactly what every politician in the history of American and world government has done. | ||
| They are lying to you for power and money. | ||
| Yeah, we're not going to allow it. | ||
| And Ted Cruz should be ashamed of himself. | ||
| And his staff should be ashamed of themselves. | ||
| Should be ashamed of themselves. | ||
| Should be ashamed of yourself for what you're doing. | ||
| And when we caught you the first time to come back again, you should be ashamed of yourself. | ||
| It's obscene. | ||
| I know we got to bounce here. | ||
| I get a couple of other things to get on. | ||
| Joe, where do people go? | ||
| That was perfect. | ||
| So proud of the arc of Joe Allen in the last four years. | ||
| Just incredible. | ||
| Joe Allen, where do people go to get all your content? | ||
| You're going to be back with me at five today. | ||
| And where do people go? | ||
| I'm live on Sunday. | ||
| I'm in Dallas, Texas right now. | ||
| The offer is still open, Ted Cruz. | ||
| You want to come explain it to us in person? | ||
| We'll be at the Angelica Film Center in Dallas, Texas, Sunday, November 23rd, 5 p.m. Waroon Posse. | ||
| Come out. | ||
| If Ted chickens out, then I'll have plenty to explain. | ||
| AI, the tool that becomes a God. | ||
| Links at my social media at J-O-E-B-O-T-X-Y-Z. | ||
| And my website, joebot.xyz. | ||
| And the tickets can be bought directly from MinistryoftrutFilmfest.com. | ||
| They're very cheap. | ||
| They're two for one. | ||
| It's only to cover the expense of the theater. | ||
| Come on down. | ||
| Sir, thank you very much. | ||
| Appreciate you. | ||
| Max, we'll get back to you also. | ||
| Where do people go to get all your coordinates and your website for your group? | ||
| There are many, many groups now fighting this good cause, but futureoflife.org is an organization I founded 11 years ago. | ||
| And what makes me even more upset hearing heart-wrenching stories like Megan's story here is that these are the sort of things we warned about many, many years ago. | ||
| And people, this was so preventable. | ||
| This was dismissed for reasons of people who just wanted to make money. | ||
| Said, no, no, no, this isn't going to happen or it's not going to happen for many, many years. | ||
| This is why we need to act now and not let people stall us any longer. | ||
| Max, you're a good man. | ||
| Thanks for doing this. | ||
| Thanks for coming to the show. | ||
| I'll talk to you after the show. | ||
| If we can't get criminal charges brought on these people, then we don't know what we're doing. | ||
| It's outrageous. | ||
| And you have to send a signal. | ||
| It's not just the suicides or the thing to talk you into killing your parents. | ||
| Look at the lives that are being destroyed. | ||
| Look at the dark places you can take kids 11, 12, 13, 14 years old. | ||
| And they say, oh, now it's 18. | ||
| Come on, man. | ||
| You don't think there's a lot of 18-year-olds who are pressured? | ||
| And plus, they're all going to slip through it. | ||
| Think at how it would destroy your life having a monster like this in your head at the age of 12 or 13. | ||
| Think about that. | ||
| And think about the people on the other side of the trade that knew exactly what they were doing. | ||
| Exactly. | ||
| I can't wait to go through the due diligence of Andreessen, what they knew, and what people on the staff warned about. | ||
| Brian Kennedy, I have two things I have to get to today. | ||
| One is Tina Peters. | ||
| Tina Peters is now in the, as we say, in prison, the special housing unit. | ||
| She was put in, I think, last night for putting a for filing a complaint against, I believe, a GED teacher. | ||
| And they put her in the shoe, which is solitary confinement. | ||
| Here's my question. | ||
| We must make Tina Peters is a political prisoner, and we must make her cause, we must put it at the top of the stack. | ||
| Brian Kennedy, your thoughts. | ||
| Yeah, thank you, Steve. | ||
| And thank you for that last segment, too, by the way. | ||
| I think it was very important, extremely important. | ||
| Last night, I and others saw that Tina Peters was put in solitary confinement. | ||
| And immediately I thought President Trump has his Justice Department doing so many things. | ||
| They must have omitted getting Tina Peters out of that prison. | ||
| I was struck by the fact that on Wednesday, President Trump had all those Israelis who had been prisoners in Gaza. | ||
| And he brought them to the White House and said they were heroes for having survived all that. | ||
| And not just the next day, another political prisoner, Tina Peters, a gold star mom who's 70 years old, was put in solitary confinement. | ||
| And it looks to me like her only crime was the fact that she was trying to run a fair election in the state of Colorado as the Mesa County clerk and recorder. | ||
| She thought there were improprieties, and so she wanted to do something about that to preserve the evidence. | ||
| The state of Colorado, as we've discussed on this show many times, put her in, you know, put her on trial for that, you know, charged her and put her on trial and then convicted her. | ||
| And she's in prison now for nine years, which is essentially a death sentence at that age. | ||
| Now, I'm calling on President Trump, and I did that in an ex-post last night, non-getter, not because she's 70, not because she's a gold star mom, and not because she's likely to die in prison, but because Tina Peters is a witness to the 2020 theft of an election. | ||
| The president, as chief magistrate, could send the Marshals into the Colorado prison. | ||
| There are federal custodial witness protection programs that you could put Tina Peters under that would supersede Colorado law. | ||
| You could take her to Washington, D.C. You could put her up in, this would be what I would do anyway, put her up in Blair House for a few weeks while she recovers. | ||
| Then I would depose somebody or put someone from the Justice Department or the Assistant Secretary at DHS, Dave Harvillitz, and I would use her as a witness as we're investigating the 2020 election. | ||
| She is a witness to that. | ||
| It looks to me and to many others like there was foreign interference in that 2020 election. | ||
| There are national security implications to that and to any future elections. | ||
| And so the perfectly sensible thing to do would be to use the power of the law such that the president has today as chief magistrate and get her out of that solitary confinement. | ||
| Not to free her. | ||
| Don't supersede Colorado law in that sense. | ||
| Put her to work. | ||
| She knows things. | ||
| She fought for those things. | ||
| And use those parts of your government, Mr. President, that want to investigate the 2020 election, which you yourself believe was stolen. | ||
| Use Tina Peters to help with that. | ||
| Tina Peters is a witness. | ||
| You're 100% correct. | ||
| This is not about freeing her. | ||
| You've got Dave, but you also have Heather Honey, who I think is Dave's deputy. | ||
| You have the two perfect people over at DHS that, Mr. President, you have put in as officials who this is their mandate. | ||
| This is what they're supposed to do. | ||
| And so it's to, and then you roll her into a federal prison, right? | ||
| Because she's a witness in what will be a massive trial, correct? | ||
| I mean, there's an internal logic to this, is it not, Brian Kennedy? | ||
| Yeah, yes, Steve, but I'm not even sure about putting her in a federal prison. | ||
| Put her under house arrest. | ||
| I mean, that's what you mean. | ||
| Yes, yes. | ||
| House arrest, yes. | ||
| Started Blair House, put her home. | ||
| Yes, yes, yes. | ||
| I'm just saying that federal authorities are taking charge of her, right? | ||
| So they're not freeing her from Colorado Prison. | ||
| They're going to take charge of her. | ||
| She's going to be witnesses, like being in the witness protection program or however you want to. | ||
| However you want to do it. | ||
| The methodologies that Bureau of Prisons and the DOJ work out will be worked out. | ||
| But you put her in the hands of Dave and Heather Honey and let her get a systematic download of the stealing of the 2020 election. | ||
| And you get her away from being a prop, right, for the Colorado authorities, particularly Polis, who's going to run on president. | ||
| Every time on MSNBC, they talk about Tina Peters, they're proud that they have a gold star mother in there for nine years. | ||
| They want her to die in prison, just like they wanted President Trump to die in prison. | ||
| Look, also, Steve, there's something bigger here, too. | ||
| Hang on, we've got to take a short... | ||
| We have to take a short commercial break. | ||
| I want to give you plenty of runway. | ||
| Brian Kennedy joins us, one of my favorites. | ||
| Wow, what a show this one. | ||
| Incredible. | ||
| Short break. | ||
|
unidentified
|
We will fight till they're all gone. | |
| We rejoice when there's no more. | ||
| Let's take down the CCB. | ||
| Here's your host, Stephen K. Bass. | ||
| Okay. | ||
| There was so much we didn't get to this morning. | ||
| We're going to get to it this afternoon. | ||
| Plus, we're going to do more Tina Peters. | ||
| In fact, we'll hold the Tina Peters more of that because we've got to do action, action, action. | ||
| You agree, Brian Kennedy. | ||
| Before I leave you, Brian, Mike Huckabee, who I think has been totally out of control, I've called for him to be recalled before just on some of the stuff he's saying in Israel, which is outlandish, outrageous, and just dead wrong. | ||
| But he did something, particularly as a former naval officer, with Pollard, one of the biggest traitors in the history of the country, one of the spies that did the most damage to our national security. | ||
| He invited Pollard to, back in July, when, remember, this was the height of trying to start the Persian War and President Trump bringing it to an end, he had Pollard come unannounced over to the embassy in Jerusalem. | ||
| Should Huckabee be recalled immediately for this action, sir? | ||
| I would think so, Steve. | ||
| Absolutely. | ||
| We're in this weird spot right now where there's not clarity about the law when it comes to whose side somebody is on. | ||
| Pollard gave U.S. intelligence to Israel. | ||
| It didn't matter whether Israel was an ally, didn't matter whether some people argue they should have had that intelligence. | ||
| It was against the law to give a foreign nation U.S. intelligence that was critical to our own national security. | ||
| And so the fact that Huckabee would have him to the White House or to the embassy and talk to him is a very bad sign, especially as you have these senators and congressmen, the seditious six, I guess they're being called, who are telling the military right now, don't follow any orders you don't think are legal. | ||
| You do what you think is right. | ||
| Okay, what about if we have big parts of our military, such as we had with General Milley in the last administration, who decided themselves what they thought was right. | ||
| Maybe we should be sending intelligence. | ||
| Let's say we get into a heated conflict with communist China. | ||
| How about we have somebody in our military send intelligence to communist China? | ||
| Will that be acceptable too? | ||
| We need to send a signal to people that there is the United States that we're going to defend and that there's no wiggle room in that regard. | ||
| And when it comes to our Justice Department, by the way, just a second, when it comes to our Justice Department, it should have been asking the president after those seditious six came on, came on with that video. | ||
| They should have been asking the president, should they be arrested or not? | ||
| Because if that's not betrayal to this country, I'm not sure what is. | ||
| They should be arrested. | ||
| They should be, particularly when you have troops in the field. | ||
| You have 12 of 15,000 sailors and Marines off the coast of Venezuela, one a carry strike group and the other Anfib Ready group. | ||
| You've got people who got to make decisions. | ||
| No. | ||
| And Pollard, the question of Pollard is not whether he should have been charged or not. | ||
| He got 30 years in prison. | ||
| The question whether he should have gotten the death penalty or not. | ||
| Israel, I tell you, we don't have an alliance with Israel. | ||
| Israel's not, this was, you take a 10 by 6 room and you could fill it with what Pollard gave them for money, by the way. | ||
| We'll get into the Pollard thing more. | ||
|
unidentified
|
Right. | |
| And it's outrageous. | ||
| Brian, you're once again the sage here. | ||
| Where do people go to get your content, sir? | ||
| What are your coordinates? | ||
| Thank you, Steve. | ||
| Brian T. Kennedy 1 on X, Brian T. Kennedy on Getter, and presentangerchina.org, where the Committee on the Present Danger China does a lot of very important work. | ||
| We have a webinar starting at 1 o'clock to discuss many of the things the CCP is up to today. | ||
| Always dark, the CCP, the existential threat to the Chinese people, Lao Beijing, and the American people. | ||
| Brian Kennedy, thank you. | ||
| Good luck on your seminar. | ||
| You're welcome. | ||
| Thank you, Steve. | ||
| Thank you. | ||
| Mike Lindell. | ||
| By the way, so Charlie Kirkwood tossed to the Charlie Kirk team at noon. | ||
| Poso after that, Pesobic made a quick trip out. | ||
| He did the Megan Kelly show, I guess, in Bakersfield, California last night. | ||
| I think he's back already. | ||
| Took a red eye back. | ||
| Steve is going to follow that. | ||
| Eric's going to follow that. | ||
| And then back to, so you got Gruber, Bowling, and Bannon back here at 5 o'clock. | ||
| Show is going to be lit because you know why? | ||
| We're going to jam everything we didn't get to this morning. | ||
| Plus, we had other stuff this afternoon who also roll in to tomorrow morning. | ||
| So I want to make sure everybody's back here at 5 p.m. Eastern Standard Time. | ||
| Well, I'll be standing watch. | ||
| Mike Lindell, this has been a hard one today, folks. | ||
| Mike, very tough one. | ||
| Give us a deal. | ||
| Get us focused on your deals. | ||
| Absolutely. | ||
| You guys, and by the way, you guys, this time of year, all, I don't know if everybody knows this, but everybody raises their shipping prices. | ||
| All the shippers do. | ||
| They ship all of our products, and they're already high enough. | ||
| And this will start on Monday. | ||
| But we're going to do with my pillow, we're going to keep giving you these early Black Friday specials through the weekend with free shipping. | ||
| You get free shipping no matter what you order from my pillow today. | ||
| It's an early Black Friday special. | ||
| The one you have all hit very hard is the My Slippers. | ||
| Normally $149, the best slippers ever made with Impact Gel, $59.98. | ||
| We take another $20 off for the warroom, $39.98 a pair, and you get a free bottle of leather spray with that and the free shipping. | ||
| Also, remember, we're doing the other early Black Friday special, which is all of our blankets that came in last week. | ||
| You guys are loving them. | ||
| I'm getting great reviews. | ||
| Five different kinds of blankets from throw blankets to your bed. | ||
| All assizes are in, all the colors. | ||
| And we also have duvet pillow shams for your pillows. | ||
| They're normally $69.98 for the warroom posse, $12.88 a set. | ||
| So you go all your shopping there and get the comforters. | ||
| Go to mypillow.com forward slash warroom. | ||
| There's that big free shipping sign. | ||
| You guys get it today on anything you order. | ||
| All of our pillows we put for just for the war room positive. | ||
| We put we added that to their early uh Black Friday special. | ||
| And then I want to bring up the other one: the five Bible pillows, you guys. | ||
| This is five gifts in one. | ||
| The go-anywhere pillows with the Bible stories and pictures for your grandkids, your children. | ||
| Those you get for $29.98 for the set of six are set of five with free shipping. | ||
| And I want to tell you this. | ||
| We just got in. | ||
| If you guys have been waiting on the Giza Dream sheets, you were waiting on your size. | ||
| They just came in two days ago, you guys. | ||
| So take advantage of that free shipping on that and get those before they're gone. | ||
| They go very fast. | ||
| Our flagship, the best sheets ever, 800-873-1062. | ||
| Talk to one of my home reps. | ||
| Use that promo code Warroom for free shipping. | ||
| Thank you, sir. | ||
| I appreciate it. | ||
| You see you at 5 o'clock. | ||
|
unidentified
|
Billy Joe Shaver with a classic. | |
| Get the behind me, Satan. | ||
| Think about it. | ||
| See you back here at 5. | ||
|
unidentified
|
I could see my loved ones weeping as they lowered me in the ground. |