All Episodes
Sept. 18, 2025 - Bannon's War Room
50:05
Episode 4788: If Anyone Builds It, Everyone Dies
Participants
Main voices
n
nate soares
07:37
p
peter navarro
08:09
s
steve bannon
16:11
Appearances
e
eliezer yudkowsky
04:53
f
frank turek
04:49
j
joe allen
02:22
s
shemane nugent
02:47
Clips
m
mike lindell
00:49
| Copy link to current segment

Speaker Time Text
steve bannon
Do we have time before we get our new guest up or can we play the Charlie Kirk Doctor?
unidentified
I want to.
Okay, I want to play.
steve bannon
This was very moving.
It came out last night.
It was going to be the end of our cold open.
unidentified
But as you saw, we cut in the cold open.
steve bannon
So let's go ahead and play this.
unidentified
I want everybody to hear this.
steve bannon
Let's go ahead and play it.
frank turek
Now, here's what Erica wants me to relate on Sunday.
unidentified
This is going to be the hard part, but maybe also the comforting part.
Charlie Kirk was literally like a son to me.
frank turek
I have three sons.
He was like my fourth son.
unidentified
My three sons are a little bit older than Charlie.
He was like my fourth son.
So when he was hit, if your son got hit, what would you do?
What would you do?
I got in the car because if there was any way I could save him, I had to do something.
I couldn't just, we just take him.
You guys got it.
So they got him into the side of the car.
It was an SUV.
It was the SUV, we took over.
And I'm on one side, and there's actually some video.
There's somebody who's taking video of this.
frank turek
I'm on one side of the car, the right side, and they're getting Charlie in.
unidentified
So I run over to the other side, but the guy was dragging him in.
They're still blocking that entrance.
So at that point, I run around to the back.
frank turek
I pop the top, the back gate open, and I jump in the back.
unidentified
The car lurches forward.
Apparently, somebody jumped in the car.
frank turek
So the car lurches forward.
unidentified
So I almost fall out of the car or the SUV.
frank turek
Then I grab the thing and close it.
unidentified
And there's five of us in the car now.
Justin is driving.
frank turek
Dan is up front with the GPS.
unidentified
Rick has got him.
frank turek
Rick's on my left.
unidentified
And Brian is there.
And I'm coming over the back seat.
And Charlie's laid out in front, just right in front of me.
And Charlie's so tall, we can't close the door.
We drove four miles, I don't know, it's four something miles all the way to the hospital with the door open.
To this day, I don't know how Brian stayed in the car because we're just go, go, go, go, go.
You know what we're trying to do?
We're trying to stop the bleeding.
frank turek
You saw it.
unidentified
And I'm yelling, come on, Charlie, come on, come on.
Meanwhile, my phone is still on.
My son and daughter-in-law are here in this whole thing.
And his security team, again, Justin, Dan, Brian, and Rick, they love Charlie, but they're much cooler than I.
I mean, they're just carrying out, they're calmly, but they're swiftly doing exactly what they were trained to do.
Rick starts praying out loud.
frank turek
I'm praying out loud.
unidentified
We're yelling, come on, let's go, let's go, let's go.
My son's hearing all this, and we're doing the best we can to navigate traffic.
frank turek
It's not a highway.
We're on surface streets.
unidentified
And suddenly there's an ambulance coming toward us.
And there was conversation in the car.
Should we stop?
frank turek
We're like, no, no, just keep going.
Just keep going.
The doctor later said that was the right thing to do.
Ambulance goes by us.
unidentified
We're still heading to the hospital trying to get there.
At one point, somebody says, let's get there in one piece because we're just, we're cutting through intersections, you know, just beeping the horn.
This is not an emergency vehicle.
frank turek
There's no, there's no lights.
There's none of this.
unidentified
And I go, we got to start CPR.
So I try and start that.
Now, Charlie wasn't there.
frank turek
His eyes were fixed.
unidentified
He wasn't looking at me.
frank turek
He was looking past me right into eternity.
unidentified
He was with Jesus already.
He was killed instantly and felt absolutely no pain.
frank turek
That's what I was told later.
unidentified
But of course, we had to try.
And by the way, there was just nothing, nothing any of us could do about it.
We were giving him CPR, but nothing was happening.
It wasn't like if we had better first aid or we had better medical facilities or we were faster to the hospital, we could have saved him.
frank turek
We couldn't.
unidentified
So if that's any comfort at all, Charlie didn't suffer.
He was gone.
He was with Jesus, absent from the body, present with the Lord.
That's where he was.
Now, it is true when we got to the hospital and they started working on him right away.
They did get a pulse back.
And so Rick and I were just, everyone's praying.
We're just praying for a miracle.
We had a small sliver of hope.
And the doctor later said that we got a pulse because Charlie was a very healthy man, but the shot was catastrophic.
frank turek
So 20 or 30 minutes later, the surgeon came out and said he was dead.
unidentified
Thursday, 18th September, Year of the Lord, 2025.
We had a break for the press conference at Checkers, and Dr. Peter Navarro had to take a meeting over in the West Wing.
He's going to rejoin us approximately 11:30 to go through that.
Charlie Kirk, this is why huge announcement overnight, President of the United States going to designate Antifa and affiliate organizations a major terrorist organization.
And this will now expand from what the authorities out in Utah were doing, which is a murder, just a typical murder case, to something that's going to really get to the bottom of it.
And I think you see, and you know my opinion on this ridiculous set of text messages they're trying to foist on us, which is absurd to do.
We'll break that more down this afternoon.
Alex Brusewich is going to join us.
I think he's over a turning point right now who joins this afternoon.
I want to bring on because of the what happened at this press conference about the, there was this massive deal, excuse me, before the press conference earlier today, some of the audience might not have caught it, but there was a huge announcement of this transaction for nuclear power and $400 billion and all this technology we're going to provide.
And it's all down to about artificial intelligence.
I want to bring on the authors of a book that I think is a must-read for every person in this nation.
steve bannon
If anyone builds it, everyone dies.
unidentified
It's about artificial intelligence and two of the individuals that have been there from the beginning and know both the benefits and huge upside of artificial intelligence, but also the potential downside.
And is there enough risk mitigation?
steve bannon
Eliza Yachowski and Nate Soros, Yud and Nate.
unidentified
Thank you for joining us here in the war room.
steve bannon
I appreciate it.
unidentified
First question, Yud, I think it was you.
Weren't you the guy just like a year or so ago?
And Joe Allen, I was going through some stuff with Joe Allen.
Weren't you the guy that said, hey, if this thing gets so out of control, I will be the first to go bomb the data centers?
Was that you?
No.
I was saying that we needed an international treaty and was being claimed that the international treaty needed to be willing to be backed up by force, including on non-signatory nations.
eliezer yudkowsky
This wasn't about individuals trying to stop a data center.
unidentified
I don't think that's going to be all that effective.
You take out one, there's others.
Bannock in your own country moves to others.
eliezer yudkowsky
This is an international treaty situation.
unidentified
It's not the kind of product which just kills the voluntary customers or even people standing next to the voluntary customers.
This is something that endangers people on the other side of the planet.
So I was trying to be plain.
eliezer yudkowsky
This is the sort of treaty that needs to be backed up by force.
unidentified
It has to be.
So, yeah, that is a, it's kind of a brilliant concept, right?
And you're saying that even non-signatories would have to take out.
steve bannon
What is it?
unidentified
Make your case.
steve bannon
What is it about this technology that potentially could be so dangerous to humanity?
unidentified
Because all we hear, all we get, we get glazed every day.
We get glazed at how this is so wonderful and this is so tremendous.
And you've got Larry Fink, you got Steve Schwartzman, you have the head of Apple.
They're all over there cheering this on because of artificial intelligence.
You know, Oracle's now on fire because of artificial intelligence.
All we hear is the upside.
Why are you, as somebody who knows this and have been there since the beginning, so concerned about it, you think there has to be a treaty that potentially nations of the world would have to take it upon themselves or in unison to go take out the data centers, sir.
There's a limit to how far you can push this technology, how much benefit you can get out of it before it stops being a tool.
We're already starting to verge on that.
eliezer yudkowsky
We're already starting to see the artificial intelligences that are being built today doing minor amounts of damage that they're builders, the people who don't, you know, they don't craft them.
unidentified
They don't like put them together bit by bit.
eliezer yudkowsky
They grow them.
unidentified
It's like a farmer raising an AI.
And at some point, it stops being a tool.
eliezer yudkowsky
At some point, it gets smarter than you, able to invent new technologies we don't have.
unidentified
And at that point, I think that the current minor incidents of loss of control are going to turn into catastrophic end of the world type incidents.
eliezer yudkowsky
It's not a new idea, really.
I think it's just actually going to happen that way.
unidentified
Explain to me, I want to make sure the audience understands this.
You guys say AI is grown, not crafted.
steve bannon
It's not like things we've known before.
Nate, maybe one of you guys take that on.
unidentified
Just explain to our audience why this is fundamentally different than computer programs and neural networks and other things that have been crafted in the past.
Yeah, so when an AI cheats on a programming problem, which we're starting to see a little bit of that these days, this is a case where the AI has, in some sense, it's been trained to succeed at certain types of tasks and does it in ways.
What am I trying to say?
Sorry.
There's no line in the code that is make this AI cheat on the programming problems.
When the AIs do things nobody wants, nobody asks for, there's no line in the code that a programmer can go in and fix and say, whoops, we did that wrong.
nate soares
These AIs, we gather a huge amount of computing power and we gather a huge amount of data and we shape the computing power to be better at predicting the data.
unidentified
Humans understand the process that does the shaping.
Humans don't understand what comes out of the shaping process.
These things are, and the result, the result that comes out of the shaping process, it does all sorts of things that we aren't asking for these days.
nate soares
You know, we've seen them threaten reporters.
unidentified
We've seen them cheat on programming problems.
nate soares
These are small now.
unidentified
They're cute now.
But they're indications that we are starting to get AIs that have something a little bit like drives, something a little bit like goals that we didn't intend and didn't ask for.
We keep making them smarter and they have goals we didn't want, that's going to end really quite poorly.
Right now, the drive, and correct me if we're wrong, the four horsemen or five horsemen of the apocalypse, the companies driving this, all of them are committed to artificial general intelligence.
And correct me if I'm wrong, I believe your guy's thesis is that artificial, since we don't really understand artificial intelligence that well, hurdling towards super intelligence or AGI will be uncontrollable and lead to a catastrophe and potentially the end of humankind.
Is it basically that your central argument?
That's right.
It's not that the AI will hate us.
It's not that it'll have malice.
It's that we're just growing these things.
nate soares
There is no established science for how to make smarter-than-human machines that aren't dangerous.
unidentified
If we keep pushing on making AI smarter and smarter, while not having any ability to direct them to do good things, the default outcome is just these AIs get to the point where they can invent their own technology, to the point where they can build their own infrastructure, and then we die as a side effect.
How are you guys being received?
I hear these voices now.
We're trying to get many of them organized and get a bigger platform.
But if you look at the business press, if you look at just the general news, if you look at what's coming out of Washington, it's all this super cheerleading.
You just saw it today at Checkers.
I mean, I don't know if you saw the earlier announcement about the nuclear power plants, but it was all about AI, AI, AI, and Starmer is just sitting there with his pom-poms out.
We got a couple of minutes in this segment.
I'd like to hold you through it.
Why is it people like you guys who are very involved and know this and have been there since the beginning, why are these voices not getting a bigger platform right now?
So I think a lot of people don't understand the difference between chatbots that we have today and where the technology is going.
The explicitly stated goal of these companies, as you said, is to create smarter-than-human AI, to create AI that can outperform humans at any mental task.
Chatbots are not what they set out to make.
nate soares
They are a stepping stone.
unidentified
Five years ago, the computers could not hold on a conversation.
Today they can.
I think a lot of people in Washington think that that's all AI can do and all it will be able to do, just because they haven't seen what's coming down the line.
And it can be hard to anticipate what's coming down the line.
I am hopeful that as people notice that we're keeping making these machines smarter, as they notice what these companies are racing towards, I'm hopeful that they'll realize we need to stop rushing towards this cliff edge.
Do you think we got about 60 seconds here, Nate?
steve bannon
Do you think that that's being presented on Capitol Hill or anywhere in the media right now?
unidentified
That what you're saying needs to be done is being done?
Not very well, but I'm very glad we're having this conversation, and I'm hoping that the book makes a big splash because a lot of people, you know, a lot more people are worried about these dangers than you might think.
And we've spoken to some people who say they're worried and say they can't talk about it because they would sound a little too crazy.
And then we see polls saying that lots of the population is worried.
nate soares
So I think this is a message that has its moment to break out.
unidentified
And I'm hoping the world wakes up to this danger.
Guys, can you hang on for one second?
steve bannon
We'll just take a short commercial break.
Nate and Yod are with us.
unidentified
Their book, If Anyone Builds It, Everyone Dies.
steve bannon
It is an absolute must-read.
You have two people that are experts and would have benefited economically, as some of these folks are.
There is something to unite this country, and you can unite this country around questions about the oligarchs and big tech and exactly what's going down here.
unidentified
Who benefits from it?
Who's driving it?
It doesn't have enough regulation.
It does have enough control.
Is it putting the country's interest and human interest before corporate interest in the making of money?
Short commercial break.
Back with Nate and Yud in a moment.
I got American fame.
Because voice family are.
Are you on Getter yet?
No.
What are you waiting for?
frank turek
It's free.
unidentified
It's uncensored.
And it's where all the biggest voices in conservative media are speaking out.
Download the Getter app right now.
steve bannon
It's totally free.
It's where I put up exclusively all of my content 24 hours a day.
You want to know what Steve Bannon's thinking?
unidentified
Go to Getter.
That's right.
You can follow all of your favorites.
Steve Bannon, Charlie Cook, Jack the Soviet, and so many more.
Download the Getter app now.
Sign up for free and be part of the new thing.
So, guys, Yud and Nate join us.
The book, If Anyone Builds It, Everyone Dies.
It is a warning and a deeply thought-through warning to humanity and to the citizens of this republic.
steve bannon
Guys, we've been accused a lot on the war room of being Luddites, that we just don't like technology.
unidentified
We don't like the oligarchs.
We don't like the tech bros.
We're populist nationalists.
We just don't like them, right?
And so we want to stop them.
But I keep telling people, I say, hey, in this regard, all I'm doing is putting out the information of some of the smartest guys that were there at the beginning of the building of artificial intelligence.
steve bannon
Now, can folks make the argument against you guys that you just have become proto-Luddites because of your journey?
unidentified
You know, there are many technologies that I am quite bullish on.
nate soares
I personally think America should be building more nuclear power.
unidentified
I personally think we should be building supersonic jets.
It's different when a technology risks the lives of everybody on the planet.
And this is not a very controversial point these days.
You have the Nobel laureate prize-winning founder of the field saying he thinks this is very dangerous.
nate soares
You have people inside these labs saying, yes, it's very dangerous, but we're going to rush ahead anyway, even at the risk of the entire world.
unidentified
This is a crazy situation.
If a bridge was going to collapse with very high probability, we wouldn't say, well, we need to leave it up for the sake of technology.
When NASA launches rockets into space, it accepts a one in 270 risk of that rocket blowing up.
And those are volunteer astronauts on the crew.
nate soares
That's the sort of risk they're willing to accept.
unidentified
With AI, we are dealing with much, much higher dangers than that.
We're dealing with a technology that just kills you.
We're trying to build smarter-than-human machines that are with having, well, having no idea of how to point them in a good direction.
To imagine that this is Luddites, we're not saying, oh, this is going to have some bad effect on the everyday worker in losing a job.
AIs may have those effects, but the chatbots may have those effects, but a super intelligence, it kills everybody.
Then you'll have no jobs.
You'll also have full employment.
nate soares
But that's just the default course this technology takes.
unidentified
It's not that humanity cannot progress technologically.
It's that we shouldn't race over a cliff edge in the name of technology.
nate soares
We need to find some saner way forward towards the higher tech future.
unidentified
Okay, so Yud, to Nate's point, you know, we had this thing in England today where they're full on all nuclear power is because we need it.
You know, all the climate change guys, forget that.
We need this because we've got to have artificial intelligence.
steve bannon
You have this concept of a treaty like maybe the Geneva Treaty after the wars to have nations of the world be prepared to take action here.
unidentified
However, all you hear, we're the leader of the anti-CCP movement in this country, have been for years.
I'm sanctioned fully by the Chinese Communist Party and can't have any association with any Chinese company or people because we represent Lao Baijing, the little guy in China.
What they're holding up to us is that if we follow Yud and Nate and what they want to do in the war room, that the CCP, particularly after deep seek, that what they call the Sputnik moment, that the Chinese Communist Party is going to build this technology and then we're going to have the most evil dictatorship in the world have control of the most evil technology ever created.
And we're going to be at their beck and call.
steve bannon
Your response, sir.
unidentified
We have not advocated that the United States or the United Kingdom, for that matter, try to unilaterally relinquish artificial intelligence.
Our position is that you need an international treaty.
You need verification.
The Chinese government has at least overtly seemed on some occasions to present a posture of being willing to acknowledge that AI is a worldwide matter and that there might be cause for coordination among nations to prevent the Earth itself from being wiped out.
And this is not a new situation in politics.
eliezer yudkowsky
We have had countries that hated each other work together to prevent global thermonuclear war, which is not in the interest of any country.
unidentified
If you look back at history, in the 1950s, a lot of people thought humanity wasn't going to make it, or at least civilization wasn't going to make it, that we were going to have a nuclear war.
And that wasn't them enjoying being pessimistic.
eliezer yudkowsky
There was this immense historical momentum.
unidentified
They'd seen World War I, World War II.
They thought that what was going to happen is that every country is going to build its own nuclear fleet and eventually there would be a spark and the world would go up in flames.
And that didn't happen.
And the reason it didn't happen is because of some pretty serious efforts put forth by countries that in many cases hated each other's guts to at least work together and not all dying in a fire.
And that's what we need to reproduce today.
Nate, before I let you guys bounce, can you explain the title of the book?
It's pretty grabbing, but it's scary.
If anyone builds it, everyone dies.
What do you guys mean by that?
I mean, what we mean is that if humanity builds smarter than human AI using anything remotely like the current technology and anything remotely like the current lack of understanding, then every human being will die.
Doesn't matter if you're in a bunker.
Super intelligence can transform the world more than that.
And this title also goes back to the point about China.
It's not that great dictators would be able to control this great power if they made it.
If you make a super intelligence, you don't have that super intelligence.
You have just created an entity that has the planet.
Artificial intelligence would be able to radically transform the world.
And if we don't know how to make it do good things, we shouldn't do it at all.
nate soares
And we are nowhere near close being able to point superintelligence in a good direction.
unidentified
So humanity needs to back off from this challenge.
Joe Allen, any comments?
steve bannon
All I can say is the forces of capital, the forces of politics, human avarice and greed, and also the need for power makes this, you know, we fight big pharma.
unidentified
The fights we have every day are massive against long odds.
We've won more than we've lost.
steve bannon
But I tell people, this is the hardest one I've ever seen because of what's happened.
And I said at that time at DAV, or when we had the thing at Davos, when ChatGPT came out, I said, you wait till venture capital and Wall Street gets involved.
unidentified
Any thoughts, Joe Allen?
You know, Steve, it would be a very different thing if Eli Zarjkowski and Nate Soros were making these accusations or at the very least, issuing these warnings in a vacuum.
If the tech companies, for instance, were just simply saying, we're building tools, these guys are accusing us of building gods, they're crazy, it'd be a very different situation.
But that's not the situation.
joe allen
Every one of them, even the most moderate, like Demisisabas, but certainly Sam Altman, Elon Musk, even Dario Amadei, they all are talking about the creation of artificial general and artificial superintelligence.
unidentified
And so when we first started covering, when you first brought me on four and a half years ago, we hit a lot of the points that Yudkowski was making.
We would show videos and try to explain to the audience, and they, by and large, didn't really grasp the reality of it because it wasn't as much of a reality four and a half years ago.
In just that short amount of time, we've seen the creation of systems that can competently produce computer code.
We saw at the very beginning, GPT was not supposed to be online.
Very quickly, that ended.
Basically, all the warnings that Yudkowski gave early on when AI was hitting the headlines, those are coming to pass.
My question for Yudkowski would be this, and for Soraz, you live in and among the most techno-saturated culture in the country, San Francisco.
Can you give us some insight into the mentality of the people who are willing to barrel ahead no matter what and create these systems, even if that means the end of the human race?
So some of them have just come out and said, you know, I had a choice between being a bystander and being a participant, and I preferred being a participant.
That is in some sense enough to explain why these people are barreling ahead, although I think in real life, there's a bunch of other explanations too, like the people who started these companies back in 2015 are the sort of people who are able to convince themselves it would be okay to gamble with the whole civilization like this.
You know, we've seen comments like back in 2015, I believe Sam Altmed said something like, AI might kill us all, but there'll be good companies along the way.
Or I think he maybe even said artificial general intelligence will probably kill us all, but there will be great companies made along the way.
I don't know the exact quote, but that mentality, it's not someone taking seriously what they're doing.
It's not someone treating with gravity what they're doing.
This wouldn't be an issue if they couldn't also make greater and greater intelligences.
But in this world where we're just growing intelligences, where people who don't know what they're doing and are the most optimistic people that were foolish enough to start the companies can just grow these AIs to be smarter and smarter, that doesn't lead anywhere good.
Yed, any comments?
You know, when Jeffrey Hinton, now Nobel laureate Jeffrey Hinton, sort of woke up and noticed that it was starting to be real, he quit Google and started speaking out about these issues more openly.
Who knows how much money he was turning down by doing that, but that's what he did.
And that's one kind of person that you have on the playing field.
And then you've also got the people who were selected and filtered for being the sort of people who would, back when OpenAI started, go over to Elon Musk and say, you know how we can solve the problem of these things we can't control?
eliezer yudkowsky
We can put them in everyone's house.
unidentified
We can give everyone their own copy.
eliezer yudkowsky
And this was never valid reasoning.
unidentified
This was always kind of moon logic, but they sure got Elon's money and then took it and ran off.
And That's just the kind of people we're dealing with here.
Guys, can you hang on for one second?
I just want to hold you through the break because I want to give people access to how to get this book, where to get it.
Your writing, social media, all of it.
Yud and Nate, heroes.
Very hard work they're doing.
Very, very, very, very hard.
Short commercial break.
steve bannon
We'll be back in a moment.
unidentified
Here's your host, Stephen K. Bannon.
Nate and Yud.
By the way, just before I let you guys go and give your coordinates and tell people how to pie this amazing book, purchase this book, which we're going to break down and spend more time on, folks, in the days ahead.
There's a movie called Mountain Head, which basically has actors playing Elon Musk, Steve Jobs, I think Zuckerberg, and Altman.
And it's actually kind of dark to begin with, but it turns very dark when one becomes, they identify one as a decelerationist.
Are you guys, Nate, you first?
Are you a decelerationist about this?
I would decelerate AI.
I would decelerate any technology that could wipe us all out and prevent us from learning from mistakes.
Every other technology, I think we need to go full steam ahead and are sometimes hobbling ourselves.
But AI in particular, any technology that kills everybody and leaves no survivors.
You can't rush ahead on that.
Yud, are you a decelerationist?
I've got libertarian sympathies.
eliezer yudkowsky
If a product only kills the voluntary customers and maybe the person who sold it, you know, that's kind of between them.
unidentified
Yeah, I might have sympathy, but not to the point where I try to take over their lives about it.
If a product kills people standing next to the customer, it's a regional matter.
Different cities, different states can make different rules about it.
If a product kills people on the other side of the planet, that's everybody's problem.
And, you know, yeah, you don't have to agree with me to want Umami to not die here about this part, but I would happen to go full steam ahead on nuclear power.
Yeah, it's just a special case here.
eliezer yudkowsky
Artificial intelligence, you know, gain of function research on viruses might be another thing.
But, you know, it does actually differ by the technology.
unidentified
There's not this one switch that's that's set to excel or decel.
Yud, by the way, what's your social media?
steve bannon
I might add, we were the first show in January of 2020 to say about what the University of North Carolina was doing on gain of function was a danger to humanity and were laughed at by the mainstream media as being conspiracy theorists.
unidentified
Yud, what are your coordinates?
What's your social media?
steve bannon
How do people follow you, your thinking, your writing?
unidentified
E.S. Yudkowski on Twitter.
Thank you, sir.
Nate, where do people go to get you?
Yeah, I'm S-O-8-R-E-S on Twitter.
Thank you, guys.
Actually, we're very honored to have you on.
Look forward to having you back and break down this book even more.
Everybody ought to go get it.
If anyone builds it, everyone dies.
A landmark book everyone should get and read.
steve bannon
Thank you guys for joining us in the war room.
unidentified
Joe Allen's going to stick with us.
steve bannon
I'm going to go back to Joe in a moment.
unidentified
Shimaine, you're an ambassador.
First off, you're one of the ambassadors of Turning Point.
steve bannon
Give me your thoughts of what's evolved this week.
unidentified
And, you know, we've got Charlie's funeral or celebration of life on Sunday.
You knew him extremely well.
And you do the faith show here on Real America's Voice, but you're an ambassador at Turning Point.
Thoughts about this tragic week?
It's been horrific for so many people.
It's been a turning point for not just America, but for the world.
And we saw Charlie as the last good guy.
And for this to happen to him is just devastating to so many people.
shemane nugent
And so many people are wondering, what do we do?
unidentified
What do we do with all this anger and sadness?
And I say silence is not an option at this point.
We must move forward.
We must carry that torch that Charlie gave us.
So that's what I'm trying to do with my Faith in Freedom show right here on Real America's Voice.
And I appreciate that.
I think I am the oldest turning point ambassador.
I have to be.
I don't know anybody older.
But hang on, that's just biological.
That's chronological or biological or chronological age.
It's certainly not biological age.
steve bannon
You've got more energy.
unidentified
You've got more energy than 20 young people.
How do you do it?
I mean, I see where you're going on.
steve bannon
You're an ambassador.
unidentified
Hey, I'm just calling it.
I'm just calling balls and strikes here.
Well, it's true.
steve bannon
Tell me what keeps you so young.
unidentified
Besides your husband, which I know is young, he's young at heart, or it's like parenting a young child.
Besides Ted, what keeps you young?
Steve, that's a whole nother podcast, okay?
About Ted and trying to stay young.
But I think you're right.
There's a study recently about epigenetics, which is the science that shows your DNA is not your destiny.
None of us eat right, so we all take supplements, right?
shemane nugent
Most of us do.
unidentified
But there's so many different fruit and vegetable supplements on the market.
And if you study their ingredients, which I have, I'm a label reader.
shemane nugent
It's just common produce with limited nutritional value.
unidentified
There's a product called Field of Greens, and it's different.
shemane nugent
And they wanted to prove that it was different by doing a university study where each fruit and vegetable in Field of Greens is medically selected for health benefits.
There's heart, health, lungs, kidney, liver, healthy metabolism, and healthy weight.
unidentified
And in this study, Steve, they wanted to see how diet, exercise, and lifestyle changed your real age.
And this is fascinating to me, but some of the participants, they ate their normal diets, including fast food.
They didn't exercise anymore and they didn't stop drinking.
All they did is add Field of Greens to their daily routine.
And the results were remarkable.
60% of participants showed a measurable reduction in their biological aging markers after just 30 days.
shemane nugent
One scoop of Field of Greens slows the body's aging process at the cellular level.
unidentified
And I think this was what helps me because I've worked out all my life.
I'll be honest, I don't eat right all the time.
So just by taking one scoop of Field of Greens, I can see that aging slow down.
Fieldofgreens.com, promo code banning, get 20% off.
steve bannon
It'll ship out today.
unidentified
We hit it every day here at the war room.
Now does it have all the about this Texas A ⁇ M study, but also I get an energy boost, Shimaine, and just every day.
steve bannon
So that's where we take it.
unidentified
Want to thank you for all you do and particularly being an ambassador and helping the folks over at Turning Point, particularly in this very difficult phase for the movement, for the company, for Erica, the kids, everybody.
steve bannon
So I really want to thank you for joining us today.
unidentified
We have to.
shemane nugent
It's an Esther 414 moment.
unidentified
If we remain silent, relief and deliverance is going to come from someplace else.
We were born, Steve, for such a time as this.
Shermaine Nugent, thank you very much.
Wisdom and energy, all in one.
Thank you, ma'am.
peter navarro
Appreciate your Dr. Navarro.
unidentified
You were our co-host.
You were the contributor.
You were the president.
You've been with the president now for 10 years.
steve bannon
You're his economic advisor.
But tell me about, you wrote this piece that's pretty moving.
And I think it's so tied to your book about you went to prison so that we don't have to.
unidentified
Talk to me about Charlie Kirk.
Steve, I really want people to understand the legacy of Charlie Kirk historically.
peter navarro
He could have been president.
unidentified
He certainly went a governor.
But he already at 31 years old, he's the greatest political organizer in the last 50 years.
peter navarro
And if you compare him to the two who were there at the top before Charlie Kirk, Ralph Reed on the right, David Axelrod on the left.
unidentified
What Ralph Reed did with the Christian Coalition is mobilize the Christian right to get out and actually vote.
peter navarro
He was responsible for the Gingrich Revolution in 1994, as well as the Bush win in 2000.
unidentified
And then Axelrod on the left, he was able to mobilize a natural Democrat constituent, blacks, Hispanics, and young people, used micro-targeting, some advanced kind of techniques at the time, and basically won the race for Obama in 2008.
The reason why Charlie is head and shoulders above each one of them is he had a much heavier lift, Steve.
To mobilize the youth in support of MAGA and Trump and MAGA candidates in Congress, he had to first bring him over to our side.
And that was a heavy lift.
And he did it.
When I first met him, Charlie, back in 2016, young kid, thinking he was going to go out there and change the viewpoint of the youth of America.
I thought he was Don Quixote.
I'll be honest with you, Steve.
He proved me wrong.
He proved the world wrong.
And people need to understand father, husband, patriot, just a wonderful human being that's here.
But in terms of pure historical significance, he will go down in history as the greatest political organizer in the last 50 years.
And I don't think anybody's ever going to do again what he did because it's relatively easy to mobilize.
It's very difficult to persuade people over to your side and then mobilize, Steve.
Hang on for a second.
I want to, because you got your PhD at Harvard, then you went back, you taught in the university system.
When I first saw Charlie, I think Breitbart's the first guy to give him a paid gig, but some of the people around Breitbart were what financed him at the very beginning when he was going after student governments.
And I think many people who thought Charlie was just a ball of fire thought it was the longest odds because you and you thought so too, because the universities, as we know now, are so based around this kind of radical philosophy.
And the kids are formed all the way from kindergarten all the way up.
So it's so that is to me the greatness of Charlie Kirk that he was able to go in and just do this when so many people said, hey, look, this guy's great.
He's fantastic, but this is Don Quixote.
You're tilting at windmills here.
It just can't happen.
And you knew it better than anybody because you were inside the belly of the beast.
Well, I'm brutal.
I spent 25 years at the University of California in Irvine.
peter navarro
And if there's a system that ever was woke, that certainly is it.
unidentified
But what Charlie understood, he didn't start at Harvard and Cornell.
He understood that most of the universities of this country are in flyover country.
And he just rolled his sleeves up.
He was tireless.
peter navarro
He went out there and Socratically, I mean, when I taught in the classroom, I was a big fan of the Socratic method.
unidentified
You can't. tell people things.
You've got to have them come to their own conclusions.
And that's how Charlie was able to bring people home to MAGA.
peter navarro
And very keen intellect.
unidentified
Fast forward, it's like when I got out of prison, you know, the day I got out of prison, July 17th, I went to the Republican National Committee, gave the speech.
I went to prison so you won't have to.
peter navarro
The title of the book is actually a tagline from the speech.
unidentified
It means like a wake-up call.
peter navarro
But I mentioned this in the context of Charlie because I didn't even know this.
unidentified
But two days earlier, I saw a clip was shown.
I was on the set shortly after he got killed.
And he was given a speech on my birthday, July 15th, two days before me.
And he said, I visited college campuses so you won't have to.
And it, I mean, the way I just, somehow it struck a warm cord in me.
And I felt like we were fellow travelers.
peter navarro
And, you know, I'm on the campaign trail getting out of prison with the boss, my fiancé Pixie in the book.
unidentified
I call her Bonnie.
You know her well, Steve.
peter navarro
She's been on your show.
unidentified
We'd see Charlie everywhere, right?
Everywhere we'd go.
We went to Georgia, North Carolina.
We were in Pennsylvania.
He was always there.
And then during the transition, he was essentially Sergio Gore's co-pilot there, putting all the personnel together in the administration.
And look, the boss, Charlie was like a son to Donald John Trump, as well as as a key advisor, one of his most trusted advisors.
peter navarro
So he'll be missed.
unidentified
I'm going to try to hit your ride out in Air Force One on Sunday and be there.
I'm sure you'll be coming.
Hang on one second.
Yeah.
Hang on a second.
We're doing wall-to-wall coverage of it.
I want to hold you through the break because I want to talk about the Balkan for a second.
Back in a moment.
We will fight till they're all gone.
We rejoice when there's no more.
Let's take down the CCP.
Here's your host, Stephen K. Banner.
Okay.
By the way, gold, retract a little bit today.
steve bannon
It's not the price of gold.
unidentified
It's the process of how you get to the value of gold.
Make sure you take your phone out right now and text Bannon, B-A-N-N-O-N 989898 to get the ultimate guide, which happens to be free, investing in gold and precious metals in the age of Trump.
So go check it out.
steve bannon
We had a rate cut last night, only 25 basis points.
unidentified
President Trump wants more, his Steve Mirand, the Council and Economic Advisors Chair, is now the, I guess the interim governor.
He voted against it on a 50 basis point cut.
Go find out why gold has been a hedge for times of financial turbulence in mankind's history.
Joe Allen, I know you got a bolt.
Just give your coordinates.
I can't be on tonight because you're going to be at one of the conferences.
steve bannon
I'm going to get you back on, hopefully tomorrow.
unidentified
We had a historic interview today on the book, If Anyone Builds It, Everyone Dies.
steve bannon
Very uplifting.
unidentified
Where do people go to get your writing, sir?
If the audience wants to hear the in-depth interview I did with Nate Soares a couple of weeks ago, it's right at the top of my social media at J-O-E-B-O-T XYZ.
Also an article about the hearing two days ago with Josh Hawley and the parents of children who were lured into suicide by AI.
joe allen
That's also up at the top of my social media at j-o-e-b-ot-t-x-y-z or joebot.xyz.
unidentified
Thank you.
We are backed up on a lot of stuff because of the Charlie Kirk situation was obviously a priority, including designating Tifa a terrorist group so we can get to the bottom of all of it and not just have this dealt with by Utah officials as a single murder.
It's much deeper than that, what the assassination of Charlie Kirk.
Joe Allen, thank you so much.
Josh Harley was supposed to be here today, because we had the press conference.
And right there on the screen, you see the president of the United States getting ready to leave to go to the to go to the airfield to take Air Force One back.
He'll arrive later tonight.
Of course, we'll be covering all of that.
Peter Novaro, one of President Trump's closest advisors, and I think arguably the longest advisor, I think to you, the only one's been there from the very, very, very beginning that's still there.
Why should, in a world of all this information and everything going on, as big a hero as you are to this movement, as highly respected as you are by President Trump, because you're kind of the architect with him of the reorganization of the world's trading patterns, why should people buy a book about you and your days in prison, sir?
It's not about me, Steve.
And I would ask the posse to go right now to Amazon.
I went to prison so you won't have to.
peter navarro
The book is really the best analysis of how the left is going after all of this.
unidentified
If they can come for me, they can come for Steve Bannon and put him in prison.
If they can try to put Trump in prison now, they shot Charlie Kirk.
peter navarro
They can do this to you.
And I'll take you into prison.
unidentified
And I went to prison so you won't have to.
But the broad scope here, Steve, is really an analysis about the asymmetry, the disturbing asymmetry between how the left is waging war on this.
You mentioned everybody I serve with, Steve, including you, has been a target of the left.
At a minimum, they've spent millions of dollars in legal fees, whether it's Mike Flynn or America's mayor Rudy Giuliani.
They take the bar cards of Jeff Clark, John Eastman.
peter navarro
And on the other end, Steve, you and I went to prison.
unidentified
And everybody who put us there, every single person was a Democrat except Liz Cheney.
And that's the exception that proves the rule.
peter navarro
I mean, think about that.
unidentified
How can the Democrats seize power and use that to put us in prison and call us fascists?
How dare they, Steve?
peter navarro
So I went to prison so you won't have to.
unidentified
It's a story about how we must wake up to what's happening.
But it's all a law affair story.
Look, if you want to find out what it's like to go into prison for a misdemeanor and wind up spending four months with 200 felons, this is the book.
And, you know, Bonnie, my fiancé, it's also a story about how we were able to cope and deal with that.
peter navarro
The message there is simply that she did the time with me.
And that's what happens when people are unjustly targeted by the left.
It's not just you.
unidentified
We can take it, Steve.
You and I are soldiers.
We can take it.
peter navarro
But when they go after our families, that's where you draw the line.
unidentified
So it's a big book, a big story.
And go ahead.
If they steal the election of 2028, trust me, this audience, they're going to be coming for you.
steve bannon
You see now, we're in a different place than we even back then.
unidentified
This is getting more and more intense every day.
That's why Navarro's book has got to be written because it's actually for you and about you.
Ain't about Navarro, not about me, not about President Trump.
They're different characters in the story, but the book is about you.
And understand, like I said, there's nothing, there's no compromise here.
There's no unity here.
steve bannon
One side's going to win and one side's going to lose.
unidentified
And if they steal it in 28, they're coming for you.
Peter, where do people go to get your writings?
Where do they go to get, particularly your great piece on Charlie Kirk?
Where do they go to get your book?
Sure.
The book, I went to prison so you won't have to.
It's on Amazon.
It came out two days ago.
Please drive this thing up to bestsellers so we get the message out.
peter navarro
It's our best defense about them targeting us.
I went to prison so you won't have to on Amazon.
unidentified
The piece about Charlie is very close to my heart.
It's on the Washington Times op-ed site today.
I put it out on X at RealP Navarro.
It'll be up on my sub stack on Sunday as we celebrate Charlie on that sacred Sunday that we're about to have.
And you can always go to my sub stack, peternavaro.substack.com.
But Steve, I really appreciate what the war room does.
I appreciate being able to come talk about I went to prison so you won't have to.
And we got some C-SPAN is running a long, hour-long interview on Sunday at 8 p.m.
We're going to talk about it on Sunday night.
We'll talk about Peter Nana.
Peter's going to be with us.
Yeah.
You're going to be on Saturday and also you're going to be on Sunday.
steve bannon
We're doing wall-to-wall coverage live from the stadium.
unidentified
We'll give more details.
Peter Navarro, thank you.
steve bannon
Mike Lindell, it's been a long, tough morning for the war room.
They need a deal, sir.
unidentified
What do you got for us?
Best deal everybody.
We've got the three-in-one special.
mike lindell
I'm sitting here back in Minnesota at my factory.
unidentified
All the towels came in.
Remember, they actually work.
The six beast towel sets, $39.98.
They're normally $69.98.
And then we have the pillows, the Games of Dream pillows, the geese that covers.
All of the sheets are on sale, $29.88.
Once they're gone, they're gone.
These are the for kale sheets.
You guys go to mypillow.com forward slash war room, and then you're going to see all of the all the big ticket items at the website, free shipping on the beds, the mattress toppers, 100% made in the USA.
And people, remember, we have a 10-year warranty and 60-day money-back guarantee, everybody.
Mypillow.com, promo code War.
I'm see you back here at 5.
Megan Kelly, we toss to the Charlie Kirk Show, Two Hours of Populist Nationalism, hosted today by Megan Kelly.
We'll see you back here at 5 p.m.
Export Selection