All Episodes
Sept. 18, 2025 - Bannon's War Room
50:04
Episode 4788: If Anyone Builds It, Everyone Dies
Participants
Main voices
n
nate soares
07:37
p
peter navarro
08:09
s
steve bannon
16:11
Appearances
e
eliezer yudkowsky
04:53
f
frank turek
04:49
j
joe allen
02:22
s
shemane nugent
02:47
Clips
m
mike lindell
00:49
| Copy link to current segment

Speaker Time Text
steve bannon
Do we have time before we get our new guest up, or can we play the Charlie Kirk uh doctor?
I want to uh okay.
I want to play.
This was very moving.
It came out last night.
It was going to be the end of our cold open.
But as you saw, we cut to the code, we cut in the cold open.
So let's go ahead and play this.
Uh, I want everybody to hear this.
Let's go ahead and play it.
frank turek
Now, here's what Erica wants me to relate on Sunday.
This is going to be the hard part, but maybe also the comforting part.
Charlie Kirk was literally like a son to me.
I have three sons.
He was like my fourth son.
My three sons are a little bit older than Charlie.
He was like my fourth son.
So when he was hit.
If your son got hit, what would you do?
What would you do?
I got in the car.
Because if there was any way I could save him, I had to do something.
I couldn't just take him.
You guys got it.
So they got him into the side of the car.
It was an SUV.
It was the SUV we took over.
And uh I'm on one side, and there's actually some video there.
Somebody's taking video of this.
I'm on one side of the car, the right side, and they're getting Charlie in.
So I run over to the other side, but the guy was dragging him in.
They're now blocking that entrance.
So at that point, I run around to the back.
I pop the top, uh the back gate open, and I jump in the back.
The car lurches forward.
Apparently, somebody jumped in the car.
So the car lurches forward, so I almost fall out of the car with the SUV.
Then I grab the thing and close it.
And there's five of us in the car now.
Justin is driving.
Dan is up front with the with the GPS.
Rick has got him.
Rick's on my left, and Brian is there, and I'm coming over the back seat, and Charlie's laid out in front, just right in front of me.
And Charlie's so tall, we can't we can't close the door.
We drove four miles, some, I don't know, it's four something miles, all the way to the hospital with the door open.
To this day, I don't know how Brian stayed in the car.
Because we're just go, go, go, go, go.
We're you know, we're trying to do, we're trying to stop the bleeding.
You saw it.
And I'm yelling, come on, Charlie, come on, come on.
Meanwhile, my phone is still on.
My son and daughter-in-law are here in this whole thing.
And a security team, again, Justin, Dan, Brian, and Rick, they love Charlie.
But they're much cooler than I. I mean, they're just carrying out, they're calmly, but they're swiftly doing exactly what they were trained to do.
Rick starts praying out loud.
I'm praying out loud.
We're yelling, come on, let's go, let's go, let's go.
My son's hearing all this.
And we're we're doing the best we can to navigate traffic.
This is not a highway.
We're on surface streets.
And suddenly there's an ambulance coming toward us.
And there was conversation in the car.
Should we stop?
Or like, no, no, just keep going.
Just keep going.
The doctor later said that was the right thing to do.
Ambulance goes by us.
We're still heading to the hospital trying to get there.
At one point, somebody says, let's get there in one piece, because we're just we're cutting through intersections, you know, we're just beeping the horn.
This is this is not an emergency vehicle.
There's no just no lights.
There's none of this.
And I go, we got to start CPR.
So I try and start that.
Now Charlie wasn't there.
His eyes were fixed.
He wasn't looking at me.
He was looking past me right into eternity.
He was with Jesus already.
unidentified
He was killed instantly.
frank turek
And felt absolutely no pain.
That's what I was told later.
But of course, we had to try.
And by the way, there was just nothing, nothing any of us could do about it.
We were giving him CPR, but nothing was happening.
It wasn't like if we had better first aid or we had better medical facilities, or we were faster to the hospital, we could have saved him.
We couldn't.
So if if that's any comfort at all, Charlie didn't suffer.
He was gone.
He was with Jesus, absent from the body, present with the Lord.
That's where he was.
Now it is true when we got to the hospital and they started working on them right away.
They did get a pulse back.
And so Rick and I were just everyone's praying.
We're just praying for a miracle.
We had a we had a small sliver of hope.
And the doctor later said that.
We got a pulse because Charlie was a very healthy man.
But the shot was catastrophic.
So 20 or 30 minutes later, the surgeon came out and said he was dead.
steve bannon
Thursday, 18th September, year over Lord 2025.
We had a break for the press conference in um at Checkers.
And Dr. Peter Navarro had to take a meeting over in the the West Wing.
He's going to rejoin us approximately 11:30 to go through that.
Charlie Kirk, this is why huge announcement overnight, President of the United States going to designate Antifa and affiliate organizations, a um major terrorist organization.
And this will now expand from what the authorities out in Utah were doing, which is a murder, just a typical murder case, to something uh it's going to really get to the bottom of it.
And I think you see, and you know my opinion on this ridiculous set of text messages that trying to foist on us, which is absurd uh to do.
We'll break that more down this afternoon.
Alex Brusewich is going to join us.
I think he's over at turning point right now, who joined this afternoon.
Uh, I want to bring on uh because of the what happened at this press conference about the there was this massive deal excuse me before the press conference earlier today, some of the audience might not have caught it, but there was a huge announcement of this transaction for nuclear power and you know, 400 billion dollars and all this technology we're gonna provide, and it's all down to about artificial intelligence.
I want to bring on the authors of a of a book that I think is a must read for every person in this nation.
If anyone builds it, everyone dies.
It's about artificial intelligence, and two of the individuals that have been there from the beginning and know both the benefits and huge upside of artificial intelligence, but also the potential downside, and is there enough risk mitigation?
Elizer Yatkowski and Nate Soras, Yud and Nate.
Thank you for joining us here in the war room.
I appreciate it.
unidentified
First question, Yud, I think I think it was you.
steve bannon
Did weren't you the guy just like a year or so ago, and Joe Allen, uh I was going through some stuff with Joe Allen.
Weren't you the guy that said, hey, if this thing gets so out of control, I will be the first to go bomb the data centers?
Was that you?
eliezer yudkowsky
No.
Uh I was saying that we needed an international treaty and was being claimed that the international treaty needed to be willing to be backed up by force, including on non-signatory nations.
This wasn't about individuals trying to stop a data center.
I don't think that's you know gonna be all that effective.
Uh you take out one, there's others.
Yeah, you you managed in your own country, it moves to others.
This is an international treaty situation.
You know, it's not it's not the kind of product which just kills the voluntary customers, or even people standing next to the voluntary customers.
This is something that endangers people on the other side of the planet.
So I'm trying to be plain.
This is the sort of treaty that needs to be backed up by force if it has to be.
steve bannon
So, yeah, that is a uh it's kind of a brilliant concept, right?
And you're saying that even non-signatories would have to take out.
What is it?
Make your case.
What is it about this technology that potentially could be so dangerous to humanity?
Because all we hear, all we get we get glazed every day.
We get glazed at how this is so wonderful and this is so tremendous, and you got you've got Larry Fink, and you got Steve Schwartzman, uh, you have the uh the head of Apple.
They're all over there cheering this on because of artificial intelligence.
Uh, you know, Oracle's now on fire because of artificial intelligence.
All we hear is the upside.
Why are you somebody knows this and have been there since the beginning, so concerned about it, you think there has to be a treaty that uh potentially nations of the world would have to take it upon themselves or in unison to go take out the data centers, sir.
eliezer yudkowsky
There's a limit to how far you can push this technology, how much benefit you can get out of it before it stops being a tool.
We're already starting to verge on that.
We're already starting to see the artificial intelligences that are being built today, doing minor amounts of damage that they're they're builders, the people who don't, you know, they don't craft them, they don't like put them together bit by bit.
They grow them.
That's it's like a farmer raising an AI.
And at some point it stops being a tool.
At some point it gets smarter than you, able to invent new technologies we don't have.
And at that point, I think that the current minor incidents of loss of control are going to turn into uh catastrophic end-of-the-world type incidents.
It's not a new idea, really.
I think it's actually gonna happen that way.
steve bannon
Explain to me, I want to make sure the audience understands this.
You guys say AI is grown, not crafted.
It's not like things we've known before.
Nate, maybe but one of you guys take that on.
Tell just explain to our audience why this is fundamentally different than computer programs and neural networks and other things that have been have been crafted in the past.
nate soares
Yeah, so when an AI uh cheats on a programming problem, which we're starting to see a little bit of that these days, this is a case where the AI has uh, in some sense it's it's been trained to succeed at certain types of tasks and uh does it in ways what am I trying to say?
Sorry.
Uh no there's no there's no line in the code that is make this AI cheat on the on the programming problems.
When the AIs do things nobody wants, nobody asks for, there's no line in the code that a programmer can go in and fix and say, whoops, we did that wrong.
These AIs, we gather a huge amount of computing power and we gather a huge amount of data and we shape the computing power to be better at predicting the data.
Humans understand the process that does the shaping.
Humans don't understand what comes out of the shaping process.
These things are uh and the result, the result that comes out of the shaping process, it does all sorts of things that we aren't asking for these days.
You know, we've seen them uh threaten reporters.
We've seen them cheat on programming problems.
These are small now, they're cute now, uh, but they're indications that we are starting to get AIs that have something a little bit like drives, something a little bit like goals that we didn't intend and didn't ask for.
We keep making them smarter and they have goals we didn't want, that's going to end really quite poorly.
steve bannon
Right now, the the drive, and correct me if we're wrong, the four horsemen or five horsemen of the apocalypse, the companies driving this, all of them are committed to artificial general intelligence.
And correct me if I'm wrong, I believe your guy's thesis is that artificial, since we don't really understand artificial intelligence that well, uh hurtling towards superintelligence or AGI uh will be uncontrollable and lead to a catastrophe and potentially the end of humankind.
Is that is it basically that your central argument?
nate soares
That's right.
It's not that the AI will hate us.
It's not that it'll have malice.
It's that we're we're just growing these things.
There is no established science for how to make smarter than human machines that aren't dangerous.
If we keep pushing on making AI smarter and smarter while not having any ability to direct them to do good things, the default outcome is just these AIs get to the point where they can invent their own technology, to the point where they can build their own infrastructure, and then we die as a side effect.
steve bannon
How are you guys being received?
I hear these voices now.
We're trying to get many of them organized and get a bigger platform.
But if you look at the business press, if you look at just the general news, if you look at what's coming out of Washington, it's all this super cheerleading.
You just saw it today at Checkers.
I mean, I don't know if you saw the earlier announcement about the nuclear power plants, but it was all about AI, AI, AI, and Starmer just sitting there with his pom-poms out.
We've got a couple of minutes in this segment.
I'd like to hold you through it.
why is it people like you like you guys who are very involved and know this and have been there since the beginning?
Why are these voices not getting a bigger platform right now?
nate soares
So I think a lot of people don't understand the difference between chatbots that we have today and where the technology is going.
The explicitly stated goal of these companies, as you said, is to create smarter than human AI, to create AI that can outperform humans at any mental task.
Chatbots are not what they set out to make.
They are a stepping stone.
They, you know, five years ago, the computers could not hold on a conversation.
Today they can.
I think a lot of people in Washington think that that's all AI can do and all it will be able to do just because you know they haven't seen what's coming down the line.
And it can be hard to anticipate what's coming down the line.
I am hopeful that as people notice that we're keeping making these machines smarter, as they notice what these companies are racing towards.
I'm hopeful that they'll realize we need to stop rushing towards this cliff edge.
steve bannon
Do you think we got about uh sixty seconds here, Nate?
Do you think that that's being presented on Capitol Hill or anywhere in the media right now?
That that what you're saying needs to be done is being done.
nate soares
Not very well, but I'm I'm very glad we're having this conversation, and I'm hoping that the book makes a big splash because a lot of people, you know, a lot more people are worried about these dangers than you might think.
And we've we've spoken to some people who say they're worried and say they can't talk about it because they would sound a little too crazy.
Uh and then we see polls saying that lots of the population is worried.
So I think this is a a message that uh has its moment to break out, and I'm hoping the world uh wakes up to this danger.
unidentified
This is all the family.
steve bannon
Guys, can you hang on for one second?
We'll just take a short commercial break.
Nate and Yud are with us, uh, their book, if anyone builds it, everyone dies.
It is an absolute must-read.
You have two people that are experts and would have benefited uh economically as some of these folks are.
There's something to unite this country, and you can unite this country around questions about the oligarchs and big tech and exactly what's going down here.
Who benefits from it?
Who's driving it?
And does it have enough regulation?
It does have enough control.
And is it putting the country's interest in human interest before corporate interest?
unidentified
American man.
steve bannon
In the making of money.
Short commercial break.
Back with Nate and Yud in a moment.
unidentified
I got American faith.
You know America's voice family, are you on getter yet?
No.
What are you waiting for?
frank turek
It's free.
unidentified
It's uncensored, and it's where all the biggest voices in conservative media are speaking out.
steve bannon
Download the getter app right now.
It's totally free.
It's where I put up exclusively all of my content 24 hours a day.
You want to know what Steve Bannon's thinking, go together.
unidentified
That's right.
You can follow all of your favorite Steve Bannon, Charlie Purchet the Soldi, and so many more.
Download the Getter app now, sign up for free, and be part of the new thing.
steve bannon
So guys, Yud and Nate uh join us.
Uh, the book, if anyone builds it, everyone dies, it is a warning and a and a deeply uh thought-through warning uh to humanity and to the citizens of this republic.
Guys, um, we've been accused a lot on the warm of being Luddites, uh, that we just don't like tech technology, we don't like the oligarchs, we don't like the tech bros, we're populist nationalists, we just don't like them, right?
And so we want to stop them.
Uh but I keep telling people, I say, hey, in this regard, all I'm doing is putting out the information of some of the smartest guys that were there at the beginning of uh of the building of artificial intelligence.
Now, can folks make the argument against you guys that you you just have become a proto Luddites uh because of your journey.
nate soares
You know, there are many technologies that I am quite bullish on.
I personally think America should be building more nuclear power.
Uh I personally think we should be building supersonic jets.
Uh the it's it's different when a technology risks the lives of everybody on the planet.
And this is not a very controversial point these days.
You have the Nobel laureate prize-winning founder of the field saying he thinks this is is very dangerous.
You have people inside these labs saying yes, it's very dangerous, but we're going to rush ahead anyway, even at the risk of of the entire world.
This is a crazy situation.
If a if a bridge was going to uh collapse with very high probability, we wouldn't say, well, we need to leave it up for for the sake of technology.
You know, when when NASA launches rockets into space, it accepts a one in 270 risk of that rocket blowing up.
And those are volunteer astronauts on the crew.
That's the sort of risk they're willing to accept.
With AI, we are dealing with much, much higher dangers than that.
We're dealing with a technology that just kills you.
We're trying to build smarter than human machines that are uh with having while having no idea of how to point them in a good direction.
This to imagine that this is um it is Luddites, you know, we're we're not saying, oh, this is going to have some some bad effect on you know the everyday worker in in losing a job.
You know, AIs may have those effects, but uh in the chatbots may have those effects, but uh a superintelligence, it kills everybody.
Then you'll have no jobs, you'll also have full employment.
But that's that's just the default course this technology takes.
It's it's not that humanity cannot progress technologically, it's that we shouldn't race over a cliff edge in the name of technology.
We need to find some saner way forward towards the higher tech future.
steve bannon
Okay, so yeah, to to Nate's point, you know, we s had this thing in in England today where they're full on uh all nuclear powers because we need it, you know, all the climate change guys forget that we need this because we've got to have artificial intelligence.
You have this concept of a treaty like maybe the Geneva Treaty after the wars to have nations of the world be prepared to take action here.
However, all you hear we're we're the leader of the anti-CCP movement in this country, have been for years.
I I'm sanctioned fully by the Chinese Cameronist Party and can't have any association with any Chinese company or or people because we represent Lau Baijing, the little guy in China.
What what they're holding up to us is that if we follow Yud and Nate and what they want to do in the war room, that the CCP, particularly after deep seek, that what they call the Sputnik moment, that the Chinese Communist Party is going to build this technology, and then we're gonna have the most evil dictatorship in the world have control of the most evil technology ever created.
Uh, and we're gonna be at their beck and call.
Your response, sir.
eliezer yudkowsky
We have not advocated that the United States or the United Kingdom for that matter try to unilaterally relinquish artificial intelligence.
Our position is that you need an international treaty, you need verification.
Um the Chinese government has at least overtly seemed on some occasions present a posture of being willing to acknowledge that AI is a worldwide matter, um, and that the then that there might be cause for a coordination among nations to prevent the Earth itself from being wiped out.
And this is not a new situation in politics.
We have had countries that hated each other, work together to prevent global thermonuclear war, which is out of the interest of any country.
If you look back at history, after in the 1950s, a lot of people thought humanity wasn't going to make it make it, or at least civilization wasn't going to make it, that we were going to have a nuclear war.
That wasn't them enjoying being pessimistic.
There was this immense historical momentum they'd seen.
World War One, World War II.
They thought that what was going to happen is that every country is going to build its own nuclear fleet and eventually there would be a spark and the world would go up in flames.
And that didn't happen.
And the reason it didn't happen is because of some pretty serious efforts put forth by countries that in many cases hated each other's guts to at least work together and not all dying in a fire.
And that's what we need to reproduce today.
steve bannon
Nate, uh, before I let you guys bounce, uh, can you explain the title of the book?
It's it's pretty grabbing, but it's scary.
If anyone builds it, everyone dies.
What do you guys mean by that?
nate soares
I mean, what we mean is that if humanity builds smarter than human AI using anything remotely like the current technology and anything remotely like the current lack of understanding, then every human being will die.
Doesn't matter if you're in a bunker.
Superintelligence can transform the world more than that.
And this title also goes back to the point about China.
It's it's not that great dictators would be able to control this great power if they made it.
If you if you make a superintelligence, you don't have that superintelligence.
You have just created an entity that has the planet.
Artificial intelligence would be able to radically transform the world.
And if we don't know how to make it do good things, we shouldn't do it at all.
And we are nowhere near close, being able to point superintelligence in a good direction.
So humanity needs to back off from this challenge.
steve bannon
Joe Allen, any comments?
I can say is the forces of uh capital, the forces of uh politics, human avarice and greed, and also the need for power.
Uh makes this, you know, we we we fight big pharma, the fights we have every day are uh massive against long odds, and we've won more than we've lost.
But I tell people this is the hardest one I've ever seen because of what's happened.
And I said at that time at DevO or when we had the thing at Davos when ChatGPT came out, I said you wait to venture capital and Wall Street gets involved.
Any thoughts of Joe Allen?
joe allen
You know, Steve, it would be a very different thing if Elie Zarykowski and Nate Soares were making these accusations, or at the very least, issuing these warnings in a vacuum.
If the tech companies, for instance, were just simply saying we're building tools, these guys are accusing us of building gods, they're crazy.
It'd be a very different situation, but that's not the situation.
Every one of them, even the most moderate, like Demis Hisobbas, but certainly Sam Altman, Elon Musk, uh, even Dario Amadei, they all are talking about the creation of artificial general and artificial superintelligence.
And so when we first started covering, when you first brought me on four and a half years ago, we hit a lot of the points that Judkowski was making.
We would show videos and try to explain to the audience, and they by and large didn't really grasp the reality of it because it wasn't as much of a reality four and a half years ago.
In just that short amount of time, we've seen the creation of systems that can competent competently produce computer code.
We saw at the very beginning, GPT was not supposed to be online very quickly.
That ended basically all the warnings that Yudkowski gave early on when AI was hitting the headlines, uh, those are coming to pass.
My question for Yudkowski would be this, and and for Sores you live in and among the most techno-saturated culture in the country, San Francisco.
Can you give us some insight into the mentality of the people who are willing to barrel ahead no matter what and create these systems, uh, even if that means the end of the human race.
nate soares
So some of them have just come out and said, you know, I had a choice between being a bystander and being a participant, and I preferred being a participant.
That is in some sense enough to explain why these people are barreling ahead.
Although I think in real life there's a bunch of other explanations too.
Like the people who started these companies back in 2015 are the sort of people who are able to convince themselves it would be okay to gamble with the whole civilization like this.
You know, we've we've seen comments like uh back in 2015, I believe uh Sam Altman said something like AI might kill us all, but they'll be good companies along the way.
Um, or I think he maybe even said artificial general intelligence will probably kill us all, but there will be great companies made along the way.
I don't know the exact quote.
But that that mentality, there's it's not someone taking seriously what they're doing.
It's not someone treating with gravity what they're doing.
This wouldn't be an issue if they they couldn't also make greater and greater intelligences.
But in this world where we're just growing intelligences, where people who don't know what they're doing and are the most optimistic people that were foolish enough to start the companies can just grow these AIs to be smarter and smarter.
That doesn't lead anywhere good.
eliezer yudkowsky
You had any uh any comments you know, the uh when Jeffrey Hinton, that now Nobel laureate Jeff Jeffrey Hinton, sort of woke up and noticed that it was starting to be real.
He uh he quit Google and uh started speaking out about these issues more openly.
You know, who knows how much money he was turning down by doing that, but that's what he did.
And you know, that that's one kind of person that you have on the playing field.
And and then you've also got the the this, you know, the people who are selected and filter for being the sort of people who would, you know, back when open AI started, go over to Elon Musk and say and say, you know, you know how we can solve the problem of these things we can't control.
We can like put them in everyone's house.
We can give everyone their own copy.
Yeah.
And this was never valid reasoning.
This was this was, you know, this was this was always kind of kind of uh moon logic, but but they sure got Elon's money and then you know took took it and ran off.
And that that's just the kind of that that's just the kind of people we're dealing with here.
steve bannon
Uh guys, can you hang on for one second?
I just want to hold you through the break because I want to give people access to how to get this book, where to get it, your writing, social media, all of it.
Yud and Nate.
unidentified
Um heroes.
steve bannon
Very hard work they're doing.
unidentified
Very, very, very, very hard.
steve bannon
Short commercial break.
We're back in a moment.
unidentified
So your host, Stephen K. Uh, Nate and Yod.
steve bannon
Uh, by the way, just for I let you guys go and give your coordinates and tell people how to pie this amazing book, purchase this book, uh, which we're gonna break down and spend more time on, folks, in the uh days ahead.
Um, there's a movie called Mountain Head, uh, which actually it basically has actors playing Elon Musk, Steve Jobs, I think Zuckerberg, and um Altman.
And it turns it's our it's actually kind of dark to begin with, but it turns very dark when one becomes they identify one as a decelerationist.
Are you guys Nate?
You first, are you a decelerationist about this?
nate soares
I would decelerate AI.
I would decelerate to any technology that could wipe us all out and prevent us from learning from mistakes.
Every other technology, I think we we need to go full steam ahead and are sometimes hobbling ourselves.
But AI in particular, and technology that kills everybody and leaves no survivors.
You can't rush ahead on that.
steve bannon
Uh Yud, are you a decelerationist?
eliezer yudkowsky
I've got libertarian sympathies.
If a product only kills the voluntary customers and maybe the person who sold it, you know, that's kind of between them.
Yeah, I might have sympathy, but not to the point where I try to take over their lives about it.
If uh if a product kills people standing next to the customer, it's a regional matter, different cities, different states can make different rules about it.
If a product kills people on the other side of the planet, that's everybody's problem.
And you know, you know, uh, yeah, uh, you don't have to agree with me to to want you, Mammy to not die here uh about this part, but I would happen to, you know, go full steam ahead on nuclear power.
Um yeah, it's it's just the it's just the special case here.
Artificial intelligence, uh, you know, gain of function research on viruses might be another thing.
But you know, you know, that it it it does actually differ by the technology.
You don't have there's not this one switch that's that's set to excel or decel.
steve bannon
Uh Yud, uh by the way, what's your social media?
I might add we were the first show in January of 2020 to say about what the University of North Carolina was doing on gain of function was a danger to humanity and we're laughed at by the mainstream media as being conspiracy theorists.
Yud, what is your what are your coordinates?
What's your social media?
How do people follow you?
your thinking, your writing.
eliezer yudkowsky
ES Yudkowski on Twitter.
unidentified
Thank you.
steve bannon
Thank you, sir.
Uh Nate, uh, where do people go to get you?
nate soares
Yeah, I'm uh S O 8RES on Twitter.
steve bannon
Thank you, guys.
Actually, we're very honored to have you on.
Look forward to having you back and break down this book even more.
Everybody ought to go get it.
If anyone if anyone builds it, everyone dies.
Uh, a landmark book, everyone should get and read.
Thank you guys for joining us in the war room.
Joe Allen's gonna stick with us.
I'm gonna go back to Joe in a moment.
Uh Shimain, you're an ambassador.
First off, you're one of the ambassadors of turning point.
Give me your thoughts of of what's evolved this week.
And you know, we got Charlie's funeral or celebration of life on Sunday.
Give me your give me your you knew him extremely well.
Uh, and you do the faith uh show here on Real America's Voice, but uh you're an ambassador at Turning Point.
Your thoughts about this uh tragic week.
shemane nugent
It's been horrific for so many people.
It's been a turning point for not just America, but for the world.
And we saw Charlie as the last good guy.
And for this to happen to him is just devastating to so many people.
And so many people are wondering what do we do?
What do we do with all this anger and sadness?
And I say silence is not an option at this point.
We must move forward.
We must carry that torch that Charlie gave us.
So that's what I'm trying to do with my Faith and Freedom show right here on Real America's Voice.
And I appreciate that.
I think I am the oldest turning point ambassador.
I have to be.
I don't know anybody older.
steve bannon
But hang on, that's just biological, that's chronological or bio or chronological age.
It's certainly not biological age.
You've got more energy.
You've got more energy than 20 young people.
How do you how do you do it?
I mean, I see where you're going with that.
You're an ambassador.
Hey, I just I'm just I'm just calling it like I'm just calling balls and strikes here.
shemane nugent
Well, it's true.
unidentified
Tell me about it.
steve bannon
Tell me what keeps you so young.
Well, besides your husband, which I know is young, he's young, he's young at heart, or it's like parenting a young child.
Uh, besides Ted, what keeps you young?
shemane nugent
Steve, that's a whole nother podcast, okay.
About about Ted and trying to stay young, but I I think you're right.
There's a study recently about epigenetics, which is the science that shows your DNA is not your destiny.
None of us eat right, so we all take supplements, right?
Most of us do, but there's so many different fruit and vegetable supplements on the market.
And if you study their ingredients, which I have, I'm a label reader.
It's just common produce with limited nutritional value.
There's a product called Field of Greens, and it's different.
And they wanted to prove that it was different by doing a university study where each fruit and vegetable in Field of Greens is medically selected for health benefits.
There's heart, health, lungs, kidney, liver, healthy metabolism, and healthy weight.
And in this study, Steve, they wanted to see how diet, exercise, and lifestyle changed your real age.
And this is fascinating to me.
But some of the participants, they ate their normal diets, they included fast food, they didn't exercise anymore, and they didn't stop drinking.
All they did is add field of greens to their daily routine, and their the results were remarkable.
60% of participants showed a measurable reduction in their biological aging markers after just 30 days.
One scoop of field of greens slows the body's aging process at the cellular level.
And I think this was what helps me because I've worked out all my life.
I'll be honest, I don't eat right all the time.
So just by taking one scoop of field of greens, I can see that aging slow down.
steve bannon
Field of Greens.com, promo code BAN and get 20% off.
It'll ship out today.
We hit it every day here at the uh war room.
Now, does it have all the the uh about this Texas AM study, but also I get an energy boost, Shamain, and uh just every day.
So that's where we take it.
Want to thank you for all you do, and particularly being an ambassador and helping the folks over at um turning point, particularly in this in this very difficult phase for the movement, for the company, for Erica, the kids, everybody.
So I really want to thank you for joining us today.
shemane nugent
Um, we have to.
It's an Esther 414 moment.
If we remain silent, silent release relief and deliverance is going to come from someplace else.
We were born, Steve, for such a time as this.
steve bannon
Thank you very much.
Wisdom and energy all in one.
Thank you, ma'am.
peter navarro
Appreciate you.
steve bannon
Dr. Navarro, you were our co-host.
Uh, you were the contributor, you were the presidents who've been with the president now for 10 years.
You're his economic advisor.
But tell me about you wrote this piece that's pretty moving.
And it's I think it's so tied to your book about you went to prison so that we don't have to.
Talk to me about Charlie Kirk.
peter navarro
Steve, I really want uh people to understand the legacy of Charlie Kirk historically.
He could have been president, he's early read it and a governor, but he already at 31 years old, he's the greatest political organizer in the last 50 years.
And if you compare him to the two who were there at the top before Charlie Kirk, Ralph Reed on the right, David Axelrod on the left.
What Ralph Reed did with the Christian coalition is mobilize the Christian right to get out and actually vote.
He was responsible for the Gingridge Revolution in 1994, uh, as well as the Bush win in 2000.
And then Axelrod on the left, uh, he was able to mobilize a natural Democrat constituent, blacks, Hispanics, and young people, um, used micro-targeting, some advanced kind of techniques at the time, uh, and basically won the race for Obama in 2008.
The reason why Charlie is head and shoulders above each one of them is he had a much heavier lift, Steve, to mobilize the youth in support of MAGA and Trump and MAGA candidates in Congress, he had to first bring him over to our side.
And that was a heavy lift.
And he did it.
When I first met him, Charlie Charlie, back in in 2016, young kid, thinking he was going to go out there and change the viewpoint of the youth of America.
I thought he was Don Quixote.
I'll be honest with you, Steve.
He proved me wrong, he proved the world wrong.
And he um he he people need to understand father, husband, patriot, just a wonderful human being that's here.
But but in terms of pure historical significance, he will go down in history as the greatest political organizer in the last 50 years.
And I don't think anybody's ever gonna do again what he did because it's it's it's relatively easy to mobilize.
It's very difficult to persuade people over to your side and then mobilize Steve.
steve bannon
Hang on for a second.
Uh, I want to, because you got your PhD at Harvard, then you went back, you taught in the university system.
Uh when I first saw Charlie, and I think Breitbart's the first guy to give him a paid gig, but some of the people around Breitbart were the what financed him at the very beginning when he was going off to student governments.
unidentified
Yeah.
steve bannon
And I think many people who who thought Charlie was just a ball of fire, thought it was the longest odds because you see, and you thought so too, because the universities, as we know now, are so in it it is based around this kind of radical philosophy.
And the kids are formed all the way from kindergarten all the way up.
So it's so that is to me the great the greatness of Charlie Kirk that he was able to go in and just do this when when so many people said, hey, look, this guy's great, he's fantastic, but this is Don Quixote.
This you're tilting at windmills here, it just can't happen.
And you knew it better than anybody because you you you were inside the belly of the beast.
peter navarro
Well, brutal.
I spent 25 years at the University of California in Irvine.
And uh if there's a system that ever was woke, uh that certainly um is it.
But what Charlie understood, you know, he he didn't he didn't start at Harvard and Cornell.
He understood that most of the universities of this country are in flyover country.
And he just rolled his sleeves up, he was tireless.
He went out there and Socratically, I mean, I when I taught in the classroom, I was a big fan of the Socratic method.
You let you can't tell people things.
You've got to have them come to their own conclusions.
And that's how Charlie was able to bring people home to MAGA.
And um, very keen intellect.
I mean fast forward, it's like uh when I got out of prison, you know, the day I got out of prison, um the July 17th, I went to the Republican National Committee, gave the speech.
I went to prison so you won't have to.
The title of the book is actually a tagline from the speech.
It means like a wake-up call.
But I mentioned this in the context of Charlie because I didn't even know this, but two days earlier I saw a clip uh was shown.
I was on the set of a uh shortly after he got killed.
And he um he was given a speech on my birthday, July 15th, two days before me, and he said, I visited college campuses, so you won't have to.
And it, I mean, the the way I just somehow it struck a warm cord um in me.
And I I felt like we were fellow travelers.
And you know, I'm on the campaign trail, getting out of prison with the boss, my fiancee pixie in the book.
I call her Bonnie.
Um, you know her well, Steve.
She's been on your show.
Uh we'd see Charlie everywhere, right?
Everywhere we'd go, we went to Georgia, North Carolina, we were in Pennsylvania.
Um, he was always there.
And then during the transition, um, he was essentially uh Sergio Gore's uh uh co-pilot there, putting all the personnel together in the administration.
And look, uh the boss Charlie was like a son to uh Donald John Trump uh as well as uh as a key advisor, one of his most trusted advisors.
So uh he'll be missed.
Um I'm gonna try to hit your ride out hanging Air Force One on Sunday and be there.
I'm sure you'll be covering.
steve bannon
Hang on one second.
Yeah.
Yeah, hang on a second.
We're doing wall-to-wall coverage of it.
I want to hold you to the break because I want to talk about the book for a second.
Back in a moment.
unidentified
We will play till they're all gone.
We rejoice with the old law.
Let's take down the C C B. Here's your host, Stephen K. Back.
steve bannon
Okay.
Um thank you.
By the way, gold retract a little bit today.
It's not the price of gold, it's the process of how you get to the value of gold.
Make sure you take your phone out right now and text Bannon B-A-N-N-O-N 989898, um, to get the ultimate guide, which happens to be free, investing in gold and precious metals in the age of Trump.
So go check it out.
We had a rate cut last night, only 25 basis points.
President Trump wants more.
His uh Steve Mirror, the uh council on economic advisors chair is now the I guess the interim governor, he voted against a one of 50 basis point cut.
Go find out why gold has been a hedge for times of financial turbulence in mankind's history.
Um Joe Allen, I know you got a bolt.
Uh just give your coordinates.
I'll get I you can't be on tonight because you're gonna be at one of the conferences.
I'm gonna get you back on, hopefully tomorrow.
We had a historic interview today on the book.
Uh if anyone builds it, everyone dies.
Very uplifting.
Uh, where do people go to get your writing, sir?
joe allen
If the audience wants to hear uh the in-depth interview I did with Nate Soares a couple of weeks ago, it's right at the top of my social media at J-O-E-B-O-T XYZ.
Also an article about the hearing two days ago with Josh Hawley and the parents of children who have who were lured into suicide by AI.
That's also up at the top of my social media at J-O-E-B-O-T-XYZ or JoeBot.xyz.
steve bannon
Thank we are backed up on a lot of stuff because of the Charlie Kirk situation was obviously a priority, including designating Tifa a terrorist group, so we can get to the bottom of all of it and not just have this dealt with by Utah officials as a single murder.
It's much deeper than that.
What uh the assassination of Charlie Kirk.
Joe Allen, thank you so much.
Uh Josh Alley was supposed to be here today, because we had the press conference, and right there on the screen, you see the president of the United States getting ready to leave uh to go to the uh to go to the airfield to take Air Force One back.
He'll arrive later tonight.
Of course, we'll be covering all of that.
Peter Navarro, uh, one of President Trump's closest advisors, and I think arguably the longest advisor, I think you're the only one that's been there from the very, very, very beginning that's still there.
Uh, why should in a world of all this information, everything going on, as big a hero as you are to this movement, as highly respected as you are by President Trump, because you're kind of the architect with him of the reorganization of the world's trading patterns.
Uh, why should people buy a book about you and your days in prison, sir?
peter navarro
It's not about me, Steve.
And I would ask the posse to go right now to Amazon.
I went to prison so you won't have to.
The book is really the best analysis of how the left is going after all of this.
If they can come from me, they can come for Steve Bannon, put him in prison.
If they can try to put Trump in prison now, they shot Charlie Kurt.
They can do this to you.
And I'll take you into prison, and I went to prison, so you won't have to.
But the broad scope here, Steve, is really an analysis about the asymmetry, the disturbing asymmetry between how the left is waging war on this.
You mentioned everybody I serve with, Steve, including you.
Has been a target of the left.
At a minimum, they've spent millions of dollars in legal fees, whether it's uh Mike Flynn or America's Mayor Rudy Giuliani.
Uh they take the bar cards of Jeff Clark, John Eastman.
And on the other end, Steve, you and I went to prison, and everybody who put us there, every single person was a Democrat except Liv Liz Cheney, and that's the exception that proves the rule.
I mean, think about that.
How can the Democrats seize power and use that to put us in prison and call us fascists?
How dare they, Steve?
So I went to prison so you won't have to.
It's a story about how we must wake up to what's happening.
Um, but it's all the law fair story.
But it look, if you want to find out what it's like to go into prison for a misdemeanor and wind up spending four months with 200 felons, um, this is the book.
Uh and you know, Bonnie, my fiancee.
Um, it's it's also a story about how we were able to cope and deal with that.
The message there is simply that she did the time with me.
And that's what happens when people are unjustly targeted by the by the left.
It's not just you.
We can take it, Steve.
You and I are soldiers, we can take it.
But when they go after our families, yeah, that's where you draw the line.
So it's a big book, a big story.
And go ahead.
steve bannon
If they steal the election in 2028, trust me, this audience, they're gonna be coming for you.
You see now, it's uh we're in a different place than we even back then.
This is getting more and more intense every day.
That's why Navarro's book has got to be written because it's uh it's actually for you and about you.
Ain't about Navarro, not about me, not about President Trump.
There, they're different characters in the story, but the book is about you and understand.
Uh, like I said, there's nothing, there's no compromise here.
There's no unity here.
One side's gonna win and one side's gonna lose.
And if they steal it in 28, they're coming for you.
Peter, where do people go to get your writings?
Where do they go to get particularly your great piece on Charlie Kirk?
Where do they go to get your book?
peter navarro
Sure.
The book, uh, I went to prison so you won't have to.
It's on Amazon.
Uh it came out uh two days ago.
Please drive this thing up to best sellers so we get the message out.
It's our best defense about them targeting us.
I went to prison so you won't have to on Amazon.
The piece about Charlie is very close to my heart.
Um, it's on the Washington Times op-ed site today.
I put it out on X at Real P Navarro.
Uh, it'll be up on my substack on Sunday as we celebrate um uh Charlie on that sacred Sunday that we're about to have.
Um, and you can always go to my substack, peternavar.substack.com.
But Steve, I really appreciate what the war room does.
I appreciate uh have being able to come talk about I went to prison uh so you won't have to.
And um we got some the uh C-Span is running a uh a long hour long interview on Sunday at 8 p.m.
steve bannon
We're gonna talk about on Sunday night.
peter navarro
Yeah, you know we'll talk about Peter.
steve bannon
Peter's gonna be with us, yeah.
You're gonna be on Saturday, and also you're gonna be on Sunday.
We're doing wall-to-wall coverage live from the stadium.
We'll give more details like Peter Navarro.
Thank you.
Mike Lindell.
It's been a long, tough morning for the war room.
They need a deal, sir.
What do you got for us?
mike lindell
They'll steal every everybody.
We've got the three in one special.
I'm sitting here back in Minnesota at my factory.
All the towels came in.
Remember, they actually work the six-piece tile cells.
3998.
They're they're normally 6998.
And then we have the pillows, the games of dream pillows.
Um, the the covers.
All of the sheets are on sale, 2988.
Once they're gone, they're gone.
These are the for kill sheets.
You guys go to my pillow.com forward slash war room, and then you're gonna see all of the um you're gonna see the all the big ticket items at the website, free shipping on the beds, the mattress toppers, 100% made in the USA.
And people remember we have a 10-year warranty and a 60-day money-back guarantee, everybody.
steve bannon
Two hours of populist nationalism, hosted today by Megan Kelly.
Export Selection