Speaker | Time | Text |
---|---|---|
This is the primal scream of a dying regime. | ||
Pray for our enemies. | ||
unidentified
|
Because we're going medieval on these people. | |
I got a free shot on all these networks lying about the people. | ||
unidentified
|
The people have had a belly full of it. | |
I know you don't like hearing that. | ||
I know you've tried to do everything in the world to stop that, but you're not going to stop it. | ||
It's going to happen. | ||
And where do people like that go to share the big lie? | ||
unidentified
|
Mega Media. | |
I wish, in my soul, I wish that any of these people had a conscience. | ||
unidentified
|
Ask yourself, what is my task and what is my purpose? | |
If that answer is to save my country, this country will be saved. | ||
unidentified
|
War Room. | |
Here's your host, Stephen K. Babb. | ||
Here's your host, Stephen K. Babb. | ||
Monday, 7 April, Year of the Lord 2025. | ||
Of course, it didn't turn into be black. | ||
Monday today. | ||
Not because Jim Cramer and the insane people over at MSNBC and New York Times didn't try to make it happen, but it didn't. | ||
Also, futures trading is already up. | ||
Asian markets are about ready to open. | ||
We'll get on top of all that here momentarily as we go through another day of this evolving trade war against the Chinese Communist Party. | ||
It's quite evident. | ||
What is happening here, and we'll break it all down for you a little later in the show. | ||
I want to bring back Rosemary Gibson. | ||
So Rosemary, in the middle of this trade war, you're one of the first investigative reporters to come out now seven years ago with, hey, look folks, I don't know if anybody realizes this, but all the generic drugs that people take, which is a huge percentage of drugs that people take, and 100% of the active pharmaceutical ingredients are all manufactured in China, and we have no control of that supply chain. | ||
Now we're losing the economics of it, but more importantly, it's a strategic asset. | ||
And we warn people about this all the time because the CCP is having their back up against the wall right now with President Trump. | ||
The tariff rate is 104%, I think. | ||
And President Trump's not backing off. | ||
This is going to be, it was very clear when Scott Besson came out early and said, hey, we're setting up a process. | ||
To talk. | ||
And just a while ago on Larry Kudlow, we've got that clip and we'll play it later. | ||
Scott Besson said there's 75 major trading partners that are now trying to get on his schedule to come in and talk about the trade relationships and whether the reciprocity thing is too high or we miscalculated or we didn't calculate correctly the non-tariff. | ||
trade barriers or you know currency manipulation all that but china's not one of them in fact president trump's canceled any further discussions with china given the fact that they tried to retaliate and he told them flat out you try to retaliate i'm going nuclear they tried to retaliate and he went nuclear you're you're clenching You're saying that generic drugs coming through here... | ||
Because either the FDA is overwhelmed or they don't have levels of competence, but that these generic drugs are essentially contaminated. | ||
And a pretty high percentage of them are contaminated by different things. | ||
I don't want to put words in your mouth, but is that what your investigation is showing? | ||
unidentified
|
Well, this is what the Department of Defense testing program that started in November of 2023 has found so far, Steve. | |
And they launched this generic testing program. | ||
Kaiser Health Plan was the first one to start a testing program. | ||
They have 13 million people. | ||
So they knew that something was not right, that our regulatory folks at FDA, it's too much for them. | ||
They don't have the enforcement power against the corporate control by Congress and the Wall Street folks. | ||
So Kaiser Permanente started testing drugs, and then the DOD followed suit, again, because they realized the FDA, and the FDA said this, that they do not have information on the quality and safety of the U.S. medicine supply. | ||
That was written in 2015. | ||
It's in black and white. | ||
And so the DOD started this generic drug testing program for the medicines that are important to them, and according to that independent testing, 13% of what they've tested so far, they're not pure as they should be. | ||
They have carcinogens. | ||
They have arsenic in some of the samples. | ||
They have lead. | ||
And plus, they weren't made right, so they don't work right. | ||
And so you don't get the protection or the therapeutic value. | ||
And this is so far from Six Sigma quality. | ||
Any manufacturer with 13% defect rate is just really bad. | ||
And by the way, it's not just China, Steve, it's India and other trading partners. | ||
And you know, we can put the onus on China or India, FDA not doing its job. | ||
Steve, what we need to look at is this. | ||
There are six companies, U.S. companies, and they source 90% of the generic drugs from around the world. | ||
They're the ones that bring them into U.S. commerce. | ||
Just as RFK convened the Leaders and CEOs of food industry businesses, we need to have a meeting at the White House with these six CEOs. | ||
No complaining, no hand-wringing. | ||
What are you going to do about it? | ||
They've outsourced their equality role to the FDA. | ||
They can't do it. | ||
So this needs to change. | ||
And we have to implement the executive order that President Trump signed in August 2020 to bring back domestic manufacturing, Direct the DOD, HHS, to give priority to domestic manufacturers of the finished drug, the active ingredients, and the components to make them. | ||
And those components to make the APIs, that's where China has the global chokehold. | ||
And we absolutely must ramp up independent testing. | ||
Let's get away to brief the new DOD leadership. | ||
They've got to, because they supply the White House pharmacy, they supply... | ||
The crash cart, we have to have those products tested. | ||
There could be some retaliation in lots of different ways, and we have to protect our people. | ||
But hang on, hang on, hang on. | ||
At 0.13%, which would be extraordinarily high if it wasn't 0.0001%, I could say, hey, you know, maybe we're better. | ||
At 13%, What are you talking about? | ||
This stuff is poison. | ||
They're just shipping in poison. | ||
13% of what they're sending has carcinogenics, arsenic, or lead in it. | ||
It just shows you we're kidding ourselves. | ||
What does it take? | ||
Because I know you and Peter worked on this situation, I think, south of my hometown. | ||
I think it was in Petersburg or Hopewell to try to have the initial kind of test facility to do this. | ||
I think on the executive order. | ||
Is that being successful? | ||
Do we even have the possibility of saying, hey, look, guess what? | ||
We're not going to take it from India. | ||
We're not going to take it from South Africa. | ||
We're going to take it from Europe. | ||
We're definitely not going to take it from China. | ||
We're going to give you one year, and we want all 100% of generics and API manufactured in the United States of America by people called Americans. | ||
Ma'am? | ||
unidentified
|
It's going to take a lot more than a year, Steve, but you know what? | |
We have to get started. | ||
Thank you. | ||
need 50 of these places around the country, but you know the most important thing that we need, Steve, is customers. | ||
Instead of the DOD sending our taxpayer money to China or India to make our drugs, let's keep that money at home. | ||
And by the way, some of the lower quality generics, the DOD was paying more money for them than generic drugs that actually performed better and had no problems. | ||
So the cost and quality is not correlated at all. | ||
So those who want to say, oh, it's going to break the bank, may not be true. | ||
So I think we have an opportunity here. | ||
We have a burning platform because it's not just supply chain issue. | ||
This is about the quality and safety. | ||
And if we don't intervene now, Steve, if you have chaos and disorder and you do nothing, what happens? | ||
It gets worse and worse and worse. | ||
So we have to, with President Trump's leadership, get in here now and get this going. | ||
Rosemary, I know we're going to follow up on this. | ||
Where do people go in the interim? | ||
Social media or your website or whatever? | ||
Hopefully we have another book in you about this because this is pretty stunning. | ||
And I know we're going to push this out and make some news today on this. | ||
But where do people go to follow up before we have you back on the show? | ||
I have to put my head back together then have you back on in a couple days. | ||
unidentified
|
This is mind-blowing. | |
It's a long story. | ||
We're trying to compress it in a real short period of time, but on X, Rosemary 100. | ||
A lot of smart people out there, Steve. | ||
They know their drugs are not working right, and now we're getting the data to show it. | ||
There you go. | ||
Okay. We'll follow up with you, Rosemary. | ||
Thank you for coming back on. | ||
You changed the world last time in the early days of the pandemic. | ||
Let's change it again. | ||
unidentified
|
Let's do it. | |
Wow. Remember Rosemary Gibson came on? | ||
Shocked the world. | ||
Got President Trump's attention. | ||
Eventually he signed that executive order. | ||
I think we're going to have to dust that baby off. | ||
Get Navarro on this. | ||
This is part of the whole trade war. | ||
I've always been very uncomfortable when the Chinese Communist Party is in charge of anything. | ||
Not that they would do it on purpose. | ||
To set 13%? | ||
Are you kidding me? | ||
At.00013 it's too high. | ||
Okay, let's talk about, can we put the, do we have a cut on Netanyahu? | ||
Let's go back and do we, can we play the President's Cut again? | ||
I want to play the President's Cut as I look for this, the, yeah, the, I've got this. | ||
So it was a, I think a big disappointment for Bibi today. | ||
Let's go ahead and play, let's go ahead and play what happened in the Oval, and then I'll try to describe it to you. | ||
unidentified
|
Let's go ahead and let it rip. | |
Thank you, Mr. President. | ||
I want to ask you about Iran, because this is the first time we hear that the U.S. is having a direct contact with the Iranians. | ||
Is it possible to give us some more information at what level the U.S. is represented? | ||
We're dealing with the Iranians. | ||
We have a very big meeting on Saturday, and we're dealing with them directly. | ||
You know, a lot of people say, oh, maybe you're going through surrogates, or you're not dealing directly, you're dealing through other countries. | ||
No, we're dealing with them directly, and maybe a deal's going to be made. | ||
That'd be great. | ||
It'd be really great for Iran, I can tell you that. | ||
But hopefully, we're not going to have to get into that. | ||
We're meeting, very importantly, on Saturday at almost the highest level, and we'll see how it works out. | ||
Please. Okay, given what's going on today and kind of the historic nature of this day on the geoeconomic and geopolitical side, I think this has kind of gotten buried. | ||
And there was a reason that they called off the press conference in the East Room. | ||
And they said, oh, it's not. | ||
We're just going to have a press availability. | ||
There's a big difference in press availability and having a formal press conference. | ||
Blacktivist and Ryan Grimm Put out later that he agreed with us. | ||
I just want to go through, because I think it's very important for everybody in the audience to understand this. | ||
The meeting with Netanyahu and Trump was a major disappointment from Netanyahu's perspective, based on this guy's reading of Netanyahu's brief statement in the Oval Office. | ||
Here's what transpired. | ||
Number one, Netanyahu's attempt to have the tariffs lifted was unsuccessful. | ||
Number two, his efforts to prevent the U.S. | ||
from selling the F-35 fighter jets to Turkey failed. | ||
Additionally, he was unable to convince Trump to pressure Erdogan into abetting his goal of building military bases in Syria. | ||
Now, I think that's a big defeat for all of us, because I think there's no way we should be arming Turkey, given what Erdogan and his overall strategy for the new caliphate. | ||
Be it as is made, that's Netanyahu's opinion of what happened with President Trump. | ||
Number three, Netanyahu was also painfully unsuccessful in getting Trump to reveal his plans regarding Iran and Persia. | ||
On Gaza, Trump wants the war to end and is no longer advocating for the ethnic cleansing of the region. | ||
This guy calls it ethnic cleansing. | ||
We would call it continuing fighting. | ||
Conclusion, a surprisingly bad day. | ||
I have to agree with that. | ||
I think the core of this, they said it was because of terrorists. | ||
The core of this is Netanyahu wanted to come once more, just like a month ago, and pitch. | ||
A military plan for Persia, that just cannot happen. | ||
We can't get into another military conflict in the Middle East, and there are ways with the leverage we have, and we have tons of leverage to bring economic warfare, just like we're bringing it to the Chinese Communist Party right now, to bring even more to bear against the Persians, and that which needs to take place. | ||
I think it was kind of a, maybe not a surprise, but a surprise President Trump was saying it publicly, that at the very high level, we're very far down the road with direct dealings with the Persians, and there's a meeting on Saturday at a very high level. | ||
And he went out of his way and said it was a direct Persia to the United States meeting. | ||
Obviously, we've been a big advocate here in the war room. | ||
This is one of the things we'll get taken care of under the Russian rapprochement. | ||
So let's see if that happens. | ||
President Trump went out of his way to say, hey, no other countries were involved here. | ||
His intermediaries, I think he meant to say Israel and Russia, and this was going directly, so we'll see what transpires out of that. | ||
But maybe I would hold off on bombing the Houthis and let the Brits, French, and Italians get down there. | ||
And they keep the sea lanes open to the Suez Canal for a change. | ||
Okay, we have a – now, do we have the short version or the long version? | ||
We have a short version or a long version of this? | ||
Besson? Yeah, no, not Besson. | ||
We're going to play – we're going to go right now to artificial intelligence. | ||
So let's go ahead and play the version we got up. | ||
We're going to have Max Tegmark from MIT and our own Joe Allen join us in a second. | ||
unidentified
|
Open AI will do great work. | |
We are trying to sort of solve general intelligence. | ||
We're trying to build the algorithm for a program that is truly smarter than a human in every way. | ||
And then figure out how to have that maximally benefit humanity. | ||
And that's why we're a nonprofit. | ||
We don't ever want to be making decisions to benefit shareholders. | ||
You know, the only people we want to be accountable to is sort of humanity as a whole. | ||
I think I've often said that my chance that something goes really quite catastrophically wrong on the scale of human civilization might be somewhere between 10 and 25 percent. | ||
In 10 years, How is life going to be different because of AI for just a normal person? | ||
I think in 10 years, based on the current rate of improvement, AI will be smarter than the smartest human. | ||
There will be ultimately billions of humanoid robots. | ||
All cars will be self-driving. | ||
Now, if AI will be smarter than any person, how many jobs go away because of that? | ||
Goods and services will become close to free. | ||
The challenge will be... | ||
Fulfillment. How do you derive fulfillment and meaning in life? | ||
How real is the prospect of killer robots annihilating humanity? | ||
20% likely. | ||
Maybe 10%. | ||
On what time frame? | ||
5 to 10 years. | ||
You can look at it like the glass is 80-90% full. | ||
Meaning like 80% likely we'll have extreme prosperity for all. | ||
Okay, Max and Joe Allen, join us. | ||
Max, you've been one of the leaders in trying to alert people to the fact that we're heading down a path in artificial intelligence that is, quite frankly, not only not regulated, really not looked at by any outsourced sources. | ||
We're kind of taking everybody's word for it. | ||
In that clip we just played right there, when you have people saying, I don't know, 20% that robots could kill humanity. | ||
I mean, these numbers are astronomical. | ||
And we've got so much going on. | ||
We've got, you know, President Trump just came out and endorsed the big, beautiful Senate bill, which I think has got some issues to it, and I don't think is going to support the House. | ||
That's a whole other fight we're going to get into tomorrow. | ||
You've got the trade war going on. | ||
You have all this geopolitical activity going on throughout the world right now. | ||
I think, unfortunately, this has been put a little bit in the back burner. | ||
And I keep saying this ought to be not just on the front burner, this ought to be on the front burner with the heat turned all the way up. | ||
So take a second and describe to the audience exactly where do we stand with this whole race on artificial intelligence, the couple or three people that are really making this happen, and how little oversight there is, sir. | ||
unidentified
|
Yeah, it's truly insane what's going on right now. | |
And it's so easy to miss the forest for all the trees because of all the other things going on. | ||
What's basically happened is scientists have always been curious about how stuff works, and we figured out how muscles work, and we built machines that were much stronger than us, gave us the Industrial Revolution. | ||
People then shifted to starting working more with their brains. | ||
Now, the people are trying to do the ultimate replacement, where you replace not just the muscles, but also our brain work with machines that can just outthink us in every way. | ||
And if that happens, We have nowhere to go. | ||
First of all, I think it's very naive to just trust that some oligarchs are going to be very compassionate about taking care of people when they don't need to anymore if we can't get jobs. | ||
Second, there's this even bigger risk that it's not clear that anyone is able to control. | ||
...machines that are just vastly smarter than us. | ||
Just walk down to the zoo and ask who's in the cages? | ||
Is it the tigers or the humans? | ||
And ask yourself why. | ||
It's because the smarter species tends to control. | ||
And the sad fact is that we're much closer now to building smarter than human machines. | ||
Many of the leaders of the companies and many of the top scientists think it's going to happen within one to five years, depending on who you ask. | ||
I had drinks with one of the leaders the other week. | ||
He thinks it's going to be in March next year. | ||
And we're much closer to building this than we are to figuring out how to control it. | ||
Okay, hang on one second, hang on one second. | ||
There's one thing to talk about the cultural or socioeconomic impact this can have of cutting through and laying off all the low-level programmers, administrative, clerical, managerial. | ||
Is that good or bad? | ||
How do we maximize it? | ||
Are we innovative or just looking for efficiencies? | ||
That's a whole line of argument. | ||
I want to go back. | ||
To the one that's in front of us that to me should be the most important we've got to talk about. | ||
And that's the control issue. | ||
When you say that, explain to people when you say, I guess it's artificial general intelligence or where a computer or where these thinking machines are actually smarter than a human brain. | ||
And you say it's one to five years. | ||
You know, a couple of years ago, people would tell us it was 10 years. | ||
It was 20 years. | ||
It was 25 years. | ||
Now you've met with you've met informally with one of the leaders in the industry says, yeah, I think it'll be March of next year. | ||
That's the last time I looked 11 or 12 months. | ||
Right. So and that would mean a red flag of like, what are we talking about? | ||
So specifically, what are you talking about when you say there's a problem with they could actually be specific? | ||
unidentified
|
Yeah, so if you look back a bit here, you know, AI, Has been seriously overhyped from the 1950s until about four years ago, falling far behind this promises, and people kind of got used to that, didn't take it seriously. | |
But as recently as five, six years ago, most of my colleagues also still predicted, therefore, that we were decades and decades away from building something that could master language and common knowledge at the level of ChatGPT. | ||
And they were all wrong, because there's been some enormous breakthroughs in the last five years, and things are now going much faster than we thought. | ||
So the kind of AI that people called AGI, artificial general intelligence, I think we should actually call it replacement AI, because the real purpose of the investors and companies that are building it is to replace all humans on the job market. | ||
Some of the companies even admit this on their websites. | ||
And then, as you said, if that happens, there's a separate question of whether they'll also just replace us all together on the planet and get rid of us. | ||
And I think we're on track for this technology coming. | ||
It's not here now, but during Trump's presidency. | ||
So if someone stops it, it's going to be Trump and his administration. | ||
And the reason that this is so insane is because we're effectively treating The AI industry, differently from any other powerful industry, AI is the only industry that can do whatever it wants without any safety standards. | ||
You talked earlier on the show here about the FDA and how you have to have safety standards before you can sell medicines to make sure they don't have 13% toxins in them, right? | ||
And we have that even for sandwiches. | ||
If there's a sandwich shop in San Francisco across the street from one of these companies, they can't even sell one sandwich until the health inspector has checked out their kitchen, right? | ||
Yet it's completely legal now if some company wants to build smarter than human machines that they have no idea how to control it, just sell them. | ||
That needs to change. | ||
So the good news with this is although the problem is very serious, and in my professional opinion, It's also super easy to solve. | ||
Just stop dilly-dallying and making special exceptions for this particular industry and their lobbyists, and just treat them like anyone else. | ||
There should be some sort of FDA for AI, and if they can't convince independent experts who don't have a conflict of interest, this stuff can be controlled. | ||
Come back when you can, buddy. | ||
Well, it's not controlled. | ||
It's not regulated for a reason. | ||
They don't want to regulate. | ||
But I want to just go back to make sure the audience, because I love replacing AGI with replacement AI, because that's what they're trying to do. | ||
Just go back. | ||
Something has been overhyped for decade after decade after decade. | ||
You said the language model was the differentiation. | ||
That's what was released, I guess, with ChatGPT. | ||
At Davos two years ago, I think it was. | ||
Two or three years ago. | ||
Why did that catch people by surprise? | ||
And why was that such a huge leap that all of a sudden folks go, wow, this thing really could replace humans? | ||
Why was that such a big leap for the industry for how you actually build these? | ||
And what exactly happened? | ||
unidentified
|
Traditional AI systems, the ones that always underperformed, had the intelligence programmed into them by humans, like the machine that beat Garry Kasparov at chess, you know, back when I was a kid, was programmed by people who knew how to play chess. | |
The modern approach is to instead make machines that just grow intelligence by gobbling up lots of data. | ||
And the reason people underestimated how well that was going to work, I think, is because I have a simple mistake. | ||
Imagine if we had this conversation the year 1900. | ||
How long will it be until we can have flying machines? | ||
And you said to me, you know, hey Max, we're clearly decades away because we don't even understand quite how birds fly and can't build mechanical birds. | ||
That would have been wrong because there turned out to be a much easier way to fly airplanes. | ||
And I think a lot of my colleagues, even very smart ones, similarly thought we would never figure out how to make thinking machines that could outthink us until after we figured out how our brain works. | ||
And we're nowhere near figuring out how the brain works. | ||
Turns out, there is an easier way to build thinking machines, and that's what the industry is doing. | ||
Hang on one second for a second, Max. | ||
Joe Allen and Max are with us. | ||
We're going to take a short commercial break. | ||
Make sure that you go. | ||
Financial turbulence is going to continue tomorrow and the next day and the next day and the next day. | ||
President Trump is on a monumental historic reordering of the global trading patterns in the world. | ||
This is going to cause a lot of what we call perturbations. | ||
So make sure that you check out Birch Gold. | ||
Take your phone out right now. | ||
Bannon. B-A-N-N-O-N. | ||
Text it at 989898. | ||
Get the ultimate guide for investing in gold and precious metals in the era of Trump. | ||
What Birch Gold is trying to do is trying to teach you the patterns of what causes gold to both rise and fall. | ||
Go check it out today. | ||
Make sure you make a personal contact in relation with Philip Patrick and the team over there. | ||
Short commercial break. | ||
unidentified
|
But I'm American-made I got American baby in America's heart. | |
on Hello, War Room Posse. | ||
Today we're going to have another huge War Room exclusive at wholesale prices. | ||
We're going to start with our My Towels with that proprietary technology. | ||
The bath towels that came in, the big bath sheets are in. | ||
The kitchen towels are in with all the accessories as low as $9.99. | ||
So go to MyPillow.com, scroll down until you see Steve, give him a click, and there it is. | ||
The kitchen and towel $9.99 sale. | ||
Right next to that. | ||
You have the spring sheet sale where you save up to 50%. | ||
And there's the MyCrosses, the most requested product, I think, ever for MyPillow. | ||
You save 30%. | ||
And there's the premium MyPillows, $18.98 for the Queens and $19.98 for the Kings. | ||
That's a War Room exclusive. | ||
Help keep my employees going, you guys, and help yourself get the best sleep ever and the best products ever. | ||
Or call 800-873-1062. | ||
800-873-1062. Promo code WARROOM. | ||
The most sought after promo code ever. | ||
Folks, I hate to say that I called this one early when Jonathan Karl, early this morning on the Good Morning America with George Stephanopoulos, said that there was a senior conservative member of the legal community that was going to place a challenge to President Trump. | ||
It's just been filed a headline in the Guardian right-wing group backed by Charles Koch, and wait for it, Leonard Leo, sues to stop Trump's tariffs. | ||
New Civil Liberties Alliance says President's invocation of emergency powers to impose tariffs is unlawful. | ||
First paragraph, a libertarian group backed by Leonard Leo and Charles Koch has mounted a legal challenge against Donald Trump's tariff regime in a sign of spreading right-wing opposition. | ||
To a policy that has sent international markets plummeting. | ||
This will be another fight. | ||
We told you we were going to go all the way to try to federal court to try to slow down President Trump. | ||
The Kochs and the Libertarians taking the same tactics as the progressive left-wing neo-Marxists. | ||
So we'll get on to that tonight. | ||
And sure, tomorrow morning we'll have a big breakdown in the morning show. | ||
Joe Allen joins us, our editor for all things singularity. | ||
Joe, I think Max is on to something here, big time. | ||
This is not getting enough coverage. | ||
You've got these four guys who all have Messiah complexes. | ||
They're running wild. | ||
Your thoughts, sir? | ||
Well, first off, Steve, I'd just like to really extend my gratitude to Max for coming on. | ||
A lot of people would maybe be nervous in his position to appear on a conservative show. | ||
But for the audience, just to jog memories, people who've been listening to The War Room for the last four years will remember in the first year, I recommended Max Tegmark's book many times, Life 3.0. | ||
And that's important because it really does give a profound framework for understanding what the impact and the import of artificial intelligence is. | ||
It is in many ways, even if you just think of it as an imaginative exercise, an invasion of a new species. | ||
It's very similar to the sort of replacement labor and replacement populations of mass immigration in the sense that it threatens jobs, it threatens national identity. | ||
I mean, what do you do when the digital space is 50% bots? | ||
It's earth-shattering. | ||
And I also want to remind the audience that they'll probably remember, some will, two years ago or so, the Future of Life Institute, of which Max is a co-founder, published an open letter to PAWS AI. | ||
They wanted to stop... | ||
Development of AI roughly at the level of GPT-4. | ||
Of course, we've now blown past that. | ||
Even many of the signatories, including Elon Musk, is attempting to blow past that. | ||
But it's really important, too, to look at what they're doing at the Future of Life Institute and what Max Tegmark is doing individually is trying to find some convincing way to persuade politicians that this is, | ||
in fact, And that if we don't at least begin the conversation on how to regulate this, on how to minimize the negative impacts, even if it doesn't become superhuman artificial intelligence, then we're going to be caught with our pants down. | ||
And the kind of nightmarish scenarios will absolutely be... | ||
Much more likely to unfold without any sort of action from either the public at large or preferably the government itself. | ||
And I'd really like to hear more from Max about the kinds of concrete proposals that he has for governmental policy. | ||
I spoke to him before and he makes a very strong case for a kind of limited regulation. | ||
Yeah. Well, no, let me turn it over to Max. | ||
First off, give us an update. | ||
Since you sent this letter and you're kind of a leader in this field, you sent this warning shot. | ||
What's been the result of just that? | ||
Did that really get people's attention? | ||
Did people dismiss it? | ||
Did people in the industry say, hey, don't ever mention Max's name again because he's kind of, you know, gone off the deep end. | ||
We've got to pursue this because the Chinese are pursuing it for whatever reason they give. | ||
What's the status when he put the warning shot across people's bow? | ||
unidentified
|
It had a huge effect. | |
there was a strong pent-up anxiety across society where people felt afraid of voicing their concerns with this for fear out of being branded clueless or crazy And when people saw that Professor Yoshua Bengio, who's now the most cited AI researchers in the world, and many others had signed this, it made it socially safe for everyone to really speak up. | ||
And that led very in short order to a statement by which was actually signed by all the top CEOs of the companies even saying, hey, you know, AI could drive us all extinct. | ||
And then there started being political discussion. | ||
Unfortunately, what's happened since then is the lobbyists from big tech have mobilized very successfully. | ||
There are more lobbyists in Washington, D.C. | ||
now from big tech than from any other industries. | ||
More or less memory hold the fact that the CEO is warned of the very technology that they're now rushing to build. | ||
So switching gears to the solution here, which is actually quite straightforward. | ||
It's important to remember, of course, we want to figure out how to cure cancer and reduce road deaths, et cetera, et cetera, if we can use AI for things like this. | ||
And we can have all of that without building replacement AI. | ||
There is a massive amount of research, actually, on how you can take AI systems that are tools, which I like to define as things that we can control for us. | ||
You don't want an uncontrollable car, for example. | ||
That's why your cars today are tools. | ||
How we can have amazing tools that help us cure diseases and make us more prosperous, etc., etc., and strong. | ||
Without ever taking that extra step and making things that can just replace all humans and that we don't know how to control. | ||
So that's where you want to draw the line. | ||
If you simply treat AI like we would treat car manufacturers, airplane manufacturers, drug manufacturers, and everywhere else, saying, hey, before you sell your stuff, show us why this is not something we could lose control over that just wholesale replaces us. | ||
Then what we'll get is a golden age of innovation. | ||
To build all these tools. | ||
And we will not be in this crazy rush where we worry if people are going to figure out how to control them in time. | ||
It's total scientific consensus, even though it's very little known among the broader public. | ||
But hang on for a second. | ||
This industry's had two just in the last couple of years. | ||
It's had two shocking moments given all the tens of billions of dollars in smart people on the finance side and venture capital side and obviously all the smart individuals on the scientific and technological side and programming side and things like the weapons labs, the national labs. | ||
The chat GPT moment at Davos shocked the world and kind of shocked the industry. | ||
Then, bro, you just had one a couple of months ago. | ||
That was orders of magnitude bigger than that shock, which was deep seek that came from the Chinese. | ||
So just given the nature and structure of this industry that continues to have things that come out of nowhere to the people who are already the best in the world and the people that are putting their money in to make sure they get return on capital, given that the landscape is in the last three years you've had two massive order of magnitude out of nowhere How could we ever have anything that just had some sort of light? | ||
Oh, can you please tell us what you're working on? | ||
You would have to have, particularly if it's on replacement, if replacement can have some sort of probability that in replacing human beings, maybe they make human beings extinct. | ||
Don't you, the reality is you just can't have some sort of light hand on regulation. | ||
You actually have to drill down into these companies to what the hell is going on because if history shows us anything, this does not move incrementally. | ||
It moves by massive leaps and bounds. | ||
Am I wrong in that? | ||
unidentified
|
No, you're exactly right, of course. | |
But, you know, if AI would be, if it's treated like other industries, like pharma, you know, there'll be different tiers. | ||
If someone is launching a new lollipop, very light touch regulation. | ||
If someone's trying to make a new kind of fentanyl, of course, there will be very, very serious scrutiny of that. | ||
There should be. | ||
And in the same way, if someone is just launching a slightly better self-driving car, you know, I think that should be very easy to get approved. | ||
Not a lot of red tape so we can save lives on the road. | ||
But if someone says, I want to build Of course, the government should come in and say, hey buddies, what are you doing here? | ||
And the default is, it's your responsibility to convince us in the government that this is safe, not the other way around. | ||
The probability when you met with this person socially and they said, oh, I think we'll get to replacement AI as soon as next March, your best guess, is it sometime in the next year or two that this will actually be dropped on us and will have to be dealt with if we don't deal with it now? | ||
unidentified
|
I'm very humble about forecasting. | |
It's really hard, but what the fact is, indisputably, is that the leading... | ||
Players in all the companies are all making predictions between one year and five years from now. | ||
And you might think they're just overhyping it for their investors, but the whistleblowers who have left these companies in disgust are also saying very similar things. | ||
And so are academic colleagues. | ||
And when I look at the research myself as well, you know, it's clear that the tracking language, getting things like ChatGPT, that was... | ||
An enormous hurdle that people thought might have taken 50 years. | ||
From now on out, it's mostly engineering, in my opinion. | ||
And five years or one year, it doesn't really matter. | ||
Either way, government has to step up. | ||
The good news is that the current administration is actually working out a new action plan that's supposed to come out middle of this year. | ||
And I think that's a really great opportunity to step up and say, you know... | ||
Say to all these companies in the Bay Area that we are not going to treat you somehow with any less scrutiny than we treat all of the other companies. | ||
And Trump himself has even spoken about how there's this real risk that we could lose control over things. | ||
We do not want. | ||
Hang on. | ||
I want to go back. | ||
In the time we've got, I want to go back. | ||
A lot of people saw Oppenheimer. | ||
Go back. | ||
You said the Chet GBT, and when we talked the other week, and you said it was a Fermi. | ||
Explain who Fermi was when his experiment below Soldier Field in Chicago took place, what it showed the world. | ||
And from there on, the theoretical aspect of could you build an atomic weapon was all theoretical. | ||
From that moment on, you said, hey, it just became the engineering, the mechanics of it. | ||
You believe in AI were at the same point. | ||
Walk us through what Fermi did and why did that bring everything together? | ||
unidentified
|
Exactly. The metaphor you mentioned is very good. | |
A long time, the world's best physicists thought maybe it'll take 100 years or 50 years or whatever to figure out how to get nuclear energy out of the atoms. | ||
And then Enrico Fermi built the first ever self-sustaining nuclear reactor in Chicago under a football stadium around 1942. | ||
And most people at that point still even heard about it, totally dismissed it. | ||
But the physicists... | ||
Totally freaked out and realized that from here on out to the bomb, it's just engineering. | ||
Maybe it'll take one year, maybe it'll take five years. | ||
In fact, it took three years. | ||
And it's very analogous. | ||
Alan Turing, who's the intellectual godfather of the field of AI, he said in 1951 that, you know, if we ever build machines that can outthink us in every way, then they... | ||
We have robots, basically, that are much smarter than us that can build robot factories, that can build more robots in the billions. | ||
They will quickly take control of Earth. | ||
But he said, don't worry about that because it's far, far away. | ||
But I'll give you a canary in the coal mine so you know when to pay attention. | ||
The Turing test said when machines can master language and knowledge at the human level, that's when you're close. | ||
That's when it's all engineering from then on. | ||
And that's when you have to pay attention. | ||
And this is... | ||
The moment that we've had now with ChatGPT and sequels. | ||
We've been waiting. | ||
I wish, for the sake of my children, that this would have taken us a lot longer to get to this point, but here we are. | ||
And now is the time to act. | ||
Max, hang on for a second. | ||
Joe, we're going to wrap up here with that. | ||
We're going to have Max back on. | ||
It's been amazing. | ||
Any closing thoughts on this? | ||
Once again, just extending gratitude to Max for coming on and imparting his wisdom to the audience. | ||
I believe that many are well prepared to hear it. | ||
Just my real parting thought, though, is that What Max Tegmark here is describing is a nightmare scenario in which machines become more intelligent than human beings and we could argue all day long about what intelligence is, what real thinking is, whether these machines could truly replace every facet of humanity. | ||
I would like to make a statement. | ||
That is perhaps even darker. | ||
Even if AI doesn't reach that level, not for the next 10 years, 20 years, where it's at right now and where these companies are positioned to deploy it all over the economy, in education, in medicine, I think that there is already a real threat. | ||
That AI is A, going to replace some jobs, but B, make a lot of jobs kind of uninhabitable for people who are not dehumanized. | ||
Teachers who have to use chatbots to teach their students. | ||
Doctors who constantly have to rely on the benchmarks of machines. | ||
So I just want to emphasize that while... | ||
AGI is still just in the realm of possibility. | ||
The narrow AIs that we have right now, I think, could do real damage and before any kind of regulation could actually be pushed through. | ||
And so just to reiterate what we've been saying for a long time, it's going to be up to those people listening, to regular citizens to make informed choices as to whether or not they want to turn their children over to chatbots for education or turn their own bodies over to doctors who are really... | ||
Reliant on AI as their means of diagnosing disease, so on and so forth. | ||
But anyway, thank you very much, Max, and thank you, Steve. | ||
Yeah, no, we're going to get organized in this. | ||
Hang on one second, Joe. | ||
Max, we've had the turning moment, as you said. | ||
Where do people get you? | ||
I want to get access to your writings, access to your book, access to your social media. | ||
Where do they go, Max? | ||
unidentified
|
I'm on Twitter. | |
Techmark is my... | ||
Twitter handle. | ||
I often piss off tech bros by things I post there. | ||
And if someone is interested in reaching out to me personally, my email is in plain sight also. | ||
It's just pegmarkatmit.edu. | ||
And I just want to maybe end with a little appeal here. | ||
There are so many different things that people argue about and fight about right now on this planet, but in this battle, as was eloquently said there by John, It's really team human versus team machine. | ||
And we have to all ask ourselves, which side are we on? | ||
And I would encourage anyone who listens to this, if someone comes up and tells them, oh, I want to go work for this tech company and build AGI, give them a hard time. | ||
Ask them, what team are they on, actually? | ||
Why is this supposed to be good for team human, that we just build our own replacements? | ||
We are in charge of this planet. | ||
And let's use that fact to build tools that improve the lives for all of us, not to just throw away the keys to... | ||
Max, honored to have you on here. | ||
We'll check in after the show and then look forward to having you back, brother. | ||
Fantastic discussion. | ||
Appreciate you. | ||
Incredible. Joe Allen, where do people get you? | ||
I know you're on special assignment for us. | ||
People miss you coming on every couple of days. | ||
Where do folks go? | ||
You can find my writings at joebutt.xyz, social media at joebutt.xyz. | ||
Thank you very much, Steve. | ||
Joe Allen, on a canceled Black Monday, great way to end the show. | ||
Hey, look, folks. | ||
If not you, who? | ||
This is why people come on the War Room. | ||
They want access to a group of fighters. | ||
Are you on Team Human? | ||
Are you not? | ||
Are we going to be taken over by thinking machines? | ||
Are we going to build our own replacements? | ||
I don't think so. | ||
But it's just not going to happen. | ||
We have to make it happen. | ||
We're going to leave with the right stuff. | ||
You know why? | ||
You got it. | ||
President Trump has it. | ||
The MAGA movement has it. |