Max Tegmark on AI Superintelligence: We’re Building an ‘Alien Species’ That Would Replace Us
|
Time
Text
Why is it, you know, that when you go down to the zoo here in Washington, it's the tigers that are in the cages, not the humans?
Why is that?
Is it because we are stronger than the tigers or have sharper teeth?
Of course not.
It's because we're smarter, right?
And it's kind of natural that the smarter species will dominate the planet.
And that was the basic point Turing made.
It still stands.
People who don't see this, I think, make the mistake of thinking of AI as just another technology, like the steam engine or the internet, you know, whereas Turing himself was clearly thinking about it as a new species.
That might sound very weird, because why your laptop isn't a species, of course, but if you think of a species as something that can make copies of itself and have its own goals and do things, Of course, they can also build robot factories and build new robots.
They can build better robots.
They can build still better robots, etc., and then gradually get more and more advanced than us.
That checks every box of being a new species.
At that point, you know, before that point, we have to ask ourselves, is that really what we want to do?
You know, we have this incredible gift.
We're placed here as stewards of this earth, you know, with an ability to use technology for all sorts of wonderful stuff, and I love tech as much as anyone.
But don't we want to aim for something more ambitious than just replacing ourselves?
So Enrico Fermi, the famous physicist, In 1942, he built the first ever nuclear reactor, actually under a football stadium in Chicago.
It totally freaked out the world's top physicists when they learned about this.
Not because the reactor was dangerous in any way, it wasn't, you know, but because they realized, oh my, this was the last big hurdle to building the nuclear bomb.
That was like the Turing test for nuclear bombs.
Now they knew.
Maybe it'll take three years, four years, two years.
But it's going to happen.
It's going to happen.
It's just a matter of engineering now.
In fact, it took three years there.
The Trinity test was in July of 1945, the first bomb.
It's exactly the same with the Turing test when that was passed now with things like JAT-TPT-4, that it became clear that from here on, out to building things we could actually lose control over, it's just engineering.
There's no fundamental obstacle anymore.
And instead, Are we going to want to do this or not?
Can I inject some optimism in this?
We're just feeling a little gloomy in the direction we're going here.
We still have not actually had a nuclear war between the superpowers because there was a political will not to have one.
So we didn't have one, you know, even though there were some close calls.
It's absolutely not a foregone conclusion either that we're just going to hand over the keys to our future to some stupid alien machines because, frankly, almost nobody wants it.
There are actually a very small number of tech people who want, who have actually gone on YouTube and places like that, and said that they want humanity to be replaced by machines because they're smarter somehow and that's better.
I think of that as digital eugenics.
I can't even imagine what these people are actually thinking.
It's insane.
Isn't it by definition, maybe?
I don't know.
Yeah, I think they're sort of salivating over some sort of digital master race.
I'm not a psychologist.
I'm not the one who can get into their minds exactly.
I believe in free speech.
I believe they should have the right to want what they want.