This is a free preview of a paid episode. To hear more, visit radixjournal.substack.comRichard begins talking about why he’s come to respect Ye-ism and why it’s different than other 3rd-party or populist campaigns. The conversation then moves into the territory of Artificial Intelligence. In the free selection, Richard strongly argues that transcendent AI is a Hollywood metaphor and, in fact, imp…
We have actually used language for a shorter amount of time than we would imagine.
And, you know, dogs can understand words.
I'm not sure a dog quite has a grammar, but they understand a sound wave as indicating something.
A dog knows its name.
If he hears the word walk or...
Dinner.
And he perks up and is going to look at you and be like, oh, walk.
Yeah, that sounds great.
Yeah, let's do it.
So he understands language to some extent, but there's no actual grammar or logic exactly.
Now, you know, we're homo sapiens.
So, you know, wise man, the rational animal or something like that.
That is actually extremely incorrect.
I actually just saw something about this today.
There was an experiment in Germany in which fish have a sense of numbers.
They have a sense of a larger and a smaller number.
And they can actually engage in a sort of addition to some extent.
And they don't have a frontal brain anywhere close to the extent that we have one.
We might even overestimate the head as the seat of reason or something.
We have reason in our spinal cord.
And I've used this metaphor quite a bit, and so I apologize if people are getting bored of it, but there is literally no time to think if you are standing at a baseball plate and someone is throwing 70, 80, 90, 100 miles per hour.
You cannot think in that split second when you determine what pitch.
Is it a curve?
Is it a fastball?
Is it a changeup?
Is it inside?
Is it outside?
Is this the pitch I want to hit?
You have absolutely no time to think that.
And yet you do.
The idea that some of these baseball players could explain to you how a curveball curves, they can't.
But they just do it and they know it in their bones.
Maybe kind of literally in their bones.
A outfielder, he hears a crack off the bat.
The audible level of the crack gives him information.
He sees the ball, maybe even kind of peripherally to some degree, and he sees it travel like 50 feet, and he estimates exactly where he should run to.
And he hops to the exact spot, opens up his glove, and in a lackadaisical manner catches it.
There is reason, mathematics, rationality in our spinal cord.
And we kind of don't grasp this.
And your thinking consciousness, when you're using language, it is kind of almost like a late stage of this.
And there also have been experiments, I think I've mentioned these to other people, and I'll mention two.
One of which is that...
Your muscles will engage before you think to pick up your coffee.
Now, does that mean that we're all predetermined?
No, it does not mean that at all.
What it means is that you are telling yourself in your mind using language, I want coffee, as a kind of post-facto rationalization of what you are instinctively doing.
Another thing.
So there's an experiment that's done where they're blinded and they tell people to pick up objects and they say, we want you to judge the texture of these objects.
And so you'll pick one up that will be like furry and you'll pick up another one that will be slick.
And then they'll say which object was heavier and people will get it.
What that indicates, that might sound like dumb or obvious.
No, it's not dumb or obvious.
What that means, Is that not only are they engaged in reason, they're engaging in judgment unconsciously.
You are thinking, your language comes kind of, it's like an after effect in a way.
And we developed language, and it's obviously immensely powerful, but we kind of shouldn't overestimate it.
We act in certain ways that are...
Amazing and miraculous long before our language mind even gets its pants on in the morning.
And why I say all of this is that AI is nothing if not language.
I mean, it is computer code.
And ultimately, if you want to just reduce it to like the most basic thing, it is still language.
It is a binary, a one or a zero.
Machine code.
The most basic kind of thing.
It is pure language.
What I am saying is that there's a kind of overestimation of language in these people who worry about like AI transcending itself or taking over the world or something like that.
It kind of can't think on some level.
Now, language matters.
Language affects us.
Language, our internal monologue is our way of rationalizing behavior.
It's also our way of kind of like It's a kind of superego.
You could say it kind of like directs us and so on.
But it also shouldn't be overestimated.
We're biology.
We have instincts.
There were men around before language.
We got on.
We built campfires and hunted shit and had sex and reproduced and raised children and did a lot of stuff.
You know?
None of that is possibly available in the computer.
The computer is pure code.
So this notion that it's anything other than some logical thing and that it's going to be able to think for itself or overcome itself, I just find ridiculous.