Speaker | Time | Text |
---|---|---|
Four, three, two, one. | ||
Hello, Lex. | ||
Hey. | ||
We're here, man. | ||
What's going on? | ||
We're here. | ||
Thanks for doing this. | ||
You brought notes. | ||
You're seriously prepared. | ||
When you're jumping out of a plane, it's best to bring a parachute. | ||
This is my parachute. | ||
I understand. | ||
Yeah. | ||
How long have you been working in artificial intelligence? | ||
My whole life, I think. | ||
unidentified
|
Really? | |
So when I was a kid, I wanted to become a psychiatrist. | ||
I wanted to understand the human mind. | ||
I think the human mind is the most beautiful mystery that our entire civilization has taken on exploring through science. | ||
I think you look up at the stars and you look at the universe out there. | ||
You had Neil deGrasse Tyson here. | ||
It's an amazing, beautiful, scientific journey that we're taking on and exploring the stars, but the mind to me is a bigger mystery and more fascinating and it's been the thing I've been fascinated by from the very beginning of my life and just I think all of human civilization has been wondering You know, what is inside this thing? | ||
The hundred trillion connections that are just firing all the time, somehow making the magic happen to where you and I can look at each other, make words, all the fear, love, life, death that happens is all because of this thing in here. | ||
And understanding why is fascinating. | ||
And what I early on understood Is that one of the best ways, for me at least, to understand the human mind is to try to build it. | ||
And that's what artificial intelligence is. | ||
It's not enough from a psychology perspective to study, from a psychiatry perspective to Investigate from the outside. | ||
The best way to understand is to do. | ||
So you mean almost like reverse engineering a brain? | ||
There's some stuff, exactly, reverse engineering the brain. | ||
There's some stuff that you can't understand until you try to do it. | ||
You can hypothesize your... | ||
I mean, we're both martial artists from various directions. | ||
You can hypothesize about what is the best martial art. | ||
But until you get in the ring, like what the UFC did... | ||
And test ideas is when you first realize that the touch of death that I've seen some YouTube videos on that you perhaps cannot kill a person with a single touch or your mind or telepathy that there are certain things that work. | ||
Wrestling works. | ||
Punching works. | ||
Okay, can we make it better? | ||
Can we create something like a touch of death? | ||
Can we figure out how to turn the hips, how to deliver a punch in a way that does do a significant amount of damage? | ||
And then you, at that moment, when you start to try to do it, and you face some of the people that are trying to do the same thing, that's the scientific process. | ||
And you try, you actually begin to understand what is intelligence. | ||
And you begin to also understand how little we understand. | ||
It's like Richard Feynman, who I'm dressed after today. | ||
Are you? | ||
He's a physicist. | ||
I'm not sure if you're sure. | ||
Yeah, he always used to wear this exact thing, so I feel pretty badass wearing it. | ||
If you think you know astrophysics, you don't know astrophysics. | ||
That's right. | ||
Well, he said it about quantum physics, right? | ||
Quantum physics, that's right. | ||
So he was a quantum physicist. | ||
I remember hearing him talk about that understanding the nature of the universe, of reality, could be like an onion. | ||
We don't know. | ||
But it could be like an onion to where you think you know you're studying a layer of an onion and then you peel it away and there's more. | ||
And you keep doing it and there's an infinite number of layers. | ||
With intelligence, there's the same kind of component to where we think we know. | ||
We got it. | ||
We figured it out. | ||
We figured out how to beat the human world champion to chess. | ||
We solved intelligence. | ||
And then we tried the next thing. | ||
Wait a minute. | ||
Go is really difficult to solve as a game. | ||
And then you say, okay. | ||
I came up when the game of Go was impossible for artificial intelligence systems to beat and have now recently have been beaten. | ||
Within the last five years, right? | ||
The last five years. | ||
There's a lot of technical, fascinating things of why that victory is interesting and important for artificial intelligence. | ||
It requires creativity, correct? | ||
It does not. | ||
It just exhibits creativity. | ||
So the technical aspects of why AlphaGo from Google DeepMind, that was the designers and the builders of the system that was the victor, they did a few very interesting technical things where… Essentially, you develop a neural network. | ||
This is this type of artificial intelligence system that looks at a board of Go, has a lot of elements on it, there's black and white pieces, and is able to tell you, how good is this situation? | ||
And how can I make it better? | ||
That idea, so chess players can do this. | ||
I'm not actually that familiar with the game of Go. | ||
I'm Russian, so chess to us is romanticized. | ||
It's a beautiful game. | ||
I think that you look at a board and all your previous experiences, all the things you've developed over tons of years of practice and thinking, you get this instinct of what is the right path to follow. | ||
And that's exactly what the Neural Network is doing. | ||
And some of the paths it has come up with are surprising to other world champions. | ||
So in that sense, it says, well, this thing is exhibiting creativity because it's coming up with solutions that are something that's outside the box, thinking from the perspective of the human. | ||
What do you differentiate between requires creativity and exhibits creativity? | ||
I think, one, because we don't really understand what creativity is. | ||
So, it's almost... | ||
It's on the level of concepts such as consciousness. | ||
For example, the question which there's a lot of thinking about whether creating something intelligent requires consciousness, requires for us to be actual living beings aware of our own existence. | ||
In the same way, does doing something like building an autonomous vehicle, that's the area where I work in, does that require creativity? | ||
Does that even require something like consciousness and self-awareness? | ||
I mean, I'm sure in LA, there's some degree of creativity required to navigate traffic. | ||
And in that sense, you start to think, are there solutions that are outside of the box an AI system needs to create? | ||
Once you start to build it, you realize that to us humans, certain things appear creative, certain things don't. | ||
Certain things we take for granted. | ||
Certain things we find beautiful. | ||
And certain things we're like, yeah, yeah, that's boring. | ||
Well, there's creativity in different levels, right? | ||
There's creativity like to write The Stand, the Stephen King novel. | ||
That requires creativity. | ||
There's something about his... | ||
He's creating these stories. | ||
He's giving voices to these characters. | ||
He's developing these scenarios and these dramatic sequences in the book that's going to get you really sucked in. | ||
That's almost undeniable creativity, right? | ||
Is it? | ||
He's imagining a world. | ||
What is always set in New Hampshire, Massachusetts? | ||
A lot of its Maine. | ||
Maine, that's right. | ||
So he's imagining a world and imagining the emotion of different levels surrounding that world. | ||
Yeah, that's creative. | ||
Although there's a few really good books, including his own, that talks about writing. | ||
Yeah, he's got a great book on writing. | ||
It's actually called On Writing. | ||
If there's anyone who can write a book on writing, it should be Stephen King. | ||
I think Stephen Pressfield. | ||
I hope I'm not saying the wrong thing. | ||
The War of Art. | ||
The War of Art. | ||
Beautiful book. | ||
And I would say, from my recollection, they don't necessarily talk about creativity very much. | ||
That it's really hard work, putting in the hours of every day of just grinding it out. | ||
Well, Pressfield talks about the muse. | ||
Pressfield speaks of it almost in like a strange, mystical sort of connection to the unknown. | ||
I'm not even exactly sure if he believes in the muse, but I think if I could put words in his mouth, I have met him. | ||
He's a great guy. | ||
He was on the podcast once. | ||
I think the way he treats it is that if you decide the muse is real and you show up every day and you write as if the muse is real, you get the benefits of the muse being real. | ||
That's right. | ||
Whether or not there's actually a muse that's giving you these wonderful ideas. | ||
And what is the muse? | ||
So, I think of artificial intelligence the same way. | ||
There's a quote by Pamela McCordick from a 1979 book that I really like. | ||
She talks about the history of artificial intelligence. | ||
AI began with an ancient wish to forge the gods. | ||
And to me, gods, broadly speaking, or religions, represents, it's kind of like the muse, it represents the limits of possibility, the limits of our imagination. | ||
So it's this thing that we don't quite understand. | ||
That is the Muse. | ||
That is God. | ||
Us chimps are very narrow in our ability to perceive and understand the world, and there's clearly a much bigger, beautiful, mysterious world out there, and God or the Muse represents that world. | ||
And for many people, I think throughout history, and especially in the past sort of 100 years, artificial intelligence has become to represent that a little bit. | ||
To the thing which we don't understand and we crave, we're both terrified and we crave in creating this thing that is greater, that is able to understand the world better than us. | ||
In that sense, artificial intelligence is the desire to create the muse, this other, this imaginary thing. | ||
And I think one of the beautiful things, if you talk about everybody from Elon Musk to Sam Harris to all the people thinking about this, Is that there is a mix of fear of that, of that unknown, of creating that unknown, and an excitement for it. | ||
Because there's something in human nature that desires creating that. | ||
Because like I said, creating is how you understand. | ||
Did you initially study biology? | ||
Did you study the actual development of the mind or what is known about the evolution of the human mind? | ||
Of the human mind, yeah. | ||
So my path is different as it's the same for a lot of computer scientists and roboticists. | ||
We ignore biology, neuroscience, the physiology and anatomy of our own bodies. | ||
And there's a lot of beliefs now that you should really study biology, you should study neuroscience, you should study our own brain, the actual chemistry, what's happening, what is actually, how are the neurons interconnected, all the different kinds of systems in there. | ||
So that is a little bit of a blind spot, or it's a big blind spot. | ||
But the problem is, so I started with more Philosophy almost. | ||
It's where, if you think, Sam Harris, in the last couple of years, has started kind of thinking about artificial intelligence. | ||
And he has a background in neuroscience, but he's also a philosopher. | ||
And I started there by reading Camus and Nietzsche or Dostoevsky, thinking what is... | ||
What is intelligence? | ||
What is human morality? | ||
Will? | ||
So all of these concepts give you the context for which you can start studying these problems. | ||
And then I said, there's a magic that happens when you build a robot. | ||
It drives around. | ||
I mean, you're a father. | ||
I'd like to be, but I'm not yet. | ||
There's a creation aspect that's wonderful, that's incredible. | ||
For me, I don't have any children at the moment, but the act of creating a robot where you programmed it and it moves around and it senses the world is a magical moment. | ||
Did you see Alien Covenant? | ||
Is that a sci-fi movie? | ||
Yeah. | ||
No. | ||
Have you ever seen any of the alien films? | ||
So I grew up in the Soviet Union where we didn't watch too many movies. | ||
So I need to catch up. | ||
We should catch up on that one in particular because a lot of it has to do with artificial intelligence. | ||
There's actually a battle between, spoiler alert, Two different but identical artificially intelligent synthetic beings that are there to aid the people on the ship. | ||
One of them is very creative and one of them is not. | ||
And the one that is not has to save them from the one that is. | ||
Spoiler alert. | ||
I won't tell you who wins. | ||
But there's a really fascinating scene at the very beginning of the movie where the creator of this artificially intelligent being is discussing its existence with the being itself. | ||
And the being is trying to figure out who made him. | ||
And it's this really fascinating moment and this being winds up being a bit of a problem because it possesses creativity and it has the ability to think for itself and they found it to be a problem. | ||
So they made a different version of it which was not able to create. | ||
And the one that was not able to create was much more of a servant. | ||
And there's this battle between these two. | ||
I think you would find it quite fascinating. | ||
It's a really good movie. | ||
Yeah, the same kind of theme carries through Ex Machina and 2001 Space Odyssey. | ||
You've seen Ex Machina? | ||
Yeah, I've seen it. | ||
So because of your... | ||
I've listened to your podcast, and because of it, I've watched it a second time. | ||
Because the first time I watched it, I had a Neil deGrasse Tyson moment where it was... | ||
You said there's cut the... | ||
Cut the shit. | ||
Cut the shit moments? | ||
Yes. | ||
For me, the movie opening is... | ||
Everything about it was... | ||
I was rolling my eyes. | ||
Why were you rolling your eyes? | ||
What was the cut the shit moment? | ||
So, that's a general bad tendency that I'd like to talk about amongst people who are scientists that are actually trying to do stuff. | ||
They're trying to build the thing. | ||
It's very tempting to roll your eyes and tune out in a lot of aspects of artificial intelligence discussion and so on. | ||
For me, there's real reasons to roll your eyes and there's just Well, let me just describe it. | ||
So this person in Ex Machina, no spoiler alerts, is in the middle of what, like a Jurassic Park type situation where he's like in the middle of land that he owns? | ||
Yeah, we don't really know where it is. | ||
It's not established, but you have to fly over glaciers and you get to this place and there's rivers and he has this fantastic compound and inside this compound he appears to be working alone. | ||
Right. | ||
And he's like doing curls, I think, like dumbbells and drinking heavily. | ||
So everything I know about science, everything I know about engineering is it doesn't happen alone. | ||
So the situation of a compound with no hundreds of engineers there working on this is not feasible. | ||
It's not possible. | ||
And the other moments like that were the technical, the discussion about how it's technically done. | ||
They threw a few jargon to spice stuff up that doesn't make any sense. | ||
Well, that's where I am... | ||
Blissfully ignorant. | ||
So I watch it and I go, this movie's awesome! | ||
And you're like, ah, I know too much. | ||
Yeah, I know too much. | ||
But that's a stupid way to think for me. | ||
So once you suspend disbelief, say, okay, well, right, those are not important details. | ||
Yeah, but it is important. | ||
I mean, they could have gone to you or someone who really has knowledge in it and cleaned up those small aspects and still kept the theme of the story. | ||
That's right. | ||
They could have, but they would make a different movie. | ||
But slightly different. | ||
I don't know if it's possible to make. | ||
So you look at 2001 Space Odyssey. | ||
I don't know if you've seen that movie. | ||
That's the kind of movie you'll start, if you talk to scientists, you'll start making those kinds of movies. | ||
Because you can't actually use jargon that makes sense because we don't know how to build a lot of these systems. | ||
So the way you need to film it and talk about it is with mystery. | ||
It's this Hitchcock type. | ||
You say very little. | ||
You leave it to your imagination to see what happens. | ||
Here everything was in the open. | ||
Right. | ||
Even in terms of the actual construction of the brain when they had that... | ||
Foam looking, whatever, gel brain. | ||
Right. | ||
Yeah. | ||
If they gave a little bit more subtle mystery, I think I would have enjoyed that movie a lot more. | ||
But the second time, really because of you, you said I think it's your favorite sci-fi movie. | ||
It's absolutely one of my favorite sci-fi movies. | ||
One of my favorite movies, period. | ||
I loved it. | ||
Yeah, so I watched it again and also Sam Harris said that he also hated the movie and then watched it again and liked it. | ||
So I give it a chance. | ||
Why would you see a movie again after you hate it? | ||
Because maybe you're self-aware enough to think there's something unhealthy about the way I hated the movie. | ||
Like you're like introspective enough to know It's like I have the same experience with Batman, okay? | ||
I watched... | ||
Which one? | ||
Dark Knight, I think. | ||
Christian Bale? | ||
Christian Bale one. | ||
So, to me, the first time I watched that is a guy in a costume, like, speaking excessively with an excessively low voice. | ||
I mean, it's just something with like little bunny ears, not bunny ears, but like little ears. | ||
It's so silly. | ||
But then you go back and, okay, if we just accept that that's the reality of the world we live in, what's the human nature aspects that are being explored here? | ||
What is the beautiful conflict between good and evil that's being explored here? | ||
And what are the awesome graphics effects that are being on the exhibit, right? | ||
So if you can just suspend that, that's beautiful. | ||
The movie can become quite fun to watch. | ||
But still, to me, not to offend anybody, but superhero movies are still difficult for me to watch. | ||
Yeah, who was talking about that recently? | ||
Was it Kelly? | ||
Kelly Slater? | ||
No. | ||
No, it was yesterday. | ||
Yesterday. | ||
It's Kyle. | ||
unidentified
|
It's Kyle. | |
Yeah, he doesn't like superhero movies or something. | ||
unidentified
|
Right. | |
He doesn't like superhero movies. | ||
We were talking about Batman, about Christian Bale's voice, and he's like, the most ridiculous thing was that he's actually Batman, not his voice. | ||
It's true. | ||
I'm Batman. | ||
That part of it is way less ridiculous than the fact that he's Batman. | ||
He's Batman. | ||
Because anybody can do that voice. | ||
Yeah. | ||
But I contradict. | ||
I'm a hypocrite because Game of Thrones or Tolkien's Lord of the Rings, it's totally believable to me. | ||
Yeah, of course. | ||
Dragons and... | ||
Well, that's a fantasy world, right? | ||
That's the problem with something like Batman or even Ex Machina is that it takes place in this world. | ||
Whereas they're in Middle Earth. | ||
They're in a place that doesn't exist. | ||
It's like Avatar. | ||
If you make a movie about a place that does not exist, you can have all kinds of crazy shit in that movie. | ||
Because it's not real. | ||
That's right. | ||
Yeah. | ||
So... | ||
But at the same time, Star Wars is harder for me. | ||
And you're saying Star Wars is a little more real because it feels feasible. | ||
Like you could have spaceships flying around. | ||
Right. | ||
What's not feasible about Star Wars to you? | ||
Oh, I'm not. | ||
I'll leave that one to Neil deGrasse. | ||
He was getting angry about the robot that's circular, when it rolls around. | ||
He's like, it would just be slippery. | ||
Like, trying to roll around all over the sand, it wouldn't work. | ||
It would get no traction. | ||
I was like, that's true. | ||
Because if you had, like, glass tires, and you'd try to drive over sand, it was smooth tires, you'd get nothing. | ||
Yeah. | ||
He's actually the guy that made me realize, you know the movie Ghost with Patrick Swayze? | ||
And it was at this podcast or somewhere he was talking about the fact that, so this guy could go through walls, right? | ||
It's a beautiful romantic movie that everybody should watch, right? | ||
But he doesn't seem to fall through chairs when he sits on them. | ||
unidentified
|
Yeah. | |
Right? | ||
So he can walk through walls, but he can put his hand on the desk. | ||
unidentified
|
Yeah. | |
And he can sit. | ||
Like, his butt has a magical shield that is in this reality. | ||
This is a quantum shield that protects him from falling. | ||
Yeah. | ||
So that's... | ||
You know, those devices are necessary movies. | ||
I get it. | ||
Yeah, but you got a good point. | ||
He's got a good point, too. | ||
It's like, there's cut-the-shit moments. | ||
They don't have to be there. | ||
You know, you just have to work them out in advance. | ||
But the problem is a lot of movie producers think that they're smarter than people. | ||
They just decide, ah, just put it in there. | ||
The average person is not going to care. | ||
I've had that conversation with movie producers about martial arts. | ||
And I was like, well, this is just nonsense. | ||
You can't do that. | ||
Like, because I was explaining martial arts to someone. | ||
And he was like, ah, the average person is not going to care. | ||
I'm like, oh, the average person. | ||
Okay, but you brought me in as a martial arts expert to talk to you about your movie. | ||
And I'm telling you right now, this is horseshit. | ||
Yeah, I'm a huge believer of Steve Jobs' philosophy, where, forget the average person discussion, because, first of all, the average person will care. | ||
Steve Jobs... | ||
It would really push the design of the interior of computers to be beautiful, not just the exterior. | ||
Even if you never see it, if you have attention to detail to every aspect of the design, even if it's completely hidden from the actual user in the end, Somehow, that karma, whatever it is, that love for everything you do, that love seeps through the product. | ||
And the same, I think, with movies. | ||
If you talk about the 2001 Space Odyssey, there's so many details. | ||
I think there's probably these groups of people that study every detail of that movie and other Kubrick films. | ||
Those little details matter. | ||
Somehow they all come together to show how deeply passionate you are about telling the story. | ||
Well, Kubrick was a perfect example of that because he would put layer upon layer upon layer of detail into films that people would never even recognize. | ||
Like there's a bunch of correlations between the Apollo moon landings and the shining. | ||
People have actually studied it to the point where they think that it's some sort of a confession that Kubrick faked the moon landing. | ||
It goes from the little boy having the rocket ship on his sweater to the number of the room that things happen. | ||
There's a bunch of very bizarre connections in the film that Kubrick... | ||
Unquestionably engineered because he was just a Stupid smart man. | ||
I mean he was so goddamn smart that he would do complex mathematics for fun in his spare time and Kubrick was like a legitimate genius and he Engineered that sort of complexity into his films where he didn't have cut the shit moments in his movies nothing I can recall and No, not even close. | ||
This was very interesting. | ||
I mean, but that probably speaks to the reality of Hollywood today, that the cut-the-shit moments don't affect the bottom line of how much the movie makes. | ||
Well, it really depends on the film, right? | ||
I mean, the cut-the-shit moments that Neil deGrasse Tyson found in Gravity, I didn't see because I wasn't aware of what the effects of gravity on a person's hair would be. | ||
You know, he saw it and he was like, this is ridiculous. | ||
And then there were some things like, why are these space stations so close together? | ||
I just let it slide while the movie was playing, but then he went into great detail about how preposterous it would be that those space stations were that close together that you could get to them so quickly. | ||
That's with Sandra Bullock and the good-looking guy. | ||
And George Clooney. | ||
unidentified
|
George Clooney. | |
Yeah, the good-looking guy. | ||
So did that pass muster with Neil deGrasse Tyson? | ||
No. | ||
He tore it apart. | ||
And when he tore it apart, people went crazy. | ||
They got so angry at him. | ||
Yeah, he reads the negative comments, as you've talked about. | ||
I actually recently, because of doing a lot of work on artificial intelligence and lecturing about it and so on, have plugged into this community of folks that are thinking about the future of artificial intelligence, artificial general intelligence, and they are very much out-of-the-box thinkers to where the kind of messages I get are best So I let them kind of explore those ideas without sort of engaging into those discussions. | ||
I think very complex discussions should be had with people in person. | ||
That's what I think. | ||
And I think that when you allow comments, just random anonymous comments to enter into your consciousness, like you're taking risks. | ||
And you may run into a bunch of really brilliant ideas. | ||
That are, you know, coming from people that are considerate, that have thought these things through, or you might just run into a river of assholes. | ||
And it's entirely possible. | ||
I peeked into my comments today on Twitter and I was like, what in the fuck? | ||
I started reading like a couple of them, some just morons. | ||
And I'm like, alright, about some shit, I don't even know what the fuck they were talking about. | ||
But that's the risk you take when you dive in. | ||
You're going to get people that are disproportionately, you know, delusional or whatever it is in regards to your position on something. | ||
Or whether or not they even understand your position. | ||
They'll argue something that's an incorrect interpretation of your position. | ||
Yeah, and you've actually, from what I've heard, you've actually been to this podcast and so on, really good at being open minded. | ||
And that's something I try to preach as well. | ||
So in AI discussions, when you're talking about AGI and talking about so there's a difference in narrow AI and general artificial intelligence, narrow AI is The kind of tools that are being applied now and being quite effective. | ||
And then there's general AI, which is a broad categorization of concepts that are human-level or superhuman-level intelligence. | ||
When you talk about AGI, Artificial General Intelligence, there seems to be two camps of people. | ||
Ones who are really working deep in it, like that's the camp I kind of sit in, and a lot of those folks tend to roll their eyes and just not engage into any discussion of the future. | ||
Their idea is saying, it's really hard to do what we're doing, and it's just really hard to see how this becomes intelligent. | ||
And then there's another group of people who say, yeah, but you're being very short-sighted, that you may not be able to do much now, but the exponential, the hard takeoff overnight, it can become super intelligent, and then it'll be too late to think about. | ||
Now, the problem with those two camps, as with any camps, Democrat, Republican, any camps, Is they don't seem to be talking past each other as opposed to both have really interesting ideas. | ||
If you go back to the analogy of touch of death, of this idea of MMA, right? | ||
So I'm in this analogy. | ||
I'm going to put myself in the UFC for a second. | ||
In this analogy, I'm ranked in the top 20. I'm working really hard. | ||
My dream is to become a world champion. | ||
I'm training three times a day. | ||
I'm really working. | ||
I'm an engineer. | ||
I'm trying to build my skills up. | ||
And then there's other folks that come along, like Steven Seagal and so on, that kind of talk about other kinds of martial arts, other ideas of how you can do certain things. | ||
And I think Steven Seagal might be on to something. | ||
I think we really need to be open-minded. | ||
Like Anderson Silva, I think, talks to Steven Seagal. | ||
Or somebody talks to Steven Seagal, right? | ||
Well, Anderson Silva thinks Steven Seagal is... | ||
I want to put this in a respectful way. | ||
And Anderson Silva has a wonderful sense of humor. | ||
And Anderson Silva is very playful. | ||
And he thought it would be hilarious if people believed that he was learning all of his martial arts from students to go. | ||
He also loves Steven Seagal movies legitimately, so it treated him with a great deal of respect. | ||
He also recognizes that Steven Seagal actually is a master of Aikido. | ||
He really does understand Aikido and was one of the very first Westerners that was teaching in Japan. | ||
Speaks fluent Japanese, was teaching at a dojo in Japan, and is a legitimate master of Aikido. | ||
The problem with Aikido is, it's one of those martial arts that has merit in a vacuum. | ||
If you're in a world where there's no NCAA wrestlers, or no Judo players, or no Brazilian Jiu Jitsu black belts, or no Muay Thai kickboxers, there might be something to that Aikido stuff. | ||
But in the world, Where all those other martial arts exist and we've examined all the intricacies of hand-to-hand combat, it falls horribly short. | ||
Well, see, this is the point I'm trying to make. | ||
You just said that we've investigated all the intricacies. | ||
You said all the intricacies of hand-to-hand combat. | ||
I mean, you're just speaking, but you want to open your mind to the possibility That Aikido has some techniques that are effective. | ||
Yeah, when I say all, you're correct. | ||
That's not a correct way of describing it. | ||
Because there's always new moves that are being, like, for instance, in this recent fight between Anthony Pettis and Tony Ferguson, Tony Ferguson actually used Wing Chun in a fight. | ||
He trapped one of Anthony Pettis' hands and hit him with an elbow. | ||
He basically used a technique that you would use on a Wing Chun dummy, and he did it in an actual world-class mixed martial arts fight. | ||
And I remember watching it, wow, going, this crazy motherfucker actually pulled that off. | ||
Because it's a technique that you just rarely see anybody getting that proficient at it that fights in MMA. And Ferguson is an extremely creative and open-minded guy, and he figured out a way to make that work in a world-class fight. | ||
So, and let me then ask you the question, there's these people who still believe, quite a lot of them, that there is this touch of death, right? | ||
So, do you think it's possible to discover, through this rigorous scientific process that is MMA, that started pretty recently, do you think, not the touch of death, but do you think we can get a 10x improvement in the amount of power the human body can... | ||
Can generate in punching? | ||
No, certainly not 10x. | ||
I think you can get incremental improvements, but it's all based entirely on your frame. | ||
Like, if you're a person that has very small hands and narrow shoulders, you're kind of screwed. | ||
There's not really a lot of room for improvement. | ||
You can certainly get incremental improvement in your ability to generate power, but you'll never be able to generate the same kind of power as, say, A guy with a very big frame like Brock Lesnar or Derek Lewis or You know anyone who has there's like classic elements that go with Being able to generate large amounts of power wide shoulders large hands. | ||
There's there's a lot of characteristics of the human frame itself those Even those people, there's only so much power you can generate and we pretty much know how to do that correctly. | ||
So the way you're talking about as a martial arts expert now is kind of the way a lot of the experts in robotics and AI talk about AI and when the topic of touch of death is brought up. | ||
Now, the analogy is not perfect. | ||
I tend to use probably too many analogies. | ||
We maybe know the human body better than we know the possibility of AI. I would assume so, right? | ||
Because the possibility of AI is basically limitless once AI starts redesigning itself. | ||
It's not obvious that that's true. | ||
Our imagination allows it to be true. | ||
I'm of two minds. | ||
I can hold both beliefs that are contradicting in my mind. | ||
One is that idea is really far away, almost bordering on BS, and the other is it can be there overnight. | ||
I think you can believe both those things. | ||
There's another quote from Barbara Wooten. | ||
It's a poem I heard on a lecture somewhere that I really like, which is it's from the champions of the impossible rather than the slaves of the possible that evolution draws its creative force. | ||
So I see Elon Musk as a representative of the champion of the impossible. | ||
I see exponential growth of AI within the next several decades as the impossible. | ||
But it's the champions of the impossible that actually make the impossible happen. | ||
Why would exponential growth of AI be impossible? | ||
Because it seems inevitable to me. | ||
So, it's not impossible. | ||
I'm sort of using the word impossible meaning... | ||
Magnificent? | ||
Yeah, it feels very difficult. | ||
Very, very difficult. | ||
We don't even know where to begin. | ||
Grand. | ||
Yep, like the touch of death actually feels. | ||
Yeah, but see, the touch of death is horse shit. | ||
But see, you're an expert. | ||
Someone's like, ah, and they touch you in the chest. | ||
But we don't have the ability in the body to generate that kind of energy. | ||
How do you know that? | ||
That's a good question. | ||
It's never been done. | ||
We understand so much about physiology. | ||
How do you know it's never been done? | ||
There could be someone out there with magic that has escaped my grasp. | ||
No, you've studied, you've talked about with Graham Hancock, you've talked about the history, maybe it was in Roman times, that idea was discovered and then it was lost. | ||
Because weapons are much more effective ways of delivering damage. | ||
Now I find myself in a very uncomfortable position of defending the concept, as a martial artist, defending the concept of this. | ||
What martial arts did you study? | ||
Jiu-Jitsu and Judo and wrestling. | ||
Those are the hard ones. | ||
Jiu-Jitsu, Judo and wrestling, those are absolute martial arts, in my opinion. | ||
This is what I mean. | ||
If you are a guy who just has a fantastic physique and incredible speed and ridiculous power, you just can generate ridiculous power. | ||
You know who Deontay Wilder is? | ||
Yes. | ||
Heavyweight champion of the world, boxer. | ||
You have, what's his name? | ||
Tyson Fury. | ||
Tyson Fury on tomorrow. | ||
Tomorrow, yes. | ||
Two undefeated guys, right? | ||
Yes. | ||
Deontay Wilder has fantastic power. | ||
I mean, he just knocks people flying across the ring. | ||
He's just... | ||
I think Deontay Wilder, if he just came off the street, if he was 25 years old and no one ever taught him how to box at all, and you just wrapped his hands up and had him hit a bag, he would be able to generate insane amounts of force. | ||
If you're a person that really didn't have much power, and you had a box with Deontay Wilder, and you were both of the same age, and you were a person that knew boxing and you stood in front of Deontay, it's entirely possible that Deontay Wilder could knock you into another dimension, even though he had no experience in boxing. | ||
If he just held on to you and hit you with a haymaker, he might be able to put you out. | ||
If you're a person who is, let's say, built like you, a guy who exercises, who's strong, and then there's someone who's identically built like you, who's a black belt in Brazilian Jiu Jitsu, and you don't have any experience in martial arts at all, you're fucked. | ||
Right? | ||
Yes. | ||
If you're a person who's built like you, who's a guy who exercises and is healthy, and you grapple with a guy who's even stronger than you and bigger than you, but he has no experience in Brazilian Jiu-Jitsu, he's still fucked. | ||
Yeah. | ||
That's the difference. | ||
That's why I think Brazilian Jiu-Jitsu and Judo and wrestling in particular, those are absolutes in that you have control of the body. | ||
unidentified
|
Yes. | |
And once you grab a hold of a person's body, there's no... | ||
Lucky triangle chokes in jiu-jitsu. | ||
That's right. | ||
But I think I would say jiu-jitsu is the highest representative of that. | ||
I think in wrestling and judo, having practiced those, I've never been quite as humble as I have been in jiu-jitsu. | ||
Especially when I started, I was powerlifting. | ||
I was a total meathead. | ||
And a 130-pound guy or girl could tap you easily. | ||
Yeah, it's confusing. | ||
It's very confusing. | ||
In wrestling, you can get pretty far with that meathead power. | ||
And in judo... | ||
A little bit less so at its highest levels. | ||
If you go to Japan, for example, the whole dream of Judo is to effortlessly throw your opponent. | ||
But if you go to gyms in America and so on, There's some hard wrestling style gripping and just beating each other up pretty intensely where we're not talking about beautiful ichimadas or these beautiful throws. | ||
We're talking about some scrapping, some wrestling style. | ||
Yeah. | ||
Yeah, no, I see what you're saying. | ||
Yeah, my experience with jiu-jitsu was very humbling when I first started out. | ||
I had a long background in martial arts and striking, and even wrestled in high school. | ||
And then I started taking jiu-jitsu, and a guy who was my size, and I was young at the time, and he was basically close to my age, just mauled me. | ||
And he wasn't even a black belt. | ||
I think he was a purple belt. | ||
He might have been a blue belt. | ||
I think he was a purple belt. | ||
And just destroyed me. | ||
Just did anything he wanted to me. | ||
Choked me. | ||
Armbarred me. | ||
And I remember thinking, man, I am so delusional. | ||
I thought I had a chance. | ||
I thought just based on taking a couple classes and learning what an armbar is and then being a strong person who has a background in martial arts that I would be able to at least hold him off a little bit. | ||
No. | ||
That's so beautiful. | ||
I feel lucky to have had that experience of having my ass kicked in Philadelphia is where I came up with. | ||
Because in science you don't often get that experience. | ||
In the space of ideas you can't choke each other out. | ||
You can't beat each other up in science. | ||
So it's easy to go your whole life. | ||
I have so many people around me telling me how smart I am. | ||
There's no way to actually know if I'm smart or not, because I think I'm full of BS. And in the same realm as fighting, there's no, it's what Hicks and Gracie said, or Salo Hibera or somebody that the mat doesn't lie. | ||
There's this deep honesty in it that I'm really grateful. | ||
Almost like wanting, you know, you talk about bullies or you talk about, or even just my fellow academics, could benefit significantly from training a little bit. | ||
I think so too. | ||
It's a beautiful thing to almost, I think it's been talked about in high school sort of requiring it. | ||
Yeah, we've talked about it many times, yeah. | ||
I think it's a more humbling sport, to be honest, than wrestling, because you could, in wrestling, like I said, get away with some muscle. | ||
It's also what martial arts are supposed to be, in that a small person who knows technique can beat a big person who doesn't know the technique. | ||
That's right. | ||
That's what we always hoped for, right? | ||
When we saw the Bruce Lee movies, and Bruce Lee, who's a smaller guy, could beat all these bigger guys just because he had better technique. | ||
That is actually real in jiu-jitsu, and it's one of the only martial arts where that's real. | ||
Yeah, and in Philadelphia, you had Steve Maxwell here, right? | ||
Sure. | ||
That was the spring of jiu-jitsu in Philadelphia. | ||
Yeah, he was one of the very first American black belts in jiu-jitsu way back in the day. | ||
I believe he was a black belt in the very early 90s when jiu-jitsu was really just starting to come to America. | ||
And he had Maxercise. | ||
Maxercise, yeah. | ||
In Philadelphia. | ||
It's still there. | ||
And then I trained at Balance, which is a few Gracie folks, which is Phil McGlorese, Rick McGlorese, Josh Vogel Brothers. | ||
I mean, especially Vogel Brothers, these couple of black belts, they come up together. | ||
They're... | ||
Well, they're smaller. | ||
They're little guys. | ||
And I think those were the guys that really humbled me pretty quickly. | ||
Well, little guys are the best to learn technique from. | ||
Yeah. | ||
Because they can't rely on strength. | ||
There's a lot of really big, powerful, you know, 250-pound jiu-jitsu guys who never are going to develop the sort of subtlety of technique That some, like the Mayo brothers, like smaller guys who just, from the very beginning, they've never had an advantage in weight and size. | ||
And so they've never been able to use anything but perfect technique. | ||
Eddie Bravo's another great example of that, too. | ||
He competed in the 140-pound, 145-pound class. | ||
But to get back to artificial intelligence, so the idea is that There's two camps. | ||
There's one camp that thinks that the exponential increase in technology and that once artificial intelligence becomes sentient it could eventually improve upon its own design and literally become a god in a short amount of time. | ||
And then there's the other school of thought that thinks that is so far outside of the realm of what is possible today that even the speculation of this eventually taking place is kind of ludicrous to imagine. | ||
Right, exactly. | ||
And the balance needs to be struck because I think I'd like to talk about sort of the short term threats that are there. | ||
And that's really important to think about. | ||
But the long term threats, if they come to fruition, will overpower everything, right? | ||
That's really important to think about. | ||
But what happens is if you think too much about the encroaching doom of humanity, there's some aspect to it that is paralyzing, where it turns you off from actually thinking about these ideas. | ||
There's something so appealing. | ||
It's like a black hole that pulls you in. | ||
And if you notice, folks like Sam Harris and so on spend a large amount of the time talking about the negative stuff, about something that's far away. | ||
Not to say it's not wrong to talk about it, but they spend very little time about the potential positive impacts. | ||
In the near term and also the negative impacts in the near term. | ||
Let's go over those. | ||
Yep. | ||
Fairness. | ||
So the more and more we put decisions about our lives into the hands of artificial intelligence systems, whether you get a loan or in an autonomous vehicle context or in terms of recommending jobs for you on LinkedIn or all these kinds of things, | ||
The idea of fairness becomes of bias in these machine learning systems becomes a really big threat because the way current artificial intelligence systems function is they train on data. | ||
So there's no way for them to somehow gain a greater intelligence than the data we provide them with. | ||
So we provide them with actual data and so they carry over, if we're not careful, the biases in that data, the discrimination that's inherent in our current society as represented by the data. | ||
So they'll just carry that forward. | ||
Like how so? | ||
So there's people working on this more so to show really the negative impacts in terms of getting a loan or whether to say whether this particular human being should be convicted or not of a crime. | ||
or There's ideas there that can carry, you know, in our criminal system there's discrimination. | ||
And if you use data from that criminal system to then assist deciders, judges, juries, lawyers in making this incriminating – in making a decision of what kind of penalty a person gets, they're going to carry that forward. | ||
So you mean like racial, economic biases? | ||
Racial, economic, yeah. | ||
Geographical? | ||
And that's a sort of, I don't study that exact problem, but it's, you're aware of it because of the tools we're using. | ||
It only, so the two ways, so I'd like to talk about neural networks with Joe. | ||
Sure, let's do it. | ||
Okay, so the current approaches are And there's been a lot of demonstrated improvements, exciting new improvements in our advancements of artificial intelligence. | ||
And those are, for the most part, have to do with neural networks, something that's been around since the 1940s. | ||
It's gone through two AI winters where everyone was super hyped and then super bummed and super hyped again and bummed again and now we're in this other hype cycle. | ||
And what neural networks are is these collections of interconnected simple compute units. | ||
They're all similar. | ||
It's kind of like it's inspired by our own brain. | ||
We have a bunch of little neurons interconnected and the idea is These interconnections are really dumb and random, but if you feed it with some data, they'll learn to connect just like they're doing our brain in a way that interprets that data. | ||
They form representations of that data and can make decisions. | ||
But there's only two ways to train those neural networks that we have now. | ||
One is we have to provide a large data set. | ||
If you want the annual network to tell the difference between a cat and a dog, you have to give it 10,000 images of a cat and 10,000 images of a dog. | ||
You need to give it those images. | ||
And who tells you what a picture of a cat and a dog is? | ||
It's humans. | ||
So it has to be annotated. | ||
So as teachers of these artificial intelligence systems, we have to collect this data. | ||
We have to invest a significant amount of effort and annotate that data. | ||
And then we teach neural networks to make that prediction. | ||
What's not obvious there is... | ||
How poor of a method there is to achieve any kind of greater degree of intelligence. | ||
You're just not able to get very far besides very specific narrow tasks of cat versus dog, or should I give this person a loan or not? | ||
These kind of simple tasks. | ||
I would argue autonomous vehicles are actually beyond the scope of that kind of approach. | ||
And then the other realm of where neural networks can be trained is if you can simulate that world. | ||
So if the world is simple enough or is conducive to be formalized sufficiently to where you can simulate it. | ||
So a game of chess, there's rules. | ||
A game of Go, there's rules. | ||
So you can simulate it. | ||
The big exciting thing about Google DeepMind Is that they were able to beat the world champion by doing something called competitive self-play, which is to have two systems play against each other. | ||
They don't need the human. | ||
They play against each other. | ||
But that only works, and that's a beautiful idea and super powerful and really interesting and surprising, but that only works on things like games and simulation. | ||
So now if I wanted to, sorry to be going to analogies of like UFC for example, if I wanted to train a system to become the world champion, be, what's his name, Nurmagomedov, right? | ||
I could play the UFC game. | ||
I could create two neural networks that use competitive self-play to play in that virtual world. | ||
And they could become, state-of-the-art, the best fighter ever in that game. | ||
But transferring that to the physical world, we don't know how to do that. | ||
We don't know how to teach systems to do stuff in the real world. | ||
Some of the stuff that freaks you out often is Boston Dynamics robots. | ||
Every day I go to the Instagram page and just go, what the fuck are you guys doing? | ||
Engineering our demise. | ||
Mark Rabert, CEO, spoke at the class I taught. | ||
He calls himself a bad boy of robotics. | ||
So he's having a little fun with it. | ||
He should definitely stop doing that. | ||
Don't call yourself a bad boy of anything. | ||
That's true. | ||
How old is he? | ||
Okay, he's one of the greatest roboticists of our generation. | ||
unidentified
|
That's great. | |
That's wonderful. | ||
unidentified
|
However, don't call yourself a bad boy, bro. | |
Okay. | ||
So you're not the bad boy of MMA? Definitely not. | ||
I'm not even the bad man. | ||
Bad man? | ||
Definitely not a bad boy. | ||
Okay. | ||
That's so silly. | ||
Yeah, those robots are actually functioning in the physical world. | ||
That's what I'm talking about. | ||
And they are using something called, what was I think coined, I don't know, 70s or 80s, the term good old-fashioned AI. Meaning, there is nothing like going on that you would consider artificially intelligent, which is usually connected to learning. | ||
So these systems aren't learning. | ||
It's not like you dropped a puppy into the world and it kind of stumbles around and figures stuff out and learns. | ||
It's better and better and better and better. | ||
That's the scary part. | ||
That's the imagination. | ||
That's what we imagine is we put something in this world. | ||
At first, it's like harmless. | ||
It falls all over the place. | ||
And all of a sudden, it figures something out. | ||
And like Elon Musk says, it travels faster than whatever. | ||
You can only see it with probe lights. | ||
There's no learning component there. | ||
This is just purely, there's hydraulics and electric motors and there is 20 to 30 degrees of freedom and it's doing hard-coded control algorithms to control the task of how do you move efficiently through space. | ||
So this is the task roboticists work on. | ||
A really, really hard problem is taking robotic manipulation, taking Arm, grabbing a water bottle and lifting it. | ||
Super hard. | ||
Somewhat unsolved to this point. | ||
And learning to do that, we really don't know how to do that. | ||
Right, but this is... | ||
What we're talking about essentially is the convergence of these robotic systems with artificially intelligent systems. | ||
That's right. | ||
And as artificially intelligent systems evolve, and then this convergence... | ||
Becomes complete you're going to have the ability to do things like the computer that beat humans at go that's right you're gonna have creativity you're going to have a complex understanding of language and Expression and you're gonna have I mean perhaps even engineered things like emotions like jealousy and anger I mean it's an it's entirely possible that as you were saying I We're going to have systems that could potentially be biased the way human beings | ||
are biased towards people of certain economic groups or certain geographic groups and you would use that data that they have to discriminate just like human beings discriminate. | ||
If you have all that in an artificially intelligent robot that has autonomy and that has the ability to move, this is what people are totally concerned with and terrified of, is that all of these different systems that are currently in semi-crewed states, they can't pick up a water bottle yet, they can't really do much other than they can do backflips, but they, you know, I'm sure you've seen the more recent Boston Dynamic ones. | ||
Parkour? | ||
Yeah, I saw that one the other day. | ||
They're getting better and better and better, and it's increasing every year. | ||
Every year they have new abilities. | ||
Did you see the Black Mirror episode, Heavy Metal? | ||
Yeah, and I think about it quite a lot, because it's... | ||
Functionally, we know how to do most aspects of that. | ||
Right now. | ||
Right now. | ||
Pretty close, yeah. | ||
Pretty close. | ||
I mean, I don't remember exactly. | ||
There's some kind of pebble shooting situation where it hurts you by shooting you somehow. | ||
Well, it has bullets, didn't it? | ||
Bullets, yeah. | ||
It's basically a gun. | ||
It had a knife that stuck into one of its arms, remember? | ||
Spoiler alert. | ||
It's just an amazing episode of how terrifying it would be if some emotionless robot with incredible abilities is coming after you and wants to terminate you. | ||
And I think about that a lot because I love that episode because it's terrifying for some reason. | ||
But when I sit down and actually in the work we're doing, think about how we would do that. | ||
So we can do the actual movement of the robot. | ||
What we don't know how to do is to have robots that do the full thing, which is have a goal of pursuing humans and eradicating. | ||
Spoiler alert all over the place. | ||
I think the goal of eradicating humans, so assuming their values are not aligned somehow, that's one. | ||
We don't know how to do that. | ||
And two is the entire process of just navigating all over the world is really difficult. | ||
So we know how to go up the stairs, but to say how to navigate the path you took from home to the studio today, how to get through that full path is so much an unsolved problem. | ||
But is it because you could engineer or you could program it into your Tesla? | ||
You could put it into your navigation system and have it stop at red lights, drive for you, take turns, and it can do that? | ||
So, first of all, that I would argue is quite far away from still, but that's within 10, 20 years. | ||
Well, how much can it do now? | ||
It can stay inside the lane on the highway or on different roads, and it can change lanes. | ||
And what's being pushed now is they're trying to be able to enter and exit a highway. | ||
So it's some basic highway driving. | ||
It doesn't stop at traffic lights. | ||
It doesn't stop at stop signs. | ||
And it doesn't interact with the complex, irrational human beings, pedestrians, cyclists, cars. | ||
This is the onion I talked about. | ||
In 2005, the DARPA Grand Challenge... | ||
DARPA organized this challenge in the desert. | ||
It says, let's go across the desert. | ||
Let's see if we can build an autonomous vehicle that goes across the desert. | ||
In 2004, they did the first one and everybody failed. | ||
We're talking about some of the smartest people in the world really tried and failed. | ||
And so they did again in 2005. There's a few. | ||
Stanford won. | ||
There's a really badass guy from CMU, Red. | ||
I think he's like a marine. | ||
He led the team there. | ||
And they succeeded. | ||
The four teams finished. | ||
Stanford won. | ||
That was in the desert. | ||
And there was this feeling that we saw the autonomous driving. | ||
But that's that onion. | ||
Because you then, okay, what's the next step? | ||
We've got a car that travels across the desert autonomously. | ||
What's the next? | ||
So in 2007, they did the Urban Grand Challenge. | ||
The Urban Challenge. | ||
Where you drove around the city a little bit. | ||
And again, super hard problem. | ||
People took it on. | ||
CMU won that one. | ||
Stanford second, I believe. | ||
And then there was definitely a feeling like, yeah, now we had a car drive around the city. | ||
It's definitely solved. | ||
The problem is those cars were traveling super slow, first of all. | ||
And second of all, there's no pedestrians. | ||
It wasn't a real city. | ||
It was artificial. | ||
It's just basically having to stop at different sides. | ||
Again, one other layer of the onion. | ||
And you say, okay, when we actually have to put this car in a city like LA, how are we going to make this work? | ||
Because if there's no cars in the street and no pedestrians in the street, driving around is still hard, but doable, and I think solvable in the next five years. | ||
When you put pedestrians Everybody jaywalks. | ||
If you put human beings into this interaction, it becomes much, much harder. | ||
Now, it's not impossible, and I think it's very doable, and with completely new interesting ideas, including revolutionizing infrastructure and rethinking transportation in general, it's possible to do the next 5-10 years, maybe 20, but it's not easy, like everybody says. | ||
But does anybody say it's easy? | ||
Yeah. | ||
There's a lot of hype behind autonomous vehicles. | ||
Elon Musk himself and other people have promised autonomous vehicles that that timeline has already passed. | ||
There's been going on in 2018, we'll have autonomous vehicles. | ||
Now, they're semi-autonomous now, right? | ||
I know they can brake for pedestrians. | ||
If they see pedestrians, they're supposed to brake for them and avoid them. | ||
Right? | ||
That's part of the, technically no. | ||
Wasn't that an issue with an Uber car that hit a pedestrian that was operating autonomously? | ||
That's right. | ||
Someone, a homeless person, stepped out off of a median right into traffic and it nailed it and then they found out it didn't have one of the settings that wasn't in place. | ||
That's right. | ||
But that was an autonomous vehicle being tested in Arizona. | ||
And unfortunately, it was a fatality. | ||
A person died. | ||
A pedestrian was killed. | ||
So what happened there, that's the thing I'm saying is really hard. | ||
That's full autonomy. | ||
That's technically when the car, you can remove the steering wheel in the car to drive itself and take care of everything. | ||
Everything I've seen, everything we're studying, so we're studying drivers and Tesla vehicles, we're building our own vehicles, it seems that it'll be a long way off before we can solve the fully autonomous driving problem. | ||
Because of pedestrians. | ||
But two things, pedestrians and cyclists and the edge cases of driving. | ||
All the stuff we take for granted. | ||
The same reason we take for granted how hard it is to walk, how hard it is to pick up this bottle. | ||
Our intuition about what's hard and easy is really flawed as human beings. | ||
Can I interject? | ||
What if all cars were autonomous? | ||
That's right. | ||
If we got to a point where every single car on the highway is operating off of a similar algorithm or off the same system, then things would be far easier, right? | ||
Because then you have to don't deal with random kinetic movements, people just changing lanes, people looking at their cell phone, not paying attention to what they're doing, all sorts of things you have to be wary of right now driving and pedestrians and bicyclists. | ||
Totally. | ||
And that's in the realm of things I'm talking about where you think outside the box and revolutionize our transportation system. | ||
That requires government to play along. | ||
Seems like that's going that way though, right? | ||
Do you feel like that one day we're going to have autonomous driving pretty much everywhere? | ||
Especially on the highway? | ||
It's not going there in terms of it's very slow moving. | ||
So government does stuff very slow moving with infrastructure. | ||
One of the biggest things you can do for autonomous driving will solve a lot of problems is to paint lane markings. | ||
Regularly. | ||
And even that has been extremely difficult to do for politicians. | ||
Right, because right now there's not really the desire for it. | ||
But to explain to people what you mean by that, when the lanes are painted very clearly, the cameras and the autonomous vehicles can recognize them and stay inside those lanes much more easily. | ||
Yeah, there's two ways that cars see the world. | ||
Three. | ||
There's different sensors. | ||
The big ones for autonomous vehicles is LIDAR, which is these lasers that are being shot all over the place in 360, and they give you this point cloud of how far stuff is away, but they don't give you the visual texture information of this is what brand water bottle they are. | ||
And cameras give you that information. | ||
So what Tesla is using, they have eight cameras, I think. | ||
Is they perceive the world with cameras. | ||
And those two things require different things from the infrastructure, those two sensors. | ||
Cameras see the world the same as our human eyes see the world. | ||
So they need lay markings, they need infrastructure to be really nicely visible, traffic lights to be visible. | ||
So the same kind of things us humans like to have is the cameras like to have. | ||
And lay marking is a big one. | ||
There's a lot of interesting infrastructure improvements that can happen, like traffic lights. | ||
Traffic lights are super dumb right now. | ||
They sense nothing about the world, about the density of pedestrians, about approaching cars. | ||
If traffic lights can communicate with a car, Which makes perfect sense. | ||
It's right there. | ||
There's no size limitations. | ||
It can have a computer inside of it. | ||
You can coordinate different things in terms of the same pedestrian kind of problem. | ||
Well, we have sensors now on streets. | ||
So when you pull up to certain lights, especially at night, the light will be red. | ||
You pull up, it instantaneously turns green because it recognizes that you've stepped over or driven over a sensor. | ||
That's right. | ||
So that's a step in the right direction, but that's really sort of 20 years, 30 years ago technology. | ||
So you want to have something like the power of a smartphone inside every traffic light. | ||
It's pretty basic to do, but there's way outside of my expertise is how do you get government to do these kinds of improvements. | ||
So if I'm mistaken, well, correct me if I'm mistaken, but you're looking at things in terms of what we can do right now, right? | ||
And a guy like Elon Musk or Sam Harris is saying, yeah, but look at where technology leads us. | ||
If you go back to 1960, the kind of computers that they used to do the Apollo mission, you got a whole room full of computers that doesn't have nearly the same power as the phone that's in your pocket right now. | ||
Now, if you go into the future and exponentially calculate what's going to take place in terms of our ability to create autonomous vehicles, our ability to create artificial intelligence, and all of these things going from what we have right now To what could be in 20 years, | ||
we very well might look at some sort of an artificial being that can communicate with you, some sort of an ex machina type creature. | ||
I mean, that's not outside the realm of possibility at all. | ||
You have to be careful with the at all part. | ||
unidentified
|
At all. | |
Our ability to predict the future is really difficult, but I agree with you. | ||
It's not outside the realm of possibility. | ||
There's a few examples that are brought along, just because I enjoy these predictions, of how bad we are at predicting stuff. | ||
From the very engineers, the very guys and gals like me sitting before you made some of the worst predictions in history in terms of both pessimistic and optimistic. | ||
The Wright brothers, one of the Wright brothers, before they flew in 1903, predicted two years before that it will be 50 years I confess that in 1901 I said to my brother Orville that man would not fly for 50 years. | ||
Two years later we ourselves were making flights. | ||
This demonstration of my inability as a prophet gave me such shock So that's a pessimistic estimation versus an optimistic explanation. | ||
Exactly. | ||
And the same with Albert Einstein, Fermi made these kind of pessimistic observations. | ||
Fermi, three years before the first critical chain reaction, he led the nuclear development of the bomb. | ||
He said that he has 90% confidence that it's impossible. | ||
Three years before. | ||
Okay, so that's on the pessimistic side. | ||
On the optimistic side, the history of AI is laden with optimistic predictions. | ||
In 1965, one of the seminal people in AI, Herbert Simon, said, machines will be capable within 20 years of doing any work a man can do. | ||
He also said, within 10 years, a digital computer will be the world's chess champion. | ||
That's in 58. And we didn't do that until 90-something, 98, so 40 years later. | ||
Yeah, but that's one person, right? | ||
I mean, it's a guy taking a stab in the dark based on what data? | ||
What's he basing this off of? | ||
Our imagination. | ||
unidentified
|
Right. | |
We have more data points now, don't you think? | ||
unidentified
|
No. | |
Not about the future. | ||
That's the thing. | ||
Not about the future, but about what's possible right now. | ||
Right. | ||
And if you look at... | ||
The past is a really bad predictor of the future. | ||
If you look at the past... | ||
What we've done, the immense advancement of technology has given us, in many ways, optimism about what's possible. | ||
But exactly what is possible, we're not good at. | ||
So, I am much more confident that the world will look very fascinatingly different in the future. | ||
Whether AI will be part of that world is unclear. | ||
It could be we will all live in a virtual reality world. | ||
Or, for example, one of the things I really think about is, to me, a really dumb AI on one billion smartphones is potentially more impactful than a super intelligent AI on one smartphone. | ||
The fact that everybody now has smartphones, this kind of access to information, the way we communicate, the globalization of everything, the potential impact there of just even subtle improvements in AI could completely change the fabric the potential impact there of just even subtle improvements in AI could completely change the fabric of our society in a way where these discussions about an ex machina type lady walking around will be silly because we'll all be either living on | ||
or there's so many exciting possibilities. | ||
right? | ||
And what I believe in is we have to think about them. | ||
We have to talk about them Technology is always the source of danger of risk and All of the biggest things that threatened our civilization at the small and large scale, all are connected to misuse of technology we develop. | ||
And at the same time, it's that very technology that will empower us and save us. | ||
So there's Max Tegmark, brilliant guy, Life 3.0. | ||
I recommend people read his book on artificial general intelligence. | ||
He talks about the race. | ||
There's a race that can't be stopped. | ||
One is the development of technology. | ||
And the other is the development of our wisdom of how to stop or how to control the technology. | ||
And it's this kind of race. | ||
And our wisdom is now is always like one step behind. | ||
And then that's why we need to invest in it and keep sort of keep always thinking about new ideas. | ||
So right now we're talking about AI. We don't know what it's going to look like in five years. | ||
We have to keep thinking about it. | ||
We have to, through simulation, explore different ideas, through conferences, have debates, come up with different approaches of how to How to solve particular problems like I said with bias or how to solve deep fakes where you fake – you can make Donald Trump or former President Obama say anything or you can have Facebook advertisements, hyper-targeted advertisements. | ||
How we can deal with those situations and constantly have this race of wisdom versus the development of technology. | ||
But not to sit and think, well, look at the development of technology. | ||
Imagine what it could do in 50 years and we're all screwed. | ||
Because that's important to sort of be nervous about it in that way, but it's not conducive to what do we do about it. | ||
And the people that know what to do about it are the people trying to build this technology, building this future one step at a time. | ||
What do you mean by know what to do about it? | ||
Because, like, let's put it in terms of Elon Musk. | ||
Right. | ||
Like, Elon Musk is terrified of artificial intelligence because he thinks by the time it becomes sentient, it'll be too late. | ||
It'll be smarter than us and we'll have essentially created our successors. | ||
Yes. | ||
And let me quote Joe Rogan and say that's just one guy. | ||
Yeah. | ||
Well, Sam Harris thinks the same thing. | ||
There's quite a few people who think that. | ||
Sam Harris I think is one of the smartest people I know and Elon Musk, intelligence aside, is one of the most impactful people I know. | ||
He's actually building these cars and in the narrow AI sense, he's built these autopilot systems that we've been studying. | ||
The way that system works is incredible. | ||
It was very surprising to me on many levels. | ||
It's an incredible demonstration of what AI can do in a positive way in the world. | ||
So I don't – but people can disagree. | ||
I'm not sure the functional value of his fear about the possibility of this future. | ||
Well, if he's correct. | ||
There's functional value in hitting the brakes before this takes place. | ||
Just to be a person who's standing on top of the rocks with a light to warn the boats, hey, there's a rock here. | ||
Pay attention to where we're going because there's perils ahead. | ||
I think that's what he's saying. | ||
And I don't think there's anything wrong with saying that. | ||
And I think there's plenty of room for people saying what he's saying and people saying what you're saying. | ||
I think what would hurt us is if we tried to silence either voice. | ||
I think what we need in terms of our understanding of this future is many, many, many, many, many of these conversations where you're dealing with the... | ||
The current state of technology versus a bunch of creative interpretations of where this could go and have discussions about where it should go or what could be the possible pitfalls of any current or future actions. | ||
I don't think there's anything wrong with this. | ||
So when you say, like, what's the benefit of thinking in a negative way? | ||
Well, it's to prevent our demise. | ||
So, totally. | ||
I agree 100%. | ||
Negativity or worry about the existential threat is really important to have as part of the conversation. | ||
But there's this level. | ||
There's this line. | ||
It's hard to put into words. | ||
There's a line that you cross when that worry becomes... | ||
Hyperbole. | ||
Yeah, and then there's something about human psyche where it becomes paralyzing for some reason. | ||
Right. | ||
Now, when I have beers with my friends, the non-AI folks, we actually go, we cross that line all day and have fun with it. | ||
Maybe I should get you drunk right now. | ||
Maybe. | ||
I regret every moment of it. | ||
I talked to Steve Pinker. | ||
Enlightenment Now, his book, kind of highlights that That kind of – he totally doesn't find that appealing because that's crossing all realms of rationality and reason. | ||
When you say that appealing, what do you mean? | ||
Crossing the line into what will happen in 50 years. | ||
What could happen. | ||
What could happen. | ||
He doesn't find that appealing. | ||
He doesn't find it appealing because he's studied, and I'm not sure I agree with him to the degree that he takes it. | ||
He finds that there's no evidence. | ||
He wants all our discussions to be grounded in evidence and data. | ||
He highlights the fact that there's something about human psyche that desires this negativity. | ||
There's something undeniable where we want to create and engineer the gods that overpower us and destroy us. | ||
We want to? | ||
Or we worry about it? | ||
I don't know if we want to. | ||
Let me rephrase that. | ||
We want to worry about it. | ||
There's something about the psyche. | ||
Because you can't take the genie and put it back in the bottle. | ||
unidentified
|
That's right. | |
When you say there's no reason to think this way. | ||
But if you do have cars that are semi-autonomous now, and if you do have computers that can beat human beings who are world GO champions, and if you do have computers that can beat people at chess, and you do have people that are consistently working on artificial intelligence, and you do have Boston Dynamics who are getting... | ||
These robots to do all sorts of spectacular physical stunts and then you think about the possible future convergence of all these technologies and then you think about the possibility of this exponential increase in technology that allows them to be sentient, like within a decade, two decades, three decades. | ||
What more evidence do you need? | ||
You're seeing all the building blocks of a potential successor being laid out in front of you, and you're seeing what we do with every single aspect of technology. | ||
We constantly and consistently improve and innovate with everything, whether it's computers or cars or anything. | ||
Everything today is better than everything that was 20 years ago. | ||
So if you looked at artificial intelligence, which does exist to a certain extent, and you look at what it could potentially be 30, 40, 50 years from now, whatever it is, why wouldn't you look at all these data points and say, hey, this could go bad? | ||
I mean, it could go great, but it could also go bad. | ||
I do not want to be mistaken as the person who's not the champion of the impossible. | ||
I agree with you completely. | ||
I don't think it's impossible. | ||
I don't think it's impossible at all. | ||
I think it's inevitable. | ||
I don't... | ||
I think it is inevitable, yes. | ||
It's the Sam Harris argument. | ||
If superintelligence is nothing more than information processing... | ||
Same as the argument of the simulation, that we're living in a simulation. | ||
It's very difficult to argue against the fact that we're living in a simulation. | ||
The question is when and what the world would look like. | ||
Right. | ||
So it's, like I said, a race. | ||
And it's difficult. | ||
You have to balance those two minds. | ||
I agree with you totally. | ||
And I disagree with my fellow robotics folks who don't want to think about it at all. | ||
Of course they don't. | ||
They want to buy new houses. | ||
They've got a lot of money invested in this adventure. | ||
They want to keep the party rolling. | ||
They don't want to pull the brakes. | ||
Everybody, pull the cords out of the walls. | ||
We've got to stop. | ||
No one's going to do that. | ||
No one's going to come along and say, hey, we've run all this data through a computer and we've found that if we just keep going the way we're going and 30 years from now we will have a successor that will decide that human beings are outdated and inefficient and dangerous to the actual world that we live in and we're going to start wiping them out. | ||
It doesn't exist right now. | ||
But if that did happen, if someone did come to the UN and had this multi-stage presentation with data that showed that if we continue on the path, we have seven years before artificial intelligence decides to eliminate human beings based on these data points. | ||
What do they do? | ||
What do the Boston Dynamics people do? | ||
Well, I'm building a house in Cambridge. | ||
What are you talking about, man? | ||
I'm not going anywhere. | ||
unidentified
|
Come on. | |
I just bought a new Tesla. | ||
I need to finance this thing. | ||
Hey, I got credit card bills. | ||
I got student loans I'm still paying off. | ||
How do you stop people from doing what they do for a living? | ||
How do you say that, hey, I know that you would like to look at the future with rose-colored glasses on, but there's a real potential pitfall that could be the extermination of the human species? | ||
Right. | ||
And obviously I'm going way far with this. | ||
Yeah, I like it. | ||
I think every one of us trying to build these systems are similar in sound to the way you were talking about the touch of death. | ||
In that my dream, and the dream of many roboticists, is to create intelligent systems that will improve our lives. | ||
And working really hard at it. | ||
Not for a house in Cambridge. | ||
Not for a billion dollar for selling a start-up paycheck. | ||
We love this stuff. | ||
Some of you. | ||
Obviously, the motivations are different for every single human being that's involved in every endeavor. | ||
And we're trying really hard to build these systems and it's really hard. | ||
So whenever the question is, well, this is going to look at historically, it's going to take off. | ||
It can potentially take off any moment. | ||
It's very difficult to really be cognizant as an engineer about how it takes off because you're trying to make it take off in a positive direction and you're failing. | ||
Everybody is failing. | ||
It's been really hard. | ||
And so you have to acknowledge that Overnight, some Elon Musk type character may come along and, you know, people with this boring company or with SpaceX, people didn't think anybody but NASA could do what Elon Musk is doing and he's doing it. | ||
It's hard to think about that too much. | ||
You have to do that. | ||
But the reality is we're trying to create these super intelligent beings. | ||
Sure, but isn't the reality also that we have done things in the past because we were trying to do it, and then we realized that these have horrific consequences for the human race, like Oppenheimer in the Manhattan Project, you know, when he said, I am death. | ||
Destroyer of worlds when he's quoting the Bhagavad Gita, when he's detonating the first nuclear bomb and realizing what he's done. | ||
Just because something's possible to do doesn't necessarily mean it's a good idea for human beings to do it. | ||
Now, we haven't destroyed the world with Oppenheimer's discovery and through the work of the Manhattan Project. | ||
We've managed to somehow or another keep the lid on this shit for the last 60 years. | ||
unidentified
|
Which is incredible. | |
It's crazy, right? | ||
I mean, for the last, what, 70 years? | ||
How long has it been? | ||
70 sounds right. | ||
10,000, 20,000 nukes all over the world right now. | ||
It's crazy. | ||
I mean, we literally could kill everything on the planet. | ||
And somehow, we don't. | ||
Somehow. | ||
Somehow, in some amazing way, we have not. | ||
But that doesn't mean we... | ||
I mean, that's a very short amount of time in relation to the actual lifespan of the Earth itself and certainly in terms of the time human history has been around. | ||
And nuclear weapons, global warming is another one. | ||
Sure, but that's a side effect of our actions, right? | ||
We're talking about a direct effect of human ingenuity and innovation, the nuclear bomb. | ||
It's a direct effect. | ||
We tried to make it. | ||
We made it. | ||
There it goes. | ||
Global warming is an accidental consequence of human civilization. | ||
So you can't – I don't think it's possible to not build a nuclear bomb. | ||
You don't think it's possible to not build it. | ||
Because people are tribal, they speak different languages, they have different desires and needs, and they were in more. | ||
So if all these engineers were working towards it, it was not possible to not build it. | ||
Yep, and like I said, there's something about us chimps in a large collective where we are born and push forward towards progress of technology. | ||
You cannot stop the progress of technology. | ||
So the goal is how to develop, how to guide that development into a positive direction. | ||
But surely, if we do understand that this has taken place, and we did drop these enormous bombs on Hiroshima and Nagasaki and killed Untold amounts of innocent people with these detonations that it's not necessarily always a good thing to pursue technology. | ||
Nobody is so... | ||
You see what I'm saying? | ||
Yes, 100%. | ||
I agree with you totally. | ||
So I'm more playing devil's advocate than anything. | ||
But what I'm saying is you guys are looking at these things like we're just trying to make these things happen. | ||
And what I think people like Elon Musk and Sam Harris and a bunch of others that are gravely concerned about the potential for AI are saying is, I understand what you're doing, but you've got to understand the other side of it. | ||
We've got to understand that there are people out there that are terrified that if you do extrapolate, if you do take this relentless thirst for innovation and keep going with it, if you look at what we can do, what human beings can do so far in our crude manner of 2018, all the amazing things they've been able to accomplish. | ||
It's entirely possible that we might be creating our successors. | ||
This is not outside the realm of possibility. | ||
And all of our biological limitations, we might figure out a better way. | ||
And this better way might be some sort of an artificial creature. | ||
Yep. | ||
AI began with our dream to forge the gods. | ||
I think that it's impossible to stop. | ||
Well, it's not impossible to stop if you go Ted Kaczynski and kill all the people. | ||
I mean, that's what Ted Kaczynski anticipated. | ||
You know, the Unabomber, do you know the whole story behind him? | ||
No. | ||
What was he trying to stop? | ||
He's a fascinating cat. | ||
Here's what's fascinating. | ||
There's a bunch of fascinating things about him. | ||
One of the more fascinating things about him, he was involved in the Harvard LSD studies. | ||
unidentified
|
Right. | |
So they were nuking that dude's brain with acid. | ||
And then he goes to Berkeley, becomes a professor, takes all his money from teaching and just makes a cabin in the woods and decides to kill people that are involved in the creation of technology because he thinks technology is eventually going to kill off all the people. | ||
So he becomes crazy and schizophrenic and who knows what the fuck is wrong with him and whether or not this would have taken place inevitably or whether this was a direct result of his We don't even know how much they gave him or what the experiment entailed or how many other people's got their brain torched during these experiments. | ||
But we do know for a fact that Ted Kaczynski was a part of the Harvard LSD studies. | ||
And we do know that he went and did move to the woods and write his manifesto and start blowing up people that were involved in technology. | ||
And the basic thesis of his manifesto that perhaps LSD opened his eyes to is that technology is going to kill all humans. | ||
Yeah. | ||
It was going to be the end of the human race, I think, I believe. | ||
The human race, so the solution... | ||
Is that what he said? | ||
The Industrial Revolution and its consequences have been a disaster for the human race. | ||
Yeah, he extrapolated. | ||
He was looking at where we're going and these people that were responsible for innovation, and he was saying they're doing this with no regard for the consequences on the human race. | ||
And he thought the way to stop that was to kill people. | ||
Obviously, he's fucking demented. | ||
But this is, I mean, he literally was saying what we're saying right now. | ||
You keep going, we're fucked. | ||
So the Industrial Revolution, we'll have to think about that. | ||
It's a really important message coming from the wrong guy, but... | ||
It's a great way to put it. | ||
Where is all this taking us? | ||
Yeah, where is it taking us? | ||
So I guess my underlying assumption is the current capitalist structure of society, that we always want a new iPhone. | ||
You just had one of the best reviewers on yesterday that always talks about... | ||
Marcus. | ||
Marcus, yeah. | ||
We always, myself too, Pixel 3. I have a Pixel 2. I'm thinking, maybe I need a Pixel 3. Maybe you do. | ||
I don't know. | ||
Better camera. | ||
Whatever that is, that fire that wants more, better, better. | ||
I just don't think it's possible to stop. | ||
And the best thing we can do is to explore ways to guide it towards safety where it helps us. | ||
When you say it's not possible to stop, you mean collectively as an organism, like the human race, that it's a tendency that's just built in? | ||
It's certainly possible to stop as an individual, because I know people, like my friend Ari, who's given up on smartphones, he went to a flip phone, and he doesn't check social media anymore, and he found it to be toxic, he didn't like it, he thought he was too addicted to it, and he didn't like where it was leading him. | ||
So on an individual level, it's possible. | ||
Individual level, but then, and just like with Ted Kaczynski, on the individual level, it's possible to do certain things that try to stop it in more dramatic ways. | ||
But I just think the force of our, this organism, this living, breathing organism that is our civilization, will progress forward. | ||
We're just curious apes. | ||
It's this desire to explore the universe. | ||
Why? | ||
Why do we want to do these things? | ||
Why do we look up and we want to travel? | ||
I don't think we're trying to optimize for survival. | ||
In fact, I don't think most of us would want to be immortal. | ||
I think it's like Neil deGrasse Tyson talks about. | ||
The fact that we're mortal, the fact that one day we'll die is one of the things that gives life meaning. | ||
And sort of trying to worry and trying to sort of say, wait a minute, where is this going? | ||
As opposed to riding the wave and riding the wave of forward progress. | ||
I mean, it's one of the things... | ||
He gets quite a bit of ironically hate for it, Steve Pinker, but he really describes in data how our world is getting better and better. | ||
Well, he just gets hate from people that don't want to admit that there's a trend towards things getting better because they feel like then people will ignore all the bad things that are happening right now and all the injustices, which I think is a very short-sighted thing, but I think it's because of their own Their own biases and the perspective that they're trying to establish and push. | ||
Instead of looking at things objectively and looking at the data and say, say, I see where you're going, it doesn't discount the fact that there's injustice in the world and crime and violence and all sorts of terrible things happen to people that are good people on a daily basis. | ||
But what he's saying is just look at the actual trend of civilization and the human species itself and there's an undeniable trend towards peace. | ||
Slowly but surely working towards peace. | ||
Way safer today. | ||
Way safer today than it was a thousand years ago. | ||
Just – it is. | ||
It just is. | ||
Yeah, and there's these interesting arguments, which his book kind of blew my mind to this funny joke. | ||
He says that some people consider giving nuclear – The atom bomb, the Nobel Peace Prize. | ||
Because he believes, I'm not an expert in this at all, but he believes that, or some people believe that nuclear weapons are actually responsible for a lot of the decrease in violence. | ||
Because all of the major people can do damage. | ||
Russia and all the major states that can do damage have a strong disincentive from engaging in warfare. | ||
And so these are the kinds of things you don't, I guess, anticipate. | ||
So I think it's very difficult to stop that forward progress, but we have to really worry and think about, okay, how do we avoid the list of things that we worry about? | ||
So one of the things that people really worry about is the control problem. | ||
It's basically AI becoming not necessarily super intelligent, but super powerful. | ||
We put too much of our lives into it. | ||
That's where Elon Musk and others that want to provide regulation of some sort, saying, wait a minute, you have to put some bars on what this thing can do from a government perspective, from a company perspective. | ||
But how could you stop rogue states from doing that? | ||
Why would China listen to us? | ||
Why would Russia listen to us? | ||
Why would other countries that are capable of doing this and maybe don't have the same sort of power that the United States has and they would like to establish that kind of power, why wouldn't they just take the cap off? | ||
In a philosophical high-level sense, there's no reason. | ||
But if you engineer it in... | ||
So I'm a big... | ||
We do this thing with autonomous vehicles called arguing machines. | ||
We have multiple AI systems argue against each other. | ||
So it's possible that you have some AI systems over supervising other AI systems. | ||
In our nation, there's a Congress arguing blue and red states being represented and there's discourse going on, debate, and have AI systems like that too. | ||
It doesn't necessarily need to be one super powerful thing. | ||
It could be AI supervising each other. | ||
So there's interesting ideas there to play with. | ||
Because ultimately, what are these artificial intelligence systems doing? | ||
We humans place power into their hands first. | ||
In order for them to run away with it, we need to put power into their hands. | ||
So we have to figure out how we put that power in initially so it doesn't run away and how supervision can happen. | ||
Right, but this is us, right? | ||
You're talking about rational people. | ||
What about other people? | ||
Why would they engineer limitations into their artificial intelligence and what incentive would they have to do that to somehow another limit their artificial intelligence to keep it from having as much power as ours? | ||
There's really not a lot of incentive on their side, especially if there's some sort of competitive advantage for their artificial intelligence to be more ruthless, more sentient, more autonomous. | ||
I mean, it seems like Once, again, once the genie's out of the bottle, it's going to be very hard. | ||
I have a theory, and this is a very bizarre theory, but I've been running with this for quite a few years now. | ||
I think human beings are some sort of a caterpillar. | ||
And I think we're creating a cocoon, and through that cocoon, we're going to give birth to a butterfly, and then we're going to become something. | ||
And I think whether we're going to have some sort of a symbiotic connection to these electronic things where they're going to replace our parts, our failing parts, with far superior parts until we're not really a person anymore. | ||
Like, what was that Scarlett Johansson movie? | ||
The Ghost in the Shell? | ||
I tried to watch part of it. | ||
It's pretty stupid. | ||
But she's hot as fuck, so it kept my attention for a little bit. | ||
But in that, they took her brain and put it in this artificial body that had superpowers. | ||
And they basically replaced everything about her that was in her consciousness With these artificial parts. | ||
All of her frame, everything was just some new thing that was far superior. | ||
And she had these abilities that no human being will ever have. | ||
I really wonder why we have this insatiable... | ||
Why can't... | ||
If we're so logical... | ||
We're so logical and so thoughtful in some ways. | ||
Why can't we be that way when it comes to materialism? | ||
Well, I think one of the reasons why is because materialism is the main engine that pushes innovation. | ||
If it wasn't for people's desire to get the newest, latest, and greatest thing, what would fund these New TVs, cell phones, computers. | ||
Why do you really need a new laptop every year? | ||
Is it because of engineered obsolescence where the laptop dies off and you have to get a new one because they fucked you and they built a shitty machine that's designed to die so you buy a new one? | ||
You really like iPhones, don't you? | ||
Well, it's not even iPhones. | ||
It's a laptop. | ||
Is it because you just see the number? | ||
2.6 gigahertz is better than 2.4. | ||
Oh, it's the new one. | ||
It has a 12 megapixel webcam instead of an 8. And for whatever reason, we have this desire to get those new things. | ||
I think that's what fuels innovation. | ||
And my cynical view Of this thing that's happening, is that we have this bizarre desire to fuel our demise, and that we're doing so by fueling technology, by motivating these companies to continually innovate. | ||
If everybody just said, you know what, man, I'm really into log cabins, and I want an axe, or I can cut my own firewood, and I realize that TV's rot in my brain, I just want to read books. | ||
So fuck off. | ||
And everybody started doing that. | ||
And everybody started living like, when it gets dark out, I'll use candles. | ||
And you know what? | ||
I'm going to get my water from a well. | ||
And you know what? | ||
I'm going to do... | ||
And I like living better that way. | ||
If people started doing that, there would be no need for companies to continually make new computers, to make new phones, to make new smart watches, or whatever the fuck they're making. | ||
To make cars that can drive themselves. | ||
These things that we're really, really attached to, if you looked at the human organism, you somehow or another could objectively remove yourself from society and culture and all the things that make us a person, and you look at what we do, what does this thing do? | ||
We found this planet, there's these little pink monkeys and brown monkeys and yellow monkeys, and what are they all into? | ||
Well, they all seem to be into making stuff. | ||
And what kind of stuff are they making? | ||
Well, they keep making better and better stuff that's more and more capable. | ||
Well, where's it going? | ||
Well, it's going to replace them. | ||
They're going to make a thing that's better than them. | ||
They're engineering these things slowly but surely to do all the things they do but do them better. | ||
Yeah, and it's a fascinating theory. | ||
I mean, it's not a theory. | ||
It's an instructive way to think about intelligence and life, period. | ||
So if you step back, look across human history, and look at Earth as an organism. | ||
What is this thing doing? | ||
The thing is, I think in terms of scale and in terms of time, you can look that way at so many things. | ||
Like isn't there billions or trillions of organisms on our skin right now, both of us, that have little civilizations, right? | ||
They have a different mechanism by which they operate and interact. | ||
But for us to say that we're intelligent and those organisms are not is a very narrow-sided view. | ||
So they are operating under some force of nature that can't That Darwin has worked on trying to understand some small elements of this evolutionary theory. | ||
But there's other more interesting forces at play that we don't understand. | ||
And there's some kind of force. | ||
It could be a fundamental force of physics that Einstein never got a chance to discover is our desire for an iPhone update. | ||
Some fundamental force of nature, somehow gravity and the strong force and these things described by physics add up to this drive for new things, for creation. | ||
And the fact that we die, the fact that we're mortal, the fact that what desires are built into us, whether it's sexual or intellectual or whatever drives us apes, Somehow that all combines to this progress and towards what... | ||
It is a compelling way to think that if an alien species did visit Earth, I think they would probably see the smartphone situation. | ||
They see how many little lights are on and how us apes are looking at them. | ||
It's possible, I think, some people have said that they would think the overlords are the phones, not the people. | ||
Mm-hmm. | ||
So to think that that's now moving into a direction where the future will be something that is beyond human or symbiotic with human ways we can't understand is really interesting. | ||
Not just that, but something that we're creating ourselves. | ||
Creating ourselves. | ||
And it's a main focal point of our existence. | ||
That's our purpose. | ||
Yeah. | ||
I mean, if you think about a main focal point, if you think about the average person, what they do, there's a great percentage of our population that has jobs where they work, and one of the ways that they placate themselves doing these things that they don't really enjoy doing is earning money for objects. | ||
They want a new car. | ||
They want a new house. | ||
They want a bigger TV. They want a this or that. | ||
And the way they motivate themselves to keep showing up at this shitty job is to think, if I just put in three more months, I can get that Mercedes. | ||
If I just do this or that, I can finance this new Pixel 3. Yeah, and it's interesting because the sort of politicians – what's the American dream? | ||
Is for – you hear this thing, I want my children to be better off than me. | ||
This kind of desire – you can almost see that that taken farther and farther will be – there will be a presidential candidate in 50, 100 years. | ||
They'll say – I want my children to be robots. | ||
You know what I mean? | ||
Like sort of this idea that that's the natural evolution and that is the highest calling of our species. | ||
That scares me because I value my own life. | ||
But does it scare you if it comes out perfect? | ||
Like if each robot is like a god and each robot is beautiful and loving and they recognize all the great parts of this existence and they avoid all the jealousy and the nonsense and all the stupid aspects of being a person. | ||
We realize that a lot of these things are just sort of biological engineered tricks that are designed to keep us surviving from generation after generation but now here in this fantastic new age we don't need them anymore. | ||
Yeah, it's... | ||
Well, first, one of the most transformative moments of my life was when I met Spot Mini in person, which is one of the legged robots in Boston Dynamics. | ||
For the first time when I met them, met that little fella, there was... | ||
I know exactly how it works. | ||
I know exactly how every aspect of it works. | ||
It's just a dumb robot. | ||
But when I met him, and he got up, and he looked at me... | ||
There it is right there. | ||
Have you seen it dance now? | ||
Yeah, the dance. | ||
The new thing? | ||
Yep. | ||
The dance is crazy. | ||
But see, it's not crazy on the technical side. | ||
unidentified
|
Right. | |
It's engineered. | ||
It's obvious. | ||
It's programmed. | ||
But it's crazy to watch. | ||
Like, wow. | ||
The reason the moment was transformative is I know exactly how it works. | ||
And yet by watching it, something about the feeling of it. | ||
You're like, this thing is alive. | ||
And there was this terrifying moment, not terrifying, but terrifying and appealing where this is the future. | ||
Right. | ||
Like, this thing represents some future that is totally, that we cannot understand. | ||
Just like a future in the 18th century, a future with planes and smartphones was something we couldn't understand. | ||
That this thing, that little dog could have had a human consciousness in it. | ||
That was the feeling I had. | ||
And I know exactly how it works. | ||
There's nothing close to the intelligence, but it just gives you this picture of what the possibilities are of these living creatures. | ||
And I think that's what people feel when they see Boston Dynamics. | ||
Look how awesome this thing running around is. | ||
They don't care about the technicalities and how far away we are. | ||
They see it. | ||
Look, this thing is pretty human. | ||
And the possibilities of human-like things that supersede humans and can evolve and learn quickly, exponentially fast. | ||
It's this terrifying frontier that really makes us think, as it did for me. | ||
Maybe terrifying is a weird word. | ||
Because when I look at it, and I'm not irrational, and I look at it, there's videos that show the progression of Boston Dynamics robots. | ||
From several years ago to today, what they're capable of. | ||
And it is a fascinating thing, because you're watching all the hard work of these engineers and all these people that have designed these systems and have figured out all these problems that these things encounter, and they've come up with solutions, and they continue to innovate. | ||
And they're constantly doing it, and you're seeing this problem, and you're like, wow, what are we going to see in a year? | ||
What am I going to see in three years? | ||
What am I going to see in five years? | ||
Absolutely fascinating, because if you extrapolate and you just keep going, boy, you go 15, 20, 30, 50, 100 years from now, you have ex machina. | ||
Yeah, you have ex machina, at least in our imagination. | ||
In our imagination. | ||
And the problem is there will be so many other things that are super exciting and interesting. | ||
Sure, but that doesn't mean it's not crazy. | ||
I mean there's many other things you could focus on also that are also going to be bizarre and crazy. | ||
Sure. | ||
But what about it? | ||
Just it. | ||
It's going somewhere. | ||
That fucker is getting better. | ||
The parkour one is bananas. | ||
You see it hopping from box to box and left to right and leaping up in the air, and you're like, whoa. | ||
That thing doesn't have any wires on it. | ||
It's not connected to anything. | ||
It's just jumping from box to box. | ||
If that thing had a machine gun and it was running across a hill at you, you'd be like, oh, fuck, how long does its battery last? | ||
How many bullets does it have? | ||
Let me just say that I would pick Tim Kennedy over that dog for the next 50 years. | ||
50? | ||
Yeah. | ||
Man, I'm a big Tim Kennedy fan. | ||
I'm talking about that. | ||
But he'll probably have some robotic additions to his body to improve the... | ||
Well, then is he Tim Kennedy anymore? | ||
If the brain is Tim Kennedy, then he's still Tim Kennedy. | ||
That's the way we think about it. | ||
But there is huge concern about – the UN is meeting about this as autonomous weapons. | ||
It's allowing AI to make decisions about who lives and who dies is really concerning in the short term. | ||
It's not about a robotic dog with a shotgun running around. | ||
It's more about our military wanting to make destruction as efficient as possible, minimizing human life. | ||
Drones? | ||
Drones. | ||
There's something really uncomfortable to me about drones in how you compare with Dan Carlin Hardcore History with Genghis Khan. | ||
There's something impersonal about what drones are doing, where it moves you away from the actual destruction that you're achieving, where I worry that our ability to encode the ethics into these systems will go wrong in ways we don't expect. | ||
And so, I mean, folks at the UN talk about, well, you have these automated drones that make That drop bombs over a particular area. | ||
So the bigger and bigger the area is over which you allow an artificial intelligence system to make a decision to drop the bombs, the weirder and weirder it gets. | ||
There's some line, now presumably if there's like three tanks that you would like to destroy with a drone, it's okay for an AI system to say, I would like to destroy those three, like I'll handle everything, just give me the three tanks. | ||
But this makes me uncomfortable as well because I think I'm opposed to most wars. | ||
But it's just military is military and they try to get the job done. | ||
Now what if we now expand that to 10, 20, 100 tanks? | ||
Where you now let the AI system draw bombs all over very large areas. | ||
How can that go wrong? | ||
And that's terrifying. | ||
And there's practical engineering solutions to that. | ||
Oversight. | ||
And that's something that engineers sit down. | ||
There's an engineering ethic where you encode and you have meetings of how do we make this safe? | ||
That's what you worry about. | ||
The thing that keeps me up at night is the 40,000 people that die every year in auto crashes. | ||
I worry about not... | ||
You have to understand, I worry about the future of AGI taking over, but that's not as large... | ||
AGI? AGI, Artificial General Intelligence. | ||
That's kind of the term that people have been using for this. | ||
But maybe because I'm in it, I worry more about the 40,000 people that die in the United States and the 1.2 million that die every year from auto crashes. | ||
There's something... | ||
That is more real to me about the death that's happening now that could be helped. | ||
And that's the fight. | ||
But, of course, if this threat becomes real, then... | ||
Then that's a much, you know, that's a serious threat to humankind. | ||
And that's something that should be thought about. | ||
I just worry that, I worry also about the AI winter. | ||
So I mentioned there's been two winters in the 70s and the 80s to 90s. | ||
When funding completely dried up, but more importantly, just people stopped getting into artificial intelligence and became cynical about its possibilities. | ||
Because there was a hype cycle where everyone was really excited about the possibilities of AI. And then they realized, you know, five, ten years into the development, that we didn't actually achieve anything. | ||
It was just too far off. | ||
Too far off. | ||
Same as it was for virtual reality. | ||
For the longest time, virtual reality was something that was discussed even in the 80s and the 90s, but it just died off. | ||
Nobody even thought about it. | ||
Now it's come back to the forefront when there's real virtual reality that you can use, like HTC Vibes or things along those lines where you can put these helmets on, and you really do see these alternative worlds that people have created in these video games. | ||
You realize there's a practical application for this stuff because the technology is caught up with the concept. | ||
Yeah, and I actually don't know where people stand on VR. We do quite a bit of stuff with VR for research purposes for simulating robotic systems, but I don't know where the hype is. | ||
I don't know if people calm down a little bit on VR. So there was a hype in the 80s and 90s, I think. | ||
I think it's ramped up quite a bit. | ||
What is the other one, the Oculus Rift, and what other one? | ||
Those are the main ones, and there's other headsets that you can work and use with. | ||
Yeah, and there's some you can use just with a Samsung phone, correct? | ||
Yeah, and the next generation, which next year to two, are going to be all standalone systems. | ||
So there's going to be an Oculus Rift coming out you don't need a computer for at all. | ||
So the ultimate end-game fear, the event horizon of that, is the Matrix. | ||
Right? | ||
That's what people are terrified of, of some sort of a virtual reality world where you don't exist in the physical sense anymore. | ||
They just plug something into your brain stem, just like they do in The Matrix, and you're just locked into this artificial world. | ||
Is that terrifying to you? | ||
That seems to be less terrifying than AI killing all of humankind. | ||
Well, it depends. | ||
What is life? | ||
That's the real question, right? | ||
If you only exist inside of a computer program, but it's a wonderful program, and whatever your consciousness is, and we haven't really established what that is, right? | ||
I mean, there's a lot of really weird hippie ideas out there about what consciousness is. | ||
Your body's just like an antenna man, and it's just like tuning into consciousness, and consciousness is all around you. | ||
It's Gaia. | ||
It's the Mother Earth. | ||
It's the universe itself. | ||
It's God. | ||
It's love. | ||
Okay, maybe. | ||
I don't know. | ||
But if you could take that, whatever the fuck it is, and send it in a cell phone to New Zealand, is that where your consciousness is now? | ||
Because if we figure out what consciousness is and get it to the point where we can turn it into a program or duplicate it, I mean, that sounds so far away. | ||
But if you went up to someone from 1820 and said, hey man, one day I'm going to take a picture of my dick and I'm going to send it to this girl. | ||
She's going to get it on her phone. | ||
They'd be like, what the fuck are you talking about? | ||
A photo? | ||
What do you mean? | ||
What's a photo? | ||
Oh, it's like a picture, but you don't draw it. | ||
It's perfect. | ||
It looks exactly like that. | ||
It's in HD and I'm going to make a video. | ||
Of me taking a shit, and I'm going to send it to everyone. | ||
They're like, what the fuck is this? | ||
That's not even possible. | ||
Get out of here. | ||
That is essentially you're capturing time. | ||
You're capturing moments in time in a very, not a very crude sense, but a crude sense in terms of comparing it to the actual world itself. | ||
In the moment where it's happening. | ||
Like here, you and I are having this conversation. | ||
We're having it in front of this wooden desk. | ||
There's paper in front of you. | ||
To you and I, we have access to all the textures, the sounds. | ||
We can feel the air conditioning. | ||
We can look up. | ||
We can see the ceiling. | ||
We got the whole thing in front of us because we're really here. | ||
But to many people that are watching this on YouTube right now, they're getting a minimized A crude version of this. | ||
That's similar. | ||
But it feels real. | ||
It feels pretty real. | ||
It's pretty close. | ||
It's pretty close. | ||
So, I mean, I've listened to your podcast for a while. | ||
You usually have... | ||
So, when I listen to your podcast, it feels like I'm sitting in with friends listening to a conversation. | ||
So, it's not as intense as, for example, Dan Carlin's Hardcore History, where the guy's, like, talking to me about the darkest aspects of human nature. | ||
His show's so good, I don't think you can call it a podcast. | ||
It's an experience. | ||
You're there. | ||
I was hanging out with him and Genghis Khan and World War I, World War II. Painfotainment is an episode he had where he talks very dark ideas about our human nature and desiring the observation of the torture and suffering of others. | ||
There's something really appealing to us. | ||
He has this whole episode how throughout history we liked watching people die. | ||
unidentified
|
Mm-hmm. | |
And there's something really dark. | ||
You're saying that if somebody streamed something like that now, it would probably get hundreds of millions of views. | ||
Yeah, it probably would. | ||
And we're protecting ourselves from our own nature because we understand the destructive aspects of it. | ||
That's why YouTube would pull something like that. | ||
If you tied a person in between two trucks and pulled them apart and put that on YouTube, it would get millions of hits. | ||
But YouTube would pull it because we've decided as a society, collectively, That those kind of images are gruesome and terrible for us. | ||
But nevertheless, that experience of listening to his podcast slash show, it feels real. | ||
Just like VR for me, there's really strongly real aspects to it. | ||
Where I'm not sure that if the VR technology gets much better, to where if you had a choice between, do you want to live your life in VR? You're going to die just like you would in real life. | ||
Meaning your body will die. | ||
You're just going to hook up yourself to a machine like it's a deprivation tank. | ||
And just all you are is in VR and you're going to live in that world. | ||
Which life would you choose? | ||
Would you choose a life in VR or would you choose a real life? | ||
That was the guy's decision in The Matrix, right? | ||
The guy decided in The Matrix he wanted to be a special person in The Matrix. | ||
He was eating that steak, talking to the guys, and he decided he was going to give up. | ||
Remember that? | ||
Yep. | ||
So what decision would you make? | ||
What is reality if it's not what you're experiencing? | ||
If you're experiencing something, but it's not tactile in the sense that you can't drag it somewhere and put it on a scale and take a ruler to it and measure it, but in the moment of being there, it seems like it is. | ||
What is missing? | ||
What is missing? | ||
Well, it's not real. | ||
Well, what is real then? | ||
What is real? | ||
Well, that's the ultimate question in terms of like, are we living in a simulation? | ||
That's one of the things that Elon brought up when I was talking to him. | ||
And this is one thing that people have struggled with. | ||
If we are one day going to come up with an artificial reality that's indiscernible from reality, In terms of emotions, in terms of experiences, feel, touch, smell, all of the sensory input that you get from the regular world, if that's inevitable, if one day we do come up with that, how are we to discern whether or not we have already created that and we're stuck in it right now? | ||
That we can't. | ||
unidentified
|
We can't. | |
And there's a lot of philosophical arguments for that, but it gets at the nature of reality. | ||
I mean, it's fascinating because we're totally clueless about what it means to be real. | ||
What it means to exist. | ||
To exist. | ||
So consciousness for us, I mean, it's incredible. | ||
You can look at your own hand. | ||
I'm pretty sure I'm on the Joe Rogan Experience podcast. | ||
I'm pretty sure this is not real. | ||
I'm imagining all of it. | ||
There's a knife in front of me. | ||
I mean, it's surreal. | ||
And I have no proof that it's not fake. | ||
And those kinds of things actually come into play with the way we think about artificial intelligence too. | ||
Like, what is intelligence? | ||
unidentified
|
Right. | |
It seems like... | ||
It seems like we're easily impressed by algorithms and robots we create that appear to have intelligence, but we still don't know what is intelligent and how close those things are to us. | ||
And we think that ourselves, as this biological entity that can think and talk and cry and laugh, that we are somehow or another more important than some sort of silicon-based thing that we create that does everything that we do but far better. | ||
Yeah, I think if I were to take a stand, a civil rights stand, I hope I'm young. | ||
I'll one day run for president on this platform, by the way, that defending the rights—well, I can't because I'm Russian, but maybe they'll change the rules—that robots will have rights. | ||
Robots' lives matter. | ||
And I actually believe that we're going to have to start struggling with the idea of how we interact with robots. | ||
I've seen too often the abuse of robots, not just the Boston Dynamics, but literally people You leave them alone with the robot, the dark aspects of human nature comes out, and it's worrying to me. | ||
I would like a robot that spars, but only can move at like 50% of what I can move at, so I can fuck it up. | ||
unidentified
|
Yeah. | |
You'd be able to practice really well. | ||
You would develop some awesome sparring instincts with that robot, but there would still be consequences. | ||
If you did fuck up and you got lazy and the leg kicked you and you didn't check it, it would hurt. | ||
I would love to see a live stream of that session because there's so many ways. | ||
I mean, I practiced on a dummy. | ||
There is aspects to a dummy that's helpful. | ||
Yeah, in terms of positioning and where your stance is and technique. | ||
Yeah, there's something to it. | ||
I can certainly see that going wrong in ways where a robot might not respect you tapping. | ||
Yeah. | ||
Or a robot decides to beat you to death. | ||
It's tired of you fucking it up every day. | ||
And one day you get tired. | ||
Or what if you sprain your ankle and it gets on top of you and mounts you and just starts blasting you in the face? | ||
It does a heel hook or something. | ||
Right. | ||
And you'd have to be able to say, stop! | ||
Stop! | ||
Well then, no, you're going to have to use your martial art to defend yourself. | ||
Yeah, right, because if you make it too easy for the robot to just stop anytime, then you're not really going to learn. | ||
Like, one of the consequences of training, if you're out of shape, is if you get tired, people fuck you up. | ||
And that's incentive for you to not get tired. | ||
Like, there are so many times that I would be in the gym, like, doing strength and conditioning, and I think about moments where I got tapped. | ||
Where guys caught me in something and I was exhausted and I couldn't get out of the triangle. | ||
I'm like, shit! | ||
And I just really push on the treadmill or push on the airdyne bike or whatever it was that I was doing, thinking about those moments of getting tired. | ||
Yeah, that's what I think about when I do like sprints and stuff was the feeling of competition, those nerves of stepping in there. | ||
It's really hard to do that kind of visualization but it builds. | ||
It's effective though and the feeling of consequences to you not having any energy. | ||
So you have to muster up the energy. | ||
Because if you don't, you're gonna get fucked up. | ||
Or something bad's gonna happen to someone you care about. | ||
Or something's gonna happen to the world. | ||
Maybe you're a superhero. | ||
You're saving the world from the robots. | ||
That's right. | ||
To go back to what we were talking about, I'm sorry to interrupt you, but just to bring this all back around, what is this life and what is consciousness and what is this experience? | ||
And if you can replicate this experience in a way that's indiscernible, will you choose to do that? | ||
Lex, you don't have much time left, but we have an option. | ||
We have an option and we can take your consciousness as you know it right now, put it into this program. | ||
You will have no idea that this has happened. | ||
You're going to close your eyes, you're going to wake up, you're going to be in the most beautiful green field. | ||
There's going to be naked women everywhere. | ||
Feasts everywhere you go. | ||
There's going to be just picnic tables filled with the most glorious food. | ||
You're going to drive around a Ferrari every day and fly around in a plane. | ||
You're never going to die. | ||
You're going to have a great time. | ||
Or take your chances. | ||
See what happens when the lights shut off. | ||
Well, first of all, I'm a simple man. | ||
I don't need multiple women. | ||
One is good. | ||
I'm romantic in that way. | ||
That's what you say. | ||
But that's in this world. | ||
This world, you've got incentive to not be greedy. | ||
In this other world where you can breathe underwater and fly through the air and, you know… No, I believe that scarcity is the fundamental ingredient of happiness. | ||
So if you give me 72 virgins or whatever it is and… You just keep one slut? | ||
Not a slut. | ||
A requirement, you know, somebody intelligent and interesting. | ||
Who enjoys sexual intercourse. | ||
Well, not just enjoys sexual intercourse. | ||
Like you. | ||
A person. | ||
Well, that and keeps things interesting. | ||
Lex, we can engineer all this into your experience. | ||
You don't need all these different women. | ||
I get it. | ||
I understand. | ||
We've got this program for you. | ||
Don't worry about it. | ||
Okay, you want one more, and a normal car, like maybe a Saab or something like that. | ||
Nothing crazy. | ||
Yeah. | ||
unidentified
|
Right? | |
Yeah. | ||
You're a simple man. | ||
I get it. | ||
No, no, no. | ||
You want to play chess with someone who could beat you every now and then, right? | ||
Yeah, but not just chess. | ||
So engineer some flaws. | ||
She needs to be able to lose her shit every once in a while. | ||
Yeah, the Matrix. | ||
Put her on the red dress. | ||
Which girl in the red dress? | ||
It comes right here. | ||
Remember, he goes like, did you notice the girl in the red dress? | ||
It's like the one that catches his attention. | ||
I don't remember this. | ||
This is right at the very beginning when he's telling them what the Matrix is. | ||
She walks by right here. | ||
Oh, there she is. | ||
Ba-bam! | ||
That's your girl. | ||
The guy afterwards is like, I engineer that. | ||
I'm telling you, it's just not... | ||
It's not. | ||
Well, yeah, but then I have certain features. | ||
Like, I'm not an iPhone guy like Android, so that may be an iPhone person's girl. | ||
But that's nonsense. | ||
unidentified
|
Mm-hmm. | |
So if an iPhone came along that was better than Android, you wouldn't want to use it? | ||
No, my definition of better is different. | ||
I know, for me, happiness lies... | ||
In Android phones? | ||
Yeah, Android phones. | ||
Close connection with other human beings who are flawed but interesting, who are passionate about what they do. | ||
Yeah, but this is all engineered into your program. | ||
Yeah, yeah. | ||
I'm requesting features here. | ||
Yeah, you're requesting features. | ||
But why Android phones? | ||
Is that like, I'm a Republican. | ||
I'm a Democrat. | ||
I like Androids. | ||
I like iPhones. | ||
Is that what you're doing? | ||
You're getting tribal? | ||
No, I'm not getting tribal. | ||
I'm totally not tribal. | ||
I was just representing... | ||
I figured the girl in the red dress just seems like an iPhone as a feature set. | ||
What? | ||
The kind of features I'm asking for... | ||
She's too hot? | ||
Yeah, and it seems like she's not interested in Dostoevsky. | ||
How would you know? | ||
That's so prejudiced of you, just because she's beautiful and she's got a tight-fitting dress? | ||
That's true. | ||
I don't know. | ||
That's very unfair. | ||
How dare you? | ||
You sexist son of a bitch. | ||
I'm sorry. | ||
Actually, that was totally... | ||
She probably likes Nietzsche and Dostoevsky and Kamu and Hesse. | ||
She did her PhD in astrophysics, possibly. | ||
Yeah, no, that's... | ||
We're talking about all the trappings. | ||
Look at that. | ||
Bam, I'll take her all day. | ||
iPhone, Android. | ||
I'm not involved in this conversation. | ||
I'll take her if she's a Windows phone. | ||
How about that? | ||
I don't give a fuck. | ||
Windows phone? | ||
Oh, come on now. | ||
I'll take her if she's a Windows phone. | ||
I'll go with a flip phone from the fucking early 2000s. | ||
I'll take a Razer phone, a Motorola Razer phone with like 37 minutes of battery life. | ||
We're talking about all the learned experiences and preferences that you've developed in your time here in this actual real Earth, or what we're assuming is the actual real Earth. | ||
But how are we... | ||
I mean, if you really are taking into account the possibility that one day something, someone, whether it's artificial intelligence figures it out or we figure it out, engineering a world, some sort of... | ||
Of a simulation that is just as real as this world. | ||
Like where there is no, there's no, it's impossible to discern. | ||
Not only is it impossible to discern, people choose not to discern anymore. | ||
unidentified
|
Right. | |
Because it's so, why bother? | ||
Why bother discerning? | ||
That's a fascinating concept to me. | ||
But I think that world, not to sound hippie or anything, but I think that, I think we live in a world that's pretty damn good. | ||
It is pretty good. | ||
But improving it with such fine ladies walking around is not necessarily the Delta that's positive. | ||
Okay, but that's one aspect of the improvement. | ||
What about improving it in this new world? | ||
There's no drone attacks in Yemen that kill children. | ||
There's no murder. | ||
There's no rape. | ||
There's no sexual harassment. | ||
There's no racism. | ||
All the negative aspects of our current culture are engineered out. | ||
I think a lot of religions have struggled with this. | ||
And of course I would say I would want a world without that. | ||
But part of me thinks that our world is meaningful because of the suffering in the world. | ||
Right, that's a real problem, isn't it? | ||
That is a fascinating concept that's almost impossible to ignore. | ||
Do you appreciate love because of all the hate? | ||
You know, like if you have a hard time finding a girlfriend and just no one's compatible and all the relationships go bad. | ||
I'm single, by the way. | ||
Holla. | ||
Letting the ladies know. | ||
But if you do have a hard time connecting with someone and then you finally do connect with someone after all those years of loneliness and this person's perfectly compatible with you, how much more will you appreciate them than a guy like Dan Bolzerian who's flying around in a private jet banging tens all day long? | ||
Maybe he's fucking drowning in his own sorrow. | ||
Maybe he's got too much prosperity. | ||
Yeah, we have that with social networks too. | ||
The people that... | ||
I mean, you're pretty famous. | ||
The amount of love you get is huge. | ||
It might be because of the overflow of love, it might be difficult to appreciate more like genuine little moments of love. | ||
It's not for me. | ||
No. | ||
I spent a lot of time thinking about that. | ||
And I also spent a lot of time thinking about how... | ||
Titanically bizarre my place in the world is. | ||
I mean, I think about it a lot, and I spent a lot of time being poor and being a loser. | ||
I mean, my childhood was not the best. | ||
I went through a lot of struggle when I was young that I cling to like a safety raft. | ||
You know, I don't ever think there's something special about me. | ||
And I try to let everybody know that anybody can do what I've done. | ||
You just have to just keep going. | ||
It's like 99% of this thing is just showing up and keep going. | ||
Keep improving, keep working at things, and keep going. | ||
Put the time in. | ||
But the interesting thing is you haven't actually, a couple days ago, went back to your first podcast and listened to it. | ||
You haven't really changed much. | ||
So you were, I mean, the audio got a little better. | ||
But just like the genuine nature of the way you interact hasn't changed. | ||
And that's fascinating because, you know, fame changes people. | ||
Well, I was already famous then. | ||
Oh, in a different way. | ||
Yeah, I was already famous from Fear Factor. | ||
I already had stand-up comedy specials. | ||
I'd already been on a sitcom. | ||
Yeah. | ||
I wasn't as famous as I am now, but I understood what it is. | ||
I'm a big believer in adversity and struggle. | ||
I think they're very important for you. | ||
It's one of the reasons why I appreciate martial arts. | ||
It's one of the reasons why I've been drawn to it as a learning tool, not just as something where it's a puzzle that I'm fascinated to try to figure out how to get better at the puzzle. | ||
And martial arts is a really good example because you're never really the best, especially when There's just so many people doing it. | ||
It's like you're always going to get beat by guys. | ||
And then I was never putting the kind of time into it as an adult outside of my Taekwondo competition. | ||
I was never really putting all day every day into it like a lot of people that I would train would. | ||
And so I'd always get dominated by the really best guys. | ||
So there's a certain amount of humility that comes from that as well. | ||
But there's a struggle in that you're learning about yourself and your own limits. | ||
And the limits of the human mind and endurance and just not understanding all the various interactions of techniques. | ||
There's humility to that in that I've always described martial arts as a vehicle for developing your own human potential. | ||
But I think marathon running has similar aspects. | ||
I think when you figure out a way to keep pushing and push through, the control of... | ||
Your mind and your desire and overcoming adversity. | ||
I think overcoming adversity is critical for the human. | ||
For humans, we have this set of reward systems that are designed to reward us for overcoming, for overcoming obstacles, for overcoming relationship struggles, for overcoming for overcoming relationship struggles, for overcoming physical limitations. | ||
And those rewards are great. | ||
And they're some of the most amazing moments in life when you do overcome. | ||
And I think this is sort of engineered into the system. | ||
So for me, fame is almost like a cheat code. | ||
It's like you don't really want it. | ||
Don't dwell on that, man. | ||
That's like a free buffet. | ||
You want to go hunt your own food. | ||
You want to make your own fire. | ||
You want to cook it yourself and feel the satisfaction. | ||
You don't want people feeding you grapes while you lie down. | ||
What is the hardest thing? | ||
So you talk about challenge a lot. | ||
What's the hardest thing? | ||
When have you been really humbled? | ||
Martial arts, for sure. | ||
The most humbling. | ||
Yeah, from the moment I started, I mean, I got really good at Taekwondo, but even then I'd still get the fuck beaten out of me by my friends. | ||
I got training partners, especially when you're tired and you're doing, you know, you're rotating partners and guys are bigger than you. | ||
It's just humbling. | ||
You know, martial arts are very humbling. | ||
Yeah, so that – and I got to call you out on something. | ||
So you talk about education systems sometimes. | ||
I've heard you say a little broken in high school and so on. | ||
I'm not really calling you out. | ||
I just want to talk about it because I think it's important and as somebody who loves math. | ||
You talked – but your own journey was school didn't give you – Uh, passion, value. | ||
Well, you can maybe talk to that, but I, for me, what I always, and maybe I'm sick in the head or something, but for me, math was exciting the way martial arts were exciting for you because it was really hard. | ||
I wanted to quit. | ||
And the idea with education I have that, that That seems to be flawed nowadays a little bit is that we want to make education easier. | ||
That we want to make, you know, more accessible and so on. | ||
Accessible, of course, is great. | ||
But you kind of forget in that, and those are all good goals. | ||
You forget in that it's supposed to be also hard. | ||
And like teachers... | ||
Just the way your wrestling coach, if you, like, quit, you say, I can't do anymore, I have to, you come up with some kind of excuse, your wrestling coach looks at you once and says, get your ass back on the mat. | ||
The same way I wish math teachers did. | ||
When people say, it's almost like cool now to say, ah, it's not, math sucks. | ||
Math's not for me. | ||
Or science sucks. | ||
This teacher's boring. | ||
I think there's room for some culture where it says, no, no, no, you're not. | ||
If you just put in the time and you struggle, then that opens up the universe to you. | ||
Like whether you become a Neil deGrasse Tyson or the next Fields Medal winner in mathematics. | ||
I would not argue with you for one second. | ||
I would also say that one of the more beautiful things about human beings is that we vary so much, and that one person who is just obsessed with playing the trombone, and to me, I don't give a fuck about trombones, but that's okay. | ||
Like, I can't be obsessed about everything. | ||
Some people love golf, and they just want to play it all day long. | ||
I've never played golf a day in my life, except miniature golf, and just fucking around. | ||
But that doesn't, it's not bad or good. | ||
And I think there's definitely some skills that you learn from mathematics that are hugely significant if you want to go into the type of fields that you're involved in. | ||
For me, it's never been appealing. | ||
But it's not that it was just difficult. | ||
It's also that it just, for whatever reason, who I was at that time in that school with those teachers, having the life experience that I had, that was not what I was drawn to. | ||
But what I was drawn to was literature. | ||
I was drawn to reading. | ||
I was drawn to stories. | ||
I was drawn to possibilities and creativity. | ||
I was drawn to all those things. | ||
You were an artist a bit too. | ||
Yeah. | ||
I used to want to be a comic book illustrator. | ||
That was a big thing when I was young. | ||
I was really into comic books. | ||
I was really into... | ||
It was traditional comic books and also a lot of the horror comics from the 1970s, the black and white, like creepy and eerie. | ||
Did you ever see those things? | ||
Creepy and eerie? | ||
Like black and white? | ||
Yeah, they were a comic book series that existed way back in the day. | ||
They were all horror. | ||
And they were really cool illustrations and these wild stories. | ||
But it was comic books, but they were all black and white. | ||
That's creepy and eerie. | ||
Oh, that's the actual name. | ||
Yeah. | ||
Eerie and Creepy were the names. | ||
See, like, that was from what year was that? | ||
It says September, but it doesn't say what year. | ||
I used to get these when I was a little kid, man. | ||
I was like eight, nine years old in the 70s. | ||
Good and evil. | ||
Yeah. | ||
They were my favorite. | ||
That's a cover of them. | ||
They would have covers that were done by Frank Frazetta, Boris Vallejo, and just really cool shit. | ||
I loved those when I was little. | ||
I was always really into horror movies and really into... | ||
Look at this werewolf one. | ||
That was one of my favorite ones. | ||
That was a crazy werewolf that was all fours. | ||
Who's the hero usually? | ||
unidentified
|
Superhero? | |
Everybody dies in those. | ||
That's the beautiful thing about it. | ||
Everybody gets fucked over. | ||
That was the thing that I really liked about them. | ||
Nobody made it out alive. | ||
There was no one guy who figured it out and rescued the woman and they wrote off in the sunset, uh-uh. | ||
You'd turn the corner and there'd be a fucking pack of wolves with glowing eyes waiting to tear everybody apart and that'd be the end of the book. | ||
I was just really into the illustrations. | ||
I found them fascinating. | ||
I love those kind of horror movies and I love those kinds of illustrations. | ||
So that's what I wanted to do when I was young. | ||
Yeah, I think the education system is probably, we talked about creativity, is probably not as good at inspiring and feeding that creativity. | ||
Because I think math and wrestling can be taught systematically. | ||
I think creativity is something, well, actually I know nothing about it. | ||
So I think it's harder to take somebody like you when you're young and say – and inspire you to pursue that fire, whatever is inside. | ||
Well, one of the best ways to inspire people is by giving them these alternatives that are so – It's uninteresting. | ||
Like saying, you're going to get a job selling washing machines. | ||
And you're like, fuck that! | ||
I'm going to figure out a way to not get a job selling washing machines. | ||
Some of the best motivations that I've ever had have been terrible jobs. | ||
Because you have these terrible jobs and you go, okay, fuck that. | ||
I'm going to figure out a way to not do this. | ||
And whether you want to call it ADD or ADHD or whatever it is that makes kids squirm in class. | ||
I didn't squirm in every class. | ||
I didn't squirm in science class. | ||
I didn't squirm in interesting subjects. | ||
There were things that were interesting to me that I would be locked in and completely fascinated by. | ||
And there were things where I just couldn't wait to run out of that room. | ||
And I don't know what... | ||
The reason is, but I do know that a lot of what we call our education system is engineered for a very specific result. | ||
And that result is you want to get a kid who can sit in class and learn so that they can sit in a job and perform. | ||
And that, for whatever reason, that was just... | ||
I mean, I didn't have the ideal childhood. | ||
Maybe if I did, I would be more inclined to lean that way, but... | ||
I didn't want to do anything like that. | ||
Like, I couldn't wait to get the fuck out of school. | ||
So I didn't ever have to listen to anybody like that again. | ||
And then just a few years later, I mean, you graduate from high school when you're 18. When I was 21, I was a stand-up comic. | ||
And I was like, I found it. | ||
This is it. | ||
I'm like good. | ||
I found there's an actual job that nobody told me about where you could just make fun of shit and People go out and they pay money to hear you Create jokes and routines and bits. | ||
Really? | ||
You weren't terrified? | ||
Of stand-up? | ||
No, getting on stage and... | ||
Oh, I was definitely nervous the first time. | ||
Probably more nervous than any... | ||
Seems harder than fighting from my perspective. | ||
No, it's different. | ||
It's different. | ||
The consequences aren't as grave, but that's one of the... | ||
Are they not? | ||
No. | ||
Like embarrassment and not... | ||
You don't get pummeled. | ||
I mean, you could say, like, emotionally it's probably more devastating or as devastating. | ||
But man, losing a fight, it fucks you up for a long time. | ||
You feel like shit for a long time. | ||
But then you win and you feel amazing for a long time, too. | ||
When you kill on stage, you only feel good for like an hour or so and that goes away. | ||
It feels normal. | ||
It's just normal. | ||
It's just life, you know? | ||
But I think that it prepared me, like competing in martial arts, the fear of that, and then how hard it is to stand opposite another person who's the same size as you, who's equally well-trained, who's also a martial arts expert, and they ask you, are you ready? | ||
Are you ready? | ||
You bow to each other, and then they go, fight! | ||
And then you're like, fight! | ||
Here we go like that to me Probably was like one of the best prep and to do that from the time I was 15 till I was 21 was probably the best preparation for anything that was difficult to do because it was so fucking scary and then To go from that into stand-up, I think it prepared me for stand-up because I was already used to doing things that were scary. | ||
And now I seek scary things out. | ||
I seek difficult things out. | ||
Like picking up the bow and learning that. | ||
Yes, archery, which is really difficult. | ||
I mean, there's one of the reasons why I got attracted even to playing pool. | ||
Pool is very difficult. | ||
It's very difficult to control your nerves in high-pressure situations. | ||
So that, there's... | ||
There's some benefits to that. | ||
But it goes back to what you were saying earlier. | ||
How much of all this stuff, like when you were saying that scarcity, there's real value in scarcity, and that there's real value in struggle. | ||
How much of all this is just engineered into our human system that has given us the tools and the incentive to make it to 2018 with the human species? | ||
Yeah, I think it's whoever the engineer is, whether it's God or nature or whatever, I think it's engineered in somehow. | ||
We get to think about that when you try to create an artificial intelligence system. | ||
When you imagine what's a perfect system for you, we talked about this with the lady, what's the perfect system for you? | ||
If you had to really put down on paper and engineer what's the experience of your life, When you start to realize, it actually looks a lot like your current life. | ||
So this is the problem that companies are facing, like Amazon, in trying to create Alexa. | ||
What do you want from Alexa? | ||
Do you want a tool that says what the weather is, or do you want Alexa to say, Joe, I don't want to talk to you right now? | ||
I have. | ||
Alexa, where you have to work her over, like, Alexa, come on. | ||
What did I do? | ||
I'm sorry. | ||
Listen, if I was rude, I was insensitive, I was tired, the commute was really rough. | ||
And they should be like, I'm seeing somebody else. | ||
unidentified
|
Alexa! | |
Do you remember Avatar Depression? | ||
The movie Avatar and depression is a psychological effect after the movie somehow? | ||
Yeah, it was a real term that people were using, that psychologists were using, because people would see the movie Avatar, which I loved. | ||
A lot of people said, oh, it's fucking Pocahontas with blue people. | ||
To those people, I say, fuck off! | ||
You want to talk about suspension of disbelief? | ||
That, to me, that movie was the ultimate suspension of disbelief. | ||
I love that movie. | ||
I fucking love that. | ||
I know James Cameron's working on like 15 sequels right now, all simultaneously. | ||
I wish that motherfucker would dole them out. | ||
He's like a crack dealer that gets you hooked once, and then you're just waiting outside in the cold, shivering for years. | ||
Avatar depression was a psychological term that psychologists were using to describe this mass influx of people that saw that movie and were so enthralled by the way the Na'vi lived in Pandorum that they came back to this stupid world. | ||
Didn't want to leave. | ||
They wanted to be like the blue guy in Avatar. | ||
And it also... | ||
There was a mechanism in that film where this regular person became a Na'vi. | ||
He became it through the Avatar. | ||
And then eventually that Tree of Life or whatever it was, they transferred his essence into this Avatar and he became one of them. | ||
He became one of them. | ||
He absorbed their culture. | ||
And it was very much like our romanticized versions of the Native Americans. | ||
unidentified
|
Mm-hmm. | |
That they lived in symbiotic relationship with the earth. | ||
They only took what they needed. | ||
They had a spiritual connection to their food and to nature and just their existence was noble and it was honorable and it wasn't selfish and it was powerful and it was spiritual and that we're missing these things. | ||
We're missing these things and I think we are better at romanticizing them and craving them as opposed to living them. | ||
I mean, you look at movies like Happy People with... | ||
Life in the Taiga. | ||
Life in the Taiga. | ||
I mean, I'm Russian, so... | ||
Yeah, Warner Herzog's film. | ||
Amazing movie. | ||
Part of you wants to be like, well, I want to be out there in nature, focusing on simple survival, setting traps for animals, cooking some soup, a family around you, and just kind of focusing on the basics. | ||
And I'm the same way. | ||
I go out hiking and I go out in nature. | ||
I would love to pick up hunting. | ||
I crave that. | ||
But if you just put me in the forest, I'll probably be like, I'm taking your phone away and you're staying here. | ||
That's it. | ||
You're never going to return to your Facebook and your Twitter and your robots. | ||
I don't know if I'll be so romantic about that notion anymore. | ||
I don't know either, but I think that's also the genie in the bottle discussion. | ||
I think that genie's been out of the bottle for so long that you'd be like, but what about my Facebook? | ||
What if I got some messages? | ||
Let me check my email real quick. | ||
unidentified
|
No, no, no. | |
We're in the forest. | ||
There's no Wi-Fi out here. | ||
No Wi-Fi ever? | ||
What the fuck? | ||
How do people get your porn? | ||
This is still porn! | ||
unidentified
|
No! | |
That's another understudied, again, not an expert, but the impact of internet pornography on culture. | ||
Oh, yeah. | ||
It's significant and also ignored to a certain extent. | ||
And if not ignored, definitely purposefully... | ||
Left out of the conversation. | ||
Yeah, there's another PhD student. | ||
A person from Google came to give a tech talk and he opened by saying 90% of you in the audience have this month Google the pornographic term in our search engine. | ||
And it was really a great opener because people were just all really uncomfortable. | ||
Yeah. | ||
Because we just kind of hide it away into this. | ||
But it certainly has an impact. | ||
Well, I think there's a suppression aspect to that, too, that's unhealthy. | ||
We have a suppression of our sexuality because we think that somehow or another it's negative. | ||
And especially for women. | ||
I mean, for women, like men, a man who is a sexual conqueror is thought to be a stud, whereas a woman who seeks out A multiple desirable sexual partners is thought to be troubled. | ||
There's something wrong with her. | ||
You know, they're criticized. | ||
They use terms like we used earlier, like slut or whore. | ||
You know, there's no... | ||
You call a man a male slut, they'll start laughing. | ||
Yup, that's me, dude. | ||
Like, men don't give a fuck about that. | ||
It's not stigmatized. | ||
But somehow or another, through our culture, it's stigmatized for women. | ||
And then the idea of masturbation is stigmatized. | ||
All these different things that our Puritan roots of our society start showing, and our religious ideology starts showing when we discuss our Our issues that we have with sex and pornography. | ||
Right, and for me this is something I think about a little bit because my dream is to create an artificial intelligence, a human-centered artificial intelligence system that provides a deep, meaningful connection with another human being. | ||
And you have to consider the fact that pornography or sex dolls will be part of that journey somehow in society. | ||
The dummy they'll be using for martial arts would likely be an out-development of sex robots. | ||
And we have to think about what's the impact of those kinds of robots on society. | ||
Well, women in particular are violently opposed to sex robots. | ||
I've read a couple of articles written by women about sex robots and the possibility of future sex robots. | ||
And I shouldn't say violently, but it's always negative. | ||
So is the idea that men would want to have sex with some beautiful thing that's programmed to love them as opposed to earning the love of a woman. | ||
But you don't hear that same interpretation from men. | ||
From men, it seems to be that there's a thought about maybe it's kind of gross, but also that it's inevitable. | ||
And then there's like this sort of nod to it, like how crazy would that be if you had the perfect woman, like the woman in the red dress in The Matrix. | ||
She comes over to your house and she's perfect. | ||
Because you're not thinking about the alternative, which is a male robot doll, which will now be able to satisfy your girlfriend, wife better than you. | ||
I think you'll hear from guys a lot more then. | ||
Maybe. | ||
Or maybe, like, good luck with her. | ||
She's fucking annoying. | ||
She's always yelling at me. | ||
Let her yell at the robot. | ||
He's not going to care. | ||
Then that robot turns into a grappling gun. | ||
Yeah, and maybe she can just go ahead and get fat with the robot. | ||
He's not even going to care. | ||
Go ahead. | ||
Just sit around, eat Cheetos all day and scream at them. | ||
He's your slave. | ||
Good. | ||
I mean, it can work both ways, right? | ||
It can work the same way that a man would, you know, a woman would see a man that is interested in a sex robot to be disgusting and pathetic. | ||
A man could see the same thing in a woman that's interested in a sex robot. | ||
Like, okay, is that what you want? | ||
You're some crude thing that just wants physical pleasure and you don't even care about a real actual emotional connection to a biological human being? | ||
Like, okay, well then you're not my kind of woman anyway. | ||
Yeah. | ||
But if done well, those are the kinds of – in terms of threats of AI, to me, it can change the fabric of society because like I'm old school in the sense I like monogamy for example. | ||
unidentified
|
Well, you say that because you don't have a girlfriend. | |
So you're longing for monogamy. | ||
One is better than zero. | ||
The real reason I don't have a girlfriend is because, and it's fascinating, with people like you actually, with Elon Musk, the time is a huge challenge because of how much romantic I am, because how much I care about people around me, I feel like it's a significant investment of time. | ||
And also the amount of work that you do. | ||
I mean, if you're dedicated to a passion like artificial intelligence and The sheer amount of fucking studying and research and... | ||
And programming, too. | ||
There's certain disciplines in which you have to... | ||
Certain disciplines require... | ||
Like Stephen Presto talks about writing. | ||
You can get pretty far with two, three hours a day. | ||
When you're programming, when you're... | ||
A lot of the engineering tasks, they just take up hours. | ||
It's just hard. | ||
Which is why I really... | ||
One of the reasons... | ||
I may disagree with you on a bunch of things, but he's an inspiration because I think he's a pretty good dad, right? | ||
And he finds the time for his sons while being probably an order of magnitude busier than I am. | ||
And it's fascinating to me how that's possible. | ||
Well, once you have children... | ||
I mean, there obviously are people that are bad dads. | ||
But once you have children, your life shifts in almost... | ||
It's an indescribable way because you're different. | ||
It's not just that your life is different. | ||
There hasn't been a moment while we're having this conversation that I haven't been thinking about my children. | ||
Thinking about what they're doing, where they are. | ||
It's always running in the background. | ||
It's a part of life. | ||
You're connected to these people that you love so much and they rely on you for guidance and for warmth and affection. | ||
But how did your life have to change? | ||
You just change, man. | ||
When you see the baby, you change. | ||
When you start feeding them, you change. | ||
When you hold them, you change. | ||
You hold their hand while they walk, you change. | ||
When they ask you questions, you change. | ||
When they laugh and giggle, you change. | ||
When they smack you in the face and you pretend to fall down, they laugh, you change. | ||
You just change, man. | ||
You change. | ||
You become a different thing. | ||
You become a dad. | ||
So you almost can't help. | ||
Some people do help, but that's what's sad. | ||
Some people resist it. | ||
I know people that have been terrible, terrible parents. | ||
They'd rather stay out all night and never come home, and they don't want to take care of their kids, and they split up with the wife or the girlfriend who's got the kid, and they don't give child support. | ||
It's a really common theme, man. | ||
I mean, there's a lot of men out there that don't pay child support. | ||
That's a dark, dark thing. | ||
You have a child out there that needs food and you're so fucking selfish. | ||
You don't want to provide resources. | ||
Not only do you not want to be there for companionship, you don't want to provide resources to pay for the child's food. | ||
You don't feel responsible for it. | ||
I mean, that was my case when I was a kid. | ||
My dad didn't pay child support. | ||
And we were very poor. | ||
It's one of the reasons why we were so poor. | ||
And I know other people that have had that same experience. | ||
So it's not everyone that becomes a father or that impregnates, I should say, a woman and becomes a father. | ||
And the other side is true, too. | ||
There's women that are terrible mothers for whatever reason. | ||
I mean, maybe they're broken psychologically. | ||
Maybe they have mental health issues. | ||
Whatever it is, there's some women that are fucking terrible moms. | ||
And it's sad. | ||
But it makes you appreciate women that are great moms so much more. | ||
Yeah, when I see guys like you, the inspiration is, so I'm looking for sort of structural, what's the process to then fit people into your life? | ||
But what I hear is, when it happens, you just do. | ||
unidentified
|
You change. | |
But this is the thing, man. | ||
We're not living in a book. | ||
We're not living in a movie. | ||
It doesn't always happen. | ||
Like, you have to decide that you want it to happen, and you've got to go looking for it, because if you don't, you could just be older, right? | ||
And still alone. | ||
There's a lot of my friends that have never had kids and now they're in their 50s. | ||
I mean, comedians, right? | ||
unidentified
|
Yes! | |
You have to be on the road a lot. | ||
Not just on the road. | ||
You have to be obsessed with comedy. | ||
Like, it's got to be something... | ||
You're always writing new jokes because you're always writing a new... | ||
Especially if you put out a special, right? | ||
Like, I just did a Netflix special. | ||
It's out now. | ||
So I really have like a half hour new material. | ||
That's it. | ||
It's great by the way, Strange Times. | ||
Thank you very much. | ||
It's the first special I've watched. | ||
It was actually really weird, sorry to go on a tangent, but I've listened to you quite a bit, but I've never looked at you doing comedy. | ||
And it was so different. | ||
Because like here you're just like improvising, you're like a jazz musician here. | ||
It's like a regular conversation. | ||
The stand-up special, it was clear, everything is perfect. | ||
The timing, it's like watching you do a different art almost. | ||
It's kind of interesting. | ||
It's like a song or something. | ||
There's some riffing to it, there's some improvisation to it, but there's a very clear structure to it. | ||
But it's so time intensive, and you've got to be obsessed with it to continue to do something like that. | ||
So for some people, that travel and the road, that takes priority over all things, including relationships, and then you never really settle down. | ||
And so you never have a significant relationship with someone that you could have a child with. | ||
And I know many friends that are like that. | ||
And I know friends that have gotten vasectomies because they don't want it. | ||
They like this life. | ||
And there's nothing wrong with that either. | ||
I always was upset by this notion that in order to be a full and complete adult, you have to have a child. | ||
You have to be a parent. | ||
And I think even as a parent, where I think it's probably one of the most significant things in my life, I reject that notion. | ||
I think you could absolutely be a fully developed person and an amazing... | ||
Influence in society, an amazing contributor to your culture and your community without ever having a child, whether you're a man or a woman. | ||
It's entirely possible. | ||
And the idea that it's not as silly. | ||
Like, we're all different in so many different ways, you know, and we contribute in so many different ways. | ||
Like, there's going to be people that are obsessed with mathematics, there's going to be people that are obsessed with literature, there's going to be people that are obsessed with music, and they don't all have to be the same fucking person, because you really don't have enough time for it to be the same person. | ||
You know, and there's going to be people that love having children. | ||
They love being a dad or love being a mom. | ||
And there's going to be people that don't want to have nothing to do with that and they get snipped early and they're like, fuck off! | ||
I'm going to smoke cigarettes and drink booze and I'm going to fly around the world and talk shit. | ||
And those people are okay too. | ||
Like, the way we interact with each other that's most important. | ||
That's what I think. | ||
The way human beings The way we form bonds and friendships, the way we contribute to each other's lives, the way we find our passion and create, those things are what's really important. | ||
Yeah, but there's also an element – just looking at my parents, I think they got – they're still together. | ||
They've gotten together what – I mean there's standards you get together when you're like 20 or – I should know this, but 23, 20, whatever, young. | ||
And there is an element there where you don't want to be too rational. | ||
You just want to be – just dive in. | ||
Right. | ||
Should you be an MMA fighter? | ||
Should you be – Like, I'm in academia now, so I'm a research scientist at MIT. The pay is much, much lower than all the offers I'm getting non-stop. | ||
Is it rational? | ||
I don't know. | ||
But your passion is doing what you're doing currently. | ||
unidentified
|
Yeah. | |
But it's like it's... | ||
What are the other offers? | ||
Like what kind of other jobs? | ||
Are they appealing in any way? | ||
Yeah. | ||
Yeah, they're appealing. | ||
So I'm making a decision that's similar to actually getting married, which is... | ||
So the offers are... | ||
Well, I shouldn't call them out, but Google phase, but the usual AI research, pretty high positions. | ||
And the... | ||
There's just something in me that says the edge, the chaos of this environment at MIT is something I'm drawn to. | ||
It doesn't make sense. | ||
So I can do what I'm passionate about in a lot of places. | ||
You just kind of dive in. | ||
I had a sense that a lot of our culture creates that momentum. | ||
You just kind of have to go with it. | ||
That's why my parents got together. | ||
A lot of couples wouldn't be together if they weren't culturally forced to be together and divorce was such a negative thing. | ||
They grew together and created a super happy connection. | ||
I'm a little afraid of over-rationality about choosing the path of life. | ||
You're saying relationship don't always make sense. | ||
They don't have to make sense. | ||
I think I'm a big believer in doing what you want to do. | ||
And if you want to be involved in a monogamous relationship, I think you should do it. | ||
But if you don't want to be involved in one, I think you should do that too. | ||
I mean, if you want to be like a nomad and travel around the world and just live out of a backpack, I don't think there's anything wrong with that. | ||
As long as you're healthy and you survive and you're not depressed and you're not longing for something that you're not participating in... | ||
But I think when you are doing something, you don't want to be doing it. | ||
It brings me back to, was it Thoreau's quote, I guess? | ||
I always fuck up who made this. | ||
What? | ||
I think I know which one you're going to say. | ||
Yeah, most men live lives of silent desperation. | ||
That's real, man. | ||
That's real. | ||
That's what you don't want. | ||
I think it's Thoreau, right? | ||
You don't want silent desperation. | ||
I fucking love that quote because I've seen it. | ||
I've seen it in so many people's faces. | ||
And that's one thing I've managed to avoid. | ||
And I don't know if I avoided that by luck or just by the fact I'm stupid and I just follow my instincts whether they're right or wrong and I make it work. | ||
This goes back to what we were discussing in terms of what is the nature of reality and are we just finding these Romanticized interpretations of our own biological needs and our human reward systems that's creating these beautiful visions of what is life and what is important, poetry and food and music and all the passions and dancing and holding someone in your arms that you care for deeply and all those things just little tricks. | ||
Are all those little biological tricks in order to just keep on this very strange dance of human civilization so that we can keep on creating new and better products that keep on moving innovation towards this ultimate eventual goal of artificial intelligence, of giving birth to the gods. | ||
Yeah, giving birth to the gods. | ||
Yeah, so, you know, I did want to mention one thing about the one thing I really, I don't understand fully, but I've been thinking about for the last couple of years, the application of artificial intelligence to politics. | ||
I've heard you talk about sort of government being broken in the sense that one guy, one president, that doesn't make any sense. | ||
So you get like – people get hundreds of millions of likes on their Facebook pictures and Instagram and we're always voting with our fingers every single day. | ||
And yet for the election process, it seems that we're voting like once every four years. | ||
unidentified
|
Right. | |
It feels like this new technology could bring about a world where the voice of the people can be heard on a daily basis, where you could speak about the issues you care about, whether it's gun control and abortion, all these topics that are so debated. | ||
It feels like there needs to be an Instagram for our elections. | ||
I agree, yeah. | ||
And I think there's room for that. | ||
I've been thinking about how to write a few papers of proposing different technologies. | ||
It just feels like the people that are playing politics are old school. | ||
The only problem with that is the influencers. | ||
If you look at Instagram, I mean, should... | ||
Nicki Minaj be able to decide how the world works because she's got the most followers should Kim Kardashian like who's influencing things and why and you have to deal with the fickle nature of human beings and Do we give enough patience? | ||
Towards the decisions of these so-called leaders that we're electing door door. | ||
Do we just decide fuck them? | ||
They're out new person in because we have like a really short attention span when it comes to think especially today at the news cycle so quick and So the same process, so Instagram might be a bad example because, yeah, you get Twitter, you start following Donald Trump, and you start to sort of idolize these certain icons that do we necessarily want them to represent us. | ||
I was more thinking about the Amazon product reviews model, recommender systems, or Netflix, the movies you've watched, the Netflix learning enough about you to represent you in your next movie selection. | ||
So in the kind of movies, like you, Joe Rogan, what are the kind of movies that you would like? | ||
The recommender systems, these artificial intelligence systems, learn based on your Netflix selection, that could be deeper understanding of who you are than you're even aware of. | ||
And I think there's that element. | ||
I'm not sure exactly, but there's that element of learning who you are. | ||
Like, do you think drugs should be legalized or not? | ||
Do you think immigration – should we let everybody in or keep everybody out? | ||
Should we – all these topics with the red and blue teams now have a hard answer. | ||
Of course you keep all the immigrants out or of course you need to be more compassionate. | ||
Of course. | ||
But for most people, it's really a gray area. | ||
And exploring that gray area the way you would explore the gray area of Netflix, what is the next movie you're watching? | ||
Do you want to watch Little Mermaid or Godfather 2? | ||
That process of understanding who you are, it feels like there's room for that in our book. | ||
Well, the problem is, of course, that there's grave consequences to these decisions that you're going to make in terms of the way it affects the community, and you might not have any information that you're basing this on at all. | ||
You might be basing all these decisions on... | ||
Misinformation, propaganda, nonsense, advertising. | ||
You could be easily influenced. | ||
You might not have looked into it at all. | ||
You could be ignorant about the subject and it might just appeal to certain dynamics that have been programmed into your brain because you grew up religious or you grew up an atheist. | ||
The real problem is whether or not people are educated about the consequences of what these decisions were going to lead. | ||
It's information. | ||
I mean, I think there's going to be a time in our life where our ability to access information is many steps better than it is now with smartphones. | ||
I think we're going – like Elon Musk has some Neuralink thing that he's working on right now. | ||
He's being very vague about it. | ||
Increase the bandwidth of our human interaction with machines is what he's working on. | ||
Yeah. | ||
I'm very interested to see where this leads, but I think that we can assume that because something like the internet came along and because it's so accessible to you and I right now with your phone, just pick it up, say, hey, Google, what the fuck is this? | ||
And you get the answer almost instantaneously. | ||
That's gonna change what a person is as that advances. | ||
And I think we're much more likely looking at some sort of a symbiotic connection between us and artificial intelligence and computer-augmented access to information than we are looking at the rise of some artificial being that takes us over and fucks our girlfriend. | ||
Wow, yeah, that's the real existential threat. | ||
Yeah, I think so. | ||
That's, to me, super exciting. | ||
The phone is a portal to this collective that we have, this collective consciousness, and it gives people a voice. | ||
I would say, if anyone's like me, you really know very little about the politicians you're voting for, or even the issues. | ||
Like, global warming, I'm embarrassed to say, I, like, I know very little about, like, if I'm actually being honest with myself, I've heard different, like, I know what I'm supposed to believe as a scientist, but I actually know nothing about... | ||
Concrete, right? | ||
Nothing concrete about... | ||
About the process itself. | ||
About the environmental process and why is it so certain? | ||
You know, scientists apparently completely agree, so as a scientist, I kind of take on faith oftentimes what the community agrees. | ||
In my own discipline, I question. | ||
But outside, I just kind of take on faith. | ||
And the same thing with gun control and so on. | ||
You just kind of say, which team am I on? | ||
And I'm just going to take that on. | ||
I just feel like it's such a disruptible space to where people could be given just a tiny bit more information to help them. | ||
Well, maybe that's where something like Neuralink comes along and just enhances our ability to access this stuff in a way that's much more tangible than just being able to Google search it. | ||
And maybe this process is something that we really can't anticipate. | ||
It's going to have to happen to us, just like we were talking about cell phone images that you could just send to Australia with the click of a button that no one would have ever anticipated that 300 years ago. | ||
Maybe we are beyond our capacity for understanding The impact of all this stuff. | ||
The kids coming up now. | ||
What is that world going to look like? | ||
When you're too old, you'll be like 95 sitting on a porch with a shotgun with Clint Eastwood. | ||
And what do those kids look like when they're 18 years old? | ||
Robots. | ||
Fucking x-ray vision and they could read minds. | ||
Yeah, what is going to happen? | ||
You'd be saying robots are everywhere these days. | ||
Back in my day, we used to put robots in their place. | ||
Yeah, right? | ||
Like they were servants. | ||
I'd shut them off. | ||
Pull the plug. | ||
I'd go fuck your mom. | ||
Now they want to go to the same school as us? | ||
unidentified
|
Yeah, and they want to run for president. | |
They want to run for president. | ||
Yeah, they're more compassionate and smarter, but we still hate them because they don't go to the bathroom. | ||
Yeah. | ||
Well, not we. | ||
Half the country will hate them, and the other will love them, and the Abraham Lincoln character will come along. | ||
That's what I'm pitching myself for. | ||
Yeah, Abraham Lincoln of the robot world. | ||
Oh, the robot world. | ||
That's the speeches that everybody quotes. | ||
And one other thing I've got to say about academia. | ||
Okay. | ||
In defense of academia. | ||
So you've had a lot of really smart people on, including Sam Harris and Jordan Peterson. | ||
And often the word academia is used to replace a certain concept. | ||
So I'm part of academia. | ||
And most of academia is engineering, is biology, is medicine, is hard sciences. | ||
It's the humanities that are slippery. | ||
Exactly. | ||
And I think a subset of humanities that I know nothing about and they're a subset I don't want to speak about. | ||
Gender studies. | ||
Say it! | ||
I don't know. | ||
I don't know. | ||
Candyman! | ||
I actually live on Harvard campus. | ||
I'm at MIT, but I live on Harvard campus. | ||
It's there. | ||
Do they have apartments for you guys? | ||
How does that work? | ||
Yeah, they hand them out. | ||
No, I just don't care. | ||
When you say live on the campus, what do you mean? | ||
Oh, sorry, like in Harvard Square. | ||
Oh, Harvard Square in Cambridge. | ||
In Cambridge, yeah. | ||
I used to go to Catch a Rising Star when it existed. | ||
There used to be a great comedy club in Cambridge. | ||
There's a few good comedy clubs there, right? | ||
Well, there's a Chinese restaurant that has stand-up there still. | ||
How does that work? | ||
Well, it's upstairs. | ||
There's like this comedy club up there. | ||
Yeah. | ||
Do you ever, because you've done, I think your specials in Boston? | ||
Yes, I did at the Wilbur Theatre. | ||
Have you ever considered just going back to Boston and doing like that Chinese restaurant? | ||
The Ding Ho? | ||
Yeah. | ||
That was before my time. | ||
When I came around, I started in 1988, the ding-ho had already ended. | ||
But I got to be friends with guys like Lenny Clark and Tony V and all these people that told me about the ding-ho and Kenny Rogerson, the comics that were... | ||
And Barry Crimmins, who just passed away, rest in peace, who was really the godfather of that whole scene. | ||
And one of the major reasons why that scene was so... | ||
It had such... | ||
Really some rock-solid morals and ethics when it came to the creation of material and standards. | ||
A lot of it was Barry Crimmins because that's just who he was as a person. | ||
But that was before my time. | ||
I came around like four years after that stuff. | ||
And so there was tons of comedy clubs. | ||
It was everywhere, but I just didn't get a chance to be around that Ding Ho scene. | ||
And you stayed in Boston for how many years before you moved out here? | ||
I was in New York in, by the time, I think I was in New York by 92, 91, 92. So I was in Boston for like four or five years doing stand-up. | ||
How'd you get from Boston to New York? | ||
My manager. | ||
Met my manager. | ||
I wanted to use this opportunity for you to talk. | ||
unidentified
|
About what? | |
Share about Connecticut. | ||
unidentified
|
Oh. | |
People from Connecticut get so upset at me. | ||
It's become a running theme to talk shit about Connecticut here. | ||
I've heard you do it once. | ||
I just had a buddy who did a gig in Connecticut. | ||
He told me it was fucking horrible. | ||
I go, I told you, bitch. | ||
You should have listened to me. | ||
Don't book gigs in Connecticut. | ||
The fuck's wrong with you? | ||
There's 49 other states. | ||
Go to Alaska. | ||
It's great. | ||
You go back to Boston and do small gigs? | ||
Sometimes, yeah. | ||
I'll do... | ||
Yeah, Laugh Boston is a great club. | ||
I used to do Nick Comedy Stop and all the other ones there. | ||
But I love the Wilbur. | ||
The Wilbur is a great place to perform. | ||
I love Boston. | ||
I would live there if it wasn't so fucking cold in the winter. | ||
But that's what keeps people like me out. | ||
It keeps the pussies away. | ||
Listen, we've got to end this. | ||
We've got to wrap it up. | ||
We've already done three hours, believe it or not. | ||
It flies by. | ||
It did. | ||
It flew by. | ||
Can I say two things? | ||
Sure, sure. | ||
So first, I've got to give a shout out to my... | ||
Shout out? | ||
Shout out. | ||
To a long, long time friend, Matt Harandi from Chicago. | ||
He's been there all along. | ||
He's a fan of the podcast, so he's probably listening. | ||
Him and his wife, Fadi, had a beautiful baby girl. | ||
So I went to send my love to him. | ||
And I told myself I'll end it this way. | ||
Okay. | ||
Let me end it the way Elon ended it. | ||
Love is the answer. | ||
Love is the answer. | ||
It probably is. | ||
Unless you're a robot. | ||
Bye! |