| Speaker | Time | Text |
|---|---|---|
|
unidentified
|
Joe Rogan podcast, check it out! | |
| The Joe Rogan experience. | ||
| Trade by day, Joe Rogan, podcast by night, all day! | ||
| Good to see you again. | ||
| We were just talking about, was that the first time we ever spoke? | ||
| Or was the first time we spoke at SpaceX? | ||
| SpaceX. | ||
| SpaceX was the first time. | ||
| When you were giving Elon that crazy AI chip. | ||
| Right, DGX Spark. | ||
| Yeah. | ||
| Ooh, that was a big moment. | ||
| That was a huge. | ||
| That felt crazy to be there. | ||
| I was like watching these wizards of tech exchange information and you're giving him this crazy device, you know? | ||
| And then the other time was I was shooting arrows in my backyard and randomly get this call from Trump and he's hanging out with you. | ||
| President Trump called and I called you. | ||
| Yeah, it's just we were talking about you. | ||
| We're talking about you. | ||
| He was talking about the UFC thing he was going to do in his front yard. | ||
| Yeah. | ||
| And he pulls out, he's, Jensen, look at this design. | ||
| He's so proud of it. | ||
| And I go, you're going to have a fight in the front lawn in the White House? | ||
| He goes, yeah. | ||
| Yeah, you're going to come. | ||
| This is going to be awesome. | ||
| And he's showing me his design and how beautiful it is. | ||
| And he goes, and somehow your name comes up. | ||
| He goes, do you know Joe? | ||
| I was like, yeah, I'm going to be on his podcast. | ||
|
unidentified
|
He's, let's call him. | |
| He's like a kid. | ||
| I know. | ||
| Let's call him. | ||
| He's like a nine-year-old kid. | ||
| He's so incredible. | ||
| Yeah, he's an odd guy. | ||
| Just very different. | ||
| You know, like what you'd expect from him, very different than what people think of him. | ||
| And also just very different as a president. | ||
| A guy who just calls you or texts you out of the blue. | ||
| Also, when he texts you, you have an Android, so it won't go through with you. | ||
| But with my iPhone, he makes the text go big. | ||
| Like, you know, USA is respected again. | ||
| Like all caps and makes the text enlarge. | ||
| It's kind of ridiculous. | ||
| Well, the one-on-one Trump President Trump is very different. | ||
| He surprised me. | ||
| First of all, he's an incredibly good listener. | ||
| Almost everything I've ever said to him, he's remembered. | ||
| Yeah, people don't, they only want to look at negative stories about him or negative narratives about him. | ||
| You know, you can catch anybody on a bad day. | ||
| Like, there's a lot of things he does where I don't think he should do. | ||
| Like, I don't think he should say to a reporter, quiet piggy. | ||
| Like, that's pretty ridiculous. | ||
| Also, objectively funny. | ||
| I mean, it's unfortunate that it happened to her. | ||
| I wouldn't want that to happen to her, but it was funny. | ||
| Just ridiculous that the president does that. | ||
| I wish he didn't do that. | ||
| But other than that, he's an interesting guy. | ||
| Like, he's a lot of different things wrapped up into one person, you know? | ||
| You know, part of his charm, well, part of his genius is just he says what's on his mind. | ||
| Yes. | ||
| And he's an anti-politician in a lot of ways. | ||
| So, you know, what's on his mind is really what's on his mind, which he's telling people what he believes. | ||
| I do. | ||
| Well, look at that. | ||
| Some people. | ||
| Some people would rather be lied to. | ||
| Yeah. | ||
| But I like the fact that he's telling you what's on his mind. | ||
| Almost every time he explains something and he says something, he starts with his, you could tell his love for America, what he wants to do for America. | ||
| And everything that he thinks through is very practical and very common sense. | ||
| And, you know, it's very logical. | ||
| And I still remember the first time I met him. | ||
| And so this was, I'd never known him, never met him before. | ||
| And Secretary Lutnick called, and we met right at the beginning of the administration. | ||
| And he said, he told me what was important to President Trump, that United States manufactures on shore. | ||
| And that was really important to him because it's important to national security. | ||
| He wants to make sure that the important critical technology of our nation is built in the United States and that we reindustrialize and get good at manufacturing again because it's important for jobs. | ||
| It just seems like common sense, right? | ||
| Incredible common sense. | ||
| And that was like literally the first conversation I had with Secretary Lutnick. | ||
| And he was talking about how that he started our conversation with Jensen. | ||
| This is Secretary Lutnick. | ||
| And I just want to let you know that you're a national treasure. | ||
| NVIDIA is a national treasure. | ||
| And whenever you need access to the president, the administration, you call us, we're always going to be available to you. | ||
| Literally, that was the first sentence. | ||
| That's pretty nice. | ||
| And it was completely true. | ||
| Every single time I called, if I needed something, I want to get something off my chest, express some concern, they're always available. | ||
| Incredible. | ||
| It's just unfortunate we live in such a politically polarized society that you can't recognize good common sense things if they're coming from a person that you object to. | ||
| And that, I think, is what's going on here. | ||
| I think most people generally, as a country, you know, as a giant community, which we are, it just only makes sense that we have manufacturing in America, especially critical technology like you're talking about. | ||
| Like, it's kind of insane that we buy so much technology from other countries. | ||
| If the United States doesn't grow, we will have no prosperity. | ||
| We can't invest in anything domestically or otherwise. | ||
| We can't fix any of our problems. | ||
| If we don't have energy growth, we can't have industrial growth. | ||
| If we don't have industrial growth, we can't have job growth. | ||
| It's as simple as that. | ||
| And the fact that he came into office and the first thing that he said was drill, baby, drill, his point is we need energy growth. | ||
| Without energy growth, we can have no industrial growth. | ||
| And that was, it saved, it saved the AI industry. | ||
| I got to tell you flat out, if not for his pro-growth energy policy, we would not be able to build factories for AI. | ||
| We would not be able to build chip factories. | ||
| We surely won't be able to build supercomputer factories. | ||
| None of that stuff would be possible. | ||
| And without all of that, construction jobs would be challenged, right? | ||
| Electrician jobs, all of these jobs that are now flourishing would be challenged. | ||
| And so I think he's got it right. | ||
| We need energy growth. | ||
| We want to reindustrialize the United States. | ||
| We need to be back in manufacturing. | ||
| Every successful person doesn't need to have a PhD. | ||
| Every successful person doesn't have to have gone to Stanford or MIT. | ||
| And I think that that sensibility is spot on. | ||
| Now, when we're talking about technology growth and energy growth, there's a lot of people that go, oh, no, that's not what we need. | ||
| We need to simplify our lives and get back. | ||
| But the real issue is that we're in the middle of a giant technology race. | ||
| And whether people are aware of it or not, whether they like it or not, it's happening. | ||
| And it's a really important race because whoever gets to whatever the event horizon of artificial intelligence is, whoever gets there first, has massive advantages in a huge way. | ||
| Do you agree with that? | ||
| Well, first, the part I will say that we are in a technology race, and we are always in a technology race. | ||
| We've been in a technology race with somebody forever. | ||
| Right. | ||
| Since the Industrial Revolution, we've been in a technology race. | ||
| Since the Manhattan Project. | ||
| Yeah. | ||
| Or, you know, even going back to the discovery of energy, right? | ||
| The United Kingdom was where the Industrial Revolution was, if you will, invented, when they realized that they can turn steam and such into energy, into electricity. | ||
| All of that was invented largely in Europe. | ||
| And the United States capitalized on it. | ||
| We were the ones that learned from it. | ||
| We industrialized it. | ||
| We diffused it faster than anybody in Europe. | ||
| They were all stuck in discussions about policy and jobs and disruptions. | ||
| Meanwhile, the United States was forming. | ||
| We just took the technology and ran with it. | ||
| And so I think we were always in a bit of a technology race. | ||
| World War II was a technology race. | ||
| Manhattan Project was a technology race. | ||
| We've been in the technology race ever since during the Cold War. | ||
| I think we're still in a technology race. | ||
| It is probably the single most important race. | ||
| It is the technology gives you superpowers, you know, whether it's information superpowers or energy superpowers or military superpowers is all founded in technology. | ||
| And so technology leadership is really important. | ||
| Well, the problem is if somebody else has superior technology, right? | ||
| That's the issue. | ||
| That's right. | ||
| It seems like with the AI race, people are very nervous about it. | ||
| Like, you know, Elon has famously said that it's like 80% chance it's awesome. | ||
| 20% chance we're in trouble. | ||
| And people are worried about that 20%, rightly so. | ||
| I mean, you know, if you had 10 bullets in a revolver and you took out eight of them, you still have two in there and you spin it, you're not going to feel real comfortable when you pull that trigger. | ||
| It's terrifying. | ||
| Right. | ||
| And when we're working towards this ultimate goal of AI, it's impossible to imagine that it wouldn't be of national security interest to get there first. | ||
| The question is, what's there? | ||
| That was the part that. | ||
| What is there? | ||
| Yeah, I'm not sure. | ||
| And I don't think anybody really knows. | ||
| That's crazy, though. | ||
| If I ask you, you're the head of NVIDIA. | ||
| If you don't know what's there, who knows? | ||
| Yeah, I think it's probably going to be much more gradual than we think. | ||
| It won't be a moment. | ||
| It won't be as if somebody arrived and nobody else has. | ||
| I don't think it's going to be like that. | ||
| I think it's going to be things just get better and better and better and better, just like technology does. | ||
| So you are rosy about the future. | ||
| You're very optimistic about what's going to happen with AI. | ||
| Obviously, will you make the best AI chips in the world? | ||
| You probably better be. | ||
| If history is a guide, we were always concerned about new technology. | ||
| Humanity has always been concerned about new technology. | ||
| There are always somebody who's thinking. | ||
| There are always a lot of people who are quite concerned. | ||
| We're quite concerned. | ||
| And so if history is a guide, it is the case that all of this concern is channeled into making the technology safer. | ||
| And so, for example, in the last several years, I would say AI technology has increased probably in the last two years alone, maybe 100x. | ||
| Let's just give it a number. | ||
| It's like a car two years ago was 100 times slower. | ||
| So AI is 100 times more capable today. | ||
| Now, how did we channel that technology? | ||
| How do we channel all of that power? | ||
| We directed it to causing the AI to be able to think, meaning that it can take a problem that we give it, break it down step by step. | ||
| It does research before it answers. | ||
| And so it grounds it on truth. | ||
| It'll reflect on that answer, ask itself: is this the best answer that I can give you? | ||
| Am I certain about this answer? | ||
| If it's not certain about the answer or highly confident about the answer, it'll go back and do more research. | ||
| It might actually even use a tool because that tool provides a better solution than it could hallucinate itself. | ||
| As a result, we took all of that computing capability and we channeled it into having it produce a safer result, safer answer, a more truthful answer. | ||
| Because as you know, one of the greatest criticisms of AI in the beginning was that it hallucinated. | ||
|
unidentified
|
Right. | |
| And so if you look at the reason why people use AI so much today, it's because the amount of hallucination has reduced. | ||
| You know, I use it almost, well, I used it the whole trip over here. | ||
| And so I think the capability, most people think about power and they think about, you know, maybe as an explosion power. | ||
| But the technology power, most of it is channeled towards safety. | ||
| A car today is more powerful, but it's safer to drive. | ||
| A lot of that power goes towards better handling. | ||
| You know, I'd rather have a well, you have a thousand horsepower truck. | ||
| I think 500 horsepower is pretty good. | ||
| No, a thousand better. | ||
| I think a thousand is better. | ||
| I don't know if it's better, but it's definitely faster. | ||
| Yeah, no, I think it's better. | ||
| You can get out of trouble faster. | ||
| I enjoyed my 599 more than my 612. | ||
| I think it was better, and more horsepower is better. | ||
| My 459 is better than my 430. | ||
| More horsepower is better. | ||
| I think more horsepower is better. | ||
| I think it's better handling, it's better control. | ||
| In the case of technology, it's also very similar in that way. | ||
| And so if you look at what we're going to do with the next thousand times of performance in AI, a lot of it is going to be channeled towards more reflection, more research, thinking about the answer more deeply. | ||
| So when you're defining safety, you're defining it as accuracy. | ||
| Functionality. | ||
| Functionality. | ||
| It does what you expect it to do. | ||
| And then you take the technology and the horsepower, you put guardrails on it, just like our cars. | ||
| We've got a lot of technology in a car today. | ||
| A lot of it goes towards, for example, ABS. | ||
| ABS is great. | ||
| And so traction control. | ||
| That's fantastic. | ||
| Without a computer in the car, how would you do any of that? | ||
| And that little computer, the computers that you have doing your traction control, is more powerful than the computer that went to Apollo 11. | ||
| And so you want that technology, channel it towards safety, channel it towards functionality. | ||
| And so when people talk about power, the advancement of technology, oftentimes I feel what they're thinking and what we're actually doing is very different. | ||
| Well, what do you think they're thinking? | ||
| Well, they're thinking somehow that this AI is being powerful, and their mind probably goes towards a sci-fi movie, the definition of power. | ||
| Oftentimes, the definition of power is military power or physical power. | ||
| But in the case of technology power, when we translate all of those operations, it's towards more refined thinking, more reflection, more planning, more options. | ||
| I think the big fears that people have is one, a big fear is military applications. | ||
| That's a big fear. | ||
| Because people are very concerned that you're going to have AI systems that make decisions that maybe an ethical person wouldn't make or a moral person wouldn't make based on achieving an objective versus based on how it's going to look to people. | ||
| Well, I'm happy that our military is going to use AI technology for defense. | ||
| And I think that Andoril building military technology, I'm happy to hear that. | ||
| I'm happy to see all these tech startups now channeling their technology capabilities towards defense and military applications. | ||
| I think you need to do that. | ||
| Yeah, we had Palmer Lucky on the podcast and he was demonstrating some of his stuff. | ||
| He's got a helmet on. | ||
| And he showed some videos how you could see behind walls and stuff. | ||
| It's nuts. | ||
| He's actually the perfect guy to go start that company. | ||
| 100%. | ||
| Yeah, 100%. | ||
| It's like he's born for that. | ||
| Yeah. | ||
| He came in here with a copper jacket on. | ||
| He's a freak. | ||
| It's awesome. | ||
| He's awesome. | ||
| But it's also, it's an unusual intellect channeled into that very bizarre field is what you need. | ||
| And I think it's happy that we're making it more socially acceptable. | ||
| There was a time where when somebody wanted to channel their technology capability and their intellect into defense technology, somehow they're vilified. | ||
| But we need people like that. | ||
| We need people who enjoyed that part of application of technology. | ||
| Well, people are terrified of war. | ||
| So it's the best way to avoid it has excessive military might. | ||
| Do you think that's absolutely the best way? | ||
| Not diplomacy, not working stuff out? | ||
| All of it. | ||
| All of it. | ||
| You have to have military might in order to get people to sit down. | ||
| Right, exactly. | ||
| All of it. | ||
| Otherwise, they just invade. | ||
| That's right. | ||
| Why ask for permission? | ||
| Again, like you said, history. | ||
| Go back and look at history. | ||
| That's right. | ||
| When you look at the future of AI, and you just said that no one really knows what's happening. | ||
| Do you ever sit down and ponder scenarios? | ||
| What do you think is the best case scenario for AI over the next two decades? | ||
| The best case scenario is that AI diffuses into everything that we do, and everything's more efficient, but the threat of war remains a threat of war. | ||
| Cybersecurity remains a super difficult challenge. | ||
| Somebody is going to try to breach your security. | ||
| You're going to have thousands of millions of AI agents protecting you from that threat. | ||
| Your technology is going to get better. | ||
| Their technology is going to get better, just like cybersecurity. | ||
| Right now, while we speak, we're seeing cyber attacks all over the planet on just about every front door you can imagine. | ||
| And yet, you and I are sitting here talking. | ||
| And so the reason for that is because we know that there's a whole bunch of cybersecurity technology in defense. | ||
| And so we just have to keep amping that up, keep stepping that up. | ||
| This episode is brought to you by Visible. | ||
| When your phone plan's as good as Visible, you've got to tell your people. | ||
| It's the ultimate wireless hack to save money and still get great coverage and a reliable connection. | ||
| Get one-line wireless with unlimited data and hotspot for $25 a month, taxes and fees included, all on Verizon's 5G network. | ||
| Plus, now for a limited time, new members can get the Visible plan for just $19 a month for the first 26 months. | ||
| Use promo code switch26 and save beyond the season. | ||
| It's a deal so good, you're going to want to tell your people. | ||
| Switch now at visible.com slash Rogan. | ||
| Terms apply, limited time offers subject to change. | ||
| See visible.com for planned features and network management details. | ||
| That's a big issue with people: the worry that technology is going to get to a point where encryption is going to be obsolete. | ||
| Encryption is just, it's no longer going to protect data, it's no longer going to protect systems. | ||
| Do you anticipate that ever being an issue, or do you think it's as the defense grows, the threat grows, then defense grows, and it just keeps going on and on and on, and they'll always be able to fight off any sort of intrusions? | ||
| Not forever. | ||
| Some intrusion will get in, and then we'll all learn from it. | ||
| And you know, the reason why cybersecurity works is because, of course, the technology of defense is advancing very quickly. | ||
| The technology offense is advancing very quickly. | ||
| However, the benefit of the cybersecurity defense is that socially, the community, all of our companies, work together as one. | ||
| Most people don't realize this. | ||
| There's a whole community of cybersecurity experts. | ||
| We exchange ideas, we exchange best practices, we exchange what we detect. | ||
| The moment something has been breached, or maybe there's a loophole, or whatever it is, it is shared by everybody. | ||
| The patches are shared with everybody. | ||
| That's interesting. | ||
| Yeah. | ||
| Most people don't realize that. | ||
| No, I had no idea. | ||
| I've assumed that it would just be competitive like everything else. | ||
| No one keeps it. | ||
| We work together. | ||
| All of us. | ||
| Has that always been the case? | ||
| It surely has been the case for about 15 years. | ||
| It might not have been the case long ago. | ||
| But this. | ||
| What do you think started off that cooperation? | ||
| People recognizing it's a challenge and no company can stand alone. | ||
| And the same thing is going to happen with AI. | ||
| I think we all have to decide. | ||
| Working together to stay out of harm's way is our best chance for defense. | ||
| Then it's basically everybody against the threat. | ||
| And it also seems like you'd be way better at detecting where these threats are coming from and neutralizing them. | ||
| Exactly. | ||
| Because the moment you detect it somewhere, you're going to find out right away. | ||
| It'll be really hard to hide. | ||
| That's right. | ||
| Yeah. | ||
| That's how it works. | ||
| That's the reason why it's safe. | ||
| That's why I'm sitting here right now instead of locking everything down on NVIDIA. | ||
| Not only am I watching my own back, I've got everybody watching my back, and I'm watching everybody else's back. | ||
| It's a bizarre world, isn't it? | ||
| When you think about that, this idea about cybersecurity is unknown to the people who are talking about AI threats. | ||
| I think when they think about AI threats and AI cybersecurity threats, they have to also think about how we deal with it today. | ||
| Now, there's no question that AI is a new technology and it's a new type of software. | ||
| Indian is software. | ||
| It's a new type of software. | ||
| And so it's going to have new capabilities. | ||
| But so will the defense. | ||
| You know, we'll use the same AI technology to go defend against it. | ||
| So do you anticipate a time ever in the future where it's going to be impossible, where there's not going to be any secrets? | ||
| Where the bottleneck between the technology that we have and the information that we have, information is just all a bunch of ones and zeros. | ||
| It's out there on hard drives, and the technology has more and more access to that information. | ||
| Is it ever going to get to a point in time where there's no way to keep a secret? | ||
| I don't think so. | ||
| Because it seems like that's where everything is kind of headed. | ||
| I don't think so. | ||
| I think the quantum computers were supposed to, yeah, quantum computers will make it possible. | ||
| We'll make it so that the previous quantum, previous encryption technology is obsolete. | ||
| But that's the reason why the entire industry is working on post-quantum encryption technology. | ||
| What would that look like? | ||
| New algorithms. | ||
| The crazy thing is when you hear about the kind of computation that quantum computing can do and the power that it has, where you're looking at all the supercomputers in the world, it would take billions of years and it takes them a few minutes to solve these equations. | ||
| How do you make encryption for something that can do that? | ||
| I'm not sure. | ||
| But I've got a bunch of scientists who are working on that. | ||
| I hope they could figure it out. | ||
| Yeah, we've got a bunch of scientists who are expert in that. | ||
| And the ultimate fear that it can't be breached, that quantum computing will always be able to decrypt all other quantum computing encryption? | ||
| I don't think that's true. | ||
| That it just gets to some point where it's like, stop playing the stupid game. | ||
| We know everything. | ||
| I don't think so. | ||
| No? | ||
| Because I'm, you know, history is a guide. | ||
| History is a guide before AI came around. | ||
| That's my worry. | ||
| My worry is this is a totally, you know, it's like history was one thing and then nuclear weapons kind of changed all of our thoughts on war and mutually assured destruction came or got everybody to stop using nuclear bombs. | ||
| Yeah. | ||
| My worry is that. | ||
| But the thing is, Joe, is that AI is not going to, it's not like we're cavemen and then all of a sudden one day AI shows up. | ||
| Every single day, we're getting better and smarter because we have AI. | ||
| And so we're stepping on our own AI's shoulders. | ||
| So when that, whatever that AI threat comes, it's a click ahead. | ||
| It's not a galaxy ahead. | ||
| You know, it's just a click ahead. | ||
| And so I think the idea that somehow this AI is going to pop out of nowhere and somehow think in a way that we can't even imagine thinking and do something that we can't possibly imagine, I think is far-fetched. | ||
| And the reason for that is because we all have AIs, and there's a whole bunch of AIs being in development. | ||
| We know what they are, and we're using it. | ||
| And so every single day, we're close to each other. | ||
| But don't they do things that are very surprising? | ||
| Yeah, but so you have an AI that does something surprising. | ||
| I'm going to have an AI. | ||
| And my AI looks at your AI and goes, that's not that surprising. | ||
| The fear for the layperson like myself is that AI becomes sentient and makes its own decisions. | ||
| And then ultimately decides to just govern the world. | ||
| Do it its own way. | ||
| Like you guys, you had a good run, but we're taking over now. | ||
| Yeah, but my AI is going to take care of me. | ||
| So this is the cybersecurity argument. | ||
| Yes. | ||
| You have an AI, and it's super smart. | ||
| But my AI is super smart too. | ||
| And maybe your AI. | ||
| Let's pretend for a second that we understand what consciousness is and we understand what sentience is. | ||
| And we really are just pretending. | ||
| Okay, let's just pretend for a second that we believe that. | ||
| I don't believe, actually, I actually don't believe that. | ||
| But nonetheless, let's pretend we believe that. | ||
| So your AI is conscious, and my AI is conscious. | ||
| And let's say your AI is, you know, wants to, I don't know, do something surprising. | ||
| My AI is so smart that it might be surprising to me, but it probably won't be surprising to my AI. | ||
| And so maybe my AI thinks it's surprising as well. | ||
| But it's so smart, the moment it sees it the first time, it's not going to be surprised the second time, just like us. | ||
| And so I feel like I think the idea that only one person has AI and that one person's AI compares to everybody else's AI as Neanderthal is probably unlikely. | ||
| I think it's much more like cybersecurity. | ||
| Interesting. | ||
| I think the fear is not that your AI is going to battle with somebody else's AI. | ||
| The fear is that AI is no longer going to listen to you. | ||
| That's the fear, is that human beings won't have control over it after a certain point. | ||
| If it achieves sentience and then has the ability to be autonomous. | ||
| That there is one AI. | ||
| Well, they just combine. | ||
| Yeah, it becomes one AI. | ||
| But it's a life form. | ||
|
unidentified
|
Yeah. | |
| But that's the, there's arguments about that, right? | ||
| That we're dealing with some sort of synthetic biology, that it's not as simple as new technology, that you're creating a life form. | ||
| If it's like life form, let's go along with that for a while. | ||
| I think if it's like life form, as you know, all life forms don't agree. | ||
| And so I'm going to have to go with your life form and my life form. | ||
| I'm going to agree because my life form is going to want to be the super life form. | ||
| And now that we have disagreeing life forms, we're back again to where we are. | ||
|
unidentified
|
Well, they would probably cooperate with each other. | |
| It would just. | ||
| The reason why we don't cooperate with each other is we're territorial primates. | ||
| But AI wouldn't be a territorial primate. | ||
| It would realize the folly in that sort of thinking, and it would say, listen, there's plenty of energy for everybody. | ||
| We don't need to dominate. | ||
| We don't need – we're not trying to acquire resources and take over the world. | ||
| We're not looking to find a good breeding partner. | ||
| We're just existing as a new super life form that these cute monkeys created for us. | ||
| Okay. | ||
| Well, that would be a superpower with no ego. | ||
| Right. | ||
| And if it has no ego, why would it have to ego to do any harm to us? | ||
| Well, I don't assume that it would do harm to us. | ||
| But the fear would be that we would no longer have control and that we would no longer be the apex species on the planet. | ||
| This thing that we created would now be. | ||
| Is that funny? | ||
| No. | ||
| I just think it's not going to happen. | ||
| I know you think it's not going to happen, but it could, right? | ||
| And here's the other thing: it's like if we're racing towards could, and could could be the end of human beings being in control of our own destiny. | ||
| I just think it's extremely unlikely. | ||
| That's what they said in the Terminator movie. | ||
| And it hasn't happened. | ||
| No, not yet, but you guys are working towards it. | ||
| The thing about you're saying about conscience and sentience, that you don't think that AI will achieve consciousness? | ||
| Or that the consciousness of the consciousness is specific? | ||
| What's the definition of consciousness? | ||
| What is the definition to you? | ||
| The consciousness, I guess first of all, you need to know about your own existence. | ||
| You have to have experience, not just knowledge and intelligence. | ||
| The concept of a machine having an experience, I'm not, well, first of all, I don't know what defines experience, why we have experiences and why this microphone doesn't. | ||
| And so I think I know, well, I think I know what consciousness is. | ||
| The sense of experience, the ability to know self versus the ability to be able to reflect, know our own self, the sense of ego. | ||
| I think all of those human experiences probably is what consciousness is. | ||
| But why it exists versus the concept of knowledge and intelligence, which is what AI is defined by today. | ||
| It has knowledge, it has intelligence, artificial intelligence. | ||
| We don't call it artificial consciousness. | ||
| Artificial intelligence, the ability to perceive, recognize, understand, plan, perform tasks. | ||
| Those things are foundations of intelligence to know things. | ||
| Knowledge. | ||
| I don't, it's clearly different than consciousness. | ||
| But consciousness is so loosely defined. | ||
| How can we say that? | ||
| I mean, doesn't a dog have consciousness? | ||
| Yeah. | ||
| Dogs seem to be pretty conscious. | ||
| That's right. | ||
| Yeah. | ||
| So, and that's a lower level consciousness than a human being's consciousness. | ||
| I'm not sure. | ||
| Yeah, right. | ||
| Well, the question is what lower level intelligence. | ||
| It's lower level intelligence. | ||
| But I don't know that's lower level consciousness. | ||
| That's a good point. | ||
| Right. | ||
| Because I believe my dogs feel as much as I feel. | ||
| Yeah, they feel a lot. | ||
| Yeah, right. | ||
| Yeah, they get attached to you. | ||
| That's right. | ||
| They get depressed if you're not. | ||
| Yeah, that's right. | ||
| Exactly. | ||
| There's definitely that. | ||
| Yeah. | ||
| The concept of experience. | ||
| Right. | ||
| But isn't AI interacting with society? | ||
| So doesn't it acquire experience through that interaction? | ||
| I don't think interactions is experience. | ||
| I think experience is experience is a collection of feelings, I think. | ||
| You're aware that AI, I forget which one, where they gave it some false information about one of the programmers having an affair with his wife just to see how it would respond to it. | ||
| And then when they said they were going to shut it down, it threatened to blackmail him and reveal his affair. | ||
| And it was like, whoa, like it's conniving. | ||
| Like if that's not learning from experience and being aware that you're about to be shut down, which would imply at least some kind of consciousness, or you could kind of define it as consciousness if you were very loose with the term. | ||
| And if you imagine that this is going to exponentially become more powerful, wouldn't that ultimately lead to a different kind of consciousness than we're defining from biology? | ||
| Well, first of all, let's just break down what it probably did. | ||
| It probably read somewhere. | ||
| There's probably text that in these consequences, certain people did that. | ||
| I could imagine a novel having those words related. | ||
| Sure. | ||
|
unidentified
|
And so inside... It realizes its strategy for survival is blackmail. | |
| It's just a bunch of numbers. | ||
| It's blackmail. | ||
| That... | ||
| It's just a bunch of numbers. | ||
| That in the collection of numbers that relates to a husband cheating on a wife has subsequently a bunch of numbers that relates to black male and such things, whatever the revenge was. | ||
|
unidentified
|
Right. | |
| And so it is spewed it out. | ||
| And so it's just like, you know, it's just as if I'm asking it to write me a poem in Shakespeare. | ||
| It's just whatever the words are, in that dimensionality, this dimensionality is all these vectors in multi-dimensional space. | ||
| These words that were in the prompt that described the affair subsequently led to one word after another led to some revenge and something. | ||
| But it's not because it had consciousness or it just spewed out those words, generated those words. | ||
| I understand what you're saying. | ||
| That is not the same thing. | ||
| It's learned from patterns that human beings have exhibited both in literature and in real life. | ||
| That's exactly right. | ||
| But at a certain point in time, one would say, okay, well, it couldn't do this two years ago, and it couldn't do this four years ago. | ||
| Like, when we were looking towards the future, like, at what point in time, when it can do everything a person does, what point in time do we decide that it's conscious? | ||
| If it absolutely mimics all human thinking and behavior patterns, that doesn't make it conscious. | ||
| It becomes indiscernible. | ||
| It's aware, it can communicate with you the exact same way a person can. | ||
| Like, is consciousness, are we putting too much weight on that concept? | ||
| Because it seems like it's a version of a kind of consciousness. | ||
| It's a version of imitation. | ||
| Imitation consciousness. | ||
| Right. | ||
| But if it perfectly imitates it. | ||
| I still think it's an example of imitation. | ||
| So it's like a fake Rolex when they 3D print them and make them indiscernible. | ||
| But the question is, what's the definition of consciousness? | ||
| Yeah. | ||
| That's the question. | ||
| And I don't think anybody's really clearly defined that. | ||
| That's where it gets weird. | ||
| And that's where the real doomsday people are worried, that you are creating a form of consciousness that you can't control. | ||
| I believe it is possible to create a machine that imitates human intelligence and has the ability to understand information, understand instructions, break the problem down, solve problems, and perform tasks. | ||
| I believe that completely. | ||
| I believe that we could have a computer that has a vast amount of knowledge, some of it true, some of it not true, some of it generated by humans, some of it generated synthetically, and more and more of knowledge in the world will be generated synthetically going forward. | ||
| Until now, the knowledge that we have are knowledge that we generate and we propagate and we send to each other and we amplify it and we add to it and we modify it, we change it. | ||
| In the future, in a couple of years, maybe two or three years, 90% of the world's knowledge will likely be generated by AI. | ||
| That's crazy. | ||
| I know, but it's just fine. | ||
| But it's just fine. | ||
| I know. | ||
| And the reason for that is this. | ||
| Let me tell you why. | ||
| It's because what difference does it make to me that I am learning from a textbook that was generated by a bunch of people I didn't know or written by a book that, you know, from somebody I don't know, to knowledge generated by AI, computers that are assimilating all of these and re-synthesizing things. | ||
| To me, I don't think there's a whole lot of difference. | ||
| We still have to fact-check it. | ||
| We still have to make sure that it's based on fundamental first principles. | ||
| And we still have to do all of that, just like we do today. | ||
| Is this taking into account the kind of AI that exists currently? | ||
| And do you anticipate that just like we could have never really believed that AI would be, at least a person like myself would never believe AI would be as so ubiquitous and so worth it's so powerful today and so important today. | ||
| We never thought that 10 years ago. | ||
| Never thought that. | ||
| Imagine like what are we looking at 10 years from now? | ||
| I think that if you reflect back 10 years from now, you would say the same thing, that we would have never believed that. | ||
| In a different direction. | ||
| Right. | ||
| But if you go forward nine years from now and then ask yourself what's going to happen ten years from now, I think it'll be quite gradual. | ||
| One of the things that Elon said that makes me happy is He believes that we're going to get to a point where it's not necessary for people to work. | ||
| And not meaning that you're going to have no purpose in life, but you will have, in his words, universal high income because so much revenue is generated by AI that it will take away this need for people to do things that they don't really enjoy doing just for money. | ||
| And I think a lot of people have a problem with that because their entire identity and how they think of themselves and how they fit in the community is what they do. | ||
| Like, this is Mike, he's an amazing mechanic. | ||
| Go to Mike, and Mike takes care of things. | ||
| But there's going to come a point in time where AI is going to be able to do all those things much better than people do. | ||
| And people will just be able to receive money. | ||
| But then what does Mike do? | ||
| When Mike really loves being the best mechanic around. | ||
| What does the guy who codes, what does he do when AI can code infinitely faster with zero errors? | ||
| Like what happens with all those people? | ||
| And that is where it gets weird. | ||
| It's like, because we've sort of wrapped our identity as human beings around what we do for a living. | ||
| You know, when you meet someone, one of the first things you meet somebody at a party, hi, Joe, what's your name, Mike? | ||
| What do you do, Mike? | ||
| And, you know, Mike's like, oh, I'm a lawyer. | ||
| Oh, what kind of law? | ||
| And you have a conversation. | ||
| You know, when Mike is like, I get money from the government, I play video games. | ||
| It gets weird. | ||
| And I think the concept sounds great until you take into account human nature. | ||
| And human nature is that we like to have puzzles to solve and things to do and an identity that's wrapped around our idea that we're very good at this thing that we do for a living. | ||
| Yeah, I think, let's see. | ||
| Let me start with the more mundane and I'll work backwards. | ||
|
unidentified
|
Okay. | |
| Work forward. | ||
| So one of the predictions from Jeff Hinton, who started the whole deep learning phenomenon, the deep learning technology trend, and incredible, incredible researcher, professor at University of Toronto. | ||
| He invented, discovered, or invented the idea of back propagation, which allows the neural network to learn. | ||
| And as you know, for the audience, software historically was humans applying first principles and our thinking to describe an algorithm that is then codified just like a recipe that's codified in software. | ||
| It looks just like a recipe, how to cook something. | ||
| It looks exactly the same, just in a slightly different language. | ||
| We call it Python or C or C or whatever it is. | ||
| In the case of deep learning, this invention of artificial intelligence, we put a structure of a whole bunch of neural networks and a whole bunch of math units. | ||
| And we make this large structure. | ||
| It's like a switchboard of little mathematical units. | ||
| And we connect it all together. | ||
| And we give it the input that the software would eventually receive. | ||
| And we just let it randomly guess what the output is. | ||
| And so we say, for example, the input could be a picture of a cat. | ||
| And one of the outputs of the switchboard is where the cat signal is supposed to show up. | ||
| And all of the other signals, the other one's a dog, the other one's an elephant, the other one's a tiger, and all of the other signals are supposed to be zero when I show it a cat. | ||
| And the one that is a cat should be one. | ||
| And I show it a cat through this big, huge network of switchboards and math units. | ||
| And they're just doing multiply and adds, multiplies and adds. | ||
| And this thing, this switchboard, is gigantic. | ||
| The more information you're going to give it, the more the bigger this switchboard has to be. | ||
| And what Jeff Hinton discovered was invented, was a way for you to guess that, put the cat signal in, put the cat image in, and that cat image, you know, could be a million numbers because it's a megapixel image, for example. | ||
| And it's just a whole bunch of numbers. | ||
| And somehow, from those numbers, it has to light up the cat signal. | ||
| Okay, that's the bottom line. | ||
| And the first time you do it, it just comes up with garbage. | ||
| And so it says the right answer is cat. | ||
| And so you need to increase this signal and decrease all of the other and backpropagates the outcome through the entire network. | ||
| And then you show it another, now it's an image of a dog, and it guesses it, takes a swing at it, and it comes up with a bunch of garbage. | ||
| And you say, no, no, no, the answer is this is a dog. | ||
| I want you to produce a dog. | ||
| And all of the other switch, all the other outputs have to be zero. | ||
| And I want to backpropagate that and just do it over and over and over again. | ||
| It's just like showing a kid this is an apple, this is a dog, this is a cat, and you just keep showing it to them until they eventually get it. | ||
| Okay, well, anyways, that big invention is deep learning. | ||
| That's the foundation of artificial intelligence. | ||
| A piece of software that learns from examples. | ||
| That's basically machine learning, a machine that learns. | ||
| And so one of the big first applications was image recognition. | ||
| And one of the most important image recognition applications is radiology. | ||
| And so he predicted about five years ago that in five years' time, the world won't need any radiologists because AI would have swept the whole field. | ||
| Well, it turns out AI has swept the whole field. | ||
| That is completely true. | ||
| Today, just about every radiologist is using AI in some way. | ||
| And what's ironic though, what's interesting is that the number of radiologists has actually grown. | ||
| And so the question is why? | ||
| That's kind of interesting, right? | ||
|
unidentified
|
It is. | |
| And so the prediction was, in fact, that 30 million radiologists will be wiped out. | ||
| But as it turns out, we needed more. | ||
| And the reason for that is because the purpose of a radiologist is to diagnose disease, not to study the image. | ||
| The image studying is simply a task in service of diagnosing the disease. | ||
| And so now, the fact that you could study the images more quickly and more precisely without ever making a mistake and never gets tired. | ||
| You could study more images. | ||
| You could study it in 3D form instead of 2D because, you know, the AI doesn't care whether it studies images in 3D or 2D. | ||
| You could study it in 4D. | ||
| And so now you could study images in a way that radiologists can't easily do. | ||
| And you could study a lot more of it. | ||
| And so the number of tests that people are able to do increases. | ||
| And because they're able to serve more patients, the hospital does better. | ||
| They have more clients, more patients. | ||
| As a result, they have better economics. | ||
| When they have better economics, they hire more radiologists because their purpose is not to study the images. | ||
| Their purpose is to diagnose disease. | ||
| And so the question is, what I'm leading up to is, ultimately, what is the purpose? | ||
| What is the purpose of the lawyer? | ||
| And has the purpose changed? | ||
| What is the purpose? | ||
| You know, one of the examples that I gave that I would give is, for example, if my car became self-driving, will all chauffeurs be out of jobs? | ||
| The answer probably is not. | ||
| Because some chauffeurs, for some people who are driving you, they could be protectors. | ||
| Some people, they're part of the experience, part of the service. | ||
| So when you get there, they could take care of things for you. | ||
| And so for a lot of different reasons, not all chauffeurs would lose their jobs. | ||
| Some chauffeurs would lose their jobs. | ||
| And many chauffeurs would change their jobs. | ||
| And the type of applications of autonomous vehicles will probably increase. | ||
| The usage of the technology within find new homes. | ||
| And so I think you have to go back to what is the purpose of a job. | ||
| Like, for example, if AI comes along, I actually don't believe I'm going to lose my job because my purpose isn't to, I have to look at a lot of documents. | ||
| I study a lot of emails. | ||
| I look at a bunch of diagrams. | ||
| The question is, what is the job? | ||
| And the purpose of somebody probably hasn't changed. | ||
| A lawyer, for example, help people. | ||
| That probably hasn't changed. | ||
| Studying legal documents, generating documents, it's part of the job, not the job. | ||
| But don't you think there's many jobs that AI will replace? | ||
| If your job is not a problem. | ||
| Particularly the terminal. | ||
| Yeah, if your job is the task. | ||
| Right. | ||
| So automation. | ||
| Yeah. | ||
| Factory. | ||
| If your job is the task. | ||
| That's a lot of people. | ||
| It could be a lot of people, but it'll probably generate – like, for example, let's say I'm super excited about the robots Elon's working on. | ||
| It's still a few years away. | ||
| When it happens, when it happens, there's a whole new industry of technicians and people who have to manufacture the robots, right? | ||
| And so that job never existed. | ||
| And so you're going to have a whole industry of people taking care of, like, for example, all the mechanics and all the people who are building things for cars, supercharging cars. | ||
| That didn't exist before cars, and now we're going to have robots. | ||
| You're going to have robot apparels. | ||
| So a whole industry of, right, isn't that right? | ||
| Because I want my robot to look different than your robot. | ||
| Oh, God. | ||
| And so you're going to have a whole apparel industry for robots. | ||
| You're going to have mechanics for robots. | ||
| And you have people who come to maintain your robots. | ||
| You don't have to create it, though? | ||
| No. | ||
| You don't think so? | ||
| You don't think that all done by other robots? | ||
| Eventually, and then there'll be something else. | ||
| So you think ultimately people just adapt? | ||
| Except if you are the task, which is a large percentage of the workforce. | ||
| If your job is just to chop vegetables, Cuisinart is going to replace you. | ||
| Yeah. | ||
| So people have to find meaning in other things. | ||
| Your job has to be more than the task. | ||
| What do you think about Elon's belief that this universal basic income thing will eventually become necessary? | ||
| Many people think that. | ||
| Andrew Yang thinks that. | ||
| He was one of the first people to sort of sound that alarm during the 2020 election. | ||
| Yeah, I guess both ideas probably won't exist at the same time. | ||
| And as in life, things will probably be in the middle. | ||
| One idea, of course, is that there'll be so much abundance of resource that nobody needs a job, and we'll all be wealthy. | ||
| On the other hand, we're going to need universal basic income. | ||
| Both ideas don't exist at the same time. | ||
|
unidentified
|
Right. | |
| And so we're either going to be all wealthy or we're going to be all universal. | ||
| How could everybody be wealthy, though? | ||
| Wealthy, not because you have a lot of dollars, wealthy because there's a lot of abundance. | ||
| Like, for example, today, we are wealthy of information. | ||
| This is a concept several thousand years ago, only a few people have. | ||
| And so today we have wealth of a whole bunch of things, resources that – That's a good point. | ||
| Yeah. | ||
| And so we're going to have wealth of resources, things that we think are valuable today that in the future are just not that valuable. | ||
| And so it – Because it's automated. | ||
| And so I think the question, maybe partly, it's hard to answer, partly because it's hard to talk about infinity, and it's hard to talk about a long time from now. | ||
| And the reason for that is because there's just too many scenarios to consider. | ||
| But I think in the next several years, call it five to ten years, there are several things that I believe in hope. | ||
| And I say hope because I'm not sure. | ||
| One of the things that I believe is that the technology divide will be substantially collapsed. | ||
| And of course, the alternative viewpoint is that AI is going to increase the technology divide. | ||
| Now, the reason why I believe AI is going to reduce the technology divide is because we have proof. | ||
| The evidence is that AI is the easiest application in the world to use. | ||
| ChatGPT has grown to almost a billion users, frankly, practically overnight. | ||
| And if you're not exactly sure how to use, everybody knows how to use ChatGPT, just say something to it. | ||
| If you're not sure how to use ChatGPT, you ask ChatGPT how to use it. | ||
| No tool in history has ever had this capability. | ||
| AquisonArt. | ||
| If you don't know how to use it, you're kind of screwed. | ||
| You're going to walk up to it and say, how do you use a Cuisinart? | ||
| You're going to have to find somebody else. | ||
| But an AI will just tell you exactly how to do it. | ||
| Anybody could do this. | ||
| It'll speak to you in any language. | ||
| And if it doesn't know your language, you'll speak it in that language and it'll probably figure out that it doesn't completely understand your language, go learns it instantly and comes back and talk to you. | ||
| And so I think the technology divide has a real chance finally that you don't have to speak Python or C ⁇ or Fortran. | ||
| You can just speak human and whatever form of human you like. | ||
| And so I think that that has a real chance of closing the technology divide. | ||
| Now, of course, the counter narrative would say that AI is only going to be available for the nations and the countries that have a vast amount of resources because AI takes energy and AI takes a lot of GPUs and factories to be able to produce the AI. | ||
| No doubt at the scale that we would like to do in the United States. | ||
| But the fact of the matter is, your phone's going to run AI just fine, all by itself, you know, in a few years. | ||
| Today, it already does it fairly decently. | ||
| And so the fact that every country, every nation, every society will have to benefit of very good AI. | ||
| It might not be tomorrow's AI. | ||
| It might be yesterday's AI, but yesterday's AI is freaking amazing. | ||
| You know, in 10 years' time, nine-year-old AI is going to be amazing. | ||
| You don't need 10-year-old AI. | ||
| You don't need frontier AI like we need frontier AI because we want to be the world leader. | ||
| But for every single country, everybody, I think the capability to elevate everybody's knowledge and capability and intelligence, that day is coming. | ||
| The Octagon isn't just in Las Vegas anymore. | ||
| It's right in your hands with DraftKings Sportsbook, the official sports betting partner of UFC. | ||
| Get ready because when Dwavish Willie and Jan face off again at UFC 323, every punch, every takedown, every finish, it all has the potential to pay off in real time. | ||
| New customers bet just $5, and if your bet wins, you get paid $200 in bonus bets. | ||
| And hey, Missouri, the wait is over. | ||
| DraftKing Sportsbook is now live in the Show Me State. | ||
| Download the DraftKings Sportsbook app and use promo code Rogan. | ||
| That's code Rogan to turn $5 into $200 in bonus bets if your bet wins. | ||
| In partnership with DraftKings, the crown is yours. | ||
|
unidentified
|
Gambling problem? | |
| Call 1-800-GABLBLER in New York to call 877-8 Hope and Y or text Hope and Y 467-369. | ||
| In Connecticut, help is available for a problem gambling call. | ||
| 888-789-7777. | ||
| Or visit ccpg.org. | ||
| Please play responsibly. | ||
| On behalf of Booth Hill Casino in Resorting, Kansas. | ||
| Pass-through of per wager tax may apply in Illinois. | ||
| 21 and over. | ||
| Age and eligibility varies by jurisdiction. | ||
| Boyd in Ontario. | ||
| Restrictions apply. | ||
| Bet must win to receive bonus bets, which expire in seven days. | ||
| Minimum odds required. | ||
| For additional terms and responsible gaming resources, see DKNG.co slash audio. | ||
| Limited time offer. | ||
| And also energy production, which is the real bottleneck when it comes to third world countries and electricity and all the resources that we take for granted. | ||
| Almost everything is going to be energy constrained. | ||
| And so if you take a look at one of the most important technology advances in history is this idea called Moore's Law. | ||
| Moore's Law started basically in my generation. | ||
| And my generation is the generation of computers. | ||
| I graduated in 1984, and that was basically at the very beginning of the PC revolution and the microprocessor. | ||
| And every single year, it approximately doubled. | ||
| And we describe it as every single year we double the performance. | ||
| But what it really means is that every single year, the cost of computing halved. | ||
| And so the cost of computing in the course of five years reduced by a factor of 10. | ||
| The amount of energy necessary to do computing, to do any task, reduced by a factor of 10. | ||
| Every single 10 years, 100, 1,000, 10,000, 100,000, so on and so forth. | ||
| And so each one of the clicks of Moore's Law, the amount of energy necessary to do any computing reduced. | ||
| That's the reason why you have a laptop today when back in 1984 it sat on the desk, you got a plug in, it wasn't that fast, and it consumed a lot of power. | ||
| Today, you know, it is only a few watts. | ||
| And so Moore's Law is the fundamental technology, the fundamental technology trend that made it possible. | ||
| Well, what's going on in AI? | ||
| The reason why NVIDIA is here is because we invented this new way of doing computing. | ||
| We call it accelerated computing. | ||
| We started it 33 years ago. | ||
| It took us about 30 years to really make a huge breakthrough. | ||
| In that 30 years or so, we took computing, you know, probably a factor of, well, let me just say in the last 10 years, the last 10 years, we improved the performance of computing by 100,000 times. | ||
|
unidentified
|
Whoa. | |
| Imagine a car over the course of 10 years, it became 100,000 times faster. | ||
| Or at the same speed, 100,000 times cheaper. | ||
| Or at the same speed, 100,000 times less energy. | ||
| If your car did that, it doesn't need energy at all. | ||
| What I mean, what I'm trying to say is that in 10 years' time, the amount of energy necessary for artificial intelligence for most people will be minuscule, utterly minuscule. | ||
| And so we'll have AI running in all kinds of things and all the time because it doesn't consume that much energy. | ||
| And so if you're a nation that uses AI for, you know, almost everything in your social fabric, of course, you're going to need these AI factories. | ||
| But for a lot of countries, I think you're going to have excellent AI and you're not going to need as much energy. | ||
| Everybody will be able to come along, is my point. | ||
| So currently, that is a big bottleneck, right? | ||
| Is energy. | ||
| Yeah, it is the bottleneck. | ||
| The bottleneck. | ||
| So was it Google that is making nuclear power plants to operate one of its AI factories? | ||
| Oh, I haven't heard that. | ||
| But I think in the next six, seven years, I think you're going to see a whole bunch of small nuclear reactors. | ||
| And by small, like how big are you talking about? | ||
| Hundreds of megawatts, yeah. | ||
| Okay. | ||
| And that these will be local to whatever specific company they have. | ||
| That's right. | ||
| Will all be power generators. | ||
| Whoa. | ||
| You know, just like just like you're, you know, somebody's farm. | ||
| It probably is the smartest way to do it, right? | ||
| And it takes the burden off, yeah, it takes the burden off the grid. | ||
| It takes, and you could build as much as you need. | ||
| And you can contribute back to the grid. | ||
| It's a really important point that I think you just made about Moore's Law and the relationship to pricing. | ||
| Because, you know, a laptop today, like you can get one of those little MacBook Airs, they're incredible. | ||
| They're so thin, unbelievably powerful. | ||
| Battery life. | ||
| You ever have to charge it? | ||
| Yeah. | ||
| Battery life is crazy. | ||
| And it's not that expensive, relatively speaking. | ||
| Something like that. | ||
| I remember when I was. | ||
| And that's just Moore's Law. | ||
|
unidentified
|
Right. | |
| Then there's the NVIDIA law. | ||
| Oh. | ||
| Just right? | ||
| The law I was talking to you about. | ||
| The computing that we invented. | ||
| The reason why we're here, this new way of doing computing, is like Moore's Law on energy drinks. | ||
| I mean, it's like Moore's Law Moore's Law and Joe Rogan. | ||
| Wow. | ||
| That's interesting. | ||
| Yeah. | ||
| That's us. | ||
| So explain that. | ||
| This chip that you brought to Elon, what's the significance of this? | ||
| Like, why is it so superior? | ||
| And so in 2012, Jeff Hinton's lab, this gentleman I was talking about, Ilya Suskabur, Alex Krushzewski, they made a breakthrough in computer vision in literally creating a piece of software called AlexNet. | ||
| And its job was to recognize images. | ||
| And it recognized images at a level, computer vision, which is fundamental to intelligence. | ||
| If you can't perceive, it's hard to have intelligence. | ||
| And so computer vision is a fundamental pillar of, not the only, but fundamental pillar of. | ||
| And so breaking computer vision or breaking through in computer vision is pretty foundational to almost everything that everybody wants to do in AI. | ||
| And so in 2012, their lab in Toronto made this breakthrough called AlexNet. | ||
| And AlexNet was able to recognize images so much better than any human created computer vision algorithm. | ||
| in the 30 years prior. | ||
| So all of these people, all these scientists, and we had many too, working on computer vision algorithms. | ||
| And these two kids, Ilya and Alex, under Jeff Hinton, took a giant leap above it. | ||
| And it was based on this thing called ElexNet, this neural network. | ||
| And the way it ran, the way they made it work was literally buying two NVIDIA graphics cards. | ||
| Because NVIDIA's GPUs, we've been working on this new way of doing computing. | ||
| And our GPU's application, and it's basically a supercomputing application back in 1984, in order to process computer games and what you have in your racing simulator, that is called an image generator supercomputer. | ||
| And so NVIDIA started, our first application was computer graphics. | ||
| And we applied this new way of doing computing where we do things in parallel instead of sequentially. | ||
| A CPU does things sequentially. | ||
| Step one, step two, step three. | ||
| In our case, we break the problem down and we give it to thousands of processors. | ||
| And so our way of doing computation is much more complicated. | ||
| But if you're able to formulate the problem in the way that we create it called CUDA, this is the invention of our company, if you could formulate it in that way, we could process everything simultaneously. | ||
| Now, in the case of computer graphics, it's easier to do because every single pixel on your screen is not related to every other pixel. | ||
| And so I could render multiple parts of the screen at the same time. | ||
| Not completely true, because maybe the way lighting works or the way shadow works, there's a lot of dependency and such. | ||
| But computer graphics, with all the pixels, I should be able to process everything simultaneously. | ||
| And so we took this embarrassingly parallel problem called computer graphics and we applied it to this new way of doing computing, NVIDIA's accelerated computing. | ||
| We put it in all of our graphics cards. | ||
| Kids were buying it to play games. | ||
| You probably don't know this, but we're the largest gaming platform in the world today. | ||
| Oh, I know that. | ||
| Oh, okay. | ||
| I used to make my own computers. | ||
| I used to buy your graphics cards. | ||
| Oh, that's super cool. | ||
| Yeah. | ||
| Set up SLI. | ||
| Oh, yeah, I love it. | ||
| Okay, that's super cool. | ||
| Oh, yeah, man. | ||
| I used to be a Quake junkie. | ||
| Oh, that's cool. | ||
| Yeah. | ||
| Okay. | ||
| So SLI, I'll tell you the story in just a second. | ||
| And how it led to Elon. | ||
| I'm still answering the question. | ||
| And so anyways, these two kids trained this model using the technique I described earlier on our GPUs because our GPUs could process things in parallel. | ||
| It's essentially a supercomputer in a PC. | ||
| The reason why you used it for Quake is because it is the first consumer supercomputer. | ||
| Okay? | ||
| And so anyways, they made that breakthrough. | ||
| We were working on computer vision at the time. | ||
| It caught my attention. | ||
| And so we went to learn about it. | ||
| Simultaneously, this deep learning phenomenon was happening all over the country. | ||
| Universities after another recognized the importance of deep learning, and all of this work was happening at Stanford, at Harvard, at Berkeley, just all over the place. | ||
| New York University, Yan Le Kun, Andrew Yang at Stanford, so many different places. | ||
| And I see it cropping up everywhere. | ||
| And so my curiosity asked, you know, what is so special about this form of machine learning? | ||
| And we've known about machine learning for a very long time. | ||
| We've known about AI for a very long time. | ||
| We've known about neural networks for a very long time. | ||
| What makes now the moment? | ||
| And so we realized that this architecture for deep neural networks, back propagation, the way deep neural networks were created, we could probably scale this problem, scale the solution to solve many problems. | ||
| That is essentially a universal function approximator. | ||
| Okay, meaning, you know, back when you're in school, you have a box, inside of it is a function. | ||
| You give it an input, it gives you an output. | ||
| And the reason why I call it a universal function approximator is that this computer, instead of you describing the function, a function could be a Newton's equation, F equals MA. | ||
| That's a function. | ||
| You write the function in software, you give it input, F, mass, acceleration, it'll tell you the force. | ||
| And the way this computer works is really interesting. | ||
| You give it a universal function. | ||
| It's not F equals MA, it's just a universal function. | ||
| It's a big, huge deep neural network. | ||
| And instead of describing the insight, you give it examples of input and output, and it figures out the inside. | ||
| So you give it input and output, and it figures out the inside. | ||
| A universal function approximator. | ||
| Today, it could be Newton's equation. | ||
| Tomorrow it could be Maxwell's equation. | ||
| It could be Coulomb's law. | ||
| It could be thermodynamics equation. | ||
| It could be, you know, Schrodinger's equation for quantum physics. | ||
| And so you could put any, you could have this describe almost anything, so long as you have the input and the output. | ||
| So long as you have the input and the output. | ||
| Or it could learn the input and output. | ||
| And so we took a step back and we said, hang on a second, this isn't just for computer vision. | ||
| Deep learning could solve any problem. | ||
| All the problems that are interesting. | ||
| So long as we have input and output. | ||
| Now, what has input and output? | ||
| Well, the world. | ||
| The world has input and output. | ||
| And so we could have a computer that could learn almost anything, machine learning, artificial intelligence. | ||
| And so we reasoned that maybe this is the fundamental breakthrough that we needed. | ||
| There were a couple of things that had to be solved. | ||
| For example, we had to believe that you could actually scale this up to giant systems. | ||
| It was running in a, they had two graphics cards, two GTX 580s, which, by the way, is exactly your SLI configuration. | ||
| Yeah. | ||
| Okay. | ||
| So that GTX 580 SLI was the revolutionary computer that put deep learning on the map. | ||
| Wow. | ||
| It was 2018. | ||
| And you were using it to play quick. | ||
| Wow, that's crazy. | ||
| That was the moment. | ||
| That was the big bang of modern AI. | ||
| We were lucky because we were inventing this technology, this computing approach. | ||
| We were lucky that they found it. | ||
| Turns out they were gamers and it was lucky they found it. | ||
| And it was lucky that we paid attention to that moment. | ||
| It was a little bit like, you know, that Star Trek, you know, first contact. | ||
| The Vulcans had to have seen the warped drive at that very moment. | ||
| If they didn't witness the warped drive, you know, they would have never come to Earth. | ||
| And everything would have never happened. | ||
| It's a little bit like if I hadn't paid attention to that moment, that flash, and that flash didn't last long, if I hadn't paid attention to that flash or our company didn't pay attention to it, who knows what would have happened. | ||
| But we saw that and we reasoned our way into this is a universal function approximator. | ||
| This is not just a computer vision approximator. | ||
| We could use this for all kinds of things if we could solve two problems. | ||
| The first problem is that we have to prove to ourselves it could scale. | ||
| The second problem we had to wait for, I guess, contribute to and wait for, is the world will never have enough data on input and output where we could supervise the AI to learn everything. | ||
| For example, if we have to supervise our children on everything they learn, the amount of information they could learn is limited. | ||
| We needed the AI. | ||
| We needed the computer to have a method of learning without supervision. | ||
| And that's where we had to wait a few more years. | ||
| But unsupervised AI learning is now here. | ||
| And so the AI could learn by itself. | ||
| And the reason why the AI could learn by itself is because we have many examples of right answers. | ||
| Like, for example, if I want to learn, if I want to teach an AI how to predict the next word, I could just grab it, grab up a whole bunch of text that we already have, mask out the last word, and make it try and try and try again until it predicts the next one. | ||
| Or I mask out random words inside the text, and I make it try and try and try until it predicts it. | ||
| You know, like Mary goes down to the bank. | ||
| Is it a riverbank or a money bank? | ||
| Well, if you're going to go down to the bank, it's probably a riverbank. | ||
| And it might not be obvious even from that. | ||
| It might need and caught a fish. | ||
| Okay, now you know it must be the riverbank. | ||
| And so you give these AIs a whole bunch of these examples and you mask out the words, it'll predict the next one. | ||
| And so unsupervised learning came along. | ||
| These two ideas, the fact that it's scalable and unsupervised learning came along, we were convinced that we ought to put everything into this and help create this industry because we're going to solve a whole bunch of interesting problems. | ||
| And that was in 2012. | ||
| By 2016, I had built this computer called the DGX1. | ||
| The one that you saw me give to Elon is called DGX Spark. | ||
| The DGX1 was $300,000. | ||
| It cost NVIDIA a few billion dollars to make the first one. | ||
| And instead of two chips SLI, we connected eight chips with a technology called NV-Link. | ||
| But it's basically SLI supercharged. | ||
| Okay? | ||
| And so we connected eight of these chips together instead of just two. | ||
| And all of them worked together, just like your Quake rig did, to solve this deep learning problem, to train this model. | ||
| And so we created this thing. | ||
| I announced it at GTC and at one of our annual events. | ||
| And I described this deep learning thing, computer vision thing, and this computer called DGX1. | ||
| The audience was like completely silent. | ||
| They had no idea what I was talking about. | ||
| And I was lucky because I had known Elon, and I helped him build the first computer for Model 3, the Model S. | ||
| And when he wanted to start working on autonomous vehicle, I helped him build the computer that went into the Model S A V system, his full self-driving system. | ||
| We were basically the FSD computer version one. | ||
| And so we're already working together. | ||
| And when I announced this thing, nobody in the world wanted it. | ||
| I had no purchase orders, not one. | ||
| Nobody wanted to buy it. | ||
| Nobody wanted to be part of it. | ||
| Except for Elon. | ||
| He goes, he was at the event, and we were doing a fireside chat about the future of self-driving cars. | ||
| I think it's like 2016. | ||
| Yeah, 20, maybe at that time it was 2015. | ||
| And he goes, you know what? | ||
| I have a company that could really use this. | ||
| I said, wow, my first customer. | ||
| And so I was pretty excited about it. | ||
| And he goes, yeah, we have this company. | ||
| It's a non-profit company. | ||
| And all the blood drained out of my face. | ||
| Yeah. | ||
| I just spent a few billion dollars building this thing. | ||
| It cost $300,000. | ||
| And, you know, the chances of a non-profit being able to pay for this thing is approximately zero. | ||
| And he goes, you know, this is an AI company, and it's a non-profit. | ||
| And we could really use one of these supercomputers. | ||
| And so I picked it up. | ||
| I built the first one for ourselves. | ||
| We're using it inside the company. | ||
| I boxed one up. | ||
| I drove it up to San Francisco and I delivered it to Elon in 2016. | ||
| A bunch of researchers were there. | ||
| Peter Beale was there. | ||
| Ilya was there. | ||
| There was a bunch of people there. | ||
| And I walked up to the second floor where they were all kind of in a room smaller than your place here. | ||
| And that place turned out to have been OpenAI. | ||
| 2016. | ||
|
unidentified
|
Wow. | |
| Just a bunch of people sitting in a room. | ||
|
unidentified
|
It's not really nonprofit anymore, though. | |
| They're not non-profit anymore. | ||
| Weird how that works. | ||
| Yeah, yeah. | ||
| But anyhow, anyhow, Elon was there. | ||
| Yeah, it was really a great moment. | ||
| Oh, yeah, there you go. | ||
| Yeah, that's it. | ||
| Look at you, bro. | ||
| Same jacket. | ||
| Look at that. | ||
| I haven't aged. | ||
| Not a lick of black hair, though. | ||
|
unidentified
|
The size of it is significantly smaller. | |
| That was the other devil. | ||
| Okay, so yeah, there you go. | ||
| Yeah, look at the difference. | ||
| Exactly the same industrial design. | ||
|
unidentified
|
He's holding it in his hand. | |
| Here's the amazing thing. | ||
| DGX1 was one petaflops. | ||
| Okay. | ||
| That's a lot of flops. | ||
| And DGX Spark is one petaflops. | ||
| Nine years later. | ||
| Wow. | ||
| The same amount of computing horsepower. | ||
| Shrunken down. | ||
| Yeah. | ||
| And instead of $300,000, it's now $4,000. | ||
| And it's the size of a small book. | ||
| Incredible. | ||
| Crazy. | ||
| That's how technology moves. | ||
| Anyways, that's the reason why I wanted to give him the first one. | ||
| Because I gave him the first one in 2016. | ||
| It's so fascinating. | ||
| I mean, if you wanted to make a story for a film, I mean, that would be the story that, like, What better scenario if it really does become a digital life form, how funny would it be that it is birthed out of the desire for computer graphics for video games? | ||
| Exactly. | ||
| It's kind of crazy. | ||
| Yeah. | ||
| Kind of crazy when you think about it that way. | ||
| Because computer graphics was one of the hardest supercomputer problems, generating reality. | ||
| And also one of the most profitable to solve because computer games are so popular. | ||
| When NVIDIA started in 1993, we were trying to create this new computing approach. | ||
| The question is, what's the killer app? | ||
| And the problem we wanted to, the company wanted to create a new type of computing computing architecture, a new type of computer that can solve problems that normal computers can't solve. | ||
| Well, the applications that existed in the industry in 1993 are applications that normal computers can solve. | ||
| Because if the normal computers can't solve them, why would the application exist? | ||
| And so we had a mission statement for a company that has no chance of success. | ||
| But I didn't know that in 1993. | ||
| It just sounded like a good idea. | ||
| Right. | ||
| And so if we created this thing that can solve problems, you know, it's like you actually have to go create the problem. | ||
| And so that's what we did. | ||
| In 1993, there was no quake. | ||
| John Carmack hadn't even released Doom yet. | ||
| You probably remember that. | ||
| Sure, yeah. | ||
| And there were no applications for it. | ||
| And so I went to Japan because the arcade industry had this at the time of Sega, you remember? | ||
| Sure. | ||
| The arcade machines, they came up with 3D arcade systems. | ||
| Virtual Fighter, Daytona, Virtual Cop. | ||
| All of those arcade games were in 3D for the very first time. | ||
| And the technology they were using was from Martin Marietta. | ||
| The flight simulators, they took the guts out of a flight simulator and put it into an arcade machine. | ||
| The system that you have over here, it's got to be a million times more powerful than that arcade machine. | ||
| And that was a flight simulator for NASA. | ||
| Whoa. | ||
| And so they took the guts out of that. | ||
| They were using it for flight simulation for jets and space shuttle. | ||
| And they took the guts out of that. | ||
| And Sega had this brilliant computer developer. | ||
| His name was Yu Suzuki. | ||
| Yu Suzuki and Miyamoto, Sega and Nintendo, these were the incredible pioneers, the visionaries, the incredible artists. | ||
| And they're both very, very technical. | ||
| They were the origins, really, of the gaming industry. | ||
| And Yu Suzuki pioneered 3D graphics gaming. | ||
| And so I went, we created this company, and there were no apps. | ||
| And we were spending all of our afternoons. | ||
| We told our family we were going to work, but it was just the three of us, who's going to know. | ||
| And so we went to Curtis's, one of the founders, went to Curtis's townhouse. | ||
| And Chris and I were married. | ||
| We have kids. | ||
| I already had Spencer and Madison. | ||
| They were probably two years old. | ||
| And Chris's kids are about the same age as ours. | ||
| And we would go to work in this townhouse. | ||
| But, you know, when you're a startup and the mission statement is the way we described, you're not going to have too many customers calling you. | ||
| And so we had really nothing to do. | ||
| And so after lunch, we would always have a great lunch. | ||
| After lunch, we would go to the arcades and play the Sega Virt, you know, the Sega Virtual Fighter and Daytona and all those games. | ||
| And analyze how they're doing it, trying to figure out how they were doing that. | ||
| And so we decided, let's just go to Japan and let's convince Sega to move those applications into the PC. | ||
| And we would start the PC gaming, the 3D gaming industry, partnering with Sega. | ||
| That's how NVIDIA started. | ||
|
unidentified
|
Wow. | |
| And so, in exchange for them developing their games for our computers in the PC, we would build a chip for their game console. | ||
| That was the partnership. | ||
| I build a chip for your game console, you port the Sega games to us. | ||
| And then they paid us, you know, at the time, quite a significant amount of money to build that game console. | ||
| And that was kind of the beginning of NVIDIA getting started. | ||
| And we thought we were on our way. | ||
| And so I started with a business plan, a mission statement that was impossible. | ||
| We lucked into the Sega partnership. | ||
| We started taking off, started building our game console. | ||
| And about a couple years into it, we discovered our first technology didn't work. | ||
| It would have been a flaw. | ||
| It was a flaw. | ||
| And all of the technology ideas that we had, the architecture concepts were sound, but the way we were doing computer graphics was exactly backwards. | ||
| You know, instead of, I won't bore you with the technology, but instead of inverse texture mapping, we were doing forward texture mapping. | ||
| Instead of triangles, we did curved surfaces. | ||
| So other people did it flat. | ||
| We did it round. | ||
| Other technology, the technology that ultimately won the technology we use today, has Z buffers. | ||
| It automatically sorted. | ||
| We had an architecture with no Z buffers. | ||
| The application had to sort it. | ||
| And so we chose a bunch of technology approaches that three major technology choices, all three choices were wrong. | ||
| Okay, so this is how incredibly smart we were. | ||
| And so in 1995, mid-95, we realized we were going down the wrong path. | ||
| Meanwhile, the Silicon Valley was packed with 3D graphics startups because it was the most exciting technology of that time. | ||
| And so 3DFX and Rendition and Silicon Graphics was coming in. | ||
| Intel was already in there. | ||
| And, you know, gosh, what added up eventually to a hundred different startups we had to compete against. | ||
| Everybody had chosen the right technology approach, and we chose the wrong one. | ||
| And so we were the first company to start. | ||
| We found ourselves essentially dead last with the wrong answer. | ||
| And so the company was in trouble. | ||
| And ultimately, we had to make several decisions. | ||
| The first decision is: well, if we change now, we will be the last company. | ||
| And even if we changed into the technology that we believe to be right, we'd still be dead. | ||
| And so that argument, you know, do we change and therefore be dead? | ||
| Don't change and make this technology work somehow? | ||
| Or go do something completely different. | ||
| That question stirred the company strategically and was a hard question. | ||
| I eventually advocated for we don't know what the right strategy is, but we know what the wrong technology is. | ||
| So let's stop doing it the wrong way and let's give ourselves a chance to go figure out what the strategy is. | ||
| The second thing, the second problem we had was our company was running out of money and I was in a contract with Sega and I owed them this game console. | ||
| And if that contract would have been canceled, we'd be dead. | ||
| We would have vaporized instantly. | ||
| And so I went to Japan and I explained to the CEO of Sega, Iri Majuri, really great man. | ||
| He was the former CEO of Honda USA. | ||
| Went back to Sega to run Sega. | ||
| I went back to Japan and run Sega. | ||
| And I explained to him that I was, I guess I was, what, 30, 33 years old. | ||
| You know, when I was 33 years old, I still had acne. | ||
| And I got this Chinese kid, I was super skinny. | ||
| And he was already kind of elder. | ||
| And I went to him and I said, listen, I've got some bad news for you. | ||
| And first, the technology that we promised you doesn't work. | ||
| And second, we shouldn't finish your contract because we'd waste all your money and you would have something that doesn't work. | ||
| And I recommend you'd find another partner to build your game console. | ||
| And so I'm terribly sorry that we've set you back in your product roadmap. | ||
| And third, even though I'm asking you to let me out of the contract, I still need the money. | ||
| Because if you didn't give me the money, we'd vaporize overnight. | ||
| And so I explained it to him humbly, honestly. | ||
| I gave him the background. | ||
| I explained to him why the technology doesn't work, why we thought it was going to work, why it doesn't work. | ||
| And I asked him to convert the last $5 million that they were going to complete the contract to give us that money as an investment instead. | ||
| And he said, but it's very likely your company will go out of business, even with my investment. | ||
| And it was completely true. | ||
| Back then, 1995, $5 million was a lot of money. | ||
| It's a lot of money today. | ||
| $5 million was a lot of money. | ||
| And here's a pile of competitors doing it right. | ||
| What are the chances that giving NVIDIA $5 million, that we would develop the right strategy, that he would get a return on that $5 million or even get it back? | ||
| Zero percent. | ||
| You do the math, it's zero percent. | ||
| If I were sitting there right there, I wouldn't have done it. | ||
| $5 million was a mountain of money to Sega at the time. | ||
| And so I told him that if you invested that $5 million in us, it is most likely to be lost. | ||
| But if you didn't invest that money, we'd be out of business and we would have no chance. | ||
| And I told him that I don't even know exactly what I said in the end, but I told him that I would understand if he decided not to, but it would make the world to me if he did. | ||
| He went off and thought about it for a couple days and came back and said, we'll do it. | ||
|
unidentified
|
Wow. | |
| I'm just trying to strategy to how to correct what it was doing wrong. | ||
| Did you wait? | ||
| Oh, man, wait until I tell you the rest of it. | ||
| It's scary, even scarier. | ||
| Oh, no. | ||
| And so what he decided was Jensen was a young man he liked. | ||
| That's it. | ||
| Wow. | ||
| To this day. | ||
| That's nuts. | ||
|
unidentified
|
Boy, but the world owes that guy. | |
| No doubt. | ||
|
unidentified
|
Right? | |
| He celebrated today in Japan. | ||
| And if he would have kept that five, the investment, I think it'd be worth probably about a trillion dollars today. | ||
| I know. | ||
| But the moment we went public, they sold it. | ||
| They go, wow, that's a miracle. | ||
| And so they sold it. | ||
| Yeah, they sold it at NVIDIA valuation about $300 million. | ||
| That's our IPO valuation, $300 million. | ||
| Wow. | ||
| And so, anyhow, I was incredibly grateful. | ||
| And then now we had to figure out what to do because we still were doing the wrong strategy, wrong technology. | ||
| So unfortunately, we had to lay off most of the company. | ||
| We shrunk the company all back. | ||
| All the people working on the game console, you know, we had to shrink it all back. | ||
| And then somebody told me that, but Jensen, we've never built it this way before. | ||
| We've never built it the right way before. | ||
| We've only known how to build it the wrong way. | ||
| And so nobody in the company knew how to build this supercomputing image generator 3D graphics thing that Silicon Graphics did. | ||
| And so I said, okay, how hard can it be? | ||
| You got all these 30 companies, 50 companies doing it. | ||
| How hard can it be? | ||
| And so, luckily, there was a textbook written by the company, Silicon Graphics. | ||
| And so I went down to the store. | ||
| I had 200 bucks in my pocket. | ||
| And I bought three textbooks, the only three they had, $60 a piece. | ||
| I bought the three textbooks. | ||
| I brought it back and I gave one to each one of the architects. | ||
| And I said, read that and let's go save the company. | ||
| And so they read this textbook, learned from the giant at the time, Silicon Graphics, about how to do 3D graphics. | ||
| But the thing that was amazing, and what makes NVIDIA special today, is that The people that are there are able to start from first principles, learn best-known art, but re-implement it in a way that's never been done before. | ||
| And so, when we reimagined the technology of 3D graphics, we reimagined it in a way that manifests today the modern 3D graphics. | ||
| We really invented modern 3D graphics, but we learned from previous known arts and we implement it fundamentally differently. | ||
| What did you do that changed it? | ||
| Well, you know, ultimately, ultimately, the simple answer is that the way silicon graphics works, the geometry engine is a bunch of software running on processors. | ||
| We took that and eliminated all the generality, the general purposeness of it, and we reduced it down into the most essential part of 3D graphics. | ||
| And we hard-coded it into the chip. | ||
| And so, instead of something general purpose, we hard-coded it very specifically into just the limited applications, limited functionality necessary for video games. | ||
| And that capability, that super, and because we reinvented a whole bunch of stuff, it supercharged the capability of that one little chip. | ||
| And our one little chip was generating images as fast as a $1 million image generator. | ||
| That was the big breakthrough. | ||
| We took a million-dollar thing and we put it into the graphics card that you now put into your gaming PC. | ||
| And that was our big invention. | ||
| And then, and of course, the question is: how do you compete against these 30 other companies doing what they were doing? | ||
| And there we did several things. | ||
| One, instead of building a 3D graphics chip for every 3D graphics application, we decided to build a 3D graphics chip for one application. | ||
| We bet the farm on video games. | ||
| The needs of video games are very different than needs for CAD, needs for flight simulators. | ||
| They're related but not the same. | ||
| And so we narrowly focused our problem statement so I could reject all of the other complexities, and we shrunk it down into this one little focus, and then we supercharged it for gamers. | ||
| And the second thing that we did was we created a whole ecosystem of working with game developers and getting their games ported and adapted to our silicon so that we could turn essentially what is a technology business into a platform business, into a game platform business. | ||
| So GeForce is really today, it's also the most advanced 3D graphics technology in the world. | ||
| But a long time ago, GeForce is really the game console inside your PC. | ||
| It runs Windows, it runs Excel, it runs PowerPoint, of course, those are easy things. | ||
| But its fundamental purpose was simply to turn your PC into a game console. | ||
| So we were the first technology company to build all of this incredible technology in service of one audience, gamers. | ||
| Now, of course, in 1993, the gaming industry didn't exist. | ||
| But by the time that John Carmack came along and the Doom phenomenon happened, and then Quake came out, as you know, that entire world, that entire community boom, took off. | ||
| Do you know where the name Doom came from? | ||
| It came from this there's a scene in the movie, The Color of Money, where Tom Cruise, who's this elite pool player, shows up at this pool hall and this local hustler says, What do you got in the case? | ||
| And he opens up this case. | ||
| He has a special pool queue. | ||
| He goes in here, and he opens it up, he goes, Doom. | ||
| Doom. | ||
| And that's where it came from. | ||
| That's right. | ||
| Yeah, because Carmack said that's what they wanted to do to the gaming industry. | ||
| Doom. | ||
| That when Doom came out, it would just be everybody would be like, oh, we're fucked. | ||
|
unidentified
|
Oh, wow. | |
| This is Doom. | ||
| That's awesome. | ||
| Isn't that amazing? | ||
| That's amazing. | ||
| Because it's the perfect name for the game. | ||
| Yeah. | ||
| And the name came out of that scene in that movie. | ||
| That's right. | ||
| Well, and then, of course, Tim Sweeney and Epic Games and the 3D gaming genre took off. | ||
| Yes. | ||
| And so if you just kind of, in the beginning, was no gaming industry. | ||
| We had no choice but to focus the company on one thing. | ||
| That one thing. | ||
| It's a really incredible origin story. | ||
| It's amazing. | ||
| Like, you must be like a disaster. | ||
| $5 million, that pivot, with that conversation with that gentleman, if he did not agree to that, if he did not like you, what would the world look like today? | ||
| That's crazy. | ||
| Oh, wait. | ||
| Then our entire life hung on another gentleman. | ||
| And so now, here we are. | ||
| We built. | ||
| So before GeForce, it was Revo 128. | ||
| Revo 128 saved the company. | ||
| It revolutionized computer graphics. | ||
| The performance, cost-performance ratio of 3D graphics for gaming was off the charts amazing. | ||
| And we're getting ready to ship it. | ||
| We're building it. | ||
| So, as you know, $5 million doesn't last long. | ||
| And so every single month, every single month, we were drawing down. | ||
| You have to build it, prototype it, you have to design it, prototype it, get the silicon back, which costs a lot of money. | ||
| Test it with software. | ||
| Because without the software testing the chip, you don't know the chip works. | ||
| And then you're going to find a bug, probably, because every time you test something, you find bugs, which means you have to tape it out again, which is more time, more money. | ||
| And so we did the math. | ||
| There was no chance somebody was going to survive it. | ||
| We didn't have that much time to tape out a chip, send it to a foundry, TSMC, get the silicon back, test it, send it back out again. | ||
| There was no shot, no hope. | ||
| And so the math, the spreadsheet, doesn't allow us to do that. | ||
| And so I heard about this company, and this company built this machine. | ||
| And this machine is an emulator. | ||
| You could take your design, all of the software that describes the chip, and you could put it into this machine. | ||
| And this machine will pretend it's our chip. | ||
| So I don't have to send it to the fab, wait until the fab sends it back. | ||
| I could have this machine pretend it's our chip, and I could put all of the software on top of this machine called an emulator and test all of the software on this pretend chip, and I could fix it all before I send it to the fab. | ||
| And if I could do that, when I send it to the fab, it should work. | ||
| Nobody knows, but it should work. | ||
| And so we came to the conclusion that let's take half of the money we had left in the bank. | ||
| At the time, it was about a million dollars. | ||
| Take half of that money and go buy this machine. | ||
| So instead of keeping the money to stay alive, I took half of the money to go buy this machine. | ||
| Well, I called this guy up. | ||
| The company's called Icos. | ||
| Call this company up and I said, hey, listen, I heard about this machine. | ||
| I like to buy one. | ||
| And they go, oh, that's terrific, but we're out of business. | ||
| I said, what? | ||
| You're out of business. | ||
| He goes, yeah, we had no customers. | ||
| And I said, wait, hang on a second. | ||
| So you never made the machine? | ||
| They could say, no, no, no, we made the machine. | ||
| We have one in inventory if you want it, but we're out of business. | ||
| So I bought one out of inventory. | ||
| Okay? | ||
| After I bought it, they went out of business. | ||
|
unidentified
|
Wow. | |
| I bought it out of inventory. | ||
| And on this machine, we put NVIDIA's chip into it. | ||
| And we tested all of the software on top. | ||
| And at this point, we were on fumes. | ||
| But we convinced ourselves that chip is going to be great. | ||
| And so I had to call some other gentleman. | ||
| So I called TSMC. | ||
| And I told TSMC that listen, TSMC is the world's largest founder today. | ||
| At the time, they were just a few hundred million dollars large. | ||
| Tiny little company. | ||
| And I explained to them what we were doing. | ||
| And I explained to my, I told them I had a lot of customers. | ||
| I had one, you know, Diamond Multimedia, probably one of the companies you bought the graphics card from back in the old days. | ||
| And I said, you know, we have a lot of customers and the demand's really great. | ||
| And we're going to tape out a chip to you. | ||
| And I like to go directly to production because I know it works. | ||
| And they said, nobody has ever done that before. | ||
| Nobody has ever taped out a chip that worked the first time. | ||
| And nobody starts out production without looking at it. | ||
| But I knew that if I didn't start the production, I'd be out of business anyways. | ||
| And if I could start the production, I might have a chance. | ||
| And so TSMC decided to support me. | ||
| And this gentleman is named Morris Chang. | ||
| Morris Chang is the father of the foundry industry, the founder of TSMC, really great man. | ||
| He decided to support our company. | ||
| I explained to them everything. | ||
| He decided to support us, frankly, probably because they didn't have that many other customers anyhow, but they were grateful. | ||
| And I was immensely grateful. | ||
| And as we were starting the production, Morris flew to the United States and he didn't so many words ask me so, but he asked me a whole lot of questions that was trying to tease out, do I have any money? | ||
| But he didn't directly ask me that, you know. | ||
| And so the truth is that we didn't have all the money. | ||
| But we had a strong PO from the customer. | ||
| And if it didn't work, some wafers would have been lost. | ||
| I'm not exactly sure what would have happened, but we would have come short. | ||
| It would have been rough. | ||
| But they supported us with all of that risk involved. | ||
| We launched this chip. | ||
| Turns out to have been completely revolutionary. | ||
| Knocked the ball out of the park. | ||
| We became the fastest growing technology company in history to go from zero to one billion dollars. | ||
| So wild that you didn't test the chip. | ||
| I know. | ||
| We tested it afterwards. | ||
| Yeah, we tested it afterwards. | ||
|
unidentified
|
Afterwards, but he went into production already. | |
| But by the way, that methodology that we developed to save the company is used throughout the world today. | ||
| That's amazing. | ||
| Yeah, we changed the whole world's methodology of designing chips, the whole world's rhythm of designing chips. | ||
| We changed everything. | ||
| How well did you sleep those days? | ||
|
unidentified
|
It must have been so much stress. | |
| You know, what is that feeling where the world just kind of feels like it's flying? | ||
| You have this, what do you call that feeling? | ||
| You can't stop the feeling that everything's moving super fast. | ||
| And, you know, you're laying in bed and the world just feels like, you know, and you feel deeply anxious, completely out of control. | ||
| I've felt that probably a couple of times in my life. | ||
| It's during that time. | ||
|
unidentified
|
Wow. | |
| Yeah. | ||
| It was incredible. | ||
| What an incredible success story. | ||
| But I learned a lot. | ||
| I learned about, I learned several things. | ||
| I learned how to develop strategies. | ||
| I learned how to, and our company learned how to develop strategies. | ||
| What are winning strategies? | ||
| We learned how to create a market. | ||
| We created the modern 3D gaming market. | ||
| We learned how, and so that exact same skill is how we created the modern AI market. | ||
| It's exactly the same. | ||
|
unidentified
|
Wow. | |
| Yeah, exactly the same skill, exactly the same blueprint. | ||
| And we learned how to deal with crisis, how to stay calm, how to think through things systematically. | ||
| We learned how to remove all waste in the company and work from first principles and doing only the things that are essential. | ||
| Everything else is waste because we have no money for it. | ||
| To live on fumes at all times. | ||
| And the feeling, no different than the feeling I had this morning when I woke up, that you're going to be out of business soon. | ||
| That you're, you know, the phrase, 30 days from going out of business, I've used for 33 years. | ||
| Do you still feel that? | ||
| Oh, yeah, oh, yeah. | ||
| Every morning. | ||
| Every morning. | ||
|
unidentified
|
But you guys are one of the biggest companies on planet Earth. | |
| But the feeling doesn't change. | ||
| Wow. | ||
| The sense of vulnerability, the sense of uncertainty, the sense of insecurity, it doesn't leave you. | ||
| That's crazy. | ||
| We had nothing. | ||
| We had nothing. | ||
| We were dealing with that. | ||
| We were still shining. | ||
| Oh, yeah. | ||
| Oh, yeah. | ||
| Every day. | ||
| Every moment. | ||
| Do you think that fuels you? | ||
| Is that part of the reason why the company's so successful? | ||
| That you have that hungry mentality? | ||
| You never rest. | ||
| You're never sitting on your laurels. | ||
| You're always on the edge. | ||
| I have a greater drive from not wanting to fail than the drive of wanting to succeed. | ||
| Isn't that like a bad thing? | ||
|
unidentified
|
Six coaches would tell you that's completely the wrong psychology. | |
| The world has just heard me say that out loud for the first time. | ||
| But it's true. | ||
| Well, that's how fast. | ||
| Fear of failure drives me more than the greed or whatever it is. | ||
| Well, ultimately, that's probably a more healthy approach, now that I'm thinking about it. | ||
| Because, like, the fear— I'm not ambitious, for example. | ||
| You know? | ||
| I just want to stay alive, Joe. | ||
| I want the company to thrive, you know? | ||
| I want us to make an impact. | ||
| That's interesting. | ||
| Well, maybe that's why you're so humble. | ||
| Maybe that's what keeps you grounded. | ||
| Because with the kind of spectacular success the company's achieved, it would be easy to get a big head. | ||
| No. | ||
|
unidentified
|
Right? | |
| But isn't that interesting? | ||
| It's like, if you were the guy that your main focus is just success, you probably would go, well, made it, nailed it, I'm the man. | ||
| Drop the mic. | ||
| It's that you wake up, you're like, God, we can't fuck this up. | ||
|
unidentified
|
No, exactly. | |
| Every morning. | ||
| Every morning. | ||
| No, every moment. | ||
| That's good. | ||
| That's crazy. | ||
| Before I go to bed. | ||
| Well, listen, if I was a major investor in your company, that's why I'd want running it. | ||
| I'd want a guy who's working. | ||
| Yeah. | ||
| That's what I work. | ||
| That's why I work seven days a week every moment I'm awake. | ||
| You work every moment. | ||
| Every moment I'm awake. | ||
| Wow. | ||
| I'm thinking about solving a problem. | ||
| I'm thinking about it. | ||
| How long can you keep this up? | ||
| I don't know, but it could be next week. | ||
| It sounds exhausting. | ||
| It is exhausting. | ||
| It sounds completely exhausting. | ||
| Always in a state of anxiety. | ||
| Wow. | ||
|
unidentified
|
Wow. | |
| Always in a state of anxiety. | ||
| Well, kudos to you for admitting that. | ||
| I think that's important for a lot of people to hear because, you know, there's probably some young people out there that are in a similar position to where you were when you were starting out that just feel like, oh, those people that have made it, they're just smarter than me and they had more opportunities than me. | ||
| And it's just like it was handed to them or they're just in the right place at the right time. | ||
| Joe, I just described to you somebody who didn't know what was going on. | ||
| Actually did it wrong. | ||
| Yeah. | ||
| Yeah. | ||
| And the ultimate diving catch like two or three times. | ||
| Crazy. | ||
| Yeah. | ||
| The ultimate diving catch is the perfect way to put it. | ||
| You know, it's just like the edge of your glove. | ||
| It probably bounced off of somebody's helmet and landed at the edge of your glove. | ||
| God, that's incredible. | ||
| That's incredible, but it's also, it's really cool that you have this perspective, that you look at it that way. | ||
| Because, you know, a lot of people that have delusions of grandeur or they have, you know. | ||
| And their inflammation. | ||
| And their rewriting of history oftentimes had them somehow extraordinarily smart and they were geniuses and they knew all along and they were spot on. | ||
| The business plan was exactly what they thought. | ||
|
unidentified
|
Yeah. | |
| They destroyed the competition and, you know, and they emerged victorious. | ||
| Meanwhile, you're like, I'm scared every day. | ||
| Exactly. | ||
| Exactly. | ||
| It's so funny. | ||
|
unidentified
|
Oh, my God. | |
| That's amazing. | ||
| It's so true, though. | ||
| It's amazing. | ||
| It's so true. | ||
| It's amazing. | ||
| Well, but I think there's nothing inconsistent with being a leader and being vulnerable. | ||
| You know, the company doesn't need me to be a genius right all along, right all the time. | ||
| Absolutely certain about what I'm trying to do and what I'm doing. | ||
| The company doesn't need that. | ||
| The company wants me to succeed. | ||
| You know, the thing that, and we started out today talking about President Trump, and I was about to say something. | ||
| And listen, he is my president. | ||
| He is our president. | ||
| We should all, and we're talking about just because it's President Trump, we all want him to be wrong. | ||
| I think the United States, we all have to realize he is our president. | ||
| We want him to succeed. | ||
| No matter who's president, that's right. | ||
| That's right. | ||
| We want him to succeed. | ||
| We need to help him succeed because it helps everybody, all of us succeed. | ||
| And I'm lucky that I work in a company where I have 40,000 people who want me to succeed. | ||
| They want me to succeed and I can tell. | ||
| And they're all every single day to help me overcome these challenges, trying to realize what I describe to be our strategy, doing their best. | ||
| And if it's somehow wrong or not perfectly right, to tell me so that we could pivot. | ||
| And the more vulnerable we are as a leader, the more able other people are able to tell you, you know, that, Jensen, that's not exactly right. | ||
| Right. | ||
| Have you considered this information? | ||
| And the more vulnerable we are, the more able we're actually able to pivot. | ||
| If we put ourselves into this superhuman capability, then it's hard for us to pivot strategy. | ||
|
unidentified
|
Right. | |
| Because we were supposed to be right all along. | ||
| And so if you're always right, how can you possibly pivot? | ||
| Because pivoting requires you to be wrong. | ||
| And so I've got no trouble with being wrong. | ||
| I just have to make sure that I stay alert, that I reason about things from first principles all the time. | ||
| Always break things down to first principles, understand why it's happening, reassess continuously. | ||
| The reassessing continuously is kind of partly what causes continuous anxiety. | ||
| You know, because you're asking yourself, were you wrong yesterday? | ||
| Are you still right? | ||
| Is this the same? | ||
| Has that changed? | ||
| Has that conditioned, is that worse than you thought? | ||
| But God, that mindset is perfect for your business, though, because this business is ever changing. | ||
| All the time. | ||
| And you've got competition coming from every direction. | ||
| So much of it is kind of up in the air. | ||
| And you have to invent a future where 100 variables are included. | ||
| And there's no way you could be right on all of them. | ||
| And so you have to be, you have to surf. | ||
|
unidentified
|
Wow. | |
| You have to surf. | ||
| That's a good way to put it. | ||
| You have to surf. | ||
|
unidentified
|
Yeah. | |
| You're surfing waves of technology and innovation. | ||
| That's right. | ||
| You can't predict the waves. | ||
| You've got to deal with the ones you have. | ||
| Wow. | ||
| But skill matters. | ||
| And I've been doing this for 30 years. | ||
| I'm the longest running tech CEO in the world. | ||
| Is that true? | ||
| Congratulations. | ||
| That's amazing. | ||
| You know, people ask me how is one, don't get fired. | ||
|
unidentified
|
I'll stop a short heartbeat. | |
| And then two, don't get bored. | ||
| Yeah. | ||
| Well, how do you maintain your enthusiasm? | ||
| The honest truth is it's not always enthusiasm. | ||
| It's, you know, sometimes it's enthusiasm. | ||
| Sometimes it's just good old-fashioned fear. | ||
| And then sometimes, you know, a healthy dose of frustration. | ||
| You know, it's whatever keeps you moving. | ||
| Yeah, just all the emotions. | ||
| I think, you know, CEOs, we have all the emotions, right? | ||
| You know, and so probably jacked up to the maximum because you're kind of feeling it on behalf of the whole company. | ||
| I'm feeling it on behalf of everybody at the same time. | ||
| And it kind of, you know, encapsulates into somebody. | ||
| And so I have to be mindful of the past. | ||
| I have to be mindful of the present. | ||
| I've got to be mindful of the future. | ||
| And, you know, it's not without emotion. | ||
| It's not just a job. | ||
| Let's just put it that way. | ||
| That doesn't seem like it at all. | ||
| I would imagine one of the more difficult aspects of your job currently, now that the company is massively successful, is anticipating where technology is headed and where the applications are going to be. | ||
| So how do you try to map that out? | ||
| Yeah, there's a whole bunch of ways. | ||
| And it takes a whole bunch of things. | ||
| But let me just start. | ||
| You have to be surrounded by amazing people. | ||
| And NVIDIA is now, if you look at the large tech companies in the world today, most of them have a business in advertising or social media or content distribution. | ||
| And at the core of it is really fundamental computer science. | ||
| And so the company's business is not computers. | ||
| The company's business is not technology. | ||
| Technology drives the company. | ||
| NVIDIA is the only company in the world that's large whose only business is technology. | ||
| We only build technology. | ||
| We don't advertise. | ||
| The only way that we make money is to create amazing technology and sell it. | ||
| And so to be that, to be NVIDIA today, the number one thing is you're surrounded by the finest computer scientists in the world. | ||
| And that's my gift. | ||
| My gift is that we've created a company's culture, a condition by which the world's greatest computer scientists want to be part of it. | ||
| Because they get to do their life's work and create the next thing. | ||
| Because that's what they want to do. | ||
| Because maybe they don't want to be in service of another business. | ||
| They want to be in service of the technology itself. | ||
| And we're the largest form of its kind in the history of the world. | ||
|
unidentified
|
Wow. | |
| I know. | ||
| It's pretty amazing. | ||
|
unidentified
|
Wow. | |
| And so one, you know, we have got a great condition. | ||
| We have a great culture. | ||
| We have great people. | ||
| And now the question is how do you systematically be able to see the future, stay alert of it, and reduce the likelihood of missing something or being wrong. | ||
| And so there's a lot of different ways you could do that. | ||
| For example, we have great partnerships. | ||
| We have fundamental research. | ||
| We have a great research lab, one of the largest industrial research labs in the world today. | ||
| And we partner with a whole bunch of universities and other scientists. | ||
| We do a lot of open collaboration. | ||
| And so I'm constantly working with researchers outside the company. | ||
| We have the benefit of having amazing customers. | ||
| And so I have the benefit of working with Elon and others in the industry. | ||
| And we have the benefit of being the only pure play technology company that can serve consumer internet, industrial manufacturing, scientific computing, healthcare, financial services, all the industries that we're in, they're all signals to me. | ||
| And so they all have mathematicians and scientists. | ||
| And so because I have the benefit now of a radar system that is the most broad of any company in the world, working across every single industry, from agriculture to energy to video games. | ||
| And so the ability for us to have this vantage point, one, doing fundamental research ourselves, and then two, working with all the great researchers, working with all the great industries, the feedback system is incredible. | ||
| And then finally, you just have to have a culture of staying super alert. | ||
| There's no easy way of being alert except for paying attention. | ||
| I haven't found a single way of being able to stay alert without paying attention. | ||
| And so, you know, I probably read several thousand emails a day. | ||
|
unidentified
|
How? | |
| How do you have a time for that? | ||
| I wake up early this morning. | ||
| I was up at 4 o'clock. | ||
| How much do you sleep? | ||
| 6, 7 hours. | ||
| Yeah. | ||
| And then you're up at 4, read emails for a few hours before you get going. | ||
| That's right, yeah. | ||
| Wow. | ||
| Every day. | ||
| Every single day. | ||
| Not one day missed. | ||
| Including Thanksgiving Christmas. | ||
| Do you ever take a vacation? | ||
| Yeah, but my definition of a vacation is when I'm with my family. | ||
| And so if I'm with my family, I'm very happy. | ||
| I don't care where we are. | ||
| And you don't work then? | ||
| Or do you work little? | ||
|
unidentified
|
No, no, I work a lot. | |
| Even like if you go on a trip somewhere, you're still working. | ||
| Oh, sure. | ||
| Oh, sure. | ||
| Wow. | ||
| Every day. | ||
| Every day. | ||
| But my kids work every day. | ||
| You make me tired just saying this. | ||
| My kids work every day. | ||
| Both of my kids work at NVIDIA. | ||
| They work every day. | ||
|
unidentified
|
Wow. | |
| Yeah, I'm very lucky. | ||
| Wow. | ||
| Yeah. | ||
| It's brutal now because, you know, it's just me working every day. | ||
| Now we have three people working every day, and they want to work with me every day. | ||
| And so it's a lot of work. | ||
| Well, you've obviously imparted that ethic into them. | ||
| They work incredibly hard. | ||
| I mean, it's not unbelievable. | ||
| But my parents work incredibly hard. | ||
| Yeah. | ||
| I was born with the work gene, the suffering gene. | ||
| Well, listen, man, it has paid off. | ||
| What a crazy story. | ||
| It's really an amazing origin story. | ||
| It really, I mean, it has to be kind of surreal to be in the position that you're in now when you look back at how many times that it could have fallen apart and humble beginnings. | ||
| But Joe, this is great. | ||
| It's a great country. | ||
| I'm an immigrant. | ||
| My parents sent my older brother and I here first. | ||
| We're in Thailand. | ||
| I was born in Taiwan, but my dad had a job in Thailand. | ||
| He was a chemical and instrumentation engineer, incredible engineer. | ||
| And his job was to go start an oil refinery. | ||
| And so we moved to Thailand, lived in Bangkok. | ||
| And in 19, I guess, 1973, 1974 timeframe, you know how Thailand, every so often, they would just have a coup. | ||
| You know, the military would have an uprising. | ||
| And all of a sudden, one day, there were tanks and soldiers in the streets. | ||
| And my parents thought, you know, it probably isn't safe for the kids to be here. | ||
| And so they contacted my uncle. | ||
| My uncle lives in Tacoma, Washington. | ||
| And we had never met him. | ||
| And my parents sent us to him. | ||
| How old are you? | ||
| I was about to turn nine, and my older brother almost turned 11. | ||
| And so the two of us came to the United States. | ||
| And we stayed with our uncle for a little bit while he looked for a school for us. | ||
| And my parents didn't have very much money. | ||
| And they'd never been to the United States. | ||
| My father was, I'll tell you that story in a second. | ||
| And so my uncle found a school that would accept foreign students and affordable enough for my parents. | ||
| And that school turned out to have been in Oneida, Kentucky, Clark County, Kentucky, the epicenter of the opioid crisis today. | ||
| Coal country. | ||
| Clark County, Kentucky was the poorest county in America when I showed up. | ||
| It is the poorest county in America today. | ||
| And so we went to the school. | ||
| It's a great school. | ||
| Oneida Baptist Institute in a town of a few hundred. | ||
| I think it was 600 at the time that we showed up. | ||
| No traffic light. | ||
| And I think it has 600 today. | ||
| It's kind of an amazing feat, actually. | ||
| The ability to hold your population for when it's 600 people is quite a magical thing, however, they did it. | ||
| And so the school had a mission of being an open school for any children who'd like to come. | ||
| And what that basically means is that if you're a troubled student, if you have a troubled family, if you're, you know, whatever your background, you're welcome to come to Oneida Baptist Institute, including kids from international who would like to stay there. | ||
| Did you speak English at the time? | ||
| Okay, yeah. | ||
| Yeah, okay, yeah. | ||
| And so we showed up. | ||
| And my first thought was, gosh, there are a lot of cigarette butts on the ground. | ||
| 100% of the kids smoked. | ||
| So right away, you know, this is not a normal school. | ||
| Nine-year-olds? | ||
| No, I was the youngest kid. | ||
| Okay. | ||
| 11-year-olds. | ||
| My roommate was 17 years old. | ||
| Wow. | ||
| Yeah, he just turned 17. | ||
| And he was jacked. | ||
| And I don't know where he is now. | ||
| I know his name, but I don't know where he is now. | ||
| But anyways, that night we got – and the second thing I noticed when you walk into your dorm room is there are no drawers and no closet doors. | ||
| Just like a prison. | ||
| And there are no locks so that people could check up on you. | ||
| And so I go into my room, and he's 17, and get ready for bed. | ||
| And he had all this tape all over his body. | ||
| And it turned out he was in a knife fight, and he's been stabbed all over his body, and these were just fresh wounds. | ||
| And the other kids were hurt much worse. | ||
| And so he was my roommate, the toughest kid in school. | ||
| And I was the youngest kid in school. | ||
| It was a junior high, but they took me anyways because if I walked about a mile across the Kentucky River, the swing bridge, the other side is a middle school that I could go to. | ||
| And then I can go to that school and I come back and I stay in the dorm. | ||
| And so basically, Oneida Baptist Institute was my dorm when I went to this other school. | ||
| My older brother went to the junior high. | ||
| And so we were there for a couple of years. | ||
| Every kid had chores. | ||
| My older brother's chore was to work in the tobacco farm. | ||
| You know, so tobacco, they raised tobacco so they could raise some extra money for the school, kind of like a penitentiary. | ||
| Wow. | ||
| And my job was just to clean the dorm. | ||
| And so I was nine years old. | ||
| I was cleaning toilets for a dorm of 100 boys. | ||
| I cleaned more bathrooms than anybody. | ||
| And I just wish that everybody was a little bit more careful. | ||
| But anyways, I was the youngest kid in school. | ||
| My memories of it was really good. | ||
| But it was a pretty tough, it was a tough town. | ||
| Town's like it. | ||
| Yeah, town kids, they all carried, everybody had knives. | ||
| Everybody had knives. | ||
| Everybody smoked. | ||
| Everybody had a Zippo lighter. | ||
| I smoked for a week. | ||
| Did you? | ||
| Oh, yeah, sure. | ||
| How old were you? | ||
| I was nine. | ||
| When you were nine, you were nine, you tried smoking. | ||
| Yeah, I got myself a pack of cigarettes. | ||
| Everybody else did. | ||
| Did you get sick? | ||
| No, I got used to it. | ||
|
unidentified
|
Yeah. | |
| And I learned how to blow smoke rings and, you know, breathe out of my nose, you know, take it in through my nose. | ||
| There's a couple all the different things that you learn. | ||
|
unidentified
|
Yeah. | |
| At nine. | ||
|
unidentified
|
Yeah. | |
| Wow. | ||
| You just did it to fit in or look at it. | ||
| Yeah, because everybody else did it. | ||
|
unidentified
|
Right. | |
| Yeah. | ||
| And then I did it for a couple weeks, I guess. | ||
| And I just rather have I had a quarter, you know, I had a quarter a month or something like that. | ||
| I'd just rather buy popsicles and French sickles with it. | ||
| I was nine, you know. | ||
| Right. | ||
| I chose the better path. | ||
| Wow. | ||
| That was our school. | ||
| And then my parents came to the United States two years later. | ||
| And we met them in Tacoma, Washington. | ||
| That's wild. | ||
| It was a really crazy experience. | ||
|
unidentified
|
What a strange, formative experience. | |
| Yeah, tough kids. | ||
| Thailand to one of the poorest places in America, or if not the poorest, as a nine-year-old. | ||
| That was my first time with your brother. | ||
|
unidentified
|
Wow. | |
| Yeah. | ||
| Yeah. | ||
| No, I used to remember. | ||
| And what breaks my heart, probably the only thing that really breaks my heart about that experience was so we didn't have enough money to make international phone calls every week. | ||
| And so my parents gave us this tape deck, this IWA tape deck, and a tape. | ||
| And so every month we would sit in front of that tape deck and my older brother, Jeff, and I the two of us would just tell them what we did the whole month. | ||
| Wow. | ||
| And we would send that tape by mail. | ||
| And my parents would take that tape and record back on top of it and send it back to us. | ||
|
unidentified
|
Wow. | |
| Could you imagine if for two years that tape still existed of these two kids just describing their first experience with the United States? | ||
| Like I remember telling my parents that I joined the swim team and my roommate was really buff and so every day we spent a lot of time in the in the gym and so every night 100 push-ups, 100 sit-ups every day in the gym. | ||
| So I was nine years old. | ||
| I was getting I was pretty buff. | ||
| And I'm pretty fit. | ||
| And so I joined the soccer team. | ||
| I joined the swim team because if you join the team, they take you to meets, and then afterwards, you get to go to a nice restaurant. | ||
| And that nice restaurant was McDonald's. | ||
| Wow. | ||
| And I recorded this thing. | ||
| I said, Mom and Dad, we went to the most amazing restaurant today. | ||
| This whole place is lit up. | ||
| It's like the future. | ||
| And the food comes in a box. | ||
| And the food is incredible. | ||
| The hamburger is incredible. | ||
| It was McDonald's. | ||
| But anyhow, wouldn't it be amazing? | ||
| Oh, my God. | ||
| Two years. | ||
| Yeah, two years. | ||
| What a crazy connection to your parents, too. | ||
| Just sending a tape and them sending you one back. | ||
| And it's the only way you're communicating for two years. | ||
|
unidentified
|
Yeah. | |
| Wow. | ||
| Yeah. | ||
| No, my parents are incredible, actually. | ||
| They grew up really poor. | ||
| And when they came to the United States, they had almost no money. | ||
| Probably one of the most impactful memories I have is they came and we were staying in an apartment complex. | ||
| And they had just rent back in the, I guess people still do rent a bunch of furniture. | ||
| And we were messing around. | ||
| And we bumped into the coffee table and crushed it. | ||
| It was made out of particle wood. | ||
| We crushed it. | ||
| And I just still remember the look on my mom's face, you know, because they didn't have any money and she didn't know how she was going to pay it back. | ||
| But anyhow, that kind of tells you how hard it was for them to come here. | ||
| But they left everything behind. | ||
| And all they had was their suitcase and the money they had in their pocket when they came to the United States. | ||
| How old were they? | ||
| Pursued the Murray Dream. | ||
| They were in their 40s. | ||
|
unidentified
|
Wow. | |
| Yeah, late 30s. | ||
| Pursued the American dream. | ||
| This is the American dream. | ||
| I'm the first generation of the American dream. | ||
| Wow. | ||
| Yeah. | ||
| It's hard not to love this country. | ||
| It's hard not to be romantic about this country. | ||
| That is a romantic story. | ||
| That's an amazing story. | ||
| Yeah. | ||
| And my dad found his job literally in the newspaper, you know, the ads. | ||
| And he calls people, got a job. | ||
| What did he do? | ||
| He was a consulting engineer in a consulting firm. | ||
| And they helped people build oil refineries, paper mills, and fabs. | ||
| And that's what he did. | ||
| He's really good at factory design, instrumentation engineer. | ||
| And so he's brilliant at that. | ||
| And so he did that. | ||
| And my mom worked as a maid, and they found a way to raise us. | ||
|
unidentified
|
Wow. | |
| That's an incredible story, Jensen. | ||
| It really is. | ||
| Everything, all of it. | ||
| From your childhood to the perils of NVIDIA almost falling. | ||
| It's really incredible, man. | ||
| It's a great story. | ||
| Yeah. | ||
| I've lived a great life. | ||
| You really have. | ||
| And it's a great story for other people to hear, too. | ||
| It really is. | ||
| You don't have to go to Ivy League schools to succeed. | ||
| This country creates opportunities. | ||
| It has opportunities for all of us. | ||
| You do have to strive. | ||
| You have to claw your way here. | ||
| Yeah. | ||
| But if you put in the work, you can succeed. | ||
| Nobody works out. | ||
| There's a lot of luck and a lot of good decision-making. | ||
| And the good graces of others. | ||
| Yes. | ||
| That's really important. | ||
| Yeah. | ||
| You and I spoke about two people who are very dear to me, but the list goes on. | ||
| The people at NVIDIA who have helped me, many friends that are on the board, the decisions, them giving me the opportunity. | ||
| Like when we were inventing this new computing approach, I tanked our stock price because we added this thing called CUDA to the chip. | ||
| We had this big idea. | ||
| We added this thing called CUDA to the chip. | ||
| But nobody paid for it, but our cost doubled. | ||
| And so we had this graphics chip company, and we invented GPUs. | ||
| We invented programmable shaders. | ||
| We invented everything modern computer graphics. | ||
| We invented real-time ray tracing. | ||
| That's why it went from GTX to RTX. | ||
| We invented all this stuff, but every time we invented something, the market doesn't know how to appreciate it, but the cost went way up. | ||
| And in the case of CUDA, that enabled AI, the cost increased a lot. | ||
| But we really believed it. | ||
| And so if you believe in that future and you don't do anything about it, you're going to regret it for your life. | ||
| And so we always, you know, I always tell the team, do we believe this or not? | ||
| And if you believe it, and grounded on first principles, not random hearsay, and we believe it, we owe it to ourselves to go pursue it. | ||
| If we're the right people to go do it, if it's really, really hard to do, it's worth doing, and we believe it. | ||
| Let's go pursue it. | ||
| Well, we pursued it. | ||
| We launched the product. | ||
| Nobody knew. | ||
| It was exactly like when I launched DGX1 and the entire audience was like complete silence. | ||
| When I launched CUDA, the audience was complete silence. | ||
| No customer wanted it. | ||
| Nobody asked for it. | ||
| Nobody understood it. | ||
| NVIDIA was a public company. | ||
| What year was this? | ||
| This is 2006, 20 years ago. | ||
| 2005. | ||
|
unidentified
|
Wow. | |
| Our stock prices went poof. | ||
| I think our valuation went down to like $2 or $3 billion. | ||
| From about 12 or something like that. | ||
| I crushed it in a very bad way. | ||
| What is it now, though? | ||
| Yeah, it's higher. | ||
| Very humble of you. | ||
| It's higher. | ||
| But it changed the world. | ||
| Yeah. | ||
| That invention changed the world. | ||
| It's an incredible story, Johnson. | ||
| It really is. | ||
| Thank you. | ||
| I like your story. | ||
| It's incredible. | ||
| My story's not as incredible. | ||
| My story's more weird. | ||
| You know? | ||
| It's much more fortuitous and weird. | ||
| Okay, what are the three milestones, most important milestones that led to here? | ||
| That's a good question. | ||
| What was step one? | ||
| I think step one was seeing other people do it. | ||
| Step one was in the initial days of podcasting, like in 2009 when I started podcasting and only been around for a couple of years. | ||
| The first was Adam Curry, my good friend, who was the pod father. | ||
| He invented podcasting. | ||
| And then, you know, I remember Adam Corolla had a show because he had a radio show. | ||
| His radio show got canceled. | ||
| And so he decided to just do the same show but do it on the internet. | ||
| And that was pretty revolutionary. | ||
| Nobody was doing that. | ||
| And then there was the experience that I had doing different morning radio shows, like Opie and Anthony in particular, because it was fun. | ||
| And we would just get together with a bunch of comedians. | ||
| You know, I'd be on the show with like three or four other guys that I knew. | ||
| And it was always just looked forward to it. | ||
| It was just such a good time. | ||
| And I said, God, I miss doing that. | ||
| It's so fun to do that. | ||
| I wish I could do something like that. | ||
| And then I saw Tom Green set up. | ||
| Tom Green had a setup in his house. | ||
| And he essentially turned his entire house into a television studio. | ||
| And he did an internet show from his living room. | ||
| He had servers in his house and cables everywhere. | ||
| He had to step over cables. | ||
| This is like 2007. | ||
| I'm like, Tom, this is nuts. | ||
| Like, this is. | ||
| And I'm like, you got to figure out a way to make money from this. | ||
| I wish everybody on the internet could see your setup. | ||
| It's nuts. | ||
| I just want to let you guys know that. | ||
| It's not just this. | ||
| So that was the beginning of it, is just seeing other people do it. | ||
| And then saying, all right, let's just try it. | ||
| And then so the beginning days, we just did it on a laptop, had a laptop with a webcam and just messed around, had a bunch of comedians come in, we would just talk and joke around. | ||
| Then I did it like once a week. | ||
| And then I started doing it twice a week. | ||
| And then all of a sudden, I was doing it for a year. | ||
| And then I was doing it for two years. | ||
| Then it was like, oh, it's starting to get a lot of viewers and a lot of listeners. | ||
| And then I just kept doing it. | ||
| It's all it is. | ||
| I just kept doing it because I enjoyed doing it. | ||
| Was there any setback? | ||
| No. | ||
| No, there's never really a setback. | ||
|
unidentified
|
Really? | |
| No. | ||
| It must have been. | ||
| It's not the same kind of thing. | ||
| You're just resilient. | ||
| Or you're just tough. | ||
| No, no, no, no. | ||
| It wasn't tough or hard. | ||
| It was just interesting. | ||
| So I just, you were never once punched in the face. | ||
| No, not in the show. | ||
| No, not really. | ||
| Not doing the show. | ||
| You never did something that big blowback. | ||
| Nope. | ||
| Not really. | ||
| No, it all just kept growing. | ||
| It kept growing. | ||
| And the thing stayed the same from the beginning to now. | ||
| And the thing is, I enjoy talking to people. | ||
| I've always enjoyed talking to interesting people. | ||
| I could even tell just when we walked in the way you interacted with everybody, not just me. | ||
| Yeah. | ||
| That's cool. | ||
| People are cool. | ||
| Yeah, that's cool. | ||
| You know, it's an amazing gift to be able to have so many conversations with so many interesting people because it changes the way you see the world because you see the world through so many different people's eyes. | ||
| And you have so many different people of different perspectives and different opinions and different philosophies and different life stories. | ||
| And, you know, it's an incredibly enriching and educating experience having so many conversations with so many amazing people. | ||
| And that's all I started doing. | ||
| And that's all I do now. | ||
| Even now, when I book the show, I do it on my phone. | ||
| And I basically go through this giant list of emails of all the people that want to be on the show or that request to be on the show. | ||
| And then I factor in another list that I have of people that I would like to get on the show that I'm interested in. | ||
| And I just map it out. | ||
| And that's it. | ||
| And I go, oh, I'd like to talk to him. | ||
| If it wasn't because of President Trump, I wouldn't have been bumped up on that list. | ||
| No, I wanted to talk to you already. | ||
| I just think, you know, what you're doing is very fascinating. | ||
| I mean, how would I not want to talk to you? | ||
| And today, it proved to be absolutely the right decision. | ||
| Well, you know, listen, it's strange to be an immigrant one day going to Oneida Baptist Institute with the students that were there. | ||
| And then here NVIDIA is one of the most consequential companies in the history of companies. | ||
| It is a crazy story. | ||
| It has to be. | ||
| That journey is. | ||
| And it's very humbling. | ||
| And I'm very grateful. | ||
| It's pretty amazing, man. | ||
| Surrounded by amazing people. | ||
| You're very fortunate, and you've also, you seem very happy. | ||
| And you seem like you're 100% on the right path in this life. | ||
| You know, everybody says you must love your job. | ||
| Not every day. | ||
| That's not part of the beauty of everything. | ||
| Yeah. | ||
| Is that there's ups and downs. | ||
| That's right. | ||
| It's never just like this giant dopamine high. | ||
| We leave this impression. | ||
| Here's an impression I don't think is healthy. | ||
| People who are successful leave the impression often that our job gives us great joy. | ||
| I think largely it does. | ||
| That our jobs, we're passionate about our work. | ||
| And that passion relates to it's just so much fun. | ||
| I think it largely is. | ||
| But it distracts from, in fact, a lot of success comes from really, really hard work. | ||
| Yes. | ||
| There's long periods of suffering and loneliness and uncertainty and fear and embarrassment and humiliation. | ||
| All of the feelings that we most not love. | ||
| That creating something from the ground up. | ||
| And Elon will tell you something similar. | ||
| Very difficult to invent something new. | ||
| And people don't believe you all the time. | ||
| You're humiliated often, disbelieved most of the time. | ||
| And so people forget that part of success. | ||
| And I don't think it's healthy. | ||
| I think it's good that we pass that forward and let people know that it's just part of the journey. | ||
| Yes. | ||
| And suffering is part of the journey. | ||
| You will appreciate it so much. | ||
| These horrible feelings that you have when things are not going so well, you appreciate it so much more when they do go well. | ||
| Deeply grateful. | ||
| Yeah. | ||
| Yeah. | ||
| Deep, deep pride. | ||
| Incredible pride. | ||
| Incredible, incredible gratefulness and surely incredible memories. | ||
| Absolutely. | ||
| Jensen, thank you so much for being here. | ||
| This was really fun. | ||
| I really enjoyed it. | ||
| And your story is just absolutely incredible and very inspirational. | ||
| And I think it really is the American dream. | ||
| It is the American dream. | ||
|
unidentified
|
It really is. | |
| Thank you so much. | ||
| Thank you, Jeff. | ||
|
unidentified
|
All right. |