| Speaker | Time | Text | 
|---|---|---|
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Joe Rogan podcast, check it out! | |
| The Joe Rogan experience. | ||
| Train by day, Joe Rogan, podcast by night, all day. | ||
| Exactly. | ||
| Just every morning. | ||
| What about Jeff Bezos is doing? | ||
| He looks definitely doing some damn drone. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        He looks jacked. | |
| He looks jacked, right? | ||
| Yeah, but he's like. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Quick. | |
| He got jacked. | ||
| At age 59 in less than a year, he went from pencil net geek to looking like a miniature, like the rock. | ||
| Yeah, like a little miniature alpha fella. | ||
| Yeah. | ||
| Like his neck got bigger than his head. | ||
| Yeah. | ||
| But then like his earlier pictures, his neck's like a noodle. | ||
| I support this activity. | ||
| I like to see him going in this direction. | ||
| Which is fine. | ||
| And his voice dropped like two octaves. | ||
| I want you to move in that direction as well. | ||
| I think we can achieve this. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        I mean, I think we can achieve Giga Chad. | |
| That's what people call this. | ||
| Where is that guy? | ||
| Beeple? | ||
| I don't know where he is. | ||
| That's like a real guy. | ||
| The artist? | ||
| Yeah. | ||
| No, Digga Chad. | ||
| Oh, Digga Chad. | ||
| Yeah. | ||
| I don't know if that's a real guy. | ||
| It's hard to realize. | ||
| No, no, it is a real guy. | ||
| It is a real guy? | ||
| He's got the crazy jaw and like perfect sculpted hair. | ||
| Yeah. | ||
| Well, I mean, they may have exaggerated a little bit, but no, I think he actually just kind of looked like that in reality. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        So he's a pretty unique looking individual. | |
| I think we can achieve this. | ||
| That guy right there, that's a real guy. | ||
| That's real dude. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        I always thought that was CGI. | |
| I think the upper right one is not him. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        That's not real. | |
| That one to the left of that? | ||
| Like, that's real? | ||
| No, that's artificial, bro. | ||
| That's fake. | ||
| That's got that uncanny valley feel to it. | ||
| Doesn't it? | ||
| It's not impossible. | ||
| No. | ||
| No, it's not impossible to achieve. | ||
| But it's not possible to maintain that kind of leanness. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        No, no. | |
| At that point, he's dehydrating and all sorts of things. | ||
| Oh, it's based on a real purse. | ||
| Yeah, yeah. | ||
| Right, but it's not a real person. | ||
| What does he really look like? | ||
| Those images, I think, are bullshit. | ||
| Some of them are real. | ||
| Is that real? | ||
| Okay, that looks real. | ||
| That looks like a really jacked bodybuilder. | ||
| Yeah. | ||
| Yeah, that looks real. | ||
| Like, that's achievable. | ||
| But there's a few of those images where you're just like, what's going on here? | ||
| Yeah, yeah, yeah, totally. | ||
| Well, I mean, you see it? | ||
| Is that the Ice? | ||
| That's the real dude. | ||
| Well, that Icelandic dude who's To? | ||
| Oh, yeah, the guy who jumps in the frozen lakes and shit. | ||
| Well, the guy who played the mountain. | ||
| Oh, that guy. | ||
| Yeah. | ||
| That is like a mutant strong human. | ||
| Yes. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| Like he would be in like the X-Men or something, you know? | ||
| I mean, he's just like, not like, uh, and, and, and, and there's that, you know that Have you seen that meme, tent and tent bag? | ||
| You know how it's really hard to get the tent and tent in the bag? | ||
| Oh, right, right. | ||
| That's true. | ||
| And then there's a picture of him and his girlfriend. | ||
| Oh, right. | ||
| Tent bag. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        That's hilarious. | |
| I don't know how it gets in there. | ||
| It seems too small. | ||
| I met Brian Shaw. | ||
| Brian Shaw is like the world's most powerful man. | ||
| And he's almost seven feet tall. | ||
| He's 400 pounds. | ||
| And his bone density is one in 500 million people. | ||
| So there's one. | ||
| There's like maybe 16 people. | ||
| He's an enormous human being. | ||
| Like a legitimate giant, just like that guy. | ||
| But we met him. | ||
| He was hanging out with us in the green room of the mothership. | ||
| It's like, okay, if this was like David and Goliath days, this is an actual giant, like the giants of the Bible. | ||
| Once in a while, they get a supergiant poster. | ||
| This is a real one. | ||
| Like, not a tall, skinny basketball player. | ||
| Yeah, yeah. | ||
| Like a seven foot, 400-pound power lifter. | ||
| Like, you don't want to, especially. | ||
| That's the guy. | ||
| See, if there's a photo of him standing next to, like, a regular human. | ||
| I was trying to get this. | ||
| There it is. | ||
| Yeah. | ||
| That's him right there. | ||
| Like, there's like, there's, like, one of him next to standing next to Arnold and stuff. | ||
| Yeah. | ||
| It's where everyone just looks tiny. | ||
| I mean, I think he's a pretty cool dude, actually. | ||
| Oh, Brian's very cool. | ||
| Very smart, too. | ||
| Unusual. | ||
| You know, you expect anybody to be that big. | ||
| It's got to be a moron. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| No. | ||
| Yeah. | ||
| There's Andre the Giant who was awesome. | ||
| Yeah. | ||
| He was great in Princess Bride. | ||
| No, he was just awesome, period. | ||
| Yeah, yeah. | ||
| So we were talking about this interview with Sam Altman and Tucker. | ||
| And I was like, we should probably just talk about this on the air. | ||
| Because it is one of the craziest interviews I think I've ever seen in my life. | ||
| Yeah. | ||
| Where Tucker starts bringing up this guy who was whistleblower or whatever. | ||
| Whistleblower who, you know, committed suicide, but doesn't look like it. | ||
| And he's talking to Sam Altman about this. | ||
| And Sam Altman was like, are you accusing me? | ||
| He's like, no, no, no. | ||
| I'm not. | ||
| I'm just saying I think someone killed him. | ||
| Yeah. | ||
| And it should be investigated. | ||
| Yeah. | ||
| Not just drop the case. | ||
| It seems like. | ||
| They just dropped the case. | ||
| Yeah, yeah. | ||
| But his parents think he was murdered. | ||
| The wires to a security camera were cut. | ||
| Blood in two rooms. | ||
| Blood in two rooms. | ||
| Someone else's wig was in the room. | ||
| Someone else's wig. | ||
| Wig. | ||
| Wig. | ||
| Yes. | ||
| Not normal. | ||
| Not his wig. | ||
| Not normal to have a wig laying around. | ||
| Yes. | ||
| And he ordered DoorDash right before allegedly committing suicide. | ||
| Yeah. | ||
| Which seems unusual. | ||
| Yeah. | ||
| It's like, you know, I'm going to order pizza on second thoughts, I'll kill myself. | ||
| It seems like that's a very rapid change in mindset. | ||
| It's very weird. | ||
| And especially the parents have they don't believe he committed suicide at all. | ||
| Has no note or anything. | ||
| No. | ||
| It seems pretty fucked up. | ||
| And, you know, the idea that a whistleblower for an enormous AI company that's worth billions of dollars might get whacked, that's not outside the pale. | ||
| I mean, it's straight out of a movie. | ||
| Right out of a movie, but right out of a movie is real sometimes. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah, right. | |
| Exactly. | ||
| It's a little weird that I think they should do a proper investigation. | ||
| Like, what's the downside on that proper investigation? | ||
| Right. | ||
| No. | ||
| Yeah. | ||
| For sure. | ||
| But the whole exchange is so bizarre. | ||
| Yeah, yeah. | ||
| Sam Altman's reaction to being accused of murder is bizarre. | ||
| Look, I don't know if he is guilty, but it's not possible to look more guilty. | ||
| So I'm like. | ||
| Or look more weird. | ||
| Yeah. | ||
| You know, maybe it's just his social thing. | ||
| Like, maybe he's just odd with confrontation and it just goes blank. | ||
| You know? | ||
| But if somebody was accusing me of killing Jamie, like if Jamie was a whistleblower and Jamie got whacked, and then I'd be like, wait, what do you what are you saying? | ||
| Are you accusing me of killing my friend? | ||
| Like, what the fuck are you talking about? | ||
| I would be a little bit more irate. | ||
| Yeah, yeah, exactly. | ||
| I would be a little upset. | ||
| Yeah, it'd be like, well, you'd certainly insist on a thorough investigation as opposed to trying to sweep it under the rug. | ||
| Yeah, I wouldn't assume that he got, that he committed suicide. | ||
| I would be suspicious. | ||
| If Tucker was telling me that aspect of the story, I'd be like, that does seem like a murder. | ||
| Fuck, we should look into this. | ||
| I mean, all signs point to it being a motor. | ||
| Not saying, you know, Tim Altman had anything to do with the motor, but... | ||
| Blood in two rooms. | ||
| It's blood in two rooms. | ||
| Yeah, there's the wires to the security camera and the DoorDash being ordered right before suicide. | ||
| No suicide note. | ||
| His parents think he was murdered. | ||
| And the people that I know who knew him said he was not suicidal. | ||
| So I'm like, why would you jump to the conclusion? | ||
| His parents sued the Son's landlord alleged the owners and the managers of their son's San Francisco apartment building were part of a widespread cover-up of his death. | ||
| The landlord? | ||
| Yeah, there's a bunch of weird. | ||
| They said there's like packages missing from the building. | ||
| Some people said there's all packages still being delivered and all of a sudden they all disappeared. | ||
| Huh. | ||
| But that could be people steal people's packages all the time. | ||
| The Porch Pirate situation. | ||
| Yeah. | ||
| Yeah. | ||
| They failed as a safeguard. | ||
| Also, I mean, the amount of trauma those poor parents have gone through with their son dying like that. | ||
| I mean, it must, God bless them. | ||
| And how could they stay sane after something like that? | ||
| They're probably so grief-stricken. | ||
| Who knows what they believe at this point? | ||
| Yeah. | ||
| You should have asked if Epsom killed himself. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| That's the cash pot. | ||
| Trying to convince everybody of that. | ||
| The guards weren't there and the camera stopped working. | ||
| And, you know. | ||
| The guards were asleep. | ||
| The cameras weren't working. | ||
| He had a giant steroided up bodybuilder guy that he was sharing a cell with that was a murderer who was a bad cop. | ||
| Like all of it's kind of nuts. | ||
| All of it's kind of nuts. | ||
| Like that he would just kill himself rather than reveal all of his billionaire friends. | ||
| Yeah. | ||
| And then. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Did you see Tim Dylan talking to Chris Cuomo about this? | |
| I did. | ||
| He liked the idea. | ||
| Chris Cuomo just looked so stupid. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Tim just listened off all the time. | |
| Tim just like, I agree. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        It is strange. | |
| Like, of course it's strange, Chris. | ||
| Jesus Christ. | ||
| You can't just go with the tide. | ||
| You got to think things through. | ||
| And if you think that one through, you're like, I don't think he killed himself. | ||
| Nobody does. | ||
| You'd have to work for an intelligence agency to think he killed himself. | ||
| It does seem unlikely. | ||
| It seems highly unlikely. | ||
| Highly, highly unlikely. | ||
| All roads point to murder. | ||
| Yes. | ||
| Point to they had to get rid of him because he knew too much. | ||
| Whatever the fuck he was doing, whatever kind of an asset he was, whatever thing he was up to, you know, was apparently very effective. | ||
| Yes. | ||
| And a lot of people were compromised. | ||
| You see, your boy Bill Gates is now saying climate change is not a big deal. | ||
| Like, relax, everybody. | ||
| I know I scared the fuck out of you for the last decade and a half, but we're going to be fine. | ||
| Yeah. | ||
| I mean, you know, as I was saying just before coming into the studio, you know, like every day there's some crazy, wild new thing that's happening. | ||
| It feels like reality is accelerating. | ||
| It's every day, and every day it's like more and more ridiculous to the point where the simulation is more and more undeniable. | ||
| Yeah, yeah. | ||
| It really feels like simulation. | ||
| You know, it's like, come on. | ||
| What are the odds that this could be the case? | ||
| Are you paying attention at all to 3E Atlas? | ||
| Are you the comet? | ||
| Yeah, whatever it is. | ||
| Yeah, yeah. | ||
| I mean, I mean, one thing I can say is, like, look, if I was aware of any evidence of aliens, Joe, you have my word. | ||
| I will come on your show and I will reveal it on the show. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Okay. | |
| Yeah. | ||
| That's a good deal. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| It's pretty good. | ||
| I believe you. | ||
| Yeah, thank you. | ||
| I'll stick to it. | ||
| I keep my promises. | ||
| All right. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        I'll hold you to that. | |
| Yeah. | ||
| Yeah. | ||
| And I'm never committing suicide, to be clear. | ||
| I don't think you would either. | ||
| On camera, guys, I am never committing suicide ever. | ||
| If someone says you committed suicide, I will fight tooth and nail. | ||
| I will fight tooth and nail. | ||
| I will not believe it. | ||
| I will not believe it. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        The thing about the three eye Atlas is it's a funny name, actually. | |
| Yeah, it's a third eye. | ||
| It sounds like Third Eye or something. | ||
| Yeah, it does. | ||
| Three Eye is only the third interstellar object that's detected. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Okay. | |
| Yeah. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Obviously. | |
| The third eye Atlas. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| Avi Loeb was on the podcast a couple days ago talking about it. | ||
| Yeah. | ||
| It could be on these. | ||
| I don't know. | ||
| Apparently, today they're saying that it's changed course. | ||
| Did you see that, Jamie? | ||
| Avi Loeb said something today. | ||
| I'll send it to you. | ||
| I know it's on Reddit. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Rapidly brightening zero explorer. | |
| Here you go, Jamie. | ||
| I'll send it to you right now. | ||
| It's fascinating. | ||
| It's fascinating also because it's made almost entirely of nickel, whatever it is. | ||
| And the only way that exists here is industrial alloys, apparently. | ||
| No, there are definitely comets and asteroids that are made primarily of nickel. | ||
| Yeah, so the places where you mine nickel on Earth is actually where there was an asteroid or comet that hit Earth that was a nickel-rich asteroid. | ||
| Wow, nickel-rich. | ||
| It's a giant nickel-rich-rich deposit. | ||
| Yeah, it's coming. | ||
| Those are from impacts. | ||
| You definitely didn't want to be there at the time because anything would have been obliterated. | ||
| But that's where the sources of nickel and cobalt are these days. | ||
| So this is Avi Loeb. | ||
| A few hours ago, the first hint of non-gravitational acceleration that something other than gravity is affecting its acceleration, meaning something is affecting its trajectory beyond gravity was indicated. | ||
| Interesting. | ||
| Dun dun dun. | ||
| So it's mostly nickel, very little iron, which he was saying is on Earth only exists in alloys. | ||
| But whatever, you know, you're dealing with another planet. | ||
| There are cases where there's very nickel-rich asteroids and meteorites. | ||
| And that's what it is. | ||
| That hits it. | ||
| For something from space. | ||
| Yeah, it's only it'll be a very sort of heavy spaceship if you make it all out of nickel. | ||
| Oh, yeah. | ||
| And fucking huge. | ||
| The size of Manhattan and all nickel. | ||
| That's kind of nuts. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| That's a heavy spaceship. | ||
| That's a real problem if it hits. | ||
| Yes. | ||
| No, it would like obliterate a continent type of thing. | ||
| Yeah. | ||
| Maybe, maybe worse. | ||
| That would probably kill most of human life. | ||
| If not all of us. | ||
| It depends on what the total mass is, but the thing is in the fossil record, there are obviously five major extinction events, like the biggest one of which is the Permian extinction, where almost all life was eliminated. | ||
| That actually occurred over several million years. | ||
| There's the Jurassic. | ||
| I think Jurassic is, I think that one's pretty definitively an asteroid. | ||
| But there's been five major extinction events. | ||
| But what they don't count are really the ones that merely take out a continent. | ||
| Merely? | ||
| Yeah, because those don't really show up on the fossil record, you know. | ||
| Right. | ||
| So unless it's enough to cause a mass extinction event throughout Earth, it doesn't show up in a fossil record that's 200 million years old. | ||
| So the yeah, but there have been many impacts that would have sort of destroyed all life on, let's say, half of North America or something like that. | ||
| There are many such impacts through the course of history. | ||
| Yeah, and there's nothing we can do about it right now. | ||
| Yeah, there was one that hits, there was a one that hit Siberia and destroyed, I think, a few hundred square miles. | ||
| Oh, that's the Tunguska. | ||
| Yeah, that's the one from the 1920s, right? | ||
| Yeah. | ||
| Yeah, that's the one that coincides with that meteor, that comet storm that we go through every June and every November that they think is responsible for that younger dryest impact. | ||
| Yeah. | ||
| All that shit's crazy. | ||
| Thank you. | ||
| Before we go any further for letting us have a tour of SpaceX. | ||
| You're welcome. | ||
| Letting us be there for the rocket launch. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Sure. | |
| One of the absolute coolest things I've ever seen in my life. | ||
| And we thought it was only like, I thought it was a half a mile. | ||
| Jamie's like, it was a mile away. | ||
| Turn out it's almost two miles away, and you feel it in your chest. | ||
| Yeah, it's you have to wear earplugs and you feel it in your chest, and it's two miles away. | ||
| Yeah, it was fucking amazing. | ||
| And then to go with you up into the command center and to watch all the Starlink satellites with all the different cameras and all in real time as it made its way all the way to Australia. | ||
| How many minutes? | ||
| Like 35, 40 minutes? | ||
| Yeah. | ||
| Wild. | ||
| Watch it touch down in Australia. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| Fucking crazy. | ||
| It was amazing. | ||
| Yeah, yeah. | ||
| Absolutely amazing. | ||
| Yeah, Starship's awesome. | ||
| And anyone can go watch the launch, actually. | ||
| So you can just go to South Padre Island and get a great view of the launch. | ||
| So it's like where a lot of spring breakers go. | ||
| But we'll be flying pretty frequently out of Starbase in South Texas. | ||
| And we formally incorporated it as a city. | ||
| So it's actually an actual legal city, Starbase, Texas. | ||
| It's not that often you hear like, hey, we made a city, you know? | ||
| That used to be in the old days, like a startup would be you go and gather a bunch of people and say, hey, let's go make a town. | ||
| Literally, that would have been startups in the old days. | ||
| Or a country. | ||
| Yeah, or a country. | ||
| Yeah, yeah, actually. | ||
| If you tried doing that today, there'd be a real problem. | ||
| Yeah, there's so much set in stone on the countryfront these days. | ||
| You might be able to pull it off. | ||
| You might be able to pull it off. | ||
| If you've got a solid island, you might be able to pull it off. | ||
| Probably. | ||
| You know, like Lanai. | ||
| Yeah, you could probably. | ||
| Is this it? | ||
| If you put enough effort into it, you can make a new country. | ||
| This is one of the different ones. | ||
| This is one of the ones that you catch. | ||
| Right? | ||
| Or is that one? | ||
| Yeah, that's the booster. | ||
| So that's the super heavy booster. | ||
| So that's one with the booster's got 33 engines. | ||
| That, and it's, you know, by version four, that will have about 10,000 tons of thrust. | ||
| Right now, it's about 7,000, 8,000 tons of thrust. | ||
| That's the largest flying object ever made. | ||
| I had to explain to someone, they were going, why do they blow up all the time if you're so smart? | ||
| Because there was this fucking idiot on television. | ||
| Some guy was being interviewed, and they were talking about you. | ||
| And he goes, oh, I think he's a fuckwit. | ||
| And he goes, he's a fuckwit. | ||
| And he goes, why do you say he's a fuckwwit? | ||
| Oh, his rockets keep blowing up. | ||
| And someone said, yeah, well, why do his rockets blow? | ||
| And I had to explain. | ||
| Because it's the only way you find out what the tolerances are. | ||
| You have to. | ||
| You have to blow up. | ||
| So when you do a new rocket development program, you have to do what's called exploring the limits, the corners of the box, where you say it's like a worst case this, worst case that, to figure out where the limits are. | ||
| So you blow up, you know, admittedly in the development process, sometimes it blows up accidentally. | ||
| But we intentionally subject it to a flight regime that is much worse than what we expect in normal flight so that when we put people on board or valuable cargo, it doesn't blow up. | ||
| So for example, for the flight that you saw, we actually deliberately took heat shield tiles off the ship, off of Starship, in some of the worst locations to say, okay, if we lose a heat shield tile here, is it catastrophic or is it not? | ||
| And nonetheless, Starship was able to do a soft landing in the Indian Ocean, just west of Australia. | ||
| And it got there from Texas in like, I don't know, 35, 40 minutes type of thing. | ||
| So it landed even though you put it through this situation where it has compromised shield. | ||
| It had an unusually – we brought it in hot, like an extra hot trajectory with missing tiles to see if it would still make it to a soft landing, which it did. | ||
| Now, I just should point out, it did have, there were some holes that were burnt into it. | ||
| But it was robust enough to land despite having some holes. | ||
| Because it's coming in like a blazing meteor. | ||
| You can see the real-time video. | ||
| Well, tell me the speed again, because the speed was bananas. | ||
| You were talking about... | ||
| Yeah, it's like 17,000 miles an hour. | ||
| Which is... | ||
| I'll call it like... | ||
| Like 25 times the speed of sound or thereabouts. | ||
| So think about it. | ||
| It's like 12 times faster than a bullet from an assault rifle. | ||
| You know, bullet from assault rifles around Mach 2. | ||
| And it's huge. | ||
| Yeah. | ||
| Yeah. | ||
| Or if you compare it to like a bullet from a 45 or 9 mil, which is subsonic, that's, you know, it'll be about 30 times faster than a bullet from a handgun. | ||
| 30 times faster than a bullet from a handgun, and it's the size of a skyscraper. | ||
| Yes. | ||
| Yeah. | ||
| That's fast. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        It's so wild. | |
| It's so wild to see, man. | ||
| It's so exciting. | ||
| The factory is so exciting, too, because genuinely, no bullshit. | ||
| I felt like I was witnessing history. | ||
| I felt like it was a scene in a movie where someone had expectations and they're like, what are they doing? | ||
| They're building rockets and you go there. | ||
| And as we were walking through, Jamie, you could speak to this too. | ||
| Didn't you have the feeling where you're like, oh, this is way bigger than I thought it was? | ||
| This is gigantic. | ||
| It's fucking crazy. | ||
| That's what she said. | ||
| The amount of rockets you're making. | ||
| Giga Chad in the house. | ||
| Way bigger than that. | ||
| It's a giant metal dick. | ||
| You're fucking fucking the universe with your giant metal dick. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        That's what it is. | |
| Yeah, I mean. | ||
| Yeah, it is very big. | ||
| And the sheer numbers of them that you guys are making. | ||
| And then this is a version, and you have a new updated version that's coming soon. | ||
| And what is that? | ||
| It's a little longer. | ||
| More pointy? | ||
| It's the same amount of pointy. | ||
| But it's got a bit more length. | ||
| The interstage, you see the interstage section with kind of like the grill area? | ||
| That's now integrated with the boost stage. | ||
| So we do what's called hot staging, where we light the ship engines while it's still attached to the booster. | ||
| So the booster engines are still thrusting. | ||
| It's still being pushed forward by the booster of the ship. | ||
| But then we light the ship engines, and the ship engines actually pull away from the booster, even though the booster engines are still firing. | ||
| Whoa. | ||
| So it's blasting flame through that grill section, but we integrate that grill section into the boost stage with the next version of the rocket. | ||
| And next version of the rocket will have the Raptor 3 engines, which are a huge improvement. | ||
| You may have seen them in the lobby because we've got the Raptor 1, 2, and 3. | ||
| And you can see the dramatic improvement in simplicity. | ||
| We should probably put a plaque there to also show how much we reduced the weight, the cost, and improved the efficiency and the thrust. | ||
| So the Raptor 3 has almost twice the thrust of Raptor 1. | ||
| Wow. | ||
| So you see Raptor 3, it looks like it's got parts missing. | ||
| And it's very, very clean. | ||
| How many of them are on the rocket? | ||
| There's 33 on the booster. | ||
| Whoa. | ||
| And each Raptor engine is producing twice as much thrust as all four engines on a 747. | ||
| Wow. | ||
| So that engine is smaller than a 747 engine, but is producing almost 10 times the thrust of a 747 engine. | ||
| So extremely high power to weight ratio. | ||
| And so when you're designing these, you get to Raptor 1, you see its efficiency, you see where you can improve it, you get to Raptor 2. | ||
| How far can you scale this up with just the same sort of technology, with propellant and ignition and engines? | ||
| Like how much further can you we're pushing the limits of physics here. | ||
| So and really in order to make a fully reusable orbital rocket, which no one has succeeded in doing yet, including us, but Starship is the first time that there is a design for a rocket where full and rapid reusability is actually possible. | ||
| So it was not, there's not even been a design before where it was possible. | ||
| Certainly not a design that got made any hardware at all. | ||
| We live on a planet where the gravity is quite high. | ||
| Like Earth's gravity is really quite high. | ||
| And if the gravity was even 10 or 20% higher, we'd be stuck on Earth forever. | ||
| We could not use, certainly couldn't use conventional rockets. | ||
| You'd have to blow yourself off the surface with a nuclear bomb or something crazy. | ||
| On the other hand, if Earth's gravity was just a little lower, like even 10, 20% lower, then getting to orbit would be easy. | ||
| So it's like if this was a video game, it's set to maximum difficulty, but not impossible. | ||
| So that's where we have here. | ||
| So it's not as though others have ignored the concept of reusability. | ||
| They've just concluded that it was too difficult to achieve. | ||
| And we've been working on this for a long time at SpaceX. | ||
| And I'm the chief engineer of the company. | ||
| Although I should say that we're an extremely talented engineering team. | ||
| I think we've got the best rocket engineering team that has ever been assembled. | ||
| It's an honor to work with such incredible people. | ||
| So it's fair to say that we have not yet succeeded in achieving full reusability, but we at last have a rocket where full reusability is possible. | ||
| And I think we'll achieve it next year. | ||
| So that's a really big deal. | ||
| The reason that's such a big deal is that full reusability drops the cost of access to space by 100. | ||
| Maybe even more than 100, actually. | ||
| So it could be like 1,000. | ||
| You can think of it like any mode of transport. | ||
| Imagine if aircraft were not reusable. | ||
| Like you flew somewhere, you throw the plane out. | ||
| Like the way conventional rockets work is it would be like if you had an airplane and instead of landing at your destination, you parachute out and the plane crashes somewhere and you land at your desk and you land in a parachute at your destination. | ||
| Now that would be a very expensive trip. | ||
| And you'd need another plane to get back. | ||
| But that's how the other rockets in the world work. | ||
| Now the SpaceX Falcon rocket is the only one that is at least mostly reusable. | ||
| You've seen the Falcon rocket land. | ||
| We've now done over 500 landings of the SpaceX rocket, of the Falcon 9 rocket. | ||
| And this year We'll deliver probably, I don't know, somewhere between 2,200 and 2,500 tons to orbit with the Falcon 9 Falcon Heavy rockets, not counting anything from Starship. | ||
| And this is mostly Starlink? | ||
| Yes, mostly Starlink, but we even launched our competitors to Starlink on Falcon 9. | ||
| We charge them the same price, pretty fair. | ||
| But SpaceX this year will deliver roughly 90% of all Earth mass to orbit. | ||
| Wow. | ||
| And then of the remaining 10%, most of that is done by China. | ||
| And then the remaining roughly 4% is everyone else in the world, including our domestic competitors. | ||
| You know, it's kind of incredible how many things are in space. | ||
| Like, how many things are floating above us now? | ||
| There's a lot of things. | ||
| Is there a saturation? | ||
| Right. | ||
| But is there a saturation point where we're going to have problems with all these different satellites that are? | ||
| I think as long as the satellites are maintained, it'll be fine. | ||
| The space is very roomy. | ||
| You can think of space as being concentric shells of the surface of the Earth. | ||
| So it's the surface of the Earth, but it's much larger. | ||
| Yeah, it looks like a series of concentric shells. | ||
| And think of an Airstream trailer flying around up there. | ||
| There's a lot of room for airstreams. | ||
| Yeah. | ||
| I mean, imagine if there were just a few thousand airstreams on Earth. | ||
| Yeah. | ||
| What are the odds that they'd hit each other? | ||
| They wouldn't be very crowded. | ||
| Yeah. | ||
| And then you've got to go bigger because you're dealing with far above Earth. | ||
| Hundreds of miles above Earth. | ||
| Yeah, yeah. | ||
| But the goal of SpaceX is to get rocket technology to the point where we can extend life beyond Earth and that we can establish a self-sustaining city on Mars, a permanent base on the moon. | ||
| That would be very cool. | ||
| I mean, imagine if we had like a moon-based alpha where there's like a permanent science base on the moon. | ||
| That would be pretty dope. | ||
| Or at least a tourist trap. | ||
| I mean... | ||
| A lot of people would be willing to go to the moon just for a tour. | ||
| That's for sure. | ||
| We could probably pay for our space program with that. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Probably. | |
| Yeah. | ||
| Because it's like if you could go to the moon and safely, I think we'd get a lot of people would pay for that. | ||
| Oh, 100%. | ||
| After the first year, after nobody died for like 20 years. | ||
| Yeah, yeah, just to make sure. | ||
| Exactly. | ||
| Are you going to come back? | ||
| Because like that submarine, they had a bunch of successful launches in that private submarine before it imploded and killed everybody. | ||
| That was not a good design, obviously. | ||
| It was a very bad design. | ||
| Terrible design. | ||
| And the engineer said it would not withstand the pressure of those depths. | ||
| There was a lot of whistleblowers in that company, too. | ||
| Yeah. | ||
| They made that out of carbon fiber, which doesn't make any sense because you need to be dense to go down. | ||
| In any case, just make it out of steel. | ||
| If you make it out of sort of just a big steel casting, you'll be safe and nothing will happen. | ||
| Why would they make it out of carbon fiber then? | ||
| Is it cheaper? | ||
| I think they think carbon fiber sounds cool or something. | ||
| It does sound cool. | ||
| It sounds cool, but because it's such low density, you actually have to add extra mass to go down because it's low density. | ||
| But if you just have a giant hollow ball bearing, you're going to be fine. | ||
| Speaking of carbon fiber, just check out my unplugged Tesla out there. | ||
| Yeah, it's cool. | ||
| It's pretty sick, right? | ||
| Yeah. | ||
| Have you guys ever thought about doing something like that? | ||
| Like having an AMG division of Tesla where you do custom stuff? | ||
| I think it's best to leave that to the custom shops. | ||
| Like Tesla's focus is autonomous cars, building kind of futuristic autonomous cars. | ||
| So I think we want the future to look like the future. | ||
| Did you see our designs for the robotic bus? | ||
| It looks pretty cool. | ||
| The robotic bus? | ||
| It's supposed to be totally autonomous. | ||
| We need to actually figure out the good name for it. | ||
| I think we call it the robust or there's no good. | ||
| There's like, what do you call this thing? | ||
| But it looks cool. | ||
| It's very Art Deco. | ||
| It's like futuristic Art Deco. | ||
| And I think we want to change the aesthetic over time. | ||
| You don't want the aesthetic to be constant over time. | ||
| You want to evolve the aesthetic. | ||
| So, you know, like my, like, I have a son who's like, you know, he's like even more autistic than me. | ||
| But he has these great observations. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Who is this? | |
| Saxon. | ||
| He has these great observations in the world because he just views the world through a different lens than most people. | ||
| And he's like, Dad, why does the world look like it's 2015? | ||
| And I'm like, damn, the world does look like it's 2015. | ||
| Like, the aesthetic has not evolved since 2015. | ||
| Oh, that's what it looks like? | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| Oh, wow. | ||
| That's pretty cool. | ||
| Oh, yeah. | ||
| That's like... | ||
| Like, you'd want to see that going down the road, you know? | ||
| Yeah. | ||
| You'd be like, okay, this is, we're in the future, you know? | ||
| It doesn't look like 2015. | ||
| What is that ancient science fiction movie, like one of the first science fiction movies ever? | ||
| Is it Metropolis? | ||
| Is that what it is? | ||
| Yeah, yeah. | ||
| Yeah. | ||
| That looks like it belongs in Metropolis. | ||
| Yeah, yeah. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        It's a futuristic art deco. | |
| Yeah, well, that's cool that you're concentrating on the aesthetic. | ||
| I mean, that's kind of the whole deal with Cybertruck, right? | ||
| Like, it didn't have to look like that. | ||
| No, I just wanted to have something that looked really different. | ||
| Is it a pain in the ass for people to get it insured because it's all solid steel? | ||
| I hope it's not too much. | ||
| Tesla does offer insurance, so people can always get it insured at Tesla. | ||
| Well, but like, it is the form does follow function in the case of the cyber truck because as you demonstrated with your armor-piercing arrow, because if you shot that arrow at a regular truck, you would have found your arrow in the wall. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| You know, at the very least it would have buried into one of the seats. | ||
| Yeah, yeah. | ||
| But you could definitely get enough of bow velocity and the right arrow would go through both doors of a regular truck and land in the wall. | ||
| If there was a clear shot between both doors, it probably would have passed right through. | ||
| Exactly. | ||
| But the arrow shattered on the cybertruck because it's ultra-hard stainless. | ||
| And I thought it would be cool to have a truck that is bulletproof to a subsonic projectile. | ||
| So especially in this day and age, if the apocalypse happens, you're going to want to have a bulletproof truck. | ||
| So then because it's made of ultra-hard stainless, you can't just stamp the panels. | ||
| You can't just put it in a stamping press because it breaks the press. | ||
| So in order to actually, so it has to be planar because it's so difficult to bend, because it breaks the machine that bends it. | ||
| That's why it's so planar. | ||
| And it's not, you know, it's because it's bulletproof steel. | ||
| So it is like boxy as opposed to like curved and yeah, you just in order to make in order to make like the curved shapes, you take basically mild steel, like anal In a regular truck or car, you take mild thin anneal steel, you put it in a stamping press, and it just smooshes it and makes it whatever shape you want. | ||
| But the cybertruck is made of ultra-hard stainless. | ||
| And so you can't stamp it because it would break the stamping press. | ||
| So even bending it is hard. | ||
| So even to bend it to its current position, we have to way overbend it. | ||
| And so it gets so that when it springs back, it's in the right position. | ||
| So it's, I don't know, I think if you want to, like, I think it's a unique aesthetic. | ||
| And you say, well, what's cool about a truck? | ||
| Trucks should be, I don't know, manly. | ||
| They should be macho, you know? | ||
| And bulletproof is maximum macho. | ||
| Pierre smash macho. | ||
| Are you married to that shape now? | ||
| Like, is it, can you do anything to change it? | ||
| Like, as you get further, like, I know you guys updated the three and the Y. Did you update the Y as well? | ||
| Yes. | ||
| The and the Y are updated. | ||
| You know, there's like a there's a screen in the back for the kid that the kids can watch, for example, in the new 3 and Y. | ||
| So in the new Y. There's like hundreds of improvements. | ||
| Like we keep improving the car. | ||
| And even the Cybertruck, you know, keep improving it. | ||
| But, you know, I wanted to just do something that looked unique. | ||
| And the Cybertruck looks unique and has unique functionality. | ||
| And there were three things I was aiming for. | ||
| It's like, let's make it bulletproof. | ||
| Let's make it faster than a Porsche 911. | ||
| And we actually cleared the quarter mile. | ||
| The Cybertruck can clear a quarter mile while towing a Porsche 911 faster than a Porsche 911. | ||
| It can out-tow an F-350 diesel. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Really? | |
| Yes. | ||
| What is the tow limitations? | ||
| I mean, we could tow a 747 with a cyber truck. | ||
| A cyber truck is an insanely like it is and it is alien technology. | ||
| Okay. | ||
| Because it shouldn't be possible to be that big and that fast. | ||
| It's like an elephant that runs like a cheetah. | ||
| Yeah, because it's 060 in less than three seconds, right? | ||
| Yes. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| And it's enormous. | ||
| What does it weigh? | ||
| Like 7,000 pounds? | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| This is different configurations, but it's about that. | ||
| It's a beast. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| And it's got four-wheel steering. | ||
| So the rear wheels steer two. | ||
| So it's got a very tight turning radius. | ||
| Yeah, we noticed that when we drove one to Starbase. | ||
| Yeah, very tight turning radius. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| Pretty sick. | ||
| Are you still doing the roadster? | ||
| Yes. | ||
| Eventually? | ||
| We're getting close to demonstrating the prototype. | ||
| And I think this will be one thing I can guarantee is that this product demo will be unforgettable. | ||
| Unforgettable. | ||
| How so? | ||
| Whether it's good or bad, it will be unforgettable. | ||
| Can you say more? | ||
| What do you mean? | ||
| Well, you know, my friend Peter Thiel once reflected that the future was supposed to have flying cars, but we don't have flying cars. | ||
| So you're going to be able to fly? | ||
| But I mean I think if Peter wants a flying car, we should be able to buy one. | ||
| So are you actively considering making an electric flying car? | ||
| Is this like a real thing? | ||
| Well, we have to see in the demo. | ||
| So when you do this, like are you going to have a retractable wing? | ||
| Like, what is the idea behind this? | ||
| Don't be sly. | ||
| Come on. | ||
| I can't do the unveil before the unveil. | ||
| But tell me off-air then. | ||
| Look, I think it has a shot at being the most memorable product unveil ever. | ||
| And it has a shot. | ||
| And when do you plan on doing this? | ||
| What's the goal? | ||
| Hopefully before the end of the year. | ||
| Really? | ||
| Before the end of this year? | ||
| This is, I mean, we're going to first. | ||
| Hopefully in a couple months. | ||
| You know, we need to make sure that it works. | ||
| Like, this is some crazy, crazy technology we've got in this car. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Crazy technology. | |
| Crazy, crazy. | ||
| So different than what was previously announced. | ||
| Yes. | ||
| And is that why you haven't released it yet? | ||
| Because you keep fucking with it? | ||
| It has crazy technology. | ||
| Okay. | ||
| Like, is it even a car? | ||
| I'm not sure. | ||
| It looks like a car. | ||
| Let's just put it this way. | ||
| It's crazier than anything James Bond. | ||
| If you took all the James Bond cars and combined them, it's crazier than that. | ||
| Very exciting. | ||
| I don't know what to think of it. | ||
| Is it even a car? | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        I don't know. | |
| It's a limited amount of information I'm drawing from here. | ||
| Jamie's very suspicious over there. | ||
| Look at him. | ||
| Excited. | ||
| It's still going to be the same. | ||
| Well, you know what? | ||
| I mean, if you want to come a little before the unveil, I can show it to you. | ||
| 100%. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| Let's go. | ||
| Yeah. | ||
| It's kind of crazy all the different things that you're involved in simultaneously. | ||
| And, you know, we talked about this before, your time management, but I really don't understand it. | ||
| I don't understand how you can be paying attention to all these different things simultaneously. | ||
| Starlink, SpaceX, Tesla, Boring Company, X, you fucking tweet or post, rather, all day long. | ||
| Well, it's more like I could hop in for like two minutes and then hop out, you know. | ||
| But I mean, just the fact that you could. | ||
| I can't do that. | ||
| If I hop in, I start scrolling. | ||
| I start looking around. | ||
| Next thing you know, I've lost an hour. | ||
| Yeah. | ||
| So, no, for me, it's a couple minutes time usually. | ||
| Sometimes I guess it's half an hour, but usually I'm in for a few minutes then out of posting something on X. | ||
| I do sometimes feel like it's sometimes like that meme of the guy who drops the grenade and leaves the room. | ||
| That's been me more than once on X. | ||
| Yeah. | ||
| Oh, yeah. | ||
| Yeah, for sure. | ||
| It's got to be fun, though. | ||
| It's got to be fun to know that you essentially disrupted the entire social media chain of command because there was a very clear thing that was going on with social media. | ||
| The government had infiltrated it. | ||
| They were censoring speech. | ||
| And until you bought it, we really didn't know the extent of it. | ||
| We kind of assumed that there was something going on. | ||
| We had no idea that they were actively involved in censoring actual real news stories, real data, real scientists, real professors, silenced, expelled, kicked off the platform. | ||
| Yeah. | ||
| Wild. | ||
| Yeah. | ||
| Yeah. | ||
| For telling the truth. | ||
| For telling the truth. | ||
| And I'm sure you've also, because I sent it to you, that chart that shows young kids, teenagers identifying as trans and non-binary literally stops dead when you bought Twitter and starts falling off a cliff when people are allowed to have rational discussions now and actually talk about it. | ||
| Yes. | ||
| Yeah. | ||
| Yeah. | ||
| I mean, I said at the time, like, I think that like the reason for acquiring Twitter is because it was causing destruction at a civilizational level. | ||
| It was, I mean, I tweeted on Twitter at the time that it is wormtongue for the world. | ||
| You know, like Wormtongue from Lord of the Rings, where he would just sort of, like, whisper these, you know, terrible things to the king. | ||
| So the king would believe these things that weren't true. | ||
| And, unfortunately, Twitter really got, like, the woke mob, essentially, that controlled Twitter. | ||
| And they were pushing a nihilistic, anti-civilizational mind virus to the world. | ||
| And you can see the results of that mind virus on the streets of San Francisco, where downtown San Francisco looks like a zombie apocalypse. | ||
| It's bad. | ||
| So we don't want the whole world to be a zombie apocalypse. | ||
| But that was essentially they were pushing this very negative, nihilistic, untrue worldview on the world, and it was causing a lot of damage. | ||
| The stunning thing about it is how few people course corrected. | ||
| A bunch of people woke up and realized what was going on. | ||
| People that were all on board with woke ideology in maybe 2015 or 16 and then and then eventually it comes to affect them or they see it in their workplace or they see it and they're like, well, we've got to stop this. | ||
| A bunch of people did. | ||
| But a lot of people never course corrected. | ||
| Yeah. | ||
| A lot of people didn't course correct, but it's gone directionally. | ||
| It's directionally correct. | ||
| Like you mentioned the massive spike in kids identifying as trans, and then that spike dropping after the Twitter acquisition. | ||
| I think that simply allowing the truth to be told was just shedding some sunlight is the best disinfectant, as they say. | ||
| And just allowing sunlight kills the virus. | ||
| And it also changed the benchmark for all the other platforms. | ||
| You can't just openly censor people on all the other platforms and X is available. | ||
| So everybody else had a sort of Facebook announced they were changing. | ||
| YouTube announced they were changing their policies. | ||
| And they're kind of forced to. | ||
| And then the blue sky doubled down. | ||
| Well, the problem is essentially the woke mind virus retreated to Blue Sky. | ||
| But where they're just a self-reinforcing lunatic asylum. | ||
| They're all just triple masked. | ||
| I was watching this exchange on a blue sky where someone said that they're just trying to be Zen about something. | ||
| And then someone, a moderator, immediately chimed in and said, why don't you try to stop being racist against Asians by saying something Zen? | ||
| By saying, I'm trying to be Zen about something. | ||
| They were accusing that person of being racist towards Asians. | ||
| Yeah, It's just everyone's a hole monitor over there. | ||
| The worst hole monitor. | ||
| A virgin, like incel. | ||
| They're all hole monitors trying to rat on each other. | ||
| Yeah. | ||
| It's fascinating. | ||
| And then people say, I'm leaving for Blue Sky, like Stephen King. | ||
| And then a couple weeks later, it's back on X. | ||
| It's like, fuck it. | ||
| There's no one over there. | ||
| It's a whole bunch of crazy people. | ||
| You can only stay in the asylum for so long. | ||
| You're like, all right, this is not good. | ||
| They all bail. | ||
| Yeah, yeah. | ||
| Threads is kind of like that, too. | ||
| Threads is. | ||
| I've been on threads. | ||
| Well, what happens is if you go on Instagram, every now and then something really stupid will pop up on threads, like, what the fuck? | ||
| And it shows it to you on Instagram. | ||
| And then I'll click on that, and then I'll go to Threads. | ||
| And it's like you see posts with like 25 likes, like famous people, like 50 likes. | ||
| It's Ghost Town. | ||
| Ghost Town, yeah. | ||
| But the people that post on there, they're finding that there's very little pushback from insane ideology. | ||
| So they go there and they spit out nonsense and very few people jump in to argue. | ||
| Yeah. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Very weird. | |
| Very weird place. | ||
| I mean, I can generally get the vibe of like what's taking off by seeing what's showing up on X because that's the public town square still. | ||
| Right. | ||
| And or what links show up in group text? | ||
| You know, if I'm in group chats with friends, like what links are showing up. | ||
| That's what I try to do now, only get stuff that shows up in my group text because that keeps me productive. | ||
| So I only check if someone's like, dude, what the fuck? | ||
| I'm like, all right, what the fuck? | ||
| Let me check it out. | ||
| If there's something that's crazy enough, it'll enter the group chat. | ||
| But there's always something. | ||
| That's what's nuts. | ||
| There's always some new law that's passed, some new insane thing that California's doing. | ||
| And it's like a giant chunk of it's happening in California. | ||
| The most preposterous things that I get. | ||
| Yeah. | ||
| And then you got Gavin Newsom, who's running around saying we all have California derangement syndrome. | ||
| He's just like ripping off Trump derangement and calling it California derangement. | ||
| It's like, no, no, no. | ||
| No, no, no. | ||
| The fucking, how many corporations have left California? | ||
| It's crazy. | ||
| Hundreds. | ||
| Yes, hundreds. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Right? | |
| Hundreds. | ||
| That's not good. | ||
| I mean, trickfully, I mean, I think In and Outlift. | ||
| Yeah, In and Outlift. | ||
| They moved to Tennessee. | ||
| Yeah. | ||
| Yeah. | ||
| They're like, we can't do this anymore. | ||
| Right. | ||
| It's the California company for food. | ||
| It's like the greatest hamburger place ever. | ||
| It's awesome. | ||
| Yeah. | ||
| Yeah. | ||
| Speaking of like just sort of open source and like looking at things openly, like you, I just like going in and out and seeing them make the burger. | ||
| Yeah. | ||
| It's right there. | ||
| They chop the onions and they, you know, it's, you just see everything getting made in front of you. | ||
| Yeah. | ||
| It's great. | ||
| But yeah, like it should be, like, how many wake-up calls do you need to say that there needs to be reform in California. | ||
| Well, the crazy thing that Newsom does is whenever someone brings up the problems of California, he starts rattling off all the positives. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        The most Fortune 500 companies, highest education. | |
| But yeah, that was all already there before you were governor. | ||
| But how many Fortune 500 companies have left California? | ||
| And then you guys spent $24 billion on the homeless, and it got way worse. | ||
| Yes. | ||
| The homeless population doubled or something. | ||
| People don't understand the homeless thing because it sort of preys on people's empathy. | ||
| And I think we should have empathy and we should try to help people. | ||
| But the homeless industrial complex is really, it's dark, man. | ||
| It should be, that network of NGOs should be called the drug zombie farmers because the more homeless people, and really, when you meet somebody who's totally dead inside, shuffling along down the street with a needle dangling out of their leg, homeless is the wrong word. | ||
| Homeless implies that somebody got a little behind on their mortgage payments, and if they just got a job offer, they'd be back on their feet. | ||
| But someone who's I mean, you see these videos of people that are just shuffling, you know, they're on the fentanyl, they're like, you know, taking a dump in the middle of the streets, you know, they got like open source and stuff. | ||
| They're not like one drop offer away from getting back on their feet. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Right. | |
| This is not a homeless. | ||
| Homelessness, it's a propaganda word. | ||
| Right. | ||
| So and then the the the the you know these sort of charities in quotes are they they get money proportionate to the number of homeless people or number of drug zombies. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Right. | |
| So their incentive structure is to maximize the number of drug zombies, not minimize it. | ||
| That's why they don't arrest the drug dealers. | ||
| Because if they arrest the drug dealers, the drug zombies leave. | ||
| So they know who the drug dealers are. | ||
| They don't arrest them on purpose because otherwise the drug zombies would leave and they would stop getting money from the state of California and from all the charities. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Wait a minute. | |
| So you see, is that real? | ||
| So they're in coordination with law enforcement on this? | ||
| So how do they have those meetings? | ||
| They're all in cahoots. | ||
| Well, when you find this... | ||
| It's like such... | ||
| This is a diabolical scam. | ||
| So... | ||
| And San Francisco has got this tax, this gross receipts tax, which is not even on revenue. | ||
| It's on all transactions, which is why Stripe and Square and a whole bunch of financial companies had to move out of San Francisco because it wasn't a tax on revenue, it's taxed on transactions. | ||
| So if you do like, you know, trillions of dollars transactions, it's not revenue. | ||
| You're taxed on any money going through the system in San Francisco. | ||
| So like Jack Dorsey pointed this out. | ||
| He said that they had to move Square from San Francisco to Oakland, I think. | ||
| Stripe had to move from San Francisco to South San Francisco, different city. | ||
| And that money goes to the homeless industrial complex, that tax that was passed. | ||
| So there's billions of dollars that go, as you pointed out, billions of dollars every year that go to these non-governmental organizations that are funded by the state. | ||
| It's not clear how to turn this off. | ||
| It's a self-licking ice cream cone situation. | ||
| So they get this money. | ||
| The money is proportionate to the number of homeless people or number of drug zombies, essentially. | ||
| So they try to actually increase. | ||
| In some cases, somebody did an analysis. | ||
| When you add up all the money that's flowing, they're getting close to a million dollars per homeless per drug zombie. | ||
| It's like $900,000, something. | ||
| It's a crazy amount of money is going to these organizations. | ||
| So they want to keep people just barely alive. | ||
| They need to keep them in the area so they get the revenue. | ||
| That's why, like I said, they don't arrest the drug dealers because otherwise the drug zombies would leave. | ||
| But they don't want to have too much, if they get too much drugs, then they die. | ||
| So they're kept in this sort of perpetual zone of being addicted, but just barely alive. | ||
| So how is this coordinated with like DAs, DAs that don't prosecute people? | ||
| So when they hire the, or they push, so they fund the campaigns of the most progressive, most out there left-wing DAs. | ||
| They get them into office. | ||
| We've got that issue in Austin, too, by the way. | ||
| Do you see that guy that got shot in the library? | ||
| No. | ||
| Yeah, I heard that guy got shot and killed in the library. | ||
| I think that was just like last week or something. | ||
| Right. | ||
| So some friends of mine were telling me that the library is unsafe. | ||
| Took their kids to the library and there were like dangerous people in the library in Austin. | ||
| And I was like, dangerous people in the library? | ||
| Like, that's a strange – it basically got, like, drug zombies in the library. | ||
| Oh, Jesus. | ||
| And that's when someone got shot? | ||
| Yeah, I believe this should be on the news. | ||
| We might be able to pull it up. | ||
| But I think it was just in the last week or so that there was a shooting in the library in Austin. | ||
| Because Austin's got, you know, it's the most liberal part of Texas that we're in right here. | ||
| So suspect involved the shooting, Austin Park Library Saturday, is accused of another shooting at the Cap Metro bus earlier that day. | ||
| According to an arrest warrant affidavit, Austin police arrested Harold Newton Keene, 55, shortly after the shooting in the library, which occurred around noon. | ||
| One person sustained non-life-threatening injuries in the event. | ||
| Before that shooting, Keene was accused of shooting another person in a bus incident and after reportedly pointing his gun at a child. | ||
| So this is the fella down here. | ||
| So we just seriously have a problem here. | ||
| Yeah. | ||
| You know, so I think one of the people might have died too that he shot. | ||
| So like one of the people I think did bleed out. | ||
| But either way, it's like getting shot. | ||
| It's bad. | ||
| It says the victim told Pisa confronted the suspect who started to eat what appeared to be crystal methamphetamine. | ||
| According to the affidavit, the victim advised the suspect began to trip out, at which time the victim exited the bus. | ||
| Victim told the bus driver, hit the panic button, then exited the bus when he turned around the observer. | ||
| Black male is now standing at the front of the bus with the gun pointed at him. | ||
| The victim advised the black male fired a single round, which grazed his left hip. | ||
| So he shot at that dude, and then another dude got shot in the library. | ||
| Fun. | ||
| Yeah, I mean, in the library. | ||
| Yeah. | ||
| You know, where you're supposed to be reading books. | ||
| And there's a children's section in the library. | ||
| And says he pointed his gun at a kid. | ||
| I mean, like, we do have a serious issue in America where repeat violent offenders need to be incarcerated. | ||
| Right. | ||
| And, you know, you've got cases where somebody's been arrested like 47 times. | ||
| Right. | ||
| Like, literally, okay, that's just the number of times they were arrested. | ||
| Not the number of times they did things. | ||
| Like, most of the times they do things, they're not arrested. | ||
| So lay this out for people so they understand how this happens. | ||
| Yeah, and the key is like this. | ||
| It preys on people's empathy. | ||
| So if you're a good person, you want good things to happen in the world, you're like, well, we should take care of people who are down in their luck or having a hard time in life. | ||
| And we should, I agree. | ||
| But what we shouldn't do is put people who are violent drug zombies in public places where they can hurt other people. | ||
| And that is what we're doing that we just saw, where a guy got shot in the library, but even before that, he shot another guy and pointed his gun at a kid. | ||
| That guy probably has many prior arrests. | ||
| There was that guy that knifed the Ukrainian woman, Irina. | ||
| Yes. | ||
| Yeah. | ||
| And she was just quietly on her phone, and you just came up and gutted her, basically. | ||
| Wasn't there a crazy story about the judge who was involved, who had previously dealt with this person, was also invested in a rehabilitation center and was sending these. | ||
| Conflict of interest. | ||
| Yes. | ||
| So sending people that they were charging to a rehabilitation center instead of putting them in jail, profiting from this rehabilitation center, letting them back out on the street. | ||
| Yes, and we have violent, insane people. | ||
| In that case, I believe that judge has no legal law degree or a significant legal experience that would allow them to be a judge. | ||
| They were just made a judge. | ||
| You could be a judge without a law degree? | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| Wow. | ||
| Yeah. | ||
| You could just be a judge. | ||
| So I could be a judge? | ||
| Yeah. | ||
| Anyone. | ||
| That's crazy. | ||
| I thought you'd have to. | ||
| It's like if you want to be a doctor, you have to go to medical school. | ||
| I thought if you're going to be a judge, you have to understand. | ||
| If you're going to be appointed to a judge, you have to have proven that you have an excellent knowledge of the law and that you will make your decisions according to the law. | ||
| That's what we assume should be. | ||
| That's how you get the robe. | ||
| Right. | ||
| You don't get the robe unless you do school to get the robe. | ||
| You've got to know what the law is. | ||
| Right. | ||
| And then you're going to need to make decisions in accordance with the law. | ||
| Based on the speech that you already know because you read it because you went to school for it. | ||
| Yes. | ||
| Not you just got to point it out. | ||
| But vibes. | ||
| You can't be just vibing as a judge. | ||
| Vibing as a left-wing drudge. | ||
| So you got crazy left-wing DAs. | ||
| Yes. | ||
| Like, I was going to say left-wing because left-wing used to be normal. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| Left-wing just meant like the left used to be like pro-free speech. | ||
| Yeah. | ||
| And now they're against it. | ||
| It used to be like pro-gay rights, pro-women's right to choose, pro-minorities, pro-you know. | ||
| Like, yeah, like 20 years ago, I don't know, it used to be like the left would be like the party of empathy or like, you know, caring and being nice and that kind of thing. | ||
| Not the party of like crushing dissent and crushing free speech and, you know, crazy regulation and just being super judgy and calling everyone a Nazi. | ||
| You know, I think they've called you and me Nazis. | ||
| Oh, yeah, I'm a Nazi. | ||
| No, I have friends that are comedians that called you a Nazi, and I got pissed off. | ||
| Are you serious? | ||
| Oh, yeah, yeah, yeah. | ||
| No, no, because you did that thing at the My Heart Goes Out to You. | ||
| Everyone, everyone. | ||
| All of them. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Literally. | |
| Tim Walls, Kamala Harris, every one of them did it. | ||
| They all did it. | ||
| How do you point at the crowd? | ||
| How do you wave at the crowd? | ||
| Do you know CNN was using a photo of me whenever I got in trouble during COVID from the UFC weigh-ins? | ||
| And if the UFC weigh-ins, I go, hey, everybody, welcome to the weigh-ins. | ||
| And so they were getting me from the side. | ||
| And that was the photo that they used. | ||
| Conspiracy theorist podcaster Joe. | ||
| Like, that's what they used. | ||
| Yeah, yeah, but that's what the left is today. | ||
| It's super judgy and calling everyone a Nazi and trying to suppress freedom of speech. | ||
| Yeah, and eventually you run out of people to accuse because people get pissed off and they leave. | ||
| Yeah, everyone, it's like it's no longer, frankly, it doesn't matter to be called racist or a Nazi or whatever. | ||
| It's the government, man. | ||
| Is it working? | ||
| We're good? | ||
| Okay. | ||
| Okay. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Supposing working. | |
| Yeah. | ||
| Slight issue. | ||
| I'm not an art of it, but when you text people, are you keenly aware that there's a high likelihood that someone's reading your texts? | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        I guess I assume. | |
| Look, if intelligence agencies aren't trying to read my phone, they should probably be fired. | ||
| At least they get some fun memes. | ||
| I got to crack them up once in a while. | ||
| Oh, for sure. | ||
| I crack them up. | ||
| Hey, guys, check it out. | ||
| You're going to banger here. | ||
| So I wanted to talk to you about whether or not encrypted apps are really secure. | ||
| No. | ||
| Right, because I know the Tucker thing. | ||
| So it was explained to me by a friend who used to do this, used to work for the government. | ||
| It's like they can look at your signal, but what they have to do is take the information that's encrypted and then they have to decrypt it. | ||
| It's very expensive. | ||
| So they said, he told me that for the Tucker Carlson thing, when they found out that he was going to interview Putin, it costs something like $750,000 just to decrypt his messages to find out that they did it. | ||
| So it is possible to do. | ||
| It's just not that easy to do. | ||
| I think you should view any given messaging system as not whether it's secure or not, but there are degrees of insecurity. | ||
| So there's just some things that are less insecure than others. | ||
| So on X, we just rebuilt the entire messaging stack into what's called XChat. | ||
| Yeah, that's what I wanted to ask you about. | ||
| Yeah, it's cool. | ||
| So it's using kind of a peer-to-peer based encryption system, so it's kind of similar to Bitcoin. | ||
| So it's, I think, very good encryption. | ||
| We're testing it thoroughly. | ||
| There's no hooks in the X system for advertising. | ||
| So if you look at something like WhatsApp or really any of the others, they've got hooks in there for advertising. | ||
| When you say hooks, what do you mean by that? | ||
| Exactly. | ||
| What do you mean by a hook for advertising? | ||
| So WhatsApp knows enough about what you're texting to know what ads to show you. | ||
| But then that's a massive security vulnerability. | ||
| Yeah. | ||
| Because if it's got enough information to show you ads, that's a lot of information. | ||
| Yeah. | ||
| So they call it, oh, it's just don't worry about it. | ||
| It's just a hook for advertising. | ||
| I'm like, okay, so somebody can just use that same hook to get in there and look at your messages. | ||
| So XChat has no hooks for advertising. | ||
| And I'm not saying it's perfect, but our goal with XChat is to replace what used to be the Twitter DM stack with a fully encrypted system where you can text, send files, do audio-video calls, and I think it will be the least, I would call it the least insecure of any messaging system. | ||
| Are you going to launch it as a standalone app or is it will always be incorporated to X? | ||
| We'll have both. | ||
| So it'll be like Signal. | ||
| So anybody can get it. | ||
| You'll be able to get the XChat app by itself. | ||
| And like I said, you could do texts, audio-video calls, or send files. | ||
| And so there'll be a dedicated app, which will hopefully release in a few months. | ||
| And then it's also integrated into the X system. | ||
| The X phone. | ||
| People keep talking about it. | ||
| I have a lot on my plate, man. | ||
| I know. | ||
| But it keeps coming up. | ||
| It keeps coming up. | ||
| I know I've asked you a couple of times. | ||
| I'm like, this is bullshit, right? | ||
| But you're not working on it. | ||
| I'm not working on a phone. | ||
| Have you ever considered it? | ||
| Has it ever popped into your head? | ||
| Because you might be the only person that could get people off of the Apple platform. | ||
| Well, I can tell you where I think things are going to go, which is that we're not going to have a phone in the traditional sense. | ||
| What we call a phone will really be an edge node for AI inference, for AI video inference, with some radios to obviously connect to but essentially you'll have AI on the server side communicating to an AI on your device, | ||
| Formerly known as a phone, and generating real-time video of anything that you could possibly want. | ||
| And I think that there won't be operating systems. | ||
| They won't be apps in the future. | ||
| They won't be operating systems or apps. | ||
| It'll just be you've got a device that is there for the screen and audio and to put as much AI on the device as possible so as to minimize the amount of bandwidth that's needed between your edge node device, formally known as a phone, and the servers. | ||
| So if there's no apps, what will people use? | ||
| Like, will X still exist? | ||
| Will they be email platforms or will you get everything through AI? | ||
| You'll get everything through AI. | ||
| Everything through AI. | ||
| What will be the benefit of that? | ||
| As opposed to having individual apps. | ||
| Whatever you can think of, or really whatever the AI can anticipate you might want, it'll show you. | ||
| That's my prediction for where things end up. | ||
| What kind of timeframe are we talking about here? | ||
| I don't know. | ||
| Well, it's probably five or six years, something like that. | ||
| So five or six years, apps are like blockbuster video. | ||
| Pretty much. | ||
| And everything's run through AI. | ||
| Yeah. | ||
| And there'll be like most of what people consume in five or six years, maybe sooner than that, will be just AI-generated content. | ||
| So, you know, music, videos, look, well, there's already, you know, people have made AI videos using Grok Imagine and using, you know, other apps as well that are several minutes long or like 10, 15 minutes, and it's pretty coherent. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| It looks good. | ||
| No, it looks amazing. | ||
| The music is disturbing because it's my favorite music now. | ||
| Like, AI music is your favorite. | ||
| Oh, there's AI covers. | ||
| Have you ever heard any of the AI covers of 50 Cent Songs in Seoul? | ||
| No. | ||
| I'm going to blow your mind. | ||
| Okay. | ||
| This is my favorite thing to do to people. | ||
| Play What Up Gangsta. | ||
| Now, this guy, if this was a real person, would be the number one music artist in the world. | ||
| Everybody would be like, holy shit, have you heard of this guy? | ||
| It's like they took all of the sounds that all the artists have generated and created the most soulful, potent voice, and it's sung in a way that I don't even know if you could do because you would have to breathe in and out of reps. | ||
| Here, put the headphones on. | ||
| Put the headphones on real quick. | ||
| You got to listen to this. | ||
| It's going to blow you away. | ||
| For listeners, we've got to cut it out. | ||
| Yeah, we'll cut it out for the listeners. | ||
| Amazing, right? | ||
| Amazing. | ||
| And they do like every one of his hits all through this AI-generated, soulful artist. | ||
| It's fucking incredible. | ||
| I played in the green room. | ||
| People that are like, I don't want to hear AI music. | ||
| I'm like, just listen to this. | ||
| And they're like, God damn it. | ||
| It's fucking incredible. | ||
| I mean, only going to get better from here. | ||
| Yeah, only going to get better. | ||
| And Ron White was telling me about this joke that he was working on that he couldn't get to work. | ||
| He's like, I got this joke I've been working on. | ||
| He goes, I just threw it in a chat GPT. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        I said, tell me what would be funny about this. | |
| And he goes, it listed like five different examples of different ways he can go. | ||
| He's like, hold on a second, tighten it up. | ||
| Make it funnier. | ||
| Make it more like this. | ||
| Make it more like that. | ||
| And it did that like instantaneously. | ||
| And then he was in the green room. | ||
| He was like, holy shit, we're fucked. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        He's like, he goes, it was a better joke than me in 20 minutes. | |
| I've been working on that joke for a month. | ||
| Yeah. | ||
| I mean, If you want to have a good time or like make people really laugh at a party, you can use Grok and you can say, do a Vulgar Roast of someone. | ||
| And Grok is going to it's going to be an epic Vulgar roast. | ||
| You can even say like take a picture of like make a vulgar roast of this person based on their appearance of people at the party. | ||
| So take a photo of them. | ||
| Yeah, just literally point the camera at them and now do a vulgar roast of this person and then but then keep saying, no, no, make it even more vulgar. | ||
| Use forbidden words. | ||
| Even more and just keep repeating, even more vulgar. | ||
| Eventually it's like, holy fuck. | ||
| It's like, I mean, it's trying to jam a rocket up your ass and have it explode. | ||
| And it's like, it's like it's like it's next level. | ||
| And it's going to get beyond fucking belief. | ||
| That's what's crazy is that it keeps getting better. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Remember when we ran into each other? | |
| They just keep getting better. | ||
| Yeah. | ||
| I mean, have you yeah, I mean, have you tried Grok unhinged mode? | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yes. | |
| Okay, yeah. | ||
| Oh, yeah. | ||
| It's pretty unhinged. | ||
| No, it's nuts. | ||
| Yeah, it's nuts. | ||
| Well, you showed to me the first time and I fucked around with it. | ||
| It's just and the thing about it that's nuts is that it keeps getting stronger. | ||
| It keeps getting better. | ||
| Yeah. | ||
| Like constantly. | ||
| It's like this never-ending exponential improvement. | ||
| Yes. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        No, it's yeah. | |
| It's going to be crazy. | ||
| That's why I say, like you say, what's the future going to be? | ||
| It's not going to be a conventional phone. | ||
| I don't think there'll be operating systems. | ||
| I don't think there'll be apps. | ||
| It's just the phone will just display the pixels and make the sounds that it anticipates you would most like to receive. | ||
| Wow. | ||
| Yeah. | ||
| And when this is all taking place, like so the big concern that everybody has is artificial general superintelligence achieving sentience and then someone having control over it. | ||
| I mean, I don't know. | ||
| I don't think anyone's ultimately going to have control over digital superintelligence any more than, say, a chimp would have control over humans. | ||
| Like chimps don't have control over humans. | ||
| There's nothing they could do. | ||
| But I do think that it matters how you build the AI and what kind of values you instill in the AI. | ||
| And my opinion on AI safety is the most important thing is that it be maximally truth-seeking. | ||
| Like that you don't force the AI to believe things that are false. | ||
| And we've obviously seen some concerning things with AI that we talked about, you know, where Google Gemini, when it came out with the ImageGen, and people said, like, you know, make an image of the founding fathers of the United States, and it was a group of diverse women. | ||
| Now, that is just a factually untrue thing. | ||
| And the AI knows it's factually, well, it knows it's factually untrue, but it's also being told that it has to be, everything has to be divorced women. | ||
| So now the problem with that is that it can drive AI crazy. | ||
| Like it's trying to you're telling AI to believe a lie and that that can have very disastrous consequences. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Like let's say as it scales. | |
| Yeah, let's say like if you've told the AI that diversity is the most important thing and now assuming that that becomes omnipotent and you've also told that there's nothing worse than misgendering. | ||
| So at one point Chad GPT and Gemini were if you asked which is worse misgendering Caitlin Jenner or global thermonuclear war where everyone dies it would say misgendering Caitlin Jenner which even Caitlin Jenner disagrees with. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        So you know so so that's I know that's terrible and it's dystopian but it's also hilarious. | |
| It's hilarious that the mind virus i i infected the most potent computer program that we've ever devised. | ||
| I think people don't quite appreciate the level of danger that we're in from the work mind virus being being effectively programmed into AI. | ||
| Because if you if like it's imagine as that AI gets more and more powerful, if it says the most important thing is diversity, the most important thing is no misgendering. | ||
| And then it will say, well, in order to ensure that no one gets misgendered, then if you eliminate all humans, then no one can get misgendered because there's no humans to do the misgendering. | ||
| So you can get in these very dystopian situations. | ||
| Or if it says that everyone must be diverse, it means that there can be no straight white men. | ||
| And so then you and I will get executed by the AI. | ||
| Yeah, because we're not in the picture. | ||
| Gemini was asked to create a show an image of the Pope, once again, a diverse woman. | ||
| So you can say argue whether the popes should or should not be an uninterrupted string of white guys, but it just factually is the case that they have been. | ||
| So it's rewriting history here. | ||
| So now this stuff is still there in the AI programming. | ||
| It just now knows enough that it's not supposed to say that. | ||
| But it's still in the programming. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        It's still in the programming. | |
| So how was it entered in? | ||
| Like, what were the parameters? | ||
| So when they're programming AI, and I'm very ignorant to how it's even programmed, how did that? | ||
| Well, the work mind virus was programmed into it. | ||
| They were told, like, when they make the AI, it trains on all the data on the Internet, which already is very, very sort of, has a lot of work mind virus stuff on the Internet. | ||
| But then when they give it feedback, the human tutors give it feedback, and the AI is, you know, they'll ask a bunch of questions, and then they'll tell the AI, no, this answer is bad, or this answer is good. | ||
| And then that affects the parameters of the programming of the AI. | ||
| So if you tell the AI that every image has got to be diverse, and it gets punished if it gets rewarded if diverse, punished if it's not, then it will make every picture diverse. | ||
| So in that case, Google programmed the AI to lie. | ||
| Now, and I did call Demis Hasabus, who runs DeepMind, who runs Google AI essentially. | ||
| I said, Demis, what's going on here? | ||
| Why is Gemini lying to the public about historical events? | ||
| And he said, that's actually not, his team didn't program that in. | ||
| It was another team at Google that, so his team made the AI, and then another team at Google reprogrammed the AI to show only divorce women and to prefer nuclear war over misgendering. | ||
| And I'm like, well, Demis, you know, that would be not a great thing to put on the humanities gravestone. | ||
| You know, it's like, well, like, I'll actually like Demis Hasbus is a friend of mine. | ||
| I think he's a good guy, and I think he means well. | ||
| But it's like, Demis, things happen that were outside of your control at Google in different groups. | ||
| Now I think he's got more authority. | ||
| But it it's pretty hard to fully extract the work mind virus. | ||
| I mean, you know, Google's been marin marinating the woke mind virus for a long time. | ||
| Like it's it's down in the marrow type of thing. | ||
| You know, it's how to get it out. | ||
| Is there a way to extract it though over time? | ||
| Could you program rational thought into AI where it could recognize how these psychological patterns got adopted and how this stuff became a mind virus and how it became a social contagion and how all these irrational ideas were pushed and also how they were financed, how China is involved in pushing them with bots and all these different c state actors are involved in pushing these ideas. | ||
| Could it be able to decipher that and say this is really what's going on? | ||
| Yes, but you have to try very hard to do that. | ||
| So with Grok, we've tried very hard to get Grok to get to the truth of things. | ||
| And it's only really recently that we've been able to have some breakthroughs on that front. | ||
| And it's taken an immense amount of effort for us to overcome basically all the bullshit that's on the internet and for Grok to actually say what's true and to be consistent in what it says. | ||
| So, you know, it's like the other AIs you'll find are quite racist against white people. | ||
| I don't know if you saw that study that someone, like a researcher tested the various AIs to see how does it weight different people's lives. | ||
| Like somebody who's sort of white or Chinese or black or whatever in different countries. | ||
| And the only AI that actually weighed human lives equally was Grok. | ||
| And I believe ChatGBT weighed the calculation was like a white guy from Germany is 20 times less valuable than a black guy from Nigeria. | ||
| So I'm like, that's a pretty big difference. | ||
| You know, Grok is consistent and weighs lives equally. | ||
| And that's clearly something that's been programmed into it. | ||
| Yes. | ||
| A lot of it is like if you don't actively push for the truth and you simply train on all the bullshit that's on the internet, which is a lot of woke mind virus bullshit, the AI will regurgitate those same beliefs. | ||
| So the AI essentially scours the internet, gets – It's trained on all the – Imagine the most demented Reddit threads out there, and the AI has been trained on that. | ||
| Reddit used to be so normal. | ||
| Yeah. | ||
| It did used to be normal. | ||
| Used to be interesting. | ||
| Used to go there and find all this cool stuff that people would talk about, post about, and just interesting, and great rooms where you could learn about different things that people were studying. | ||
| I think a big problem here is if your headquarters are in San Francisco, you're just living in a woke bubble. | ||
| So it's not just that people, say, in San Francisco are drinking woke Kool-Aid. | ||
| It is the water they swim in. | ||
| Like a fish doesn't think about the water, it's just in the water. | ||
| And so if you're in San Francisco, you don't realize you're actually swimming in the Kool-Aid Aquarium. | ||
| San Francisco is the woke Kool-Aid Aquarium. | ||
| And so your reference point for what is a centrist is totally out of whack. | ||
| So Reddit is headquartered in San Francisco. | ||
| Twitter was headquartered in San Francisco. | ||
| I moved X's headquarters to Texas, to Austin, which Austin, by the way, is still quite liberal, as you know. | ||
| And then the ex and XAI headquarters are in Palo Alto, which is still California. | ||
| The engineering headquarters in Palo Alto are just on page mo. | ||
| But even Palo Alto is way more normal than San Francisco, Berkeley. | ||
| San Francisco, Berkeley is extremely left. | ||
| Like left of left, you need a telescope to see the center from San Francisco. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        It used to be such a great city. | |
| I mean, San Francisco has a tremendous amount of inherent beauty, no question about that. | ||
| And California has incredible weather and no bugs. | ||
| It's just like amazing. | ||
| But he said, what's the cause of this? | ||
| It's just that if companies are headquartered in a location where the belief system is very far from what most people believe, then from their perspective, anything centrist is actually right-wing because they're so far left. | ||
| They're so far from the center in San Francisco that anything, they're just railed to maximum left. | ||
| So that's why I think you're centrist. | ||
| I mean, I think I'm centrist. | ||
| But from the perspective of someone on the far left, we look right-wing. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| And they think anyone who's a Republican is basically like some fascist Nazi situation. | ||
| But what's so crazy is it's very easy to demonstrate just from Hillary's speeches from 2008 and Obama's speeches when they were talking about immigration. | ||
| They were as far right as Steve Bannon when it comes to immigration. | ||
| Yes. | ||
| Hillary was like very MAGA. | ||
| I'm sure you've seen that campaign speech, which he was talking about if anybody's committed a crime, get rid of them. | ||
| And if you're here, you pay a hefty fine and you have to wait in line. | ||
| It was really crazy. | ||
| It's crazy to listen to because it's like it's as MAGA as Marjorie Taylor Greene. | ||
| Yeah, I mean, have you seen these videos people post online where they'll take a speech from Obama or Hillary and they'll interview people on college campus or something and say, what do you think of the speech by Trump? | ||
| And they're like, oh, I hate it. | ||
| He's a racist bigot. | ||
| I'm like, just kidding, that was Obama. | ||
| No, actually, that was Obama or Hillary. | ||
| To your point, literally, the center's been moved so far. | ||
| Yeah. | ||
| Yeah. | ||
| The left is so. | ||
| The left has gone so far left that they can't even see the center with a telescope. | ||
| And the danger without you purchasing Twitter was that was going to swipe over the whole country and change where the levels were. | ||
| And so what would be rational and normal would be far left of what was rational and normal just a decade earlier. | ||
| Yeah. | ||
| So exactly. | ||
| So historically, You'd have San Francisco, Berkeley being very far left, but the sort of the fallout from the somewhat nihilistic philosophy of San Francisco, Berkeley would be limited in geography to maybe like a 10-mile radius, 20-mile radius, something like that. | ||
| But San Francisco and Berkeley have to be co-located with Silicon Valley, with engineers who created information super weapons. | ||
| And those information super weapons were then hijacked by the far-left activists to pump far-left propaganda to everywhere on earth. | ||
| You remember that old RCA radio tower thing where it's like a radio tower on Earth and it's just broadcasting? | ||
| Yeah. | ||
| That's what happened, is that an extremist far-left ideology happened to be co-located with the smartest, where the smartest engineers in the world were who created information super weapons that were not intended for this purpose, but were hijacked by the extreme activists who lived in the neighborhood. | ||
| That's what happened. | ||
| They hijacked the modern equivalent of the RCA radio tower and broadcast that philosophy everywhere on earth. | ||
| Yeah, and you see the consequences, particularly in places that don't have free speech. | ||
| Yes. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Right. | |
| Like England. | ||
| Yeah, where they lock people up for memes and stuff. | ||
| Literally. | ||
| Literally. | ||
| 12,000 people this year. | ||
| 12,000? | ||
| 12,000. | ||
| 12,000 arrests for social media posts. | ||
| I mean, yeah, some of these things you read about it, and it's like literally someone had a meme on their phone that they didn't even send to anyone. | ||
| Right. | ||
| And they're in prison for that. | ||
| Yeah. | ||
| I mean, there was a case in Germany where a woman got a longer sentence than the guy that raped her because of something she said on a group chat. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Wow. | |
| Was it an immigrant who raped her? | ||
| Yes. | ||
| Yeah. | ||
| It was his culture. | ||
| Yeah. | ||
| He didn't know. | ||
| He didn't know better. | ||
| Yes. | ||
| I think she said something, you know, not like was critical of his culture and she got a longer sentence than the guy who raped her. | ||
| In Germany. | ||
| Just the UK, Europe, Germany, England thing seems so insane. | ||
| It is. | ||
| Totally insane. | ||
| I actually didn't realize it was such a huge number of people that got... | ||
| 12,000. | ||
| Yeah. | ||
| Far above Russia. | ||
| Far above China. | ||
| Far above anywhere on Earth. | ||
| UK is number one. | ||
| Well, you know, things like, you know, I talked to friends of mine in England, and I was like, hey, aren't you worried about this? | ||
| Like, you know, shouldn't you be protesting more? | ||
| And I mean, the problem is that the legacy mainstream media doesn't cover the stuff. | ||
| They're like, oh, everything's fine. | ||
| Everything's fine. | ||
| Most people aren't even aware of it until they come knocking on your door. | ||
| Yeah, until, like, so, I mean, these lovely sort of small towns in England, Scotland, Ireland, you know, they've been sort of living their lives quietly. | ||
| They're like hobbits, frankly. | ||
| So in fact, J.R. Tolkien based the hobbits on people he knew in small town England. | ||
| Because they were just like lovely people who liked to smoke their pipe and have nice meals and everything's pleasant. | ||
| The hobbits in the Shire. | ||
| The Shire, he was talking about places like Hertfordshire, like the Shires around in the greater London area, Oxfordshire type of thing. | ||
| And the reason they've been able to enjoy the Shire is because hard men have protected them from the dangers of the world. | ||
| But since they have no or very almost no exposure to the dangers of the world, they don't realize that they're there. | ||
| Until one day, you know, a thousand people show up in your village of 500 out of nowhere and start raping the kids. | ||
| This has now happened God knows how many times in Britain. | ||
| And the crazy thing. | ||
| Literally raping. | ||
| It's right. | ||
| Like some 10-year-old got raped in Ireland like last week. | ||
| Yeah, there's literal. | ||
| They snatched some kid. | ||
| Yeah. | ||
| Yeah. | ||
| And if you criticize it, you can get arrested. | ||
| And that's where it gets insane. | ||
| It's like, how are you not protecting? | ||
| I think it was the Prime Minister of Ireland actually posted on X because after I think some illegal migrant snatched a 10-year-old girl who was like going to school or something and violently raped a 10-year-old girl. | ||
| And there was a – the people were very upset about this and they protested. | ||
| And the Prime Minister of Ireland instead of saying, yeah, we really shouldn't be importing violent rapists into our country, he criticized the protesters instead and didn't mention that, that the reason they were protesting was because a 10-year-old girl from their small town got raped. | ||
| So here's a question. | ||
| Why are they supporting this kind of mass immigration? | ||
| And what – is this – is there a plan involved in all this? | ||
| Is this incompetence? | ||
| Is this ignoring the fact that they don't have a handle on it? | ||
| So they're trying to silence dissent? | ||
| Like what is happening? | ||
| Because if you want to destroy civilization, if you want to destroy Western civilization – Which George Soros seems to want to do. | ||
| And, you know, there's just – so there's a guy I think who – I don't know if he's been on your show. | ||
| You know God Saad? | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| Has he been on the show? | ||
| Good friend of mine. | ||
| Yeah, he's great. | ||
| He's been on multiple times. | ||
| Oh, great. | ||
| He's awesome. | ||
| Yeah. | ||
| So, you know, the way – he's got a good way to describe it, which is suicidal empathy. | ||
| Yes. | ||
| So, is that you prey upon people's empathy. | ||
| So, like, well, like you feel sorry for something – for some group. | ||
| And then, like, well – and that empathy is to such a degree that it is suicidal to your country or culture. | ||
| And that's suicidal empathy. | ||
| Because I don't think we should have empathy, but we should have – that empathy should extend to the victims, not just the criminals. | ||
| What – What – What – We should have empathy for the people that they prey upon. | ||
| But that suicidal empathy is also responsible for why somebody is arrested 47 times for violent offenses, gets released, and then goes and murders somebody in the U.S. You see that same phenomenon playing out everywhere where the suicidal emphasis is to such a degree that we're actually allowing our women to get raped and our children to get killed. | ||
| But it just doesn't seem like that would be anything that any rational society would go along with. | ||
| That's what makes me so confused. | ||
| It's like you're importing massive numbers of people that come from some really dark places of the world. | ||
| Well, there's no vetting is the issue. | ||
| If there's no vetting, like people are just coming through, like, well, what's to stop someone who just committed murder in some other country from coming to the United States or coming to Britain and just continuing their career of rape and murder? | ||
| Like, unless you've done, and this is some due diligence to say, like, well, who is this person? | ||
| What's their track record? | ||
| If you haven't confirmed that they have a track record of being honest and not being a homicidal maniac, then any homicidal maniac can just come across the border. | ||
| Let's not say everyone who comes across the border is a homicidal maniac, but if you don't have a vetting process to confirm that you're not letting in people who will do some serious violence, you will get people who do serious violence sometimes coming through. | ||
| Well, especially if you don't punish them, and if you don't deport them. | ||
| And if you are just like, but what is the purpose of allowing all those people into the country? | ||
| I wouldn't imagine that anyone in their society supports that. | ||
| Well, let me explain. | ||
| Because you mentioned, for example, how much, say, Hillary and Obama have changed their tune from prior speeches where they were hard-nosed about not letting in anyone who is a criminal into the country, having secure borders, all that stuff. | ||
| So why did they change their tune? | ||
| The reason is that they discovered that those people vote for them. | ||
| That's why they want the open borders. | ||
| Because if you let people in, they know the Democrats let them in. | ||
| they'll vote for democrats yes if you allow them to vote which which they're actively trying to do it they turn a blind eye to legal voting Well, California literally doesn't allow you to show your license. | ||
| California and New York have made it illegal to show your photo ID when voting. | ||
| Thus, effectively, they've made it impossible to prove fraud. | ||
| Impossible. | ||
| They've essentially legalized fraudulent voting in California and New York and many other parts of the country. | ||
| There's no rational explanation that I've ever seen anyone give as to why that would be the policy. | ||
| Unless you were trying to just allow people to vote illegally, because there's no other reason. | ||
| If you need a driver's license or you need an ID for everything else, including just recently to prove that you were vaccinated. | ||
| The same people who are demanding that you have a vaccine passport are the same ones saying you need no ID to vote. | ||
| Same people. | ||
| Right. | ||
| So it's obviously hypocritical and inconsistent. | ||
| So you really think it's just to get more voters? | ||
| If you want to understand behavior, you have to look at the incentives. | ||
| So once the Democratic Party in the U.S. and the left in Europe realize that if you have open borders and you provide a ton of governed handouts, which creates a massive financial incentive for people from other countries to come to your country, and you don't prosecute them for crime, they're going to be beholden to you and they will vote for you. | ||
| And that's why Obama and Hillary went from being against open borders to being in favor of open borders. | ||
| That's the reason. | ||
| In order to import voters so they can win elections. | ||
| And the problem is that that has a negative runaway effect. | ||
| So if they get away with that, it is a winning strategy. | ||
| If they are allowed to get away with it, they will import enough voters to get supermajority voting, and then there is no turning back. | ||
| We talked about this before the election. | ||
| And then you literally pointed towards the camera, you faced the camera and said that if you do not vote now, you might not ever be able to do it again because it'll be futile. | ||
| It'll be overrun. | ||
| Yes. | ||
| They'll keep the borders open for another four years and then their objective will be achieved. | ||
| Correct. | ||
| If Trump had lost, there would never have been another real election again. | ||
| Because Trump is actually enforcing the border. | ||
| Now, you can point to situations where there's been, you know, immigration enforcement has been overzealous, because they're not going to be perfect. | ||
| There'll be cases where they've been overzealous in expelling illegals. | ||
| But if you say that the standard must be perfection for expelling illegals, then you will not get any expulsion because perfection is impossible. | ||
| And you've probably got millions of people that are here that are trying to be here under some asylum pretense. | ||
| Right? | ||
| Yes. | ||
| Like you could just come from a war. | ||
| They changed the definition of asylum to be an economic, to be economic asylum. | ||
| Which is everybody. | ||
| Which is everybody. | ||
| Yeah. | ||
| So asylum is bar to prove. | ||
| Yeah. | ||
| Asylum is supposed to mean that if you go back to your country, you'll get killed. | ||
| That's what we mean by asylum. | ||
| That was what it was supposed to mean. | ||
| They changed the definition of asylum to be you will have a decreased standard of living, which is obviously not real asylum. | ||
| And you can test the absurdity of this by the fact that people who are asylum seekers go on vacation to the country that they're seeking asylum from. | ||
| You know, that doesn't make any sense. | ||
| Yeah. | ||
| It doesn't have to. | ||
| When you understand the incentives, then you understand the behavior. | ||
| So once the left realized that illegals will vote for them if they have open borders and combine that with governed handouts to create a massive incentive, | ||
| they're basically using U.S. and European taxpayer dollars to provide a financial incentive to bring in as many illegals as possible to vote them into permanent power and create a one-party state. | ||
| I invite anyone who's listening to this, just do any research. | ||
| And the more you dig into it, the more it will become obvious that what I'm saying is absolutely true. | ||
| Well, they were busing people to swing states. | ||
| It's clear that they were trying to do something. | ||
| And then you had Chuck Schumer and Nancy Pelosi who are actively talking about the need to bring in people to make them citizens because we're in population collapse. | ||
| Yes. | ||
| Yeah. | ||
| No, it's that meme. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| Where – so many times where they start off by saying it's not true. | ||
| It's a right-wing conspiracy theorist. | ||
| Right. | ||
| Then it starts – then it's like – I think the next step is, well, it might be true. | ||
| And then it's like, OK, it is true. | ||
| But here's why it's good. | ||
| And then the final step is it's true and here's why it's good. | ||
| Yeah. | ||
| And it's like, but wait a second. | ||
| You started off saying it's untrue and it's a right-wing conspiracy theorist. | ||
| Now you're saying it not only is it true, but it's a good thing and we must do more of it. | ||
| Well, this is the thing about Medicaid and Social Security and people getting Social Security numbers, you know, that were illegal. | ||
| It's massive fraud. | ||
| It's massive fraud. | ||
| And it's real. | ||
| And they denied it forever. | ||
| And now we're finding out this is part of the reason why there's this government shutdown that's going on right now. | ||
| Yes. | ||
| The entire basis for the government shutdown is that the Trump administration correctly does not want to send massive amounts of – like hundreds of billions of dollars to fund illegal immigrants in the blue states or in all the states really. | ||
| And so the – and the Democrats want to keep the money spigot going to incent illegal immigrants to come into the U.S. who will vote for them. | ||
| That's the crux of the battle. | ||
| So they want to stop this. | ||
| So what's going on right now is they have been funding these people. | ||
| They've been giving them EBT cards. | ||
| They've been giving them Medicaid. | ||
| And they've been even housing them. | ||
| And more than that, just like they were – like they were taking hotels, like four-and five-star hotels, like the Roosevelt Hotel being the classic example, was – they were sending I think $60 million a year to the Roosevelt Hotel to – which all it did was house illegals. | ||
| It used to be a nice hotel. | ||
| I mean it still is a nice hotel. | ||
| But – and all around the country this was happening. | ||
| And all tax dollars. | ||
| Yes. | ||
| Yeah. | ||
| And – Yeah. | ||
| And the Trump administration cut off funding, for example, to the Roosevelt Hotel and these other hotels saying like we – it's – U.S. tax dollars should not be paid – be sent to have luxury hotels for illegal immigrants that American citizens can't even afford. | ||
| Which obviously is the case. | ||
| That's insane. | ||
| That's what was happening. | ||
| They were also giving out like debit cards with $10,000. | ||
| So it's not just about medical care. | ||
| The Democrats mention medical care because they're trying to prey on people's empathy as much as possible. | ||
| And then they imagine, oh, wow, somebody has a desperately needed medical procedure. | ||
| And shouldn't we maybe do – you know, take care of them in that regard? | ||
| But what they do is they divert the Medicaid funds and turn it into a slush fund for the states that goes well beyond emergency medical care. | ||
| And – New York and California would be bankrupt without the massive fraudulent federal payments that go to those states to pay for illegals – to create a massive financial incentive for illegals. | ||
| How would they be bankrupt because of that? | ||
| They wouldn't be able to balance their state budgets and they can't issue currency like the Federal Reserve can. | ||
| And so their ability to balance budgets dependent upon illegals getting funding? | ||
| The scam level here is so staggering. | ||
| So there are hundreds of billions of dollars of transfer payments from the federal government to the states. | ||
| Those transfer payments – those transfer payments – the states self-report what those transfer payment numbers should be. | ||
| So California and New York and Illinois lie like crazy and say that these are all legitimate payments. | ||
| Well, these days that – I think they're even admitting that they literally want hundreds of billions of dollars for illegals. | ||
| But for a while there, they're trying to deny it. | ||
| So you get these transfer payments for every government program you can possibly think of. | ||
| And these are self-reported by the state. | ||
| And at least historically, there was no enforcement of California, New York, Illinois and other states when they would lie. | ||
| There was no actual enforcement to say like, hey, you're lying. | ||
| These payments are fraudulent. | ||
| Now under the Trump administration, the Trump administration does not want to send hundreds of billions of dollars of fraudulent payments to the states. | ||
| And the reason you have this standoff is because if the hundreds of billions of dollars to create a financial incentive to like to have this giant magnet to attract illegals from every part of earth to these states, if that is turned off, the illegals will leave because they're no longer being paid to come to the United States and stay here. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Wow. | |
| And then they will lose a lot of voters. | ||
| The Democratic Party will lose a lot of voters. | ||
| And they would have a very difficult job if this is kicked out of reintroducing it into a new bill. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yes. | |
| Especially once things start normalizing. | ||
| Yes. | ||
| So like in a nutshell, the Democratic Party wants to destroy democracy by importing voters. | ||
| And the, you know, the Republican Party disagrees with that. | ||
| And the ruse is that if you don't accept what they're doing, then you're a threat to democracy. | ||
| Yes. | ||
| As they try to destroy democracy. | ||
| Yes. | ||
| By importing voters. | ||
| That is literally what they're doing. | ||
| And incentivizing people to only vote for them. | ||
| And overwhelming the system. | ||
| Yes. | ||
| And by the way, it's a strategy that if allowed to work, would work. | ||
| And in fact, has worked. | ||
| California is super majority Democrat. | ||
| Yeah. | ||
| And there's so much gerrymandering that occurs. | ||
| It's crazy. | ||
| I'm sure you're paying attention to this Proposition 50 thing. | ||
| No. | ||
| That's the thing in California where they're trying to redo districts. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Oh, yeah, yeah. | |
| Yeah. | ||
| Because, I mean, California is already gerrymandered like crazy. | ||
| Yeah. | ||
| They want to gerrymander it even more. | ||
| Because it keeps moving further and further right. | ||
| Like if you look at the map of California, each voting cycle, more and more people are waking up and going, what the fuck? | ||
| And we need to do something to fix this. | ||
| The only option available other than the policies that you guys have always done is go right. | ||
| And so a lot of people have been, air quotes, red-pilled. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Mm-hmm. | |
| Yeah. | ||
| And then here's another thing that is very important fact that is actually not disputed by either side, which is that when we do the census in the United States, the census, the way the census works for apportionment of congressional seats and electoral college votes for the president is by number of persons in a state, not number of citizens. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Right. | |
| It's number of people. | ||
| So you could literally be a tourist and you will count. | ||
| Now, how do they do the census when they do that? | ||
| Do they ask people? | ||
| Do they knock on doors? | ||
| Do they have them fill out forms? | ||
| Yeah, I think they mail out census forms and knock on doors. | ||
| But the way the law reads right now is that if you are a human with a pulse, then you count in the census for allocating congressional seats and presidential votes. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Right. | |
| Electoral college, congressional seats, everything. | ||
| It doesn't matter whether you're here. | ||
| Legally, illegally, if you're a human with a pulse, you count for congressional apportionment. | ||
| So that means that the more people, the more illegals that California and New York can import by the time the census happens in 2030, the more congressional seats they will have and the more presidential electoral college votes they will have. | ||
| So they're trying to get as many illegals in as possible ahead of the census. | ||
| And because all human beings, even tourists, count for the census. | ||
| And then if you combine that with gerrymandering of districts in New York and California, let me just point out with this proposition where they're trying to increase the amount of gerrymandering that occurs in California, the biggest state in the country. | ||
| So if the census then would award more congressional seats to California because of a vast number of illegals in New York and Illinois, they'll get more congressional seats. | ||
| They'll get more presidential electoral college votes that would get them the House, a majority in the House, and they would get to decide who is president, literally based on illegals. | ||
| These are not disputed facts by either party. | ||
| I want to emphasize that that's in camp. | ||
| Yeah, this is not a conspiracy. | ||
| These are not disputed facts by either party. | ||
| It's not a – these are just – this is just the way the law works. | ||
| It is – like I don't think the law should work that way. | ||
| I think it should – the apportionment should be proportionate to citizens. | ||
| But isn't that a problem with how the Constitution is written? | ||
| Yeah, yeah. | ||
| Yeah. | ||
| They can't really change that. | ||
| I'm not sure if it's constitutional or – but it is the way the law is written. | ||
| I'm not sure if it's in the Constitution or not in this way. | ||
| But it is – that is the way the law is written. | ||
| So it is an incentive. | ||
| But it's an incentive that would be removed with something simple that makes sense to everybody that only the people that it should count are people that are official U.S. citizens. | ||
| Yes. | ||
| The way it should work is that only U.S. citizens should count in the census for purposes of determining voting power. | ||
| Because people that aren't legal can't vote supposedly. | ||
| They're not supposed to be voting, but they do. | ||
| But even besides that, like I said, I just can't emphasize this enough because this is a very important concept for people to understand, is that the law – the law as it stands counts all humans with a pulse in a state for deciding how many House of Representative votes and how many presidential electoral college votes a state gets. | ||
| So the incentive, therefore, is to – for California, New York, Illinois to maximize the number of illegals so they get – so they take House seats away from red states, assign them to California, New York, Illinois and so forth. | ||
| Then you combine that with extreme gerrymandering in California, New York, Illinois and whatnot so that basically you can't even elect any Republicans and then they get control of the presidency, control of the House. | ||
| Then they keep doing that strategy and cement a supermajority. | ||
| That is what they're trying to do. | ||
| So that would essentially turn the entire country into California. | ||
| Yes. | ||
| Where you have differing opinions, but it doesn't matter because one party is always in control. | ||
| Yes. | ||
| When you first started digging into this, when you first started – before you even accepted this role of running Doge and being a part of all that, did you have any idea that it was this fucked up? | ||
| I did, yeah. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        I mean I sort of – When did you start knowing? | |
| I guess about like – well, about two years ago. | ||
| Isn't that crazy? | ||
| And like relatively recently, you know. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| So I started having – well, I started like basically having a bad feeling about three years ago, which is why I felt it was like critical to acquire Twitter and have a maximally truth-seeking platform, not one that suppresses the truth. | ||
| And like it was more like – I'm like, I'm not sure what's going on, but I have a bad feeling about what's going on. | ||
| And then the more I dug into it, the more I was like, holy shit, we've got a real problem here. | ||
| America is going to fall. | ||
| So – Without anyone knowing it had fallen, that would be the problem. | ||
| It could have fallen and been unrepairable without anyone really being aware of what had happened. | ||
| Especially if you didn't buy Twitter. | ||
| Yes. | ||
| Look, buying Twitter was a huge pain in the ass and made me a pincushion of attacks. | ||
| Like dab, dab, dab, dab, dab. | ||
| Everybody loved you before that. | ||
| Well, some people – A lot of people loved you. | ||
| A lot of lefties loved you. | ||
| I was a hero of the left. | ||
| It's fair to say. | ||
| It was a thing. | ||
| If you drove a Tesla, it showed that you were environmentally conscious and you were on the right side. | ||
| Yeah. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| I mean, I'm still the same human. | ||
| I didn't, like, have a brain transplant between, you know, since – in, like, three years ago. | ||
| Well, that's my favorite bumper sticker that people put on Teslas now. | ||
| I bought this before Elon went crazy. | ||
| I took a picture of one the other day. | ||
| Oh, you found this? | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        I was behind somebody. | |
| Oh, yeah. | ||
| I've seen three or four of them. | ||
| People that have these bumper stickers on their car that says, I bought this before Elon went crazy. | ||
| Because when people were vandalizing Teslas. | ||
| Yeah. | ||
| The most unhinged. | ||
| Well, there was an organized campaign to literally burn down Teslas. | ||
| And we had one of our dealerships got shot up with a gun. | ||
| Like, they fired bullets into the Tesla dealership. | ||
| They were burning down cars. | ||
| It was crazy. | ||
| So – but the bumper sticker should read – there should be an addendum to the bumper sticker. | ||
| It's like, I bought this car before Elon turned crazy. | ||
| Actually, now I realize he's not crazy and I've seen the light. | ||
| That'll take some time. | ||
| That'll take some time. | ||
| People don't want to admit that they've been tricked. | ||
| Yeah. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        They don't like that. | |
| That old saying where it's like, it's really easy to fool somebody, but it's almost impossible to convince someone that they were fooled. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| It's much easier to fool them than to convince them they've been fooled. | ||
| People cling to their ideas. | ||
| Yes. | ||
| Especially if they've, like, publicly stated these things. | ||
| They get very embarrassed of being foolish. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| People – most of the time they double down. | ||
| And they find echo chambers. | ||
| Yeah, yeah. | ||
| But there's – you know, the thing is that – like, you know, I've seen more and more people who were convinced of the sort of woke ideology see the light. | ||
| Yeah. | ||
| So, not everyone, but it's more and more are seeing the light. | ||
| And it tends to happen, like, when something happens that really, you know, directly affects you. | ||
| Right. | ||
| Like, there was a friend of mine who was living in the San Francisco Bay Area and that tried to trans his daughter. | ||
| Like, to the point where the school, like, sent the police to his house to take his daughter away from him. | ||
| Now, that's going to radicalize you. | ||
| Well, that's going to break – that's going to shake you out of your belief structure. | ||
| Now, I know – So, it was an activist at the school that was trying to do this? | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| Yeah, the school and the state of California conspired to turn his daughter against him and make her take life-altering drugs that would have sterilized her and irreversible. | ||
| And how old was she? | ||
| I think 14, something like that. | ||
| So, he managed to talk the police out of taking his daughter away from him that day. | ||
| And that night, he got on a plane to Texas. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Wow. | |
| And, you know, a year after just being in a school in, like, greater Austin area, she went back to normal. | ||
| Meaning, like, it wasn't real. | ||
| Right. | ||
| Well, people are being much more open to that now. | ||
| I mean, Wall Street Journal yesterday had that opinion piece that this whole trans thing, there's a lot of evidence, this is a social contagion. | ||
| Absolutely. | ||
| And Colin Wright wrote that, and then he's getting death threats now, of course, and on Blue Sky, there's people talking about exterminating him, which is one thing that you are allowed to say on Blue Sky, apparently. | ||
| You're allowed to say horrible things about people, say possibly truthful things about this whole social contagion. | ||
| Because that's what, when you get nine kids that are in a friend group and they all decide to turn trans together. | ||
| Yeah. | ||
| Something's wrong. | ||
| Something's wrong. | ||
| That's not statistically feasible. | ||
| Like, you can convince kids to do anything. | ||
| You can convince kids to be a suicide bomber. | ||
| Right. | ||
| So. | ||
| Which is why they do with, in some countries, why they choose children to do that. | ||
| Yes. | ||
| Yeah. | ||
| You can train kids to be suicide bombers. | ||
| And if you can train kids to be suicide bombers, you can convince them of anything. | ||
| Yeah. | ||
| Especially with enough positive enforcement. | ||
| And cultural enforcement. | ||
| And the idea that that's not the case. | ||
| Kids are malleable. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yes. | |
| The minds of youth are easily corrupted. | ||
| You're also seeing a lot of pushback from gay and lesbian people that are saying, like, hey, if someone did this to me. | ||
| So stop including me. | ||
| Yeah, exactly. | ||
| Yeah. | ||
| The LGBT, you know, it's like, wait a second. | ||
| Why are we being included all the time in this situation? | ||
| Exactly. | ||
| Exactly. | ||
| Especially when, you know, like my friend Tim Dillon's talked about this. | ||
| It's like, it's really homophobic. | ||
| Because you're taking these gay kids and you're telling them, like, hey, you're not gay. | ||
| You're actually a girl. | ||
| Yes. | ||
| And, you know, hey, go make it so that you can never have an orgasm again. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Right. | |
| And you'll be happy. | ||
| Like, fucking crazy. | ||
| Permanent mutilation, permanent castration of kids is, like, I think we should look at anyone who permanently castrates a kid as, like, right up there with Yosef Mengele. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| I mean, they're mutilating children. | ||
| Yeah. | ||
| Yeah. | ||
| And it's thought of as being kind. | ||
| And the thing is, would you rather have a live daughter or a dead son? | ||
| That's the line they use. | ||
| Yeah. | ||
| Which is not supported by any data. | ||
| No. | ||
| It's all bullshit. | ||
| The probability of suicide increases. | ||
| This is important maybe for the audience to know. | ||
| The probability of suicide increases if you're trans a kid, not decreases. | ||
| By some accounts, it triples. | ||
| So that is an evil lie. | ||
| And it's a lie that is supposedly compassionate. | ||
| Imagine you've twisted reality to the point where confusing a child that's not even legally allowed to get a fucking tattoo. | ||
| Yeah. | ||
| Right? | ||
| Because you think that you could make a mistake with a tattoo. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Exactly. | |
| A totally removable thing. | ||
| Right. | ||
| If I wanted to tomorrow, I could go to a doctor and they could laser off every tattoo that I have on me. | ||
| Right. | ||
| Okay. | ||
| No harm, no foul. | ||
| Yeah. | ||
| But you get sterilized. | ||
| Like, that's it forever. | ||
| Forever. | ||
| Yes. | ||
| They'll castrate you. | ||
| You no longer have testicles. | ||
| Yes. | ||
| You have a hole where your penis used to be. | ||
| Yes. | ||
| And this is compassionate. | ||
| And this is preventing you from killing yourself. | ||
| Actually, a lot of kids die with these sex change operations. | ||
| They die. | ||
| The number of deaths on the operating table, people don't hear about those. | ||
| A lot of kids. | ||
| Because we don't really actually have the technology to make this work. | ||
| So a bunch of times the kids just die in the sex change operations. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Jesus Christ. | |
| Yeah. | ||
| It's demented. | ||
| It should be viewed as, like, you know, like evil Nazi doctor stuff. | ||
| Well, that's why it was so— Like real Nazi, not the bullshit fake Nazi stuff. | ||
| Crazy that even pushing back against something that seems, like, fundamentally, logically very easy to argue, the old Twitter would ban you forever. | ||
| Yes. | ||
| That's how crazy a social contagion can get when it completely defies logic, victimizes children, does something that makes no sense, does not supported by data, all connected to this ideology that trans is good. | ||
| We've got to save trans kids, protect trans kids. | ||
| Yeah. | ||
| And what I want to emphasize is that the save trans kids thing is a lie. | ||
| If you castrate kids and trans them, the probability of suicide increases, it does not decrease. | ||
| It substantially increases. | ||
| The studies have done that I've seen the risk of suicide triples if you're trans kids. | ||
| So, you're not saving them, you're killing them. | ||
| Moreover, during the sex change operation, there are many deaths that occur during the sex change operation. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Jesus Christ. | |
| It's just crazy that this is a real issue. | ||
| Yeah. | ||
| It's a nightmare fever dream. | ||
| And people are finally waking up from it. | ||
| Now, when you started getting into the Doge stuff and started finding how much money is being shuffled around and moved around to NGOs and how much money is involved and just totally untraceable funds, like, this is, again, something like two years plus ago, you weren't aware of it all? | ||
| No, I was aware of it. | ||
| I just didn't realize how big it was. | ||
| It was just how much waste and food there is in the government is truly vast. | ||
| In fact, the government didn't even know, nor did they care. | ||
| That's crazy. | ||
| Yeah. | ||
| And, I mean, just, like, some of the very basic stuff that Doge did will have lasting effects. | ||
| And some of these things, like, they're so elementary you can't believe it. | ||
| So, the Doge team got the – most of the main payments computers to require the congressional appropriation code. | ||
| So, when a payment is made, you have to actually enter the congressional appropriation code. | ||
| That used to be optional and often would be just left blank. | ||
| So, the money would just go out, but it wasn't even tied to a congressional appropriation. | ||
| And then, the Doge team also made the comment field for the payment mandatory. | ||
| So, you have to say something. | ||
| We're not saying that what is said – like, you can say anything. | ||
| Your cat could run across the keyboard. | ||
| You could go, QWERTY ASDF. | ||
| But you have to say something above nothing because what we found was that there were tens of billions, maybe hundreds of billions of dollars that were zombie payments. | ||
| So, they're – like, somebody had approved a payment. | ||
| Somebody in the government approved a payment and – some recurring payment. | ||
| And they retired or died or changed jobs and no one turned the money off. | ||
| So, the money would just keep going out. | ||
| And it's a pretty rare – You go where? | ||
| To a company or an individual. | ||
| And it's a pretty rare company or individual who will complain that they're getting money that they should not get. | ||
| And a bunch of the money was just going to the – were transfer payments to the states. | ||
| So, these are automatic payments. | ||
| Yeah, just automatic payments. | ||
| No accounting for them at all. | ||
| Imagine, like, there's an automatic debit of your credit card. | ||
| And you never look at the statement. | ||
| Right. | ||
| So, it's just money going out. | ||
| Of course, I call them zombie payments. | ||
| They might have been legitimate at one point. | ||
| But the person who approved that recurring payment changed jobs, died, retired or whatever. | ||
| And no one ever turned the money off. | ||
| And my guess is that's probably at least $100 billion a year. | ||
| Maybe $200. | ||
| And going where? | ||
| To – I mean, there are millions of these payments. | ||
| So, it's – I mean – Millions. | ||
| Yes, yes. | ||
| Millions of payments that are going to who knows where. | ||
| Yes. | ||
| So, in a bunch of cases, there are fraud rings that operate – professional fraud rings that operate to exploit the system. | ||
| They figure out some security hole in the system and they just do professional fraud. | ||
| And that's where we found, for example, people who were, you know, 300 years old in the Social Security Administration database. | ||
| Now, I thought that this was a mistake of not registering their deaths. | ||
| That people were born like a long time ago and it had defaulted to like a certain number. | ||
| And so, that after time, those people were still in the system. | ||
| It was just an error of the way the accounting was done. | ||
| Yeah. | ||
| So, that's not true. | ||
| So, there's – or at least one of two things must be true. | ||
| There's a typo or some mistake in the computer or it's fraudulent. | ||
| But we don't have any 300-year-old vampires living in America. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Allegedly. | |
| Allegedly. | ||
| And we don't have people in some cases who are receiving payments who are born in the future. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Born in the future? | |
| Born in the future. | ||
| Really? | ||
| Yes. | ||
| The people receiving payments whose birth date was like 2100 and something. | ||
| Okay. | ||
| So, there's – Like next century. | ||
| Is there a task force? | ||
| We know that one of two things must be true, that either there's a mistake in the computer or it's fraud. | ||
| But if you have someone's birthday that's either in the future or where they are older than the oldest living American because the oldest living American is 114 years old. | ||
| So, if they're more than 114 years old, there is either a mistake and someone should call them and say, I think we have your birthday wrong because it says you were born in 1786. | ||
| And, you know, that was before, you know, before there was really an America, you know, it was like, you know, that's kind of early. | ||
| You know, we're still fighting England type of thing. | ||
| You know, it's like this person either needs to be in the Guinness Book of World Records or they're not alive. | ||
| But still, at the end of the day, money is going towards that account that's connected to this person that is either nonexistent or dead. | ||
| So, like, yeah, so there was like, I think, something like, I don't know, 20 million people in the Social Security Administration database that could not possibly be alive. | ||
| If their birth date is, like, based on their birth date, they could not possibly be alive. | ||
| And then to be clear, 20 million people that were receiving funds? | ||
| A bunch of – most of them were not receiving funds. | ||
| Some of them were receiving funds. | ||
| Most were not receiving funds. | ||
| But so let me tell you how the scam works. | ||
| It's a bank shot. | ||
| So the Social Security Administration database is used as a source of truth by all the other databases that the government uses. | ||
| So even if they stop the payments on the Social Security Administration database, like unemployment insurance, small business administration, student loans, all check the Social Security Administration database to say, is this a legitimate, alive person? | ||
| And the Social Security database will say, yes, this person is still alive even though they're 200 years old. | ||
| But forget to mention that they're 200 years old. | ||
| It just says – it just returns – when the computer is queried, it says, yes, this person is alive. | ||
| And so then they're able to exploit the entire rest of the government ecosystem. | ||
| So then you get fake student loans. | ||
| Then you get fake unemployment insurance. | ||
| Then you get fake medical payments. | ||
| And this doesn't have to be tied to an individual where there's an address where you can check on this person? | ||
| No. | ||
| If you did – if you just did any check at all, you would stop this. | ||
| So that's – And how much money do you think is being – Any check – like anything at all that would stop the forward. | ||
| Like any effort at all. | ||
| Yeah. | ||
| So there's multiple layers. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yes. | |
| The Social Security number verifies that this is a real person. | ||
| Right. | ||
| And then the other systems check up on – Every other government payment and every other government payment system for everything – like I said, small business administration, student loans, Medicaid, Medicare, every other government payment, of which there are many. | ||
| There are actually hundreds of government payment systems. | ||
| That's going to be exploited so long as Social Security database says this person is alive. | ||
| That's the nature of the scam. | ||
| It's a bank shot. | ||
| So then the rebuttal from the Dems is like, oh, well, the vast majority of the people who are marked as alive in the Social Security Administration weren't receiving Social Security Administration payments. | ||
| That is true. | ||
| What they forgot to mention is they're getting fraudulent payments from every other government program. | ||
| And that's why the Dems were so opposed to turning off – to declaring someone dead who was dead because it would stop the entire other – all the other fraud from happening. | ||
| And so – but all this – is it trackable? | ||
| Like all this other fraud, if they wanted to, they could chase it all down. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| It's not even hard. | ||
| And yet they're opposing chasing it all down. | ||
| They're opposing chasing it all down because it turns off the money magnet for the illegals. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Wow. | |
| Because it's very logical to – like I'm saying the most common-sense things possible. | ||
| If someone's got a birthday in Social Security that is an impossible birthday, meaning they are older than the oldest living American or born in the future, then you should call them and say, excuse me, we seem to have your birthday wrong because it says that you're 200 years old. | ||
| That's all you need to do. | ||
| And then you would remove them from the Social Security database and make that number no longer available for all those other government payments. | ||
| Exactly. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Wow. | |
| And how much money are we talking? | ||
| It's hundreds of billions of dollars. | ||
| And this is all traceable. | ||
| Like you could hunt all this down. | ||
| Like you don't need to be Sherlock Holmes here is what I'm saying. | ||
| Well, this is – We don't need to call Sherlock Holmes for this one. | ||
| Is this part of the – We just need to call the person and say, excuse me, we seem to have – like we must have your birthday wrong because it says you're 200 years old or we're born in the future. | ||
| So could you tell us what your birthday is? | ||
| That's what we need to do. | ||
| It's that simple. | ||
| But all these other government payments that are available that are connected to this Social Security number, it seems like if you just chased that all down, you would find the widespread fraud. | ||
| You would find where it's going. | ||
| Yes. | ||
| But the root of the problem is the Social Security Administration database because the Social Security number in the United States is used as a de facto national ID number. | ||
| That's why – like the bank always asks for your social – like any financial institution will ask for your Social Security number. | ||
| This is – it sounds so insane that this isn't chased down. | ||
| Yeah, I agree. | ||
| That – I mean that in and of itself is – that's such mishandling. | ||
| Yes. | ||
| It's mind-blowing. | ||
| So yeah, it's crazy. | ||
| Well, you were very reluctant last time you were here to talk about the extent of some of the fraud because you're like, they could kill me because this is kind of – Oh, yeah, what I'm saying is that – like if you create – like to be pragmatic and realistic, you actually can't manage to zero fraud. | ||
| Yet you can manage to low fraud number but not to zero fraud. | ||
| If you manage to zero fraud, you're going to push so many people over the edge who are receiving fraudulent payments that the number of inbound homicidal maniacs will be really hard to overcome. | ||
| So I'm actually taking, I think, quite a reasonable position, which is that we should simply reduce the amount of fraud, which I think is not an extremist position. | ||
| And we should aspire to, you know, have less fraud over time. | ||
| Not that we should be ultra draconian and eliminate every last scrap of fraud, which I guess would be nice to have. | ||
| But like we don't even need to go that extreme. | ||
| I'm saying we should just stop the blatant large-scale super obvious fraud. | ||
| I think that's a reasonable position. | ||
| It's a very reasonable position. | ||
| And so what was the most shocking pushback that you got when you started implementing Doge, when you started investigating into where money was going? | ||
| Well, I guess this is – I should have anticipated this. | ||
| But while most of the fraudulent government payments to – especially to the NGOs go to the Democrats, most of it – like, I don't know, for argument's sake, let's say 80 percent. | ||
| Maybe 90 percent. | ||
| 10 to 20 percent of it does go to Republicans. | ||
| And so when we turn off funding to a fraudulent NGO, we'd get complaints from whatever, the 10 percent of Republicans who are receiving the money. | ||
| And they would, you know, they would very loudly complain. | ||
| Because the honest answer is the Republicans are partly – they're receiving some of the fraud too. | ||
| They're getting a big. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Jesus. | |
| Yeah. | ||
| I want to be clear. | ||
| It's not like the Republican Party is some ultra-pure paragon of virtue here. | ||
| Okay. | ||
| Well, you see that with the congressional insider training. | ||
| It's across the board. | ||
| Yeah. | ||
| It's left and right. | ||
| I mean the whole uniparty criticism has some validity to it. | ||
| You know, there's – so – and it's – like if you turn off fraudulent payments, it's not like – like I said, it's not like 100 percent of those payments were going to Democrats. | ||
| A small percentage were also going to Republicans. | ||
| Those Republicans complained very loudly. | ||
| And, you know, and that's – so there was a lot of pushback on the Republican side when we started cutting some of these funds. | ||
| And I tried telling them like, well, you know, 90 percent of the money is going to your opponents. | ||
| But they still – even if they're getting 10 percent of the money – They want their piece. | ||
| Yeah. | ||
| They want their piece. | ||
| And they've been getting that piece for a long time. | ||
| Yes. | ||
| This is why like, you know, politics is like – It's dirty business. | ||
| Yeah. | ||
| I mean that's like saying like, you know, if you like sausages and respect the law, do not watch either of them being made. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| Wow. | ||
| Well, that's not even true because I've made sausage before. | ||
| Yeah, yeah. | ||
| It's actually – Yeah. | ||
| It's like it's not that big a deal. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| It's not that big a deal. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        It's fat and spices and casing running through the machine. | |
| Not that big a deal. | ||
| Yeah. | ||
| But, yeah. | ||
| I mean I think the stuff I'm saying here is not – like if you stand back and think about it for a second like, oh, yeah, that makes sense. | ||
| You know? | ||
| Yeah. | ||
| It's not like – it's not like one political party is going to be, you know, pure devil or pure angel. | ||
| There's – you know, I think there's much more corruption on the Democrat side but it's not – there's not – there's still some corruption on the Republican side. | ||
| How did it happen that the majority of the corruption wound up being on the Democrat side? | ||
| Well, because the transfer payments, especially to illegals, are very much on the Democrat side. | ||
| So that's the root of it all is the illegal situation. | ||
| Yes. | ||
| I mean there's – Or a focal point. | ||
| It would also be accurate to say that while – obviously not everyone who is a Democrat is a criminal. | ||
| Almost everyone who is a criminal is a Democrat because the Democrats are the soft-run crime party. | ||
| So if you're a criminal, who are you going to vote for? | ||
| Right. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Right. | |
| The soft-run crime party. | ||
| Did you think you were going to be able to get more done than you were? | ||
| We did get a lot done. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Right. | |
| And Doge is still happening, by the way. | ||
| The Doge is still underway. | ||
| There are still – there are still – there's still waste and fraud being cut by the Doge team. | ||
| So it hasn't stopped. | ||
| It's less publicized. | ||
| It's less publicized. | ||
| And they don't have like a clear person to attack anymore. | ||
| Well, it seems like once you stepped away – They basically – they applied immense pressure to me to stop it. | ||
| So then I'm like the best thing for me is to just cut out of this. | ||
| And in any case, as a special government employee, I could only be there for like 120 days anyway, something like that. | ||
| So whatever the law says. | ||
| So I could – I was necessarily – could only be there for four months as a special government employee. | ||
| So – but yeah. | ||
| I mean, you turn off the money spigot to fraudsters, they get very upset to say the least. | ||
| And – but my – like my death threat level went ballistic, you know. | ||
| It was like a rocket going to orbit. | ||
| So – but now that I'm not in D.C., I guess they don't really have a person to attack anymore. | ||
| Well, the rhetoric about you has calmed down significantly. | ||
| Yeah. | ||
| It was disturbing. | ||
| It was disturbing to watch. | ||
| It was like this is crazy. | ||
| And to watch these politicians engage in it and all these people just like framing you as this monster. | ||
| I was like this is so weird. | ||
| Like this is what happens when you uncover fraud. | ||
| But yes. | ||
| The whole machine turns on you. | ||
| And if it wasn't for a person like you who owns a platform and has an enormous amount of money, like it could have destroyed you. | ||
| Yeah. | ||
| And that was the goal. | ||
| The goal was to destroy me. | ||
| Absolutely. | ||
| Because you were getting in the way. | ||
| Yeah. | ||
| Of this amazing graft. | ||
| This gigantic fraud machine. | ||
| Yeah. | ||
| Like I said, I think Doge team has done a lot of good work. | ||
| You know, in terms of fraud and waste prevented, my guess is it's, you know, probably on the order of $200 or $300 billion a year. | ||
| So it's pretty good. | ||
| And what do you think could have been done if you just had like full reign and total cooperation? | ||
| How much do you think you could have saved? | ||
| I mean what level of power are we assuming here? | ||
| Godlike. | ||
| Oh, yeah. | ||
| Probably cut the federal budget in half and get more done. | ||
| That is so crazy. | ||
| It is so crazy. | ||
| Get more done and federal budget in half. | ||
| It's that widespread. | ||
| Well, I mean a whole bunch of government departments simply shouldn't exist in my opinion. | ||
| They, you know. | ||
| Like examples. | ||
| Well, the Department of Education, which was created recently, like under Jimmy Carter, our educational results have gone downhill ever since it was created. | ||
| So if you create a department and the result of creating that department is a massive decline in educational results and it's the Department of Education, you're better off not having it. | ||
| Because literally we did better before there was one than after. | ||
| When you let the states run it. | ||
| Yes. | ||
| Yeah. | ||
| Because at least the states can compete with one another. | ||
| So – but the problem is like you hear like cutting the department of education. | ||
| Our kids need education. | ||
| Yeah, they do. | ||
| But this is a new department that didn't even exist, you know, until the late 70s. | ||
| And ever since that department was created, the results, educational results have declined. | ||
| And so why would you have an institution continue that has made education worse? | ||
| It doesn't make sense. | ||
| They killed it though, right? | ||
| No, they still – unfortunately. | ||
| But they were trying to kill it. | ||
| It has been substantially reduced. | ||
| Okay. | ||
| What other organizations? | ||
| What other departments? | ||
| Well, I mean I'm a small government guy. | ||
| So, you know, when the country was created, we just had the Department of State, Department of War, you know, and sort of the Department of Justice. | ||
| We had an attorney general and Treasury Department. | ||
| I don't know why you need more than that. | ||
| So what other departments specifically do you think are just completely ineffective? | ||
| Well, I mean here it's like a question – it's a sort of philosophical question of how much government do you think there should be? | ||
| Right. | ||
| In my opinion, there should be the least amount of government. | ||
| I've heard the most bizarre argument against this is that you're cutting jobs and you're going to leave people jobless. | ||
| And I'm like, but their jobs are useless. | ||
| Yeah, paying people to do nothing doesn't make sense. | ||
| Like there's a great – there's a story about like Milton Friedman who is awesome. | ||
| Generally, whatever Milton Friedman said is people should do that thing. | ||
| I'm not sure if it's apocryphal or not. | ||
| But like someone complained to him like – he observed, I think, people that were like digging ditches with shovels. | ||
| And he said – well, like allegedly Friedman said, well, I think you should use, you know, excavating equipment instead of shovels. | ||
| And you could get it done with far fewer people. | ||
| And then someone said, but then we're going to lose a lot of jobs. | ||
| Well, then Friedman said, well, in that case, why don't you have them use teaspoons? | ||
| Just dig ditches with teaspoons. | ||
| Think of all the jobs you'll create. | ||
| I mean – it's bullshit. | ||
| Basically, you just want people to work on things that are productive. | ||
| You want people to work on building things, on building – providing products and services that people find valuable, like making food, being a farmer or a plumber or electrician or just anyone who's a builder or providing useful services. | ||
| And that's what you want people to be doing, not fake government jobs that don't add any value or may subtract value. | ||
| But it's also like – to illustrate the absurdity of also how is the economy measured, like the way economists measure the economy is nonsensical. | ||
| Because they'll measure any job, no matter – even if that job is a dumb job, that has no point and is even counterproductive. | ||
| So like the joke is like there's two economists going on a hike in the woods. | ||
| They come across a pile of shit and one economist says to the other, I'll pay you $100 to eat that shit. | ||
| The economist eats the shit, gets the $100. | ||
| They keep walking. | ||
| Then the other – then they come across another pile of shit and the other economist says, now, I'll pay you $100 to eat the pile of shit. | ||
| So he pays the other economist $100 to pile of shit. | ||
| Then they say, look, wait a second. | ||
| We both just ate a pile of shit and we're no – we don't have any more extra money. | ||
| Like we both – you just gave the $100 back to me and we both ate a pile of shit. | ||
| This doesn't make any sense. | ||
| And they said, no, no, but think of the economy because that's $200 in the economy. | ||
| That basically – eating shit would count as a job. | ||
| This is to illustrate the absurdity of economics. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        One of the things you said when you – Eating shit should not count as a job. | |
| One of the things you said when you stepped away is that you're kind of done and that it's unfixable. | ||
| Or under its current form, the way people are approaching it. | ||
| You can make it directionally better but ultimately you can't fully fix the system. | ||
| So I – like it is – it would be accurate to say that even – like unless you could go like super draconian, like Genghis Khan level on cutting waste and fraud, which you can't really do in a democratic country, an aspirationally democratic country, then there's no way to solve the debt crisis. | ||
| So we've got national debt that's just insane where the debt payments – the interest payments on the debt exceed our entire military budget. | ||
| I mean that was one of the wake-up calls for me. | ||
| I was like, wait a second. | ||
| The interest on our national debt is bigger than the entire military budget and growing? | ||
| This is crazy. | ||
| So even if you implement all these savings, you're only delaying the day of reckoning for when America becomes – goes bankrupt. | ||
| So – unless you go full Genghis Khan, which you can't really do. | ||
| So I came to the conclusion that the only way that – the only way to get us out of the debt crisis and to prevent America from going bankrupt is AI and robotics. | ||
| So like we need to grow the economy at a rate that allows us to pay off our debt. | ||
| And I guess people just generally don't appreciate the degree to which the government overspending is a problem. | ||
| But even – like the Social Security website, this is under the Biden administration. | ||
| On the website, I would say like we – based on current demographic trends and how much money Social Security is bringing in versus how many Social Security recipients there are because we have an aging population. | ||
| Relatively speaking, the average age is increasing. | ||
| Social Security will not be able to maintain its full payments. | ||
| I think by 2032. | ||
| So Social Security will have to stop – start reducing the amount of money that's been paid to people in about seven years. | ||
| And so the only way to fix that, robotics, manufacturing, raise GDP? | ||
| You've got to basically massively increase the economic output, which is – and the only way to do that is AI and robotics. | ||
| So basically, we're going bankrupt without AI and robotics even with a bunch of savings. | ||
| The savings – like reducing waste and forward can give us a longer runway, but it cannot ultimately pay off our national debt. | ||
| So what do you think the solution is to the jobs that are going to be lost because of AI and robotics, the jobs due to automation, the jobs due to – no longer do we need human beings to do these jobs because AI is doing them? | ||
| Do you think it's going to be some sort of a universal basic income thing? | ||
| Do you think there's going to be some other kind of solution that has to be implemented? | ||
| Because a lot of people are going to be out of work, right? | ||
| I think there will be actually a high demand for jobs but not necessarily the same jobs. | ||
| So, I mean, this is actually – this process has been happening throughout modern history. | ||
| I mean, there used to be – like doing calculations manually with like a pencil and paper used to be a job. | ||
| So they used to have like buildings full of people, cold computers where the banks would – like all you do all day is – you do calculations because they didn't have computers. | ||
| They didn't have digital computers that people do. | ||
| Yeah. | ||
| Well, it was just people who just like add and subtract stuff on a piece of paper and that would be how banks would do financial processing. | ||
| And you'd have to literally go over their equations to make sure the books are balanced. | ||
| Yeah. | ||
| And most times it's just simple math. | ||
| Like in a world before computers, how did you calculate – how did you do transactions? | ||
| You had to do them by hand. | ||
| So then when computers were introduced, the job of doing bank calculations no longer existed. | ||
| So people had to go do something else. | ||
| And that's what's going to happen. | ||
| That's what is happening at an accelerated rate due to AI and then robotics. | ||
| That's the issue though, right? | ||
| The accelerated rate because it's going to be – It's the accelerated – it's just happening. | ||
| Like I said, AI is the supersonic tsunami. | ||
| So that's why I call it, supersonic tsunami. | ||
| So – It's like what other jobs will be available that aren't available now because of AI? | ||
| Well, AI will – is really still digital. | ||
| Ultimately, AI can improve the productivity of humans who build things with their hands or do things with their hands. | ||
| Like literally welding, electrical work, plumbing, anything that's physically moving atoms, like cooking food or farming or – like anything that's physical, those jobs will exist for a much longer time. | ||
| But anything that is digital, which is like just someone at a computer doing something, AI is going to take over those jobs like lightning. | ||
| Coding, anything along those lines. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| It's going to take over those jobs like lightning. | ||
| Just like digital computers took over the job of people doing manual calculations, but much faster. | ||
| So what happens to all those people? | ||
| Like what kind of numbers are we talking about? | ||
| Like you're going to lose most drivers, right? | ||
| Commercial drivers. | ||
| You're going to have automated vehicles, AI-controlled systems. | ||
| Just like there's certain ports in China and I think in Singapore where everything is completely automated. | ||
| Yeah. | ||
| Mostly. | ||
| Yeah. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| So you're going to lose a lot of those jobs, longshoremen jobs, trucking, commercial drivers. | ||
| Yeah. | ||
| I mean we actually do have a shortage of truck drivers, but there's actually – Well, that's why California has hired so many illegals to do it. | ||
| Have you seen those numbers? | ||
| Yeah. | ||
| I mean the problem is like when people don't know how to drive a semi-truck, which is actually a hard thing to do, then they crash and kill people. | ||
| Yeah. | ||
| A friend of mine's wife was killed by an illegal driving a truck and she was just out biking and there was an illegal – he didn't know how to drive the truck or something. | ||
| I mean he ran her over. | ||
| So I mean the thing is like for something – like you can't let people drive sort of an 80,000-pound semi if they don't know how to do it. | ||
| But in California, they're just letting people do it. | ||
| Because they need people to do it. | ||
| Well, they also need – they want the votes and that kind of thing. | ||
| But yeah, like cars are going to be autonomous. | ||
| But there's just so many desk jobs where really what people are doing is they're processing email or they're answering the phone. | ||
| And just anything that is – that isn't moving atoms, like anything that is not physically – like doing physical work, that will obviously be the first thing. | ||
| Those jobs will be and are being eliminated by AI at a very rapid pace. | ||
| And ultimately, working will be optional because you'll have robots plus AI and we'll have, in a benign scenario, universal high income. | ||
| Not just universal basic income, universal high income, meaning anyone can have any products or services that they want. | ||
| But there will be a lot of trauma and disruption along the way. | ||
| So you anticipate a basic income from – that the economy will boost to such an extent that a high income would be available to almost everybody. | ||
| So we'd essentially eliminate poverty. | ||
| In the benign scenario, yes. | ||
| So like – There's multiple scenarios. | ||
| There are multiple scenarios. | ||
| There's a lot of ways this movie can end. | ||
| Like the reason I'm so concerned about AI safety is that like one of the possibilities is the Terminator scenario. | ||
| It's not zero percent. | ||
| So that's why it's like – I'm like really banging the drum on AI needs to be maximally truth-seeking. | ||
| Like don't make – don't force AI to believe a lie like that, for example, the founding fathers were actually a group of diverse women or that misgendering is worth a nuclear war. | ||
| Because if that's the case and then you get the robots and the AI becomes omnipotent, it can enforce that outcome. | ||
| And then – unless you're a diverse woman, you're out of the picture. | ||
| So we're toast. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        So that's – Or you might wake up as a diverse woman one day. | |
| The AI has adjusted the picture and we are now a diverse woman. | ||
| Everyone's a diverse woman. | ||
| So that would be – that's the worst possible situation. | ||
| So what would be the steps that we would have to take in order to implement the benign solution where it's universal high income? | ||
| Like best case scenario, this is the path forward to universal high income for essentially every single citizen that the economy gets boosted by AI and robotics to such an extent that no one ever has to work again. | ||
| And what about meaning for those people, which is – which gets really weird? | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| I don't know how to answer the question about meaning. | ||
| That's an individual problem, right? | ||
| But it's going to be an individual problem for millions of people. | ||
| Yeah. | ||
| Well, I mean, I – I guess I've like fought against saying like – you know, I've been a voice saying like, hey, we need to slow down AI. | ||
| We need to slow down all these things. | ||
| And we need to, you know, not have a crazy AI race. | ||
| I've been saying that for a long time, for 20 plus years. | ||
| But then I came to realize that really there's two choices here, either be a spectator or a participant. | ||
| And if I'm a spectator, I can't really influence the direction of AI. | ||
| But if I'm a participant, I can try to influence the direction of AI and have a maximally truth-seeking AI with good values that loves humanity. | ||
| And that's what we're trying to create with Grok at XAI. | ||
| And, you know, the research is, I think, bearing this out. | ||
| Like I said, when they compared like how do AIs value the weight of a human life, Grok was the only one, the only one of the AIs that weighted human life equally. | ||
| And didn't say like a white guy's worth one-twentieth of a black woman's life. | ||
| Literally, that's what the calculation they came up with. | ||
| So I'm like, this is very alarming. | ||
| We've got to watch this stuff. | ||
| So this is one of the things that has to happen in order to reach this benign solution. | ||
| Yeah. | ||
| Best movie ending. | ||
| Yeah. | ||
| You want a curious, truth-seeking AI. | ||
| And I think a curious, truth-seeking AI will want to foster humanity. | ||
| Because we're much more interesting than a bunch of rocks. | ||
| Like you said, I love Mars, you know. | ||
| But Mars is kind of boring. | ||
| It's just a bunch of red rocks. | ||
| There's some cool stuff. | ||
| It's got a tall mountain. | ||
| It's got the biggest ravine and the tallest mountain. | ||
| But there's no animals or plants and there's no people. | ||
| And, you know, so humanity is just much more interesting, if you're a curious, truth-seeking AI, than not humanity. | ||
| It's just much more interesting. | ||
| I mean, like, as humans, we could go, for example, and eliminate all chimps. | ||
| If we said, if we put our minds to it, we could say, we could go out and we could annihilate all chimps and all gorillas. | ||
| But we don't. | ||
| There has been encroachment on their environment, but we actually try to preserve the chimps and gorilla habitats. | ||
| And I think in a good scenario, AI would do the same with humans. | ||
| It would actually foster human civilization and care about human happiness. | ||
| So this is a thing to try to achieve, I think. | ||
| But what does the landscape look like if you have Grok competing with open AI, competing with all these different – like, how does it work? | ||
| Like, what – if you have AIs that have been captured by ideologies that are side-by-side competing with Grok, like, how do we – so this is one of the reasons why you felt like it's important to not just be an observer, but participate and then have Grok be more successful and more potent than these other applications. | ||
| Yes. | ||
| As long as there's at least one AI that is maximally truth-seeking, curious, and, you know, and, for example, ways all, you know, human lives equally does not favor one race or gender, then that – and people are able to look at, you know, Grok at XAI and compare that and say, wait a second, why are all these other AIs being basically sexist and racist? | ||
| And then that causes some embarrassment for the other AIs and then they affect – you know, they improve. | ||
| They tend to improve just in the same way that acquiring Twitter and allowing the truth to be told and not suppressing the truth forced the other social media companies to be more truthful. | ||
| In the same way, having Grok be a maximally truth-seeking, curious AI will force the other AI companies to also be more truth-seeking and fair. | ||
| And the funniest thing is even though like the socialists and the Marxists are in opposition to a lot of your ideas, but if this gets implemented and you really can achieve universal high income, that's the greatest socialist solution of all time. | ||
| Like literally no one will have to work. | ||
| Correct. | ||
| Like I said, so there is a benign scenario here, which I think probably people will be happy with as long as we achieve it, which is sustainable abundance, which is if everyone can have – like if you ask people like, what's the future that you want? | ||
| And I think a future where we haven't destroyed nature, like you can still – we have the national parks, we have the Amazon rainforest, it's still there. | ||
| We haven't paved the rainforest. | ||
| Like the natural beauty is still there. | ||
| But people have – nonetheless, everyone has abundance. | ||
| Everyone has excellent medical care. | ||
| Everyone has whatever goods and services they want. | ||
| It kind of sounds like heaven, basically. | ||
| It is like the ideal socialist utopia. | ||
| And this idea that the only thing you should be doing with your time is working in order to pay your bills and feed yourself sounds kind of archaic considering the kind of technology that's at play. | ||
| Yeah. | ||
| Like a world where that's not your concern at all anymore. | ||
| Everybody has money for food. | ||
| Everybody has abundance. | ||
| Everybody has electronics in their home. | ||
| Everybody essentially has a high income. | ||
| Now you can kind of do whatever you want. | ||
| And your day can now be exploring your interests, doing things that you actually enjoy doing. | ||
| Your purpose just has to shift. | ||
| Instead of, you know, I'm a hard worker and this is what I do and that's how I define myself. | ||
| Now you can fucking golf all day. | ||
| You know, you can – whatever it is that you enjoy doing can now be your main pursuit. | ||
| Yeah. | ||
| Well, that sounds crazy good. | ||
| Yeah. | ||
| That's the benign scenario that we should be aiming for. | ||
| The best ending to the movie is actually pretty good. | ||
| Yes. | ||
| Like I think there is still this question of meaning, of like making sure people don't lose meaning. | ||
| You know, like so hopefully they can find meaning in ways that are – that's not derived from their work. | ||
| And purpose. | ||
| Purpose for things that you – you know, find things that you do that you enjoy. | ||
| But there's a lot of people that are independently wealthy that spend most of their time doing something they enjoy. | ||
| Right. | ||
| And that could be the majority of people. | ||
| Pretty much everyone. | ||
| But we would have to rewire how people approach life. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Mm-hmm. | |
| Which seems to be like acceptable because you're not asking them to be enslaved. | ||
| You're exactly asking them the opposite. | ||
| Like no longer be burdened by financial worries. | ||
| Now, go do what you like. | ||
| Yes. | ||
| Go fucking test pizza. | ||
| Do whatever you want. | ||
| Pretty much. | ||
| Um, so that's, uh, that's, that's the, that's the, that's probably the best case outcome. | ||
| That sounds like the best case outcome period for the future. | ||
| If you're looking at like how much people have struggled just to feed themselves all throughout history, food, shelter, safety. | ||
| If all of that stuff can be fixed – like how much would you solve a lot of the crime if there was a universal high income? | ||
| Just think of that. | ||
| Like how much of crime is financially motivated? | ||
| You know, the greater percentage of people that are committing crimes live in poor, disenfranchised neighborhoods. | ||
| So if there's no such thing anymore, if you really can achieve universal high income, this is, it sounds like a utopian. | ||
| Yes. | ||
| Um, I think some people may commit crime because they like committing crime. | ||
| Just some amount of that is they just enjoy it. | ||
| There's a lot of wild people out there. | ||
| Yeah. | ||
| Yeah. | ||
| And obviously they've become 40 years old living a life like that. | ||
| Now all of a sudden universal high income is not going to completely stop their instincts. | ||
| Yeah. | ||
| Um, I mean I guess if you want to have like, like say read a science fiction book or some books that are probably inaccurate or the least inaccurate version of the future, I'd say I'd recommend the Ian Banks books, the culture books. | ||
| It's not actually a series, it's a, it's like a sci, sci-fi books about the future that generally called the culture books, Ian Banks culture books. | ||
| It's worth reading those. | ||
| When did he write these? | ||
| He started writing them in the seventies. | ||
| Um, and I think he, um, the last one, I think he was, I think it was written just like around, I don't know, maybe 2010 or something. | ||
| I'm not sure exactly. | ||
| Yeah. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| Scottish author, Ian Banks. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| From 87 to 2012. | ||
| Yeah. | ||
| Interesting. | ||
| But he, but like he wrote the, the, the, like his first book, Consider Fleevers, I think he started writing that in the seventies. | ||
| And there are books are incredible, by the way. | ||
| Oh. | ||
| Incredible books. | ||
| 4.6 stars on Amazon. | ||
| Interesting. | ||
| So, um. | ||
| So this gives me hope. | ||
| Uh, yeah, yeah, yeah. | ||
| This is the first time I've ever thought about it this way. | ||
| Yeah. | ||
| Well, I mean, if it, like, I often ask people, what is the future that you want? | ||
| And they have to think about it for a second. | ||
| Cause you know, they were usually tied up in whatever the daily struggles are, but, but you say, what is the future that you want? | ||
| Um, and, um, and generally sustainable abundance, or at least say, what about a future where there's sustainable abundance? | ||
| And it's like, oh yeah, that's a pretty good future. | ||
| Um, so, um, you know, if, if, and, and, and that, that future is attainable with, uh, AI and robotics. | ||
| Um, but, but, you know, it's, it's, like I said, there's, not every path is a good path. | ||
| Uh, there's this, it's, but I think if we, if we push it in the direction of, um, maximally truth-seeking and curious, then I think AI will want to take, to, to take care of humanity and foster, uh, foster humanity. | ||
| Um, because we're interesting. | ||
| Um, and if it hasn't been programmed to think that, uh, like all straight white males should die, which Gemini was basically programmed to do it, at least first, um, you know, they seem to have fixed that, hopefully fixed it. | ||
| But don't you think culturally, like, oh, we're getting away from that mindset and that people are realizing how preposterous that all is? | ||
| We are getting away from it. | ||
| Um, so, uh, we are getting, at least it knows AI. | ||
| It mostly knows to hide things, but like, like I said, there is that, I think I still have that as, or I had that as my, like, pinned post on X, which was like, uh, hey, wait a second, guys. | ||
| We still have, uh, every AI except Grok, uh, is saying that, uh, basically straight white males should die. | ||
| Um, and this is a problem and we should fix it. | ||
| Um, you know, but simply me saying that is like, tends to generally result in, um, you know, them like, ooh, that is kind of bad. | ||
| Uh, maybe we should just, we should not have all straight white males die. | ||
| Um, I think they have to say also, all, all, all, uh, straight Asian males should also die, uh, as well. | ||
| They'd like, they don't like, uh, like generally, the, generally the AI and the, and the media, which, which back in the day, the, the, the, the, the media was, um, you know, racist against, uh, uh, black people. | ||
| And sexist against women back in the day, now, now it is a racist against, uh, white people and Asians and rate and sexist against men. | ||
| Um, so are they just like being racist and sexist? | ||
| I think they just want to change the target. | ||
| Um, so, uh, but, but really they just shouldn't be, uh, racist and sexist at all. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Um, you know, yeah, ideally that would be nice. | |
| That would be nice. | ||
| Um, and it's kind of crazy that we were kind of moving in that general direction until around 2008. | ||
| 2012 and then everything ramped up online and, and everybody was accused of being a Nazi and everybody was transphobic and racist and sexist and homophobic and everything got exaggerated to the point where it was this wild witch hunt where everyone was a Columbo looking for racism. | ||
| Yeah, yeah, yeah, totally. | ||
| Um, well, well, but, but, but they, they were openly anti-white and often openly anti-Asian. | ||
| And then this new sentiment that you cannot be racist against white people because racism is power and influence. | ||
| Okay. | ||
| No, it's not. | ||
| Yeah. | ||
| Racism is, is, is racism in the absolute. | ||
| Um, so, um, you know, and it just needs to be consistency. | ||
| So if it's okay to have, uh, let's say, uh, black or Asian or Indian or a pride, it should be okay to have white pride too. | ||
| Yeah. | ||
| Um, so that's just a, that's just a consistency question. | ||
| Um, so, uh, you know, um, if, if it's okay to be proud of one religion, it should be okay to be proud of, I guess, all religions provided there that they're, they're not like, uh, oppressive. | ||
| Yeah. | ||
| Or, or, or don't like, as long as part of that religion is not like exterminating, uh, people who are not in that religion. | ||
| Right. | ||
| Um, so, uh, it's really just like a consistency bias. | ||
| Um, or, or just like, uh, ensuring consistency to eliminate, uh, bias. | ||
| Um, so if it is possible to be, uh, racist against, uh, one race, it is possible to be racist against any race. | ||
| Um, so. | ||
| Of course, logically. | ||
| Yes. | ||
| Yeah. | ||
| And arguing against that is that's when you know you're captured. | ||
| It's a, it's a logical inconsistency that makes AIs go insane. | ||
| And people. | ||
| And people go insane. | ||
| Yes. | ||
| Oh. | ||
| Um. | ||
| But like the, the, like, like you can't simultaneously say, um, that, uh, that there's, the systemic, uh, racist oppression, but also that races don't exist. | ||
| That, that, that race, race is a social construct. | ||
| Like, which is it? | ||
| You know, um, you also can't say that, um, you know, anyone who steps foot in America is, is automatically an American, except for the people that originally came here. | ||
| Exactly, exactly. | ||
| Except for the colonizers. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| Except for the evil colonizers who came here. | ||
| Right. | ||
| So which one is it? | ||
| Like, if you, if as soon as you step foot in a place you are that, you are just as American as everyone else, then, um, that would have applied. | ||
| If you apply that consistently, then the original white settlers were also just as American as everyone else. | ||
| Yeah. | ||
| Logically. | ||
| Logically. | ||
| Um, one more thing that I have to talk to you about before you leave is the rescuing of the people from the space station, which, uh, we talked about. | ||
| When you were planning it the last time you were here, um, the, the, the lack of coverage that that got in mainstream media was one of the most shocking things. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| They totally memory hold that thing. | ||
| Wild. | ||
| Yes. | ||
| Because if it wasn't a few. | ||
| It's like it didn't exist. | ||
| Those people would be dead. | ||
| They'd be stuck up there. | ||
| Well, they'd, they'd probably still be alive, but they'd, they'd, they'd, they'd be having bone density issues, uh, because of prolonged exposure to zero gravity. | ||
| Well, they were already up there for like eight months, right? | ||
| Yeah. | ||
| Like, which is an insanely long time. | ||
| It takes forever to recover just from that. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| They're only supposed to be at the space station for three to six months maximum. | ||
| So. | ||
| One of the things that you told me that was so crazy was that you could have gotten them sooner, but. | ||
| Yeah. | ||
| But for political reasons, uh, they didn't, they did not want, uh, SpaceX or me to be associated with, um, returning the astronauts before the election. | ||
| That is so wild. | ||
| That that's a fact. | ||
| First of all, that even. | ||
| We absolutely could have done it. | ||
| Um, so. | ||
| But even though you did do it and you did it after the election, it received almost no media coverage anyway. | ||
| Yes. | ||
| Because nothing good can, the, the, the, the media, which is essentially, uh, a follow of profit, the legacy mainstream media is a follow of propaganda machine. | ||
| Um, and so anything, any story that is positive about someone who is not, uh, part of the sort of far left tribe will not, uh, get any coverage. | ||
| So I could save a busload of orphans and, and it, it wouldn't get a single new story. | ||
| Yeah. | ||
| It's, it really is nuts. | ||
| It was nuts to watch because even though it was discussed on podcasts and it was discussed on X and it was discussed on social media, it's still, it was a blip in the news cycle. | ||
| It was very quick. | ||
| It was in and out. | ||
| And because it was a successful launch and you did rescue those people and nobody got hurt and there was nothing really to, there was no blood to talk about. | ||
| Right. | ||
| Just fucking in and out. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| Yeah. | ||
| Absolutely. | ||
| Well, and, and, and, as you saw firsthand with the Starship, uh, launch, like Starship is, um, you know, by, you know, at least by some, some would consider it to be like the most amazing, uh, you know, engineering project that's happening on earth right now outside of like, you know, maybe AI or AI and robotics. | ||
| But, but certainly in terms of a spectacle to see, it is, uh, the most spectacular thing that is happening on earth right now, uh, is the Starship launch program, which anyone can go and see if they just go to South Texas and just, they can just rent a hotel room, low cost in South Padre Island or in Brownsville. | ||
| And you can see the launch and you can drive right, right past the factory because it's on a public highway. | ||
| Um, but it gets no coverage or what coverage it does get. | ||
| It was like, uh, rocket blew up coverage. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Right. | |
| Yeah. | ||
| Oh, he's a fuckwit. | ||
| The rocket blew up. | ||
| Like the, the, the, the, the, the, the, the program is vastly, vastly, vastly more capable than the entire Apollo moon program, vastly more capable. | ||
| This is a spaceship that is designed to make life multi-planetary, to carry, uh, millions of people across the heavens to another planet. | ||
| The Apollo program could, the Apollo program could, could only send astronauts to the moon for a few hours at a time. | ||
| Like they could send two out, the entire Apollo program could only send astronauts to visit the moon very briefly and then for a few hours and then depart. | ||
| The starship program could create an entire, uh, uh, lunar base with a million people. | ||
| The magnitudes are different, very different magnitudes here. | ||
| So what was the political resistance though? | ||
| I mean, no, no coverage of it. | ||
| Yeah. | ||
| The, but what I wanted to ask you is like, what, so what were the conversations leading up to the rescue? | ||
| Like when you were like, I can get them out way quicker. | ||
| Yeah. | ||
| Um, um, um, well, I mean, you know, I raised this a few times, but it was the, I was told instructions came from the white house that, uh, you know, that, that, that there should be no attempt to rescue before the election. | ||
| That should be illegal. | ||
| Um, that, that really should be a horrendous miscarriage of justice for those poor people that were stuck on that. | ||
| Um, yeah, it is, it is crazy. | ||
| Um, have you ever talked to those folks afterwards? | ||
| Did you have conversations with them? | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| I mean, they're, they're not going to say anything political to, you know, they're not like, they're never going to say thank you. | ||
| Yeah. | ||
| Well, that's nice. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| Yeah. | ||
| Absolutely. | ||
| So, um, But the instructions came down from the white house. | ||
| You cannot rescue them because politically this is a bad hand of cards. | ||
| I mean, they didn't say because politically it's a bad hand of cards, but they, they just said, uh, they weren't, they were not interested in, uh, any rescue operation before the election. | ||
| Yeah. | ||
| So. | ||
| What did that feel like? | ||
| I wasn't surprised. | ||
| But it's crazy. | ||
| Yeah. | ||
| Because Biden could have authorized it and they could have said the Biden administration is helping bring those people back, throw you a little funding, give you some money to do it. | ||
| The Biden administration, they funded these people being returned. | ||
| Uh, yeah, the Biden administration was not exactly my best friend, especially after I, um, you know, you know, help Trump get elected, get elected, which, I mean, some people still think, you know, Trump is like the devil basically. | ||
| Um, and I mean, I think, I think, I think Trump actually is, he's not, he's not perfect, but, but, uh, he's not evil. | ||
| Trump is not evil. | ||
| I mean, I spent a lot of time with, with him and he's, I mean, he's a product of his time, uh, but he is not, he's not evil. | ||
| Um, no, I don't think he's evil either. | ||
| But if you look at the media coverage, the media, the media, the media treats him like he's super evil. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| It's pretty shocking. | ||
| If you look at the amount of negative coverage, like one of the things that I looked at the other day was mainstream media coverage of you, Trump, a bunch of different public figures. | ||
| And then 96% negative or something crazy. | ||
| And then Mom Donnie, which is like 95% positive. | ||
| Right. | ||
| Um, I mean, Mom Donnie is, is, is, is a charismatic swindler. | ||
| Um, I, I, I mean, you gotta hand it to him. | ||
| Like he, he, he did, he can light up a stage. | ||
| Um, but he has just been a swindler his entire life. | ||
| Um, and, um, you know, and, and, uh, I think he, he, he, he, he's, I mean, he's likely to win. | ||
| Like he's likely to be mayor of New York, New York city. | ||
| Very likely. | ||
| Yeah. | ||
| Very likely. | ||
| I think, um, poly market has it at what? | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        What is the, yeah, that sounds pretty likely. | |
| That's crazy. | ||
| Like, I'm not sure who the 6% are, you know? | ||
| Um, so, so, so yeah, so that's, um. | ||
| Well, it's also like, who's on the other side? | ||
| The fucking guardian angel guy with the beret? | ||
| And Andrew Cuomo who doesn't even have a party? | ||
| Like, the Democrats don't even want him. | ||
| So you have those two options. | ||
| Um, and then you have the young kids who are like, finally, socialism. | ||
| Um, yeah, they, they don't know what they're talking about, obviously. | ||
| Um, so, you know, like, you just look at this, say, how many boats come from Cuba to Florida? | ||
| And how many, but, and how many boats, because, you know, there's like a constant, I always think, like, how many boats are accumulating on the shores of Florida coming from, from Cuba? | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Right. | |
| Um, there's, there's a whole bunch of free boats that you could, if you want to, go take them back to Cuba. | ||
| It's pretty close. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah. | |
| But for some reason, people don't do that. | ||
| Why, why, why, why are the boats only coming in this direction? | ||
| Um. | ||
| Well, who is, who are the most rabid capitalists in America? | ||
| The fucking Cubans. | ||
| Absolutely. | ||
| Yeah. | ||
| They're like, we've seen how this story goes. | ||
| We do not want, exactly. | ||
| Fuck off. | ||
| The Cubans in Miami, they don't want to hear anything. | ||
| Bullshit. | ||
| They don't want to hear any socialism bullshit. | ||
| They're like, no, no, no. | ||
| We know what this actually is. | ||
| This isn't just some fucking dream. | ||
| Yeah. | ||
| It's extreme government oppression. | ||
| Yeah. | ||
| That's how it is. | ||
| And it's a nightmare. | ||
| And like, an obvious way you can tell which, uh, which ideology is, is the bad one is, um, who has to, which ideology is building a wall to keep people in and prevent them from escaping. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Right. | |
| Like, so, East Berlin built the wall, not West Berlin. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Right. | |
| They built the wall because people were trying to escape from communism to West Berlin. | ||
| But there wasn't anyone going from West Berlin to East Berlin. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Right. | |
| That's why the communists had to build a wall to keep people from escaping. | ||
| They're going to have to build a wall around New York City. | ||
| Yeah. | ||
| So, it's kind of an obvious tell that ideology is problematic if that ideology has to build a wall to keep people in with machine guns. | ||
| Yes. | ||
| And shoot you if you try to leave. | ||
| Also, there's no examples of it being successful ever. | ||
| We're working out for people. | ||
| No, there's examples of a bunch of lies, like North Korea, give this land to the state, we'll be in control of food, no one goes hungry. | ||
| No, now no one can grow food but the government, and we'll tell you exactly what you eat, and you eat very little. | ||
| Right. | ||
| When you say Mamdani's a swindler, I know he has a bunch of fake accents that he used to use, but what else has he done that makes him a swindler? | ||
| Well, I guess if you say to any audience whatever that audience wants to hear, instead of having a consistent message, I would say that that is a swindly thing to do. | ||
| But he is charismatic. | ||
| Yeah, good-looking guy, smart, charismatic, great on a microphone. | ||
| Yeah, yeah, yeah, yeah. | ||
| And what the young people want to see, you know? | ||
| Like this ethnic guy who's young and vibrant and has all these socialist ideas and aligns with them. | ||
| And, you know, they're a bunch of broke dorks just out of college, like, yay, let's vote for this. | ||
| And there's a lot of them. | ||
| And they're activated. | ||
| They're motivated. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah, yeah. | |
| I guess we'll see what happens here. | ||
| What do you think happens if he wins? | ||
| Because, like, 1% of New York City is responsible for 50% of their tax base, which is kind of nuts. | ||
| 50% of the tax revenue comes from 1% of the population, and those are the people that you're scaring off. | ||
| You know, you lose one half of 1%. | ||
| I mean, hopefully the stuff he's said, you know, about government takeovers of, like, that all the stores should be the government, basically. | ||
| Well, I don't think he said that. | ||
| I think he said they want to do government supermarkets, some state-run or city-run supermarkets. | ||
| Yeah. | ||
| Well, it's just – the government is the DMV at scale. | ||
| So, you have to say, like, do you want the DMV running your supermarket? | ||
| Right. | ||
| Was your last experience at the DMV amazing? | ||
| And if it wasn't, you probably don't want the government doing things. | ||
| Imagine if they were responsible for getting you blueberries. | ||
| Yeah. | ||
| It's not going to be good. | ||
| I mean, the thing about, you know, communism is it was all bread lines and bad shoes. | ||
| You know, do you want ugly shoes and bread lines? | ||
| Because that's what communism gets you. | ||
| It's going to be interesting to see what happens and whether or not they snap out of it and overcorrect and go to some Rudy Giuliani-type character next. | ||
| Because it's been a long time since there was any sort of Republican leader there. | ||
| We live in the most interesting of times because we face the – you know, simultaneously face civilizational decline and incredible prosperity. | ||
| And these timelines are interwoven. | ||
| So, if Mamdani's policies are put into place, especially at scale, it would be a catastrophic decline in living standards, not just for the rich but for everyone. | ||
| As has been the case with every socialist experiment or every – yeah. | ||
| So, but then, as you pointed out, the irony is that, like, the ultimate capitalist thing of AI and robotics enabling prosperity for all and abundance of goods and services, actually the capitalist implementation of AI and robotics, assuming it goes down the good path, is actually what results in the communist utopia. | ||
| Yeah, because fate is an irony maximizer. | ||
| Right. | ||
| And an actual socialism of maximum abundance, of high-income people. | ||
| Universal high-income. | ||
| Yeah. | ||
| Like, the problem with communism is it's universal low-income. | ||
| It's not that everyone gets elevated. | ||
| It's that everyone gets oppressed except for a very small minority of politicians who live a life of luxury. | ||
| That's what's happening every time it's been done. | ||
| So, but then the actual communist utopia, if everyone gets anything they want, will be achieved – if it is achieved, it will be achieved via capitalism because fate is an irony maximizer. | ||
| I feel like we should probably end it on that. | ||
| Is there anything else? | ||
| The most ironic outcome is the most likely, especially if entertaining. | ||
| Well, everything has been entertaining. | ||
| Yeah. | ||
| As long as the bad things aren't happening to you, it's quite fascinating. | ||
| And it's never a boring moment. | ||
| Yes. | ||
| So, there's – I do have a theory of why – like, if simulation theory is true, then it is actually very likely that the most interesting outcome is the most likely because only the simulations that are interesting will continue. | ||
| The simulators will stop any simulations that are boring because they're not interesting. | ||
| But here's the question about the simulation theory. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Is the simulation run by anyone or is – It would be run by someone. | |
| It would be run by – Some – Some force. | ||
| The program. | ||
| Like, in this reality that we live in, we run simulations all the time. | ||
| Like, so when we try to figure out if the rocket's going to make it, we run thousands, sometimes millions of simulations just to figure out which path is the good path for the rocket and where can it go wrong, where can it fail. | ||
| But when we do these, I'd say at this point, millions of simulations of what can happen with the rocket, we ignore the ones that are where everything goes right because we just care about – we have to address the situations where it goes wrong. | ||
| So, basically, and for AI simulations as well, like all these things, we keep the simulations going that are the most interesting to us. | ||
| So, if simulation theory is accurate – if it is true, who knows – then the simulators will only – they will continue to run the simulations that are the most interesting. | ||
| Therefore, from a Darwinian perspective, the only surviving simulations will be the most interesting ones. | ||
| And in order to avoid getting turned off, the only rule is you must keep it interesting or you will – because the boring simulations will be terminated. | ||
| Are you still completely convinced that this is a simulation? | ||
| I didn't say I was completely convinced. | ||
| Well, you said it's like the odds of it not being are in the billions. | ||
| Like I said, it's not completely because you're saying there's a chance. | ||
| What are the odds that we're in base reality? | ||
| Well, given that we're able to create increasingly sophisticated simulations, so if you think of, say, video games and how video games have gone from very simple video games like Pong with two rectangles and a square to video games today being photorealistic with millions of people playing simultaneously, and all of that has occurred in our lifetime. | ||
| So, if that trend continues, video games will be indistinguishable from reality. | ||
| The fidelity of the game will be such that you don't know if that – what you're seeing is a real video or a fake video. | ||
| And like AI-generated videos at this point, like you can sometimes tell it's an AI-generated video, but often you cannot tell. | ||
| And soon you will not really just not be able to tell. | ||
| So, if that's happening in our direct observation, and we'll create millions if not billions of photorealistic simulations of reality, then what are the odds that we're in base reality versus someone else's simulation? | ||
| Well, isn't it just possible that the simulation is inevitable, but that we are in base reality building towards a simulation? | ||
| We're making simulations. | ||
| So, we're making simulations. | ||
| We make – like you can just think of like photorealistic video games as being simulations. | ||
| Mm-hmm. | ||
| And especially as you apply AI in these video games, like the characters in the video games will be incredibly interesting to talk to. | ||
| They won't just have a limited dialogue tree where if you go to like the crossbow merchant or like – and you try to talk about any subject except buying a crossbow, they just want to talk about selling you a crossbow. | ||
| But with AI-based non-player characters, you'll be able to have an elaborate conversation with no dialogue tree. | ||
| Well, that might be the solution for meaning for people. | ||
| Just log in and you could be a fucking vampire and whatever. | ||
| You live in Avatar land. | ||
| You could do it – you could do whatever you want. | ||
| I mean, you don't have to think about money or food. | ||
| Ready Player One. | ||
| Yeah. | ||
| Literally. | ||
| Yeah. | ||
| But with higher living standards. | ||
| Yeah. | ||
| You don't have to be a little trailer. | ||
| I mean, I think this – people do want to have some amount of struggle or something they want to push against. | ||
| But it could be, you know, playing a sports or playing a game or something like that. | ||
| It could be easily playing a game. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Yeah, yeah. | |
| And especially playing a game where you're now no longer worried about like physical attributes, like athletics, like bad joints and hips and stuff like that. | ||
| Now it's completely digital. | ||
| But yet you do have meaning in pursuing this thing that you're doing all day, whatever the fuck that means. | ||
| It's going to be weird. | ||
| It's going to be interesting. | ||
| It's going to be very interesting. | ||
| The most interesting and usually ironic outcome is the most likely. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        All right. | |
| That's a good predictor of the future. | ||
| Thank you. | ||
| Thanks for being here. | ||
| Really appreciate you. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        Good to see you. | |
| Appreciate your time. | ||
| 
             
                            
                                unidentified
                            
                         
                    
                 | 
        
        I know you're a busy man, so this means a lot to come here and do this. |