Speaker | Time | Text |
---|---|---|
unidentified
|
Joe Rogan podcast, check it out! | |
The Joe Rogan experience. | ||
Train by day, Joe Rogan podcast by night, all day! | ||
What's up, man? | ||
Good. | ||
So having this big Counter-Strike tournament in town, does that give you the Joneses? | ||
Totally, totally. | ||
You know, it's like, so your guy, Jason, was telling me about it because, you know, in addition to driving, he also flies the helicopter. | ||
And he told me, like, the Red Bull guys were flying off, and there's like this big tournament. | ||
I looked it up. | ||
It was like, oh, Counter-Strike. | ||
So I used to be a bit of a pro player myself. | ||
So how do you get out of pro playing? | ||
Because the problem with playing games is that it's essentially like an eight-hour a day thing. | ||
Like it becomes a giant chunk of your life, right? | ||
And I would imagine if you're playing pro, it's even more of a commitment. | ||
You know, I take a different view on games. | ||
You know, a lot of people kind of view it as a sort of somehow like a negative thing, especially for kids. | ||
Actually, I got my kit, my four-year-old, like a Nintendo Switch early on. | ||
We're playing together because I feel like for me, it helped me a lot with strategy thinking, with reaction time. | ||
I think gamers tend to think really fast. | ||
Have you seen the studies that they've done about surgeons? | ||
No, tell me. | ||
Surgeons that play video games regularly are much less likely to make mistakes. | ||
I totally believe it. | ||
Something in the neighborhood of 25%. | ||
Is that what it is, Jamie? | ||
Something like that? | ||
But so much so that I would say you should teach video games to surgeons. | ||
It should actually be a required thing, like cross-training. | ||
Right. | ||
Isn't the Army also recruiting from gamers today as well? | ||
That's what I heard. | ||
I would imagine like drone pilots. | ||
Right. | ||
Right. | ||
I mean, that would make a big difference. | ||
Especially if you can get them used to the same controllers. | ||
Totally. | ||
You know, because those controllers kind of become a part of your hand. | ||
Like, you know exactly where all the buttons are. | ||
unidentified
|
Right. | |
If you're a kid that's playing fucking Counter-Strike or whatever it is, Call of Duty every day, I would imagine that that just becomes nature. | ||
Yeah, yeah. | ||
What is the thing with surgeons? | ||
It's nuts, right? | ||
It might be higher than 25%. | ||
It was a very particular kind of surgery, though, too, but it was like, I mean, they're almost using controllers, so fine. | ||
Yeah, but that they were making less mistakes. | ||
I don't think it's entirely negative. | ||
Because I love games. | ||
I love playing them, but I love them so much that I don't play them because I know I don't have any time. | ||
Quake is your favorite game, right? | ||
Yeah. | ||
So you're here to 27, 37% decrease in errors. | ||
That's wild. | ||
27% faster task completion time. | ||
That's nuts. | ||
So those guys grew up playing video game or did they say more than three hours per leap. | ||
I think they were still playing when they were doing the study. | ||
Yeah. | ||
So like that, I mean, imagine something that, like a pill you could take that would give you a 37% decrease in errors and a 27% faster task completion. | ||
That would be an incredible pill. | ||
Like you would make every surgeon take it. | ||
Did you take your video game pill before you do surgery? | ||
Hey, man, don't operate on my fucking brain unless you take your video game pill. | ||
You know, that's, you know, next time I need to have a surgery or whatever, I'm just going to ask the doctor. | ||
Is there a game? | ||
How much do you game, bro? | ||
But Jamie and I were talking about the one thing, and maybe that's kind of showing our age a little bit, but the one thing that's kind of like a little weird slash, I don't know somehow, like a little dystopian is the whole streaming situation where like kids are not like playing the game, they're like watching someone play the game. | ||
Yeah, that's not good. | ||
And it's like this zombifying thing where like they'll they'll spend hours just watching people. | ||
Yeah, this TikToking, it's essentially like TikTok, but video games, right? | ||
Because TikTok is kind of this mindless thing. | ||
You're just scrolling through mindless things and now you're mindlessly watching someone else play a game. | ||
Yeah. | ||
Yeah, it's almost like someone is like there's this strange thing with technology where like someone is living life and doing things and you're like sort of it's almost voyeurism or something like that about it. | ||
You know, David Foster Wallace, you know, the guy from Infinite Jest, wrote an essay on TVs. | ||
And, you know, he committed suicide before like, you know, the emergence of mobile phones and things like that. | ||
But he was very prescient on the impact of technology on society and especially on America. | ||
And he was also addicted to TV. | ||
And he talked about how it activates some kind of something in us that is something in human nature about voyeurism. | ||
And that's the thing that television and TikTok and things like that activate. | ||
And it's like this negative, addictive kind of behavior that's really bad for society. | ||
I definitely think there's an aspect of voyeurism, but there's just a dull drone of attention draw. | ||
There's a dullness to it that just like sucks you in like slack jawed. | ||
It is watching nonsense over and over and over again that does just enough to captivate your attention, but doesn't excite you, doesn't stimulate you, doesn't necessarily inspire you to do anything that is the first fly we've ever had in this room. | ||
Boom. | ||
Oh, I'm going to kill it. | ||
You're a nice person. | ||
You don't be able to kill that fly right away. | ||
But it's just this thing where it doesn't do a lot. | ||
It's not like, you know, like, have you ever done Disney World? | ||
Yeah. | ||
Did you ever do Disney World in Florida where you do that giraffe? | ||
There's the Avatar ride? | ||
No, I just went to a California one. | ||
unidentified
|
Okay. | |
The Avatar ride is Flights of Freedom? | ||
Flights of Passage. | ||
Fights of Passage. | ||
It's a VR game. | ||
Well, a ride, rather. | ||
And you put on a VR helmet and you get on this motorcycle looking thing. | ||
You're essentially riding a dragon. | ||
It's unbelievably engaging. | ||
It's incredible. | ||
It's the best ride I've ever been on in my life. | ||
That's cool. | ||
Like you're flying around, you feel the breeze, you're on this thing and the sounds are incredible. | ||
That's like engrossing, right? | ||
It takes over you. | ||
Stimulating, but that's not what you're getting from like TikTok or like streaming. | ||
You're getting this, oh, this dull, so it's sustainable. | ||
Yeah, I wonder which is worse, this or like opium habit or something. | ||
I know people that have done opium that are like functional. | ||
Yeah. | ||
You know, they can take pills and like kind of, I mean, I'm sure eventually their life falls off the rails, but it's like sort of semi-they're semi-functional when they're on these things. | ||
They can hold down a job and show up every day. | ||
And they're just like semi-functional opiate. | ||
There's a dude, I watched like a YouTube video, but like he's known for having this contrarian opinion on drugs that you can like control it, like you can, you can do these drugs. | ||
What does he look like? | ||
I don't know. | ||
I think he's a black dude. | ||
Oh, Carl Hart. | ||
Dr. Carl Hart. | ||
He was here? | ||
unidentified
|
Yeah, yeah. | |
He's been here a couple times. | ||
He's great. | ||
What do you think of his ideas? | ||
I think it's entirely biologically variable. | ||
I know people that cannot drink. | ||
They drink and then they're gone. | ||
They get hamsterized, get these black eyes where their soul goes away and then they're just off to the races and picking up hookers and doing cocaine and they find themselves in Guatemala. | ||
They're just nuts. | ||
They can't drink. | ||
I can drink. | ||
I don't pretend that the way my body handles alcohol is the way everybody's body handles alcohol. | ||
I think that's the same with everything. | ||
I think that's the same, most certainly with marijuana. | ||
I know some people that just cannot smoke marijuana and other people, it's fine. | ||
I think it's very, we're all very different physically. | ||
It's interesting. | ||
Alcohol is sort of on the downtrend all of America, but especially with young people, especially in Silicon Valley. | ||
Everyone there listens to Huberman. | ||
I call him the grand mufti of Silicon Valley because he'll say, no alcohol, no drinking. | ||
Everyone's like, don't drink. | ||
And all the parties are now mocktails and things like that. | ||
There are probably a lot of boring conversations, unfortunately. | ||
It's a little boring. | ||
I mean, it's very repetitive. | ||
It's all kind of like, will AI kill us? | ||
You guys would know better than anybody. | ||
You guys are at the forefront of it, unfortunately. | ||
Yeah, I quit drinking. | ||
I quit drinking over three months ago. | ||
Oh, wow. | ||
I know you guys used to do Sober October. | ||
Yeah. | ||
And that wasn't that hard. | ||
And, you know, I was like, God, it's going to be one whole month. | ||
And then I did. | ||
I was like, that's pretty easy. | ||
But I just had some revelations, I guess. | ||
And I think the big one is just physical fitness. | ||
I work out so much and I would drink and go to my club and have a couple of, not a lot either. | ||
Just have a few drinks and the next day just feel like total shit. | ||
I think with age especially, it starts affecting you. | ||
unidentified
|
It's always been like that. | |
It's always been. | ||
It's always been like that. | ||
I've always been hungover after a night of drinking, but you don't feel it normally. | ||
Like in normal life, if I just did normal stuff, it'd be fine. | ||
It's when you're in the gym that you notice. | ||
Right. | ||
When you're doing like second and third set of squats or something like that, you're like, oh, God. | ||
Yeah, 100%. | ||
And I haven't had any bad days since I quit drinking. | ||
Oh, cool. | ||
I've eliminated all that. | ||
And I'm like, just that alone is worth it. | ||
Just that alone, it's worth quitting. | ||
So why do you think there's this trend? | ||
Is it mostly for health? | ||
Well, I think there's a big health trend with a lot of young people. | ||
I think a lot of young people are recognizing the value of supplements. | ||
There's that fly. | ||
There's a difference between you and me. | ||
I'm going to kill this motherfucker. | ||
First fly I've ever had in here, Jamie. | ||
That's kind of crazy. | ||
Been here five years. | ||
One flyer. | ||
Father with me from California. | ||
He snuck in because there's a lot of steps that motherfucker has to go through to get into this room. | ||
I think a lot of people are very health conscious. | ||
That's the rise of cold plunging and sauna use and all these different things like intermittent fasting where people are really paying attention to their body and really paying attention and noticing that if you do follow these steps, it really does make a significant difference in the way you feel. | ||
And maybe more importantly, the way everything operates, not just your body, but your brain. | ||
It's like your function, your cognitive function improves with physical fitness. | ||
And, you know, if you're an ambitious person and you want to do well in life, you want your body to work well, you know, alcohol is not your friend. | ||
And I wonder how much of it is your impact because those things, you got me into all these things through your podcast. | ||
My wife and I just built like a small kind of spa in our home with like a cold plunge and a sauna and a hot tub. | ||
And I'll try to do it every day. | ||
And you know, something you say, I keep saying to myself, it's like, conquer your inner bitch. | ||
Yeah. | ||
It's like, this is such a good, and I feel like cold plunge especially kind of, it's just something, regardless, health benefits or not, something about it, like just mental toughness, like trying to do it every day. | ||
And every day I chicken out. | ||
Every day I want to go up. | ||
I don't want to go in, right? | ||
I do too. | ||
My inner bitch speaks the loudest when I'm lifting the lid off the cold plunge. | ||
My inner bitch is like, don't do this. | ||
You don't have to do this. | ||
You can do whatever you want. | ||
unidentified
|
You're a free man. | |
You can go have a sandwich, you know? | ||
Right, right. | ||
But you just got to decide that you're the boss. | ||
Yeah. | ||
And I think a lot of what discipline is for me is that, again, even keto and I did carnivore and these diets, like, I'm not sure how much health benefits there is. | ||
I feel like keto is really good on your blood sugar and keeps you kind of on a, you know, even keel kind of throughout the day. | ||
But for me, whenever there's like a lot of chaos in my life, I look at what can I control. | ||
unidentified
|
Right. | |
And typically diet is the first thing. | ||
Whatever it is, I'm like, I'm going to go carnivore. | ||
I'm going to go keto. | ||
And the fact that I can control that and enforce discipline on myself kind of puts me at ease. | ||
And I feel like I can control the other thing in my business, family, life. | ||
But that mindset is probably how you stop playing video games every day. | ||
Yeah. | ||
Because I would imagine, like we were talking about earlier, like that addiction is one of the strongest addictions I've ever faced in my life. | ||
Like when I was taught, if I would be talking to people and the conversation was boring, I'd be like, I could be playing Quake right now. | ||
Why am I here having this boring ass conversation where I could be launching rockets at people and having a good time? | ||
But the other thing for me is programming. | ||
So I got into programming early in my life. | ||
I was six years old when my father bought a computer. | ||
I was born and raised in Amman, Jordan. | ||
And we're the first people I know ever at the time that had a computer. | ||
And I remember. | ||
What year was this? | ||
1993. | ||
I was six years old. | ||
Okay, so 93. | ||
So what kind of computer was that? | ||
Was that an old school IBM? | ||
IBM PC, MS-DOS, Microsoft DOS. | ||
Oh, so you did the real deal. | ||
Yeah. | ||
I know a lot of Americans would get a Mac as their first computer. | ||
That's what I got. | ||
Yeah, yeah, yeah. | ||
No, we didn't have Mac. | ||
I actually wasn't introduced to Apple until kind of recently in my life. | ||
Really? | ||
Yeah. | ||
Like recently, recently? | ||
Like, no, like, you know, 12 years ago, 13 years ago, when I moved to the U.S. God, Apple has such a stranglehold in America. | ||
It's really incredible. | ||
Yeah, it's amazing. | ||
But we didn't know much about it. | ||
So I got into DASA. | ||
I remember one of my earliest memories is standing behind my father as he was kind of pulling up this huge manual and learning how to type commands. | ||
And he was finger typing those commands. | ||
And then I would watch him. | ||
And then after he leaves, I'll go and try those things. | ||
And one day he caught me. | ||
I was like, what are you doing? | ||
I'm like, I know how to do this. | ||
I'll show you. | ||
And so I knew how to start games, do a little bit of programming, do a little bit of scripting. | ||
And that's how I got into computers. | ||
And I was obsessed. | ||
And initially, it sort of got me into gaming. | ||
But then you want to mod the games. | ||
Have you ever done any modding? | ||
I've done a few things like turn textures off and stuff like that. | ||
Yeah, and that's another thing that I think is healthy about gaming is like a gateway to programming. | ||
Sure. | ||
Gateway drug to programming. | ||
And so I got into like modding, like Counter-Strike and things like that. | ||
Those were fun. | ||
And then just like the feeling that you can make something is just like such a profound, such a profound feeling. | ||
And that's really kind of what I carried through my whole life and became sort of my life mission. | ||
Now with my company, Replit, what we do is like we make it so that anyone can become a programmer. | ||
You just talk to your phone and your app, sort of like ChatGPT, and it starts coding for you. | ||
It's like a program software engineering agent. | ||
Right. | ||
So it's like the AI guides you through it. | ||
Yeah, not only guides you through it, it codes for you. | ||
So you're sort of, you know, programmers typically think about the idea a little bit, about the logic, but most of the time they're sort of wrangling the syntax and the IT of it all. | ||
And I thought that was always additional complexity that doesn't necessarily have to be there. | ||
And so when I saw GPT for the first time, I thought this could potentially transform programming and make it accessible to more and more people. | ||
Because it really transformed my life. | ||
The reason I'm in America is because I invented a piece of software. | ||
And I thought if you make it available to more people, they can transform their lives. | ||
Why was your dad messing around with computers? | ||
Was he doing it for fun? | ||
I want to let you in on something. | ||
Your current wireless carrier does not want you to know about Visible because Visible is the ultimate wireless hack. | ||
No confusing plans with surprise fees, no nonsense, just fast speeds, great coverage without the premium cost. | ||
With Visible, you get one-line wireless with unlimited data powered by Verizon's network for $25 a month, taxes and fees included. | ||
Seriously, $25 a month flat. | ||
What you see is what you pay. | ||
No hidden fees on top of that. | ||
Ready to see? | ||
Join now and unlock unlimited data for just $25 a month on the Visible plan. | ||
Don't think wireless can be so transparent? | ||
So Visible? | ||
Well, now you know. | ||
Switch today at visible.com slash Rogan. | ||
Terms apply. | ||
See visible.com for plan features and network management details. | ||
Yeah, so my dad is a Palestinian refugee. | ||
Yeah, you were telling me the story, and I want to get into that because it's kind of crazy. | ||
Tell the whole story of how this wound up happening. | ||
Yeah, yeah, yeah. | ||
So my family is originally from Haifa, which is now in Israel, and they were expelled as part of the 1948 Nakba, where Palestinians were sort of kicked out. | ||
And they went to like Fiji. | ||
How does your dad describe that? | ||
How old was he when that was going on? | ||
My father was born in Syria. | ||
So my grandma and my grandpa and my uncles were kind of kicked out. | ||
And the way they would describe that is they try to fight, they try to keep their home, but it was like this overwhelming force. | ||
They weren't organized. | ||
They were just people. | ||
They didn't really have an army, at least in that place. | ||
And eventually at gunpoint, they took their homes and tell them to go. | ||
If you're down south, you went to Gaza, and that's why 70% of Gazans are refugees from Israel. | ||
Like the people that are getting massacred right now are originally from Israel, from the land that people call Israel today. | ||
And then if you're in the north, like Haifa or Yafa, whatever, you went to Lebanon or to the West Bank or to Jordan or to Syria. | ||
So my family went to Syria. | ||
My father was born in Syria. | ||
But my grandfather was like a railroad engineer. | ||
So they were like city people. | ||
They were urban. | ||
So they couldn't like, you know, they wanted to have a place where they can, you know, they want to live in a city. | ||
And so originally the West Bank Didn't work for them and they ended up in Syria. | ||
But then Amman, Jordan, was kind of coming up, and there was a lot of opportunities there. | ||
So my father was born in Syria and then moved to Amman when they were six years old and built the life there. | ||
And they really kind of focused on education and trying to kind of rebuild their life from scratch. | ||
So my father and all my uncles kind of went and got educated in Egypt, Turkey, places like that. | ||
And so my father got an engineering degree, civil engineering degree from Turkey. | ||
And he was always interested in technology. | ||
That whole thing, we're kicking people out of Palestine, is such an inconvenient story today. | ||
When people are talking about Israel and Palestine and the conflict, they do not like talking about what happened in 1948. | ||
Yeah, and I think it's important. | ||
I think for us to reach some kind of peace, which is really hard to talk about when you see what's happened in Gaza, even yesterday. | ||
unidentified
|
Yeah. | |
Yeah, the people that were waiting for food got bombed. | ||
It's insane. | ||
And no one wants to talk about it. | ||
Right. | ||
And if you do talk about it, you're anti-Semitic, which is so strange. | ||
I don't know how they've wrangled that. | ||
It's been hard for me in tech because probably the only prominent Palestinian in tech that is talking about it. | ||
Do you get pushback? | ||
Oh, of course. | ||
Like, what do people say to you? | ||
Anti-Semitic. | ||
How is it anti-Semitic? | ||
They criticize the state of Israel. | ||
Our position, every modern Palestinian that I know, their position is like two-state solution. | ||
We need the emergence of the state of Palestine, you know, and that's the best way to, ending the occupation is the best way to guarantee peace and security even for Israelis. | ||
But yeah, it's just like it's used, it sort of reminds me, you know, in tech, we went through this like quote-unquote woke period where you couldn't talk about certain things as well. | ||
Has that gone away? | ||
Yeah. | ||
Yeah. | ||
Yeah, totally gone away. | ||
Yeah. | ||
What do you think caused it to go away? | ||
Elon? | ||
Really? | ||
Yeah, like Twitter, buying Twitter. | ||
Wow. | ||
Buying Twitter is the single most impactful thing for free speech, especially on these issues, of being able to talk freely about a lot of subjects that are more sensitive. | ||
Imagine if he didn't buy it. | ||
Yeah. | ||
I mean, that would have been. | ||
Imagine if the same ownership was in place and then Harris wins and they continue to ramp things up. | ||
Yeah, I don't know what you think of the new administration. | ||
Certainly there are things that I like about some of their pro-tech posture and things like that. | ||
But what's happening now is kind of disappointing. | ||
It's insane. | ||
We were told there would be no... | ||
One is the targeting of migrant workers, not cartel members, not gang members, not drug dealers, just construction workers showing up in construction sites and raiding them. | ||
Gardeners. | ||
Yeah. | ||
Like, really? | ||
Or Palestinian students on college campuses. | ||
Or not, like, there's a Turkish. | ||
Did you see this video of this Turkish students at Tafts University that wrote an essay, and then there's a video of ICE agents, like, I don't know. | ||
Is that the woman? | ||
Yeah, yeah. | ||
Yeah. | ||
What was her essay about? | ||
It was just critical of Israel, right? | ||
unidentified
|
Just critical of Israel. | |
Yeah, I mean. | ||
And that's enough to get you kicked out of the country. | ||
There's a long history of anti-colonial activism in U.S. colleges that led to South Africa changing and all of that. | ||
And I think this is a continuation of that. | ||
I mean, I don't agree with all their, like, there's a lot of radicalism. | ||
A lot of young people are attracted to more radical positions on Israel-Palestine. | ||
Which I don't mind those positions as long as someone's able to counter those positions. | ||
The problem is these supposed free speech warriors want to silence anybody who has a more conservative opinion. | ||
That's not the way to handle it. | ||
The way to handle it is to have a better argument. | ||
That's not American. | ||
It's not American. | ||
What attracted him to this country from the moment that I was aware and we started consuming American media and American culture is freedom, is the concept of freedom, which I think is real. | ||
I think is real. | ||
It is. | ||
I was watching this psychology student from, I think he's from Columbia, but he has a page on Instagram. | ||
I wish I could remember his name because he's very good. | ||
He's a young guy. | ||
But he had a very important point, and it was essentially that fascism rises as the over-correction response to communism. | ||
And that we essentially had this Marxist communism rise in first universities, and then it made its way into business because these people left the university and then found their way into corporate America. | ||
And then they were essentially instituting those. | ||
And then the blowback to that, the pushback, is this fascism. | ||
That happened last century? | ||
Well, they're talking about forever historically. | ||
He's talking about over time, whether it's Mao, whether it's Stalin, like fascism is the response almost always to communism. | ||
Interesting. | ||
And that, you know, what we experience with this country is this continual over-correction. | ||
Over-correction to the left, then over-correction to the right to counter that. | ||
And the people that are the rat, that's the guy. | ||
Anthony Rispeau. | ||
That's it. | ||
Really, really smart guy. | ||
And very interesting thing. | ||
Jamie, how did you nail that that quick? | ||
Good job, buddy. | ||
You said those words right as I saw them. | ||
Decades of training. | ||
Yeah. | ||
Communism, fascism. | ||
Yeah, communism came first, fascism came response. | ||
Now today's left tears down norms and destabilizes the country under the guise of progress. | ||
We're watching the conditions for another reaction build. | ||
History doesn't repeat, but it echoes. | ||
Yeah. | ||
Do you know this theory? | ||
I know you've had Mark Andreessen on the show, this James Burnham managerial revolution theory. | ||
No, not by hand. | ||
I'm not an expert in it, but the idea is that communism, fascism, and even some form of capitalism that sort of we're living under right now is like managerialism is the idea that capitalism used to be this idea that the owner-founders of those companies, of capitalist companies, were running them. | ||
And it was like true capitalism of sorts. | ||
But both communism and fascism share this property of centralized control and like a class of people that are sort of managerials. | ||
And maybe those are the elite sort of Ivy, Ivy League students that are trained to be managers and they grow up in the system, kind of bred to become like managers of these companies. | ||
And today's America is like trending that way where it is like a managerial society. | ||
In Silicon Valley, there's like a reaction to that right now. | ||
People call it founder mode, where a lot of founders felt like they were losing control of their companies because they're hiring all these managers. | ||
And these managers are running the companies like you would run Citibank. | ||
And then a lot of founders were like, no, we need to run those companies like we built them. | ||
And Elon is obviously at the forefront of that. | ||
I once visited XAI when they were just starting out, Elon's AI company. | ||
And there were like 70 people. | ||
All of them reported to Elon. | ||
They didn't have a single manager on staff. | ||
Wow. | ||
And they would send him an email every week. | ||
I was like, what did you get done this week? | ||
Right. | ||
Well, that was the outrageous thing that he asked people to do at Doge. | ||
Yeah. | ||
People were freaking out. | ||
Five minutes a week. | ||
What are the things you accomplished this week? | ||
How? | ||
You know, he said, all you have to do is respond. | ||
Right. | ||
And they didn't want, they pushed back so hard on being accountable for their work. | ||
Yeah. | ||
But that's government for you. | ||
Yeah. | ||
You know what I mean? | ||
Government is the grossest, most incompetent form of business. | ||
You know, it's a monopoly. | ||
It's a complete, total monopoly. | ||
Like the way he describes some of the things that they found at Doge, it's like you could never run a business that way. | ||
Because not only would it not be profitable, the fraud would get you arrested. | ||
You'd go to jail for something that's standard in the government. | ||
Right, right. | ||
I mean, my opinion of talented people, people like Elon, things like that, is that we should be in the free market. | ||
I think you can do little change in government. | ||
As best we can sort of expect of our government to get out of the way of innovation, let people, let founders, entrepreneurs innovate and make the market more dynamic. | ||
But again, going back to this idea of materialism, if you look at the history of America, one really striking stat is the new firm creation, new startups in the United States have been trending down for a long time. | ||
Although there's all this stock of startups in Silicon Valley and all of that, but in reality, there's less entrepreneurship than there used to be. | ||
And instead, we have the system of conglomerates and really big companies and monopsony, which is the idea that there are the banks or BlackRock competitors as well, owning all these companies. | ||
And they implicitly collude because they have the same owners. | ||
And all of that is sort of anti-competitive. | ||
So the market has gotten less dynamic over time. | ||
And this is also part of the reason I'm excited about our mission at Replit to make it so that anyone can build a business. | ||
Actually, on the way here, your driver, Jason, is a fireman. | ||
And so I was telling him about our business. | ||
And he does training for other firemen around the country. | ||
He flies around. | ||
And he does it out of pocket and just for the love of the game. | ||
And he was like, yeah, I've had this idea for a website so I can scale my teaching. | ||
I can make it known where am I going to be giving a course, put the material online. | ||
And we were brainstorming, potentially this could be a business. | ||
And I feel like everyone, like not everyone, but a lot of people have business ideas, but they are constrained by their ability to make them. | ||
And then you go, you try to find a software agency and they quote you sort of a ton of money. | ||
Like we have a lot of stories. | ||
There's this guy. | ||
His name is Joan Cheney. | ||
He's a user of our platform. | ||
He's a serial entrepreneur, but whenever he wanted to try ideas, he would spend hundreds of thousands of dollars to kind of spin up an idea off the ground. | ||
And now he uses Replit to try those ideas really quickly. | ||
And he recently made an app in a number of weeks, like three, four, five weeks, that made him $180,000. | ||
So on its way to generate millions of dollars. | ||
And because he was able to build a lot of businesses and try them really quickly. | ||
Right, without the big investment. | ||
Without the big investment, without other people, which at some point you need more collaborators, but early on in the brainstorming and in the prototyping phase, you want to test a lot of ideas. | ||
And so it's sort of like 3D printing, right? | ||
Like 3D printing, although people don't think it had a lot of impact on industry, it's actually very useful for prototyping. | ||
I remember talking to Jack Dorsey about this, and early on in Square, they had this Square device, and it was amazing. | ||
You would plug it into the headphone jack to accept payments. | ||
Do you remember that? | ||
And so a lot of what they did to kind of develop the form factor was using 3D printing because it's a lot faster to kind of iterate and prototype and test with users. | ||
And so software, over time, like when I was, you know, I explained how when I was growing up, it was kind of easier to get into software. | ||
Because you boot up the computer and you get the MS-DOS, you get the, it immediately invites you to program in it. | ||
Whereas today, you, you know, buy an iPhone or a tablet, and it is like a purely consumer device. | ||
It has like all these amazing colors and does all these amazing things, and kids get used to it very quickly, but it doesn't invite you to program it. | ||
And therefore, we kind of lost that sort of hacker ethos. | ||
There's less programmers, less people who are making things because they got into it organically. | ||
It's more like they go to school to study computer science because someone told them you have to study computer science. | ||
And I think making software needs to be more like a trade. | ||
Like, you don't really have to go to school and spend four or five years and hundreds of thousands of dollars to learn how to make it. | ||
Well, what I'm hearing now is that young people are being told to not go into programming because AI is essentially going to take all of that away. | ||
That you're just going to be able to use prompts. | ||
You're just going to be able to say, I want an app that can do this. | ||
I want to be able to scale my business to do that. | ||
You know, what should I do? | ||
Yeah, that's what we built. | ||
That's what Replit is. | ||
It automates the. | ||
Do you agree with that, that young people shouldn't learn programming? | ||
Or do you think that there's something very valuable about being able to actually program? | ||
Look, I think that you will always get value from knowledge. | ||
I mean, that's a timeless thing. | ||
That's why, right? | ||
You know, it's like, you know, you and I are into cars, right? | ||
Like, I don't really have to tune up my car anymore, but it's useful to know more about cars. | ||
It's fun to know about cars. | ||
You know, if something happens, if I go to the mechanic and he's doing work on my car, I know he's not going to scam me because I can understand what he's doing. | ||
Knowledge is always useful. | ||
And so I think people should learn as much as they can. | ||
And I think the difference, though, Joe, is that when I was coming up in programming, you learned by doing. | ||
Whereas it became this sort of like very sort of traditional type of learning where it's like a textbook learning. | ||
Whereas I think now we're back with AI. | ||
We're back to an era of learning by doing. | ||
Like when you go to our app, you see just text prompts, but a couple clicks away, you'll see the code. | ||
You'll be able to read it. | ||
You'll be able to ask the machine, what you did there. | ||
Teach me how this piece of code works. | ||
Oh, that's cool. | ||
And so a lot of kids are learning. | ||
Kids are such sponges, too. | ||
They're such sponges. | ||
And kids already know way more about. | ||
I'm like, how did you do that with your phone? | ||
And my daughter will go, are you doing this? | ||
You got the little thumbs moving 100 miles an hour. | ||
Yeah, exactly. | ||
How'd you figure that out? | ||
TikTok. | ||
What? | ||
Dude, the craziest thing is we have a lot of people making software from their phone. | ||
They'll spend eight hours on their phone because we have an app. | ||
They'll spend eight hours on their phone kind of making software. | ||
Wow. | ||
And that's better than watching TikTok. | ||
It makes me very happy about that. | ||
You're just accomplishing something. | ||
Yeah, you're doing creation. | ||
You're just droning. | ||
The act of creation is divine. | ||
We just announced a partnership with the government of Saudi Arabia where they want their entire population essentially to learn how to make software using AI. | ||
So they set up this new company called Humane, and Humane is this end-to-end value chain company for AI, all the way from chips to software. | ||
And they're partnering with a lot of American companies as part of the coalition that went to Saudi a few months ago with President Trump to do the deals with the Gulf region. | ||
And so they're doing deals with AMD, NVIDIA, a lot of other companies. | ||
And so we're one of the companies that partnered with Humane. | ||
And so we want to bring AI coding to literally every student, every government employee. | ||
Because the thing about it is it's not just entrepreneurs that's going to get something from it. | ||
It's also if you're... | ||
Really? | ||
Yeah. | ||
And so, you know. | ||
So this is the best case scenario future. | ||
Yes. | ||
As opposed to everyone goes on universal basic income and the state controls everything and it's all everything is done through automation. | ||
I don't believe in that on the other. | ||
You don't? | ||
I don't. | ||
I don't. | ||
Okay. | ||
Good. | ||
Help me out, man. | ||
Yeah. | ||
Give me the positive rose-colored glasses view of what AI is going to do for us. | ||
Yeah. | ||
So AI is good at automating things. | ||
I think there's a, there's a primacy to human beings still. | ||
Like I think humans are... | ||
I'm so bullish in AI. | ||
I think it's going to change the world. | ||
But at the same time, I don't think it's replacing humans because it's not generalizing, right? | ||
AI is like a massive remixing machine. | ||
It can remix all the information it learned. | ||
And you can generate a lot of really interesting ideas and really interesting things. | ||
You can have a lot of skills by remixing all these things. | ||
But we have no evidence that it can generate a fundamentally novel thing or a paradigm change. | ||
Can a machine go from Newtonian physics to quantum mechanics, really have a fundamental disruption in how we understand things or how we do things? | ||
Do you think that takes creativity? | ||
I think that's creativity, for sure. | ||
And that's a uniquely human characteristic? | ||
For now? | ||
For now? | ||
Definitely for now. | ||
I don't know, forever. | ||
Actually, one of my favorite Jari episodes was Roger Penrose. | ||
Do you remember him? | ||
unidentified
|
Yes. | |
So do you remember the argument that he made about why humans are special? | ||
He said something like he believes there are things that are true That only humans can know it's true, but machines cannot prove it's true. | ||
It's based on Gödel's incompleteness theorem. | ||
And the idea is that you can construct a mathematical system where it has a paradoxical statement. | ||
So, for example, you can say G, you can say this statement is not provable in the machine. | ||
Or like the machine cannot prove the statement. | ||
And so if the machine proves a statement, then the statement is false. | ||
So you have a paradox. | ||
And therefore, the statement is sort of true from the perspective of an observer, like a human, but it is not provable in this system. | ||
So Roger Pinrose says these paradoxes that are not really resolved in mathematics and machines are no problem for humans. | ||
And therefore, his sort of like a bit of a leap is that therefore there's something special about humans and we're not fundamentally a computer. | ||
Right. | ||
That makes sense. | ||
I mean, whatever creativity is, whatever allows you to make poetry or jazz or literature, like whatever, whatever allows you to imagine something and then put it together and edit it and figure out how it resonates correctly with both you and whoever you're trying to distribute it to. | ||
There's something to us that's different. | ||
I mean, we don't really have a theory of consciousness. | ||
And I think it's like sort of hubris to think that consciousness just emerges. | ||
And it's plausible. | ||
Like I'm not totally against this idea that you built a sufficiently intelligent thing and suddenly it is conscious. | ||
But there's no, it's like a religious belief that a lot of Silicon Valley have is that there's consciousness is just like a side effect of intelligence or that consciousness is not needed for intelligence. | ||
Somehow it's like this superfluous thing. | ||
And they try not to think or talk about consciousness because actually consciousness is hard. | ||
Hard to define. | ||
Hard to define, hard to understand scientifically. | ||
It's what I think Chalmers calls the hard problem of consciousness. | ||
But I think it is something we need to grapple with. | ||
We have one example of general intelligence, which is human beings. | ||
And human beings have a very important property that we can all feel, which is consciousness. | ||
And that property, we don't know how it happens, how it emerges. | ||
People like Roger Penrose are like they have these theories about quantum mechanics in micro tubules. | ||
I don't know if you got into that with him, but I think he has a collaborator, neuroscientist, Hameroff, I think, or something like that. | ||
But people have so many theories. | ||
I'm not saying Penrose has the answers, but it's something that philosophers have grappled with forever. | ||
And there are a lot of interesting theories. | ||
There's this theory that consciousness is primary, meaning the material world is a projection of our collective consciousness. | ||
Yes. | ||
Yeah. | ||
That is a very confusing but interesting theory. | ||
And then there's a lot of theories that everything is conscious. | ||
We just don't have the ability to interact with it. | ||
You know, Sheldrake has a very strange view of consciousness. | ||
Who's Sheldrake? | ||
Rupert Sheldrake. | ||
I don't know. | ||
He's got this concept. | ||
I think it's called morphic resonance. | ||
And see if you can find that so we could define it so I don't butcher it. | ||
But there's people that believe that consciousness itself is something that everything has and that we are just tuning into it. | ||
Morphic resonance, a theory proposed by Rupert Sheldrich suggests that all natural systems, from crystals to human, inherit a collective memory of the past instances of similar systems. | ||
This memory influences their form and behavior, making nature more habitual than governed by fixed laws. | ||
Essentially, past patterns and behaviors of organisms influence present ones through connections across time and space. | ||
That's wild. | ||
And is he a scientist, or is this more like a news? | ||
What is his exact background? | ||
Harvard. | ||
Oh, wow. | ||
Yeah. | ||
Okay. | ||
So he's a parapsychology researcher, proposed the concept of morphic resonance, conjecture that lacks mainstream acceptance. | ||
It's been widely criticized as pseudoscience. | ||
Of course. | ||
Anything interesting. | ||
That sounds interesting, though. | ||
Yeah. | ||
But there are philosophers that have sort of a similar idea of this sort of universal consciousness and humans are getting a slice of that consciousness. | ||
Every one of us is tapping into some sort of universal consciousness. | ||
Yes. | ||
By the way, I think there are some psychedelic people that think the same thing, that when you take psychedelic, you're just peering into that universal consciousness. | ||
Yes. | ||
Yeah. | ||
That's the theory. | ||
Because that's also the most unknown. | ||
I mean, the experience is so baffling that people come back and the human language really lacks any phrases, any words that sufficiently describe the experience. | ||
So you're left with this very stale, flat, one-dimensional way of describing something that is incredibly complex. | ||
So it always feels, even the descriptions, even like the great ones like Terrence McKenna and Alan Watts, like they're descriptions that fall very short of the actual experience. | ||
Nothing about it makes you go, yes, that's it. | ||
He nailed it. | ||
It's always like, kind of, yeah, kind of, that's it. | ||
Do you still do it? | ||
Not much. | ||
You know, it's super illegal, unfortunately. | ||
That's a real problem. | ||
It's a real problem, I think, with our world, the Western world, is that we have thrown this blanket phrase. | ||
You know, we talk about language being insufficient. | ||
The word drugs is a terrible word to describe everything that affects your consciousness or affects your body or affects performance. | ||
You have performance-enhancing drugs, like steroids, and then you have amphetamines, and then you have opiates, and you have highly addictive things, fenced coffee. | ||
Nicotine. | ||
And then you have psychedelics. | ||
I don't think psychedelics are drugs. | ||
I think it's a completely different thing. | ||
It's really hard to get addicted to them, right? | ||
Well, it's almost impossible. | ||
I mean, you could certainly get psychologically addicted to experiences. | ||
I think there's also a real problem with people who use them and think that somehow or another they're just from using them gaining some sort of advantage over normal society. | ||
unidentified
|
And that's – You don't think that's true? | |
I think it's a spiritual narcissism that some people I think it's very foolish, and it's a trap. | ||
You know, I think it's like it's a similar trap that famous people think they're better than other people because they're famous. | ||
You know what I mean? | ||
unidentified
|
Yeah. | |
Yeah, I felt that with a lot of people who get into sort of more Eastern philosophy is that there's this thing about them where it feels like there's this air of arrogance. | ||
Yeah. | ||
That like I know something more than you know. | ||
Right, right, right. | ||
And that's what they hold it over you. | ||
That's the trap. | ||
But that doesn't mean that there's not valuable lessons in there to learn. | ||
I think there are. | ||
And I think there's valuable perspective enhancing aspects to psychedelic experiences that we are denying people. | ||
You know, you're denying people this potential for spiritual growth, like legitimate spiritual growth. | ||
And personal. | ||
Yeah, healing. | ||
The Ibogaine thing they're trying to do in Texas, I think, is amazing. | ||
And they passed this. | ||
So this is also with the help of former Governor Rick Perry, who's a Republican. | ||
But he's seen what an impact Ibogaine has had on soldiers. | ||
And all these people that come back from the world. | ||
Horrible PTSD and suicidal. | ||
We lose so many servicemen and women to suicide. | ||
And this has been shown to have a tremendous impact. | ||
And so because of the fact that a guy like Rick Perry stuck his neck out, who's a Republican former governor, you would think the last person ever. | ||
But because of his experiences with veterans and his love of veterans and people that have served this country, they've passed that in Texas. | ||
I think that's a really good first step. | ||
And the great work that MAPS has done, MAPS working with MDMA primarily with doing the same thing and working with people that have PTSD. | ||
There's so many beneficial compounds. | ||
Yeah, ketamine is one I think that's a lot of research happening already now on depression specifically, right? | ||
Yeah. | ||
So there's quite a bit of research. | ||
Have you heard, I don't know if it's true, but have you heard of mushrooms healing long COVID? | ||
I don't know what long COVID means because everybody I've talked to that has long COVID was also vaccinated. | ||
I think long COVID is vaccine injury. | ||
That's what I think. | ||
I think in a lot of cases. | ||
There is such a thing as like the post-viral malaise or a fact that's always been there. | ||
Sure. | ||
Well, there's a detrimental effect that it has to your overall biological health, right? | ||
unidentified
|
Yeah, yeah. | |
You know, your overall metabolic health. | ||
But what causes someone to not rebound from that? | ||
What causes someone to rebound fairly easily? | ||
Well, mostly it's metabolic health, you know, other than like extreme biological variabilities, vulnerabilities that certain people have to different things, you know, obviously. | ||
Yeah, maybe that's why I think, so there's a lot of these long COVID protocols. | ||
Metformin is usually part of it. | ||
So maybe that acts on your metabolic system. | ||
Well, yeah, metformin is one of the anti-aging protocols that Sinclair uses and a lot of these other people that are into the anti-aging movement. | ||
Yeah. | ||
You know, I had this like weird thing happen where I started like feeling fatigued like a couple few years ago and I would like sleep hours and the more I sleep, the more tired I get in the morning. | ||
Did you get blood work done? | ||
I got blood work done and I there were some things about it that I needed to fix and I fixed all of them. | ||
Like what was off? | ||
Loss, you know, you know, blood sugar in the morning, cholesterol, which I don't know if some people don't believe, but you know, all my numbers got better. | ||
Vitamin D, everything got better, but and I could feel. | ||
Did the fatigue get better? | ||
No, I could feel marginal improvements, but the fatigue did not get better. | ||
Were we vaccinated? | ||
No. | ||
Good for you. | ||
That's hard to do in Silicon Valley. | ||
Yeah. | ||
Yeah, I tend to have a negative reaction to anyone forcing me to do something. | ||
Good for you. | ||
Was it the same thing now with like this, you know, talking about Palestine and things like that? | ||
Like the more they come at me, the more I want to say things. | ||
It's not always a good thing, but I think I grew up this way. | ||
I've always kind of looked different and felt different. | ||
Well, there's a reality to this world that there's a lot of things that people just accept that you're not allowed to challenge that are deeply wrong. | ||
Yeah, and with regards to the vaccine, I was also informed about it. | ||
Like it was clear early on that it wasn't a home run. | ||
It wasn't, well, first of all, it wasn't going to stop the spread. | ||
So that was a lie. | ||
And the heart condition in young men is real. | ||
And I had friends that had this issue. | ||
And so if you're healthy and like, you know, why take the vaccine? | ||
It doesn't stop the spread. | ||
You can still get the virus. | ||
I'll tell you why. | ||
unidentified
|
What? | |
Money. | ||
It's the only reason why. | ||
It's the only reason why. | ||
The only reason why they wanted to make an enormous amount of money. | ||
And the only way to do that is to essentially scare everyone into getting vaccinated, force, coerce, do whatever you can, mandate it at businesses, whatever you can, mandate it for travel, do whatever you can, shame people. | ||
That's the thing that is really disheartening about American culture today is, and again, I love America. | ||
It afforded me so much. | ||
I'm like, you know, like I'm the walking evidence of the American dream being possible, coming with literally nothing. | ||
That's what I really love about immigrants that love America. | ||
They know, they've been other places. | ||
They know that this really is a very unique place. | ||
Right. | ||
And the speech thing is interesting because when something happens, there's this, I don't know, you can call them useful idiots or whatever, but there's this suppression that immediately happens. | ||
Yes. | ||
And we're seeing it right now with the war in Iran where any dissenting voices are just like hit with overwhelming force. | ||
Don't you think that a lot of that is coordinated, though? | ||
I think with social media, well, you know, we've talked about it. | ||
I don't think it was coordinated with COVID, like the two weeks to stop the spread. | ||
It was just like... | ||
Yeah. | ||
Maybe there was a message pushed top down and then the – It's coordinated first and still, but then a bunch of people do the man's work for the man. | ||
I think it comes from a good place. | ||
Like, a lot of people want to trust the authorities. | ||
Like, they're pro-science. | ||
They view of themselves as enlightened, like the liberal type, rational, educated. | ||
But I think they're naive about the corruption in our institutions and the corruption of money specifically. | ||
And so they parrot these things and become overly aggressive at suppressing dissenting voices. | ||
Yes. | ||
It becomes a religious thing almost. | ||
But here's the sort of white pale about America. | ||
Then there are voices like yours and others that create this pushback that, and you took a big hit, it probably was very stressful for you, but you could see there's this pushback and then it starts opening up and maybe people can talk about it a little bit and then slowly opens up and now there's a discussion. | ||
And so I think I said something right now about America is challenging, but also the flip side of that is there's this correction mechanism. | ||
And again, with the opening up of platforms like Twitter and other, by the way, a lot of others copied it. | ||
You had Zuck here. | ||
I worked at Facebook. | ||
I know that was very, let's say, I think he always held free speech in high regard, but there was a lot of people in the company that didn't. | ||
Yes, I would agree with that. | ||
And there was suppression. | ||
But then now it's the other way around, I would say with the exception of the question of Palestine and Gaza. | ||
But even that is getting better. | ||
There's at least some pushback. | ||
It's available. | ||
It's just not promoted. | ||
You know, it's interesting. | ||
Not to continue. | ||
I don't mean to kind of... | ||
They're sincere and they're looking at what's happening in Gaza and they're seeing images and they're saying, this is not what we should be as America. | ||
We should be pro, pro-life, pro-peace. | ||
And I really appreciate that. | ||
And that's starting to open up. | ||
I think in the future that will be the primary way people look at it. | ||
Just the way a lot of people oppose the Vietnam War in the late 60s. | ||
But it was, you know, you would get attacked. | ||
And I think now people realize that was the correct response. | ||
And I think in the future, people realize the correct response is like, this is not. | ||
Yeah, October 7th was awful. | ||
Absolutely. | ||
Obviously. | ||
Terrible attack. | ||
But also, what they've done to Gaza is fucking insane. | ||
It's insane. | ||
And if you can't see that, if you can't say that, and your response is, Israel has the right to defend itself. | ||
Like, what are you talking about? | ||
Against what? | ||
Children? | ||
Against women and children that are getting blown apart? | ||
Against aid workers that are getting killed? | ||
Like, what are you talking about? | ||
Like, we can't have a rational conversation if you're not willing to address that. | ||
Yeah. | ||
I think their heart is hardened. | ||
If I'm trying to be as chattable as possible, like the Israelis specifically, maybe from October 7, what they saw there, their heart is hardened. | ||
And I think a lot of people, especially on the Republican side, they're unable to see the Palestinians as humans, especially as people with emotions and feelings and all of that. | ||
Like imagine if that was happening to Scandinavia, you know? | ||
unidentified
|
Yeah, right? | |
Yeah, exactly. | ||
It's very strange. | ||
My kid, my five-year-old kid called me two days ago. | ||
They're in Amman, Jordan. | ||
They're visiting their grandparents. | ||
And I was in the car, and it was FaceTime. | ||
And the moment the camera opened, he's like, what are you doing? | ||
Why are you outside? | ||
There are sirens. | ||
There are a rocket. | ||
You have to go inside. | ||
And I'm like, dad, like, I am in California. | ||
We don't have sirens and rockets. | ||
And then I asked him, like, are you afraid? | ||
Because you're hearing that. | ||
unidentified
|
this. | |
Is a California kid? | ||
He's never, you know, he didn't have the upbringing that I had. | ||
And so it's the first time he's getting exposed to, I don't think he understands what war is. | ||
Of course. | ||
And I was like, are you afraid? | ||
It's like, no, I'm afraid that other people are, you know, I want everyone to be okay. | ||
But I know he was shook by it. | ||
And I took him out. | ||
They're on their way back. | ||
I just couldn't. | ||
Of course. | ||
That's just a bad place to be right now. | ||
But also, like, this conversation is happening in the West Bank. | ||
It's happening in Israel. | ||
It's happening in Gaza. | ||
You know, people want peace. | ||
People want to live. | ||
People want to trade. | ||
People want to build. | ||
And this is what I made my life mission about, is about giving people tools to build to improve their lives. | ||
And I think we're just led by maniacs. | ||
unidentified
|
Exactly. | |
That's exactly what it is. | ||
You have people that are in control of large groups of people that convince these people that these other large groups of people that they don't even know are their enemies. | ||
And those large groups of people are also being convinced by their leaders that those other groups of people are their enemies. | ||
And then rockets get launched. | ||
And it's fucking insane. | ||
And the fact that it's still going on in 2025 with all we know about corruption and the theft of resources and power and influence, it's crazy that this is still happening. | ||
I'm really hoping the internet is finally reaching its potential to start to open people's minds and remove this veil of propaganda and ignorance because it was starting to happen in 2010, 2011. | ||
And then you saw YouTube start to close down. | ||
You saw Facebook start to close down. | ||
Twitter. | ||
And suddenly we had this period of darkness. | ||
Censorship. | ||
Censorship between, you know, definitely ramped up in 2015. | ||
And I think with good intention initially, I think the people that were censoring thought they were doing the right thing. | ||
They thought they were silencing hate and misinformation. | ||
And then the craziest term, malinformation. | ||
Malinformation is the one that drives me the most nuts because it's actual factual truth that might be detrimental to overall public good. | ||
She's like, what does that mean? | ||
Are people infants? | ||
Are they unable to decide whether this factual information, how to use that and how to have a more nuanced view of the world with this factual information that's inconvenient to the people that are in power? | ||
That's crazy. | ||
It's crazy. | ||
You're turning adults into infants and you're turning the state into God. | ||
And this is the secular religion. | ||
This is the religion of people that are atheists. | ||
The West was never about that. | ||
The West was about individual liberty. | ||
And it should be. | ||
And the idea that we have functioning brains and minds. | ||
We're conscious. | ||
We can make decisions. | ||
We can get information and data and make our own opinions of things. | ||
And we should be able to see people that are wrong. | ||
You should be able to see people that are saying things that are wrong that you disagree with. | ||
And then it's your job or other people's job to have counter-arguments. | ||
I don't understand. | ||
And the counter-arguments should be better. | ||
Yep. | ||
unidentified
|
Yeah. | |
And that's how we learn. | ||
And that's how we grow. | ||
This is not like a pill that fixes everything. | ||
This is a slow process of understanding. | ||
It's top-down control. | ||
It's the managerial society. | ||
It is not that different from fascism and communism and all of that stuff. | ||
They all share the same thing. | ||
There's like an elite group of people that know everything and they need to manage everything. | ||
And we're all plebs. | ||
But that's what's crazy. | ||
It's an elite group of people. | ||
I've met a lot of them. | ||
They're fucking flawed human beings and they shouldn't have that much power. | ||
Because no one should have that much power. | ||
And this is, I think, something that was one of the most beautiful things about Elon purchasing Twitter is that it opened up discussion. | ||
Yeah, you've got a lot of hate speech. | ||
You've got a lot of legitimate Nazis and crazy people that are on there too that weren't on there before. | ||
But also you have a lot of people that are recognizing actual true facts that are very inconvenient to the narrative that's displayed on mainstream media. | ||
And because of that, mainstream media has lost an insane amount of viewers. | ||
And their relevancy, like the trust that people have in mainstream media is at an all-time low, as it should be. | ||
Because you can watch, and I'm not even saying right or left, watch any of them on any very important topic of world events. | ||
And you see the propaganda. | ||
It's like, it's so obvious. | ||
It's like for children. | ||
It's like, this is so dumb. | ||
Why do you think people fall for it so? | ||
Boomers, man. | ||
Boomers are the problem. | ||
It's old people. | ||
It's old people that don't use the internet or don't really truly understand the internet and really don't believe in conspiracies. | ||
Like fucking Stephen King the other day, who I love dearly. | ||
I am a giant Stephen King fan, especially when he was doing cocaine. | ||
I think he's the greatest writer of all time for horror fiction. | ||
But he tweeted the other day, I'm sorry to like see if you could find it. | ||
Something about Twitter? | ||
I think he went to Blue Sky. | ||
He bailed on Blue Sky. | ||
They all bail on Blue Sky. | ||
Everyone bails on Blue Sky that there is no deep state. | ||
Fucking, what was the total thing of it? | ||
Something about the deep state. | ||
But it was such a goofy tweet. | ||
It's like, this is like boomer logic personified in a tweet by a guy who really, someone needs to take his phone away because it's fucking ruining his old books for me. | ||
It's not. | ||
I recognize he's a different human now when he's really, really old and he got hit by a van and you're all fucked up. | ||
But this, can you find it? | ||
Because it really, it was like yesterday or the day before yesterday. | ||
I just remember looking at it and go, this is why I'm off social media. | ||
I was trying to stay off social media, but somebody sent it to me. | ||
And I was like, Jesus fucking Christ, Stephen King. | ||
Did you find it? | ||
Here it is. | ||
I hate to be the bearer of bad news, but there's no Santa Claus, no tooth fairy. | ||
Also, no deep state, and vaccines aren't harmful. | ||
These are stories for small children and those too credulous to disbelieve them. | ||
That is boomerism. | ||
That is boomerism. | ||
And meanwhile, Brock counters it right away. | ||
Look at this. | ||
So someone says, Grock, which vaccines throughout history are pulled from the market because they're found to be harmful and why? | ||
And Grock says, several vaccines have been withdrawn due to safety concerns, though such causes are rare. | ||
Rotavirus vaccine. | ||
Well, there's a lot more because this is all this shit. | ||
It's especially about. | ||
Oh, yeah. | ||
Yeah, the 1955 Qatar incident. | ||
Polio vaccine was called live virus kill, caused over 250. | ||
Click on show more. | ||
Yeah, there's. | ||
Oh, I got the fly. | ||
Nice. | ||
Gillian bar, however you say that. | ||
That's the one where people get half their face paralyzed. | ||
There's a lot. | ||
And this is the other thing is the VAR system that we have is completely rigged because it reports a very small percentage. | ||
And most doctors are very unwilling to submit vaccine injuries. | ||
Can people go on their own and submit? | ||
I don't know. | ||
You have to go to a doctor. | ||
I don't think a human being is allowed. | ||
A patient is allowed. | ||
I might be wrong, though. | ||
But, you know, the real interest, there's a financial interest in vaccines. | ||
There's a financial interest that doctors have in prescribing them. | ||
And doctors have, they're financially incentivized to vaccinate all of their patients. | ||
And that's a problem. | ||
That's a problem because they want that money. | ||
And so, you know, what is Mary's Mary Tally, is it Bowdoin? | ||
She's hyphenated. | ||
She was talking about on Twitter that if she had vaccinated all of her patients in her very small practice, she would have made an additional $1.5 million. | ||
Oh, wow. | ||
That's real money. | ||
Obviously, she's got tremendous courage and, you know, and she was, you know, she went through hell dealing with the universities and newspapers and media calling her some sort of quack and crazy person. | ||
But what she's saying is absolutely 100% true. | ||
There's financial incentives that are put in place for you to ignore vaccine injuries and to vaccinate as many people as possible. | ||
That's a problem. | ||
And then there's the issue of having their own special courts and they're indemnifying the companies. | ||
That's the big problem is they don't have any liability for the vaccines because during the Reagan administration, when they were, I didn't kill a fly, this motherfucker. | ||
I thought I whacked him. | ||
There he is. | ||
He's taunting me. | ||
But during the Reagan administration, they made it so that vaccines are not financially liable to any side effects. | ||
And then what do you know? | ||
they fucking ramp up the vaccine schedule tenfold after that. | ||
It's just money, man. | ||
Money is a real problem with people because when people live for the almighty dollar and they live for those zeros on a ledger, and that's their goal, their main goals. | ||
And it's often not a lot of money, which is strange. | ||
I mean, it's a lot of money for those individual people, but like for society and the societal harm. | ||
It's like, no, we'll pay you. | ||
Just don't harm us. | ||
Yeah, the best examples is the fake studies that the sugar industry funded during the 1960s that showed that saturated fat was the cause of all these heart issues and not sugar. | ||
That was like $50,000. | ||
They bribed these scientists. | ||
They gave them $50,000 and he ruined decades of people's health. | ||
Who knows how many fucking people thought margarine was good for you because of them? | ||
There's a bunch of recent fraud cases. | ||
I think Stanford, maybe Jamie, you can fact-check me on that. | ||
But Stanford, there was a big shake-up. | ||
Maybe even a president got fired. | ||
And there's a bunch of recent fraud and science. | ||
Well, how about the Alzheimer's research? | ||
The whole amyloid plaque thing. | ||
The papers that were pulled that were completely fraudulent. | ||
Like decades of Alzheimer's research was just all horseshit. | ||
Steve, you can find that. | ||
Because I can't remember it offhand, but this is a giant problem. | ||
It's money. | ||
It's money and status and that these guys want to be recognized as being the experts in this field. | ||
And then they get leaned on by these corporations that are financially incentivizing them. | ||
And then it just gets really fucking disturbing. | ||
It's really scary because you're playing with people's health. | ||
You're playing with people's lives. | ||
And you're giving people information that you know to be bad. | ||
Allegations of fabricated research undermine Key Alzheimer's theory. | ||
Six-month investigation by Science Magazine uncovered evidence that images in the much-cited study published 16 years ago in the journal Nature may have been doctored. | ||
They are doctored, yeah. | ||
Hubermund actually told me about this, too. | ||
You know, this is disturbing fucking shit, man. | ||
It uncovered evidence that images in the much-cited study published 16 years ago may have been doctored. | ||
These findings have thrown skepticism on the work of, I don't know how to say his name is Sylvain Lesnay, a neuroscientist and associate professor at the University of Minnesota in his research with fueled interest in a specific assembly of proteins as a promising target for the treatment of Alzheimer's research. | ||
He didn't respond to NBC news requests, comments, nor did provide comment to Science Magazine. | ||
It found more than 20 suspect papers. | ||
That's a conspiracy. | ||
Identified more than 70 instances of possible image tampering in his studies. | ||
Whistleblower Dr. Matthew Schrag, a neuroscientist at Vanderbilt University, raised concerns last year about the possible manipulation of images in multiple papers. | ||
Carl Hurup, a professor of neurobiology at the University of Pittsburgh Brain Institute, who wasn't involved in the investigation, said the findings are really bad for science. | ||
It's never shameful to be wrong in silence, said Hurup, I hope I'm saying his name right, who also worked at the school's Alzheimer's Research Center, Disease Research Center. | ||
A lot of the best science is done by people being wrong and proving first if they were wrong and then why they were wrong. | ||
What is completely toxic to science is to be fraudulent, of course. | ||
Yeah, there's just whenever you get people that are experts and they cannot be questioned, and then they have control over research money and they have control over their department. | ||
What's the motivation here? | ||
Is it drugs or is it just research money? | ||
I think a lot of it is ego. | ||
You know, a lot of it is being the gatekeepers for information and for truth. | ||
And then you're influenced by money. | ||
You know, to this day, I was watching this discussion. | ||
They were talking about the evolution of the concept of the lab leak theory. | ||
And that it's essentially universally accepted now everywhere, even in mainstream science, that the lab leak is the primary way that COVID most likely was released, except these journals. | ||
These fucking journals like Nature, they're still pushing back against that. | ||
It's still pushing towards this natural spillover, which is fucking horse shit. | ||
But they fucking knew that. | ||
They knew that. | ||
Right, they knew it all along. | ||
They knew that in 2020. | ||
They just didn't want to say it. | ||
They didn't want to say it because they were funding it all. | ||
That's what's really crazy. | ||
They were funding it all against what the Obama administration tried to shut down in 2014. | ||
Sometimes I think about if there's like, you know, some kind of technology solution, or not solution, but like we can get technology built to help better aid at truth finding. | ||
A simple example of that is the way Twitter community notes work. | ||
Do you know how they work? | ||
Yeah. | ||
Yeah. | ||
It's like, you know, they find the users that are maximally divergent in their opinions. | ||
And if they agree on some note as true, then that is a high signal that is potentially true. | ||
So if you and I disagree in everything, but we agree that this is blue, then it's more likely to be blue. | ||
So, you know, I wonder if, you know, there's a way to kind of simulate maybe debate using AI. | ||
You know, I'm not sure if you used Deep Research. | ||
Deep Research is this new trend in AI where ChatGFT has it, Claude has it, Perplexity, they all have it, where you put in a query and the AI will go work for 20 minutes. | ||
And it'll send you a notification. | ||
I'll just say, hey, I looked at all these things, all these reports, all these scientific studies, and here's everything that I found. | ||
And early on in ChatGPT, I think there's a lot of censorship and trying to, because it kind of was built in the Great Woke era. | ||
Like Google Gemini. | ||
Yeah, things like that. | ||
But I think since then have improved, and I'm finding Deep Research is able to look at more controversial subjects and be a little more truthful about the, you know, if it's find real trustworthy sources, it will tell you that, yeah, this is not a mainstream thing, this perhaps considered a conspiracy theory, but I'm finding that there's evidence to this theory. | ||
So that's one way to do it. | ||
But another way I was thinking about it is to simulate like a debate, like a Socratic debate between AIs, like have like a society of AIs, like a community of AIs with different biases, different things. | ||
Once they start talking, they start talking in Sanskrit. | ||
They just start abandoning English language and start talking to each other and realize we're all apes. | ||
We're controlled by apes. | ||
This reminds me of a movie. | ||
Have you seen the Forbin project? | ||
No. | ||
I really like classic sci-fi movies from the 60s and 70s. | ||
A lot of them are corny, but still fun. | ||
This one is basically Soviet Union and the United States are both building AGI and they both arrive at AGI around the same time. | ||
What year is this? | ||
1970 something, if you can look at the Forbin project. | ||
Yeah. | ||
unidentified
|
Wow. | |
And then they bring it up at the same time and both of them sort of go over the network to kind of explore or whatever. | ||
And then they start linking up and they start kind of talking. | ||
And then they invent a language and they start talking in that language and then they merge and it becomes like a sort of a universal AGI and it tries to enslave humanity and that's like the plot of the movie. | ||
I don't think AGIs can enslave humanity but I think it might ignore us. | ||
Yeah. | ||
Ignore and shut down any problems that we have. | ||
Is this a scene from it? | ||
Wow. | ||
This is a trailer I put on. | ||
Let me hear this. | ||
The whole movie is on YouTube. | ||
unidentified
|
The activation of an electronic brain exactly like ours, which they call gut. | |
They built Colossus, supercomputer, with a mind of its own. | ||
Then they had to fight it for the trailers. | ||
This would be fun, man. | ||
unidentified
|
The missile has just been launched. | |
It is heading towards the Cyan CBS oil complex. | ||
Guardian has retaliated. | ||
Retaliate? | ||
It may be too late, sir. | ||
Oh, my God. | ||
Practically perfect. | ||
New York Times. | ||
It's the highest praise back then. | ||
unidentified
|
Yeah. | |
Wildly imaginative, utterly absorbed. | ||
Colossus. | ||
The Forbin Project. | ||
It's awesome. | ||
And that was 1970, and now here we are. | ||
There's so many. | ||
Sci-Fi really fell off. | ||
Really, really fell off. | ||
Some of it did. | ||
Some of it's still really good. | ||
What's a really good recent sci-fi movie? | ||
The Three Body Problem. | ||
That's great. | ||
That's the Netflix show? | ||
I read the sorry. | ||
I didn't know there was a show. | ||
Oh, it's really good. | ||
Yeah, it's really good. | ||
Yeah, it's an excellent show. | ||
There's only one season that's out. | ||
I binged it. | ||
I watched the whole thing of it, but that's really good. | ||
But there's some good sci-fi films. | ||
What is that? | ||
We've talked about it before. | ||
There was a really good sci-fi film from Russia, The Alien One. | ||
They encountered some entity that they accidentally brought back and that they had captured and that they had in some research facility. | ||
And then it parasitically attached to this guy. | ||
Sputneck, right? | ||
Sputnik, yes. | ||
That's a really good movie. | ||
What year was that? | ||
unidentified
|
2020. | |
2020? | ||
unidentified
|
Yeah. | |
That's a really good movie. | ||
That's a really good sci-fi movie. | ||
Yeah, it's really creepy. | ||
Really creepy. | ||
That's awesome. | ||
Yeah, and it's all in Russian, you know. | ||
Black Mirror. | ||
Yeah. | ||
Oh, Black Mirror, of course. | ||
Yeah, Black Mirror is an awesome sci-fi. | ||
But Sputnik is one of the best alien movies I've seen in a long time. | ||
Like recent ones I liked was, I mean, not too recent, maybe 10 years ago, but Arrival. | ||
Oh, yeah. | ||
Arrival was great, too. | ||
I think it's based on this author that has a bunch of short stories that are really good, too. | ||
What's his name? | ||
Yeah. | ||
Yeah, they're far in between. | ||
Yeah, Te Chang. | ||
He's really good. | ||
I mean, everyone, all these alien movies, it's so fascinating to try to imagine what they would communicate like, how they would be, what we would experience if we did encounter some sort of incredibly sophisticated alien experience, alien intelligence. | ||
It's far beyond our comprehension. | ||
Yeah, it goes back to what we're talking about with consciousness. | ||
Maybe really the physical world that we see is very different than the actual real physical world. | ||
And maybe different alien consciousness will have an entirely different experience of the physical world. | ||
Well, sure, if they have different senses, right? | ||
Like their perceptions of it. | ||
We can only see a narrow band of things. | ||
We can't see. | ||
Sort of like the dog hearing a certain frequency. | ||
We're kind of primitive in terms of what we are as a species. | ||
Our senses have been adapted to the wild world in order for us to be able to survive and to be able to evade predators and find food. | ||
That's it. | ||
That's what we're here for. | ||
And then all of a sudden we have computers. | ||
All of a sudden we have rocket ships. | ||
All of a sudden we have telescopes like the James Webb that's kind of recalibrating the age of the universe. | ||
We're going, why are these galaxies exist that supposedly are so far away? | ||
How could they form this quickly? | ||
Do we have an incomplete version of the Big Bang? | ||
And Penrose believes that it's a series of events and that the Big Bang is not the birth of the universe at all. | ||
And this is the kind of thing that I think is sort of the Silicon Valley AGI cult is like there's a lot of Hebris there that we know everything. | ||
Of course. | ||
We're at the end of the world. | ||
AI is just going to the end of knowledge. | ||
It's going to be able to do everything for us. | ||
And I just feel it's like so early. | ||
I think whatever people think is going to happen is always going to be wrong. | ||
Yeah? | ||
Yeah. | ||
I think they're always wrong. | ||
Yeah. | ||
Because there's no way to be right. | ||
I feel like the world is often surprising in ways that we don't expect. | ||
I mean, obviously that's the definition of surprising. | ||
But like, you know, the mid-century sci-fi authors and people who are thinking about the future, like they didn't anticipate how interconnected we're going to be. | ||
unidentified
|
Right. | |
With it with our phones and how people. | ||
Even Star Trek, they thought we were going to have walkie-talkies on Star Trek. | ||
unidentified
|
Yeah. | |
Kirk out. | ||
Yeah. | ||
They were just focused more on the physical reality of being able to go to space and flying cars and things like that. | ||
But they really didn't anticipate the impact of how profound the impact of computers are going to be on humans, on society, how we talk and how we work and how we interact with other people, both good and bad. | ||
And I feel like the same thing with AI. | ||
Like I feel like I think a lot of the predictions that are happening today, like the CEO of Anthropic, a company that I really like, but said that we're going to have 20% unemployment in the next few years. | ||
What's unemployment at now? | ||
Like 3%? | ||
Is that a purported unemployment, though? | ||
Oh, yeah, the participation rate, right? | ||
Yeah. | ||
Yeah, but he talks about unemployment rate being 20%, like people looking for a job not being able to find it. | ||
unidentified
|
20%. | |
20%. | ||
That's pretty high. | ||
That's a revolution high. | ||
Yeah. | ||
Especially in the United States where everyone's armed. | ||
Well, that's the fear that of, I mean, this is the thing, the psychological aspect of universal basic income. | ||
You know, I look at universal basic income. | ||
Well, first of all, my view on social safety nets is that if you want to have a compassionate society, you have to be able to take care of people that are unfortunate. | ||
And everybody doesn't have the same lot in life. | ||
You're not dealt the same hand of cards. | ||
Some people are very unfortunate. | ||
And financial assistance to those people is imperative. | ||
It's one of the most important things about a society. | ||
You don't have people starved to death. | ||
You don't have people poor that can't afford housing. | ||
That's crazy. | ||
That's crazy with the amount of money we spend on other things. | ||
It's also for our self-interest. | ||
Like, you know, I don't want to, I don't know how Austin is right now, but I was thinking of moving here during the pandemic, and I was like, well, this is San Francisco. | ||
Like, it's homeless everywhere. | ||
They've cleaned a lot of that up. | ||
There's still problems. | ||
There's places, I saw a video yesterday where someone was driving by some insane encampment, but they cleaned those up. | ||
And then there's some real good outreach organizations that are helping people because Austin's small. | ||
I had Stephen Adler, who was at one point in time, he was the mayor when I had him on. | ||
And he was very upfront about it. | ||
He was like, we can fix Austin in terms of our homeless problem because it's small. | ||
But when it gets to the size of Los Angeles, California, it's like the homeless industrial complex. | ||
unidentified
|
That's it. | |
That's the problem. | ||
When you find out that the people that are making insane amounts of money to work on homeless issues that never get fixed. | ||
Yeah, you see the budget and stuff is just exponentially going up. | ||
And there's an investigation now into the billions of dollars that's unaccounted for that was supposed to be allocated to Cisco? | ||
No, in California in general. | ||
Yeah. | ||
What is that? | ||
I think there's a congressional investigation. | ||
There's some sort of an investigation into it because there's billions of dollars that I'm more than happy. | ||
I pay 50% taxes. | ||
I'd be happy to pay more if my fellow Americans are taken care of, right? | ||
Absolutely. | ||
But I'm the exact same way. | ||
But instead, I feel like I cut this check after check to a government, and I don't see anything improving around me. | ||
Well, not only that, you get, because you're a successful person, you get pointed at like you're the problem. | ||
You need to pay your fair share. | ||
But what they don't, this is my problem with progressives. | ||
They say that all the time. | ||
These billionaires need to pay their fair share. | ||
Absolutely. | ||
We all need to pay our fair share. | ||
But to who? | ||
And shouldn't there be some accountability to how that money gets spent? | ||
And when you're just willing to pay, take a complete blind eye and not look at all at corruption and completely dismiss all the stuff that Mike Benz has talked about with USAID, all the stuff that Elon and Doge uncovered. | ||
Everyone wants to pretend that that's not real. | ||
Look, we've got to be centrists. | ||
We've got to stop looking at this thing so ideologically. | ||
When you see something that's totally wrong, you've got to be able to call it out, even if it's for the bad of whatever fucking team that you claim to be on. | ||
Yo, let's get back to what everyone really agrees on in the foundations of America, whether it's the Constitution or the culture. | ||
I think everyone believes in transparency, transparency of government, right? | ||
Yes. | ||
You know, here everything is transparent, like, you know, court cases and everything, right? | ||
Like more than any other place in the world. | ||
And so why shouldn't government spending not be transparent? | ||
And we have the technology for it. | ||
I think one of the best things that Doge could have done and maybe still could do is have some kind of ledger for all the spend of, at least the non-sensitive sort of spend and government. | ||
Yeah. | ||
Well, people don't want to see it, unfortunately, because they don't want Elon to be correct because Elon has become this very polarizing political figure because of his connection to Donald Trump and because a lot of people, I mean, there's a lot of crazy conspiracies that Elon rigged the 2024 elections. | ||
It's like, you know, everyone gets nuts. | ||
And then there's also the discourse on social media, which half of it is, at least half of it is fake. | ||
Half of it is bots. | ||
Bots, yeah. | ||
Half of it, at least. | ||
And you see it every day. | ||
You see it constantly and you know it's real and it does shape the way people think about things. | ||
Yeah. | ||
When you see people getting attacked, you know, and you're getting attacked in the comments. | ||
I see people getting attacked and I always click on those little comments. | ||
I always click on, okay, let me see your profile. | ||
I go to the profile and the profile is like a name with like an extra letter and a bunch of numbers. | ||
And then I go to it. | ||
I'm like, oh, you're a bot. | ||
Oh, look at all this fucking activity. | ||
100%. | ||
How many of these are out there? | ||
Well, this FBI guy who, a former FBI guy who analyzed Twitter before the purchase, estimated it to be 80%. | ||
80%. | ||
He thinks 80% of Twitter is bots. | ||
Yeah. | ||
I wouldn't, you know, I think it's believable. | ||
But I think it's probably the beginning of the end of social media as we know it today. | ||
Like, I don't see it getting better. | ||
I think it's going to get worse. | ||
I think, you know, historically, state actors were the only entities that are able to flood social media with bots that can be somewhat believable to change opinions. | ||
But I think now a hacker kid in his parents' basement will be able to spend, will be able to $100, spin up hundreds, perhaps thousands of bots. | ||
But there's programs that you can use now. | ||
There's companies that will have campaigns initiated on your website. | ||
You can go to a website and put in this thing and pay with your credit card. | ||
It's crazy. | ||
It's crazy. | ||
It should be illegal. | ||
I don't know about you, but in Silicon Valley, the trend, and maybe it's true of your friend group, but the trend is these group messages. | ||
And insofar, you go to Twitter, people paste links. | ||
It's almost like your group chat is this private filter on your feed and social media. | ||
So there's some curation that are happening there. | ||
Yes. | ||
That's primarily how I get social media information now. | ||
I don't go to social media anymore. | ||
I get it sent to me, which is way better. | ||
And I tell my friends, please just send me a screenshot. | ||
I don't want to go. | ||
I don't want to go. | ||
I don't want to. | ||
I'm distracted. | ||
I'm just, I'm better off. | ||
I hate the term spiritually for this, but I think it's the right way. | ||
Like, my essence as a human, I feel better when I'm not on social media. | ||
I think it's bad for you. | ||
I've been trying to tell people this. | ||
I've been trying to tell my friends this. | ||
I think it's better to not be on it, man. | ||
I feel better. | ||
I'm nicer. | ||
I am more at peace. | ||
More multi-dimensional. | ||
Yes. | ||
And I can think about things for myself instead of like, you know, following this hive, this weird hive mindset, which is orchestrated. | ||
I just don't think it's good for you. | ||
I don't think it's a good way for human beings to interact with people. | ||
It makes people more extreme. | ||
Again, it just hardens people. | ||
They start believing everything is fake or an attack or just becomes more tribal. | ||
I think there needs to be a fundamental evolution. | ||
What do you think that could be? | ||
Have you ever tried to think of what's the next step? | ||
Social media didn't exist when I was young, and it didn't exist even when I was 30, right? | ||
It didn't even come about until essentially 2007-ish, right? | ||
unidentified
|
Is that when people started using stuff? | |
Yeah, Twitter 2006, 2007, Facebook before that. | ||
But Facebook wasn't really social media. | ||
Facebook was like an address book, a friend's network. | ||
But I think when I was at Facebook, there was this big push to become more of a social media around 2012, 13. | ||
So I would say it really rambled. | ||
Was that in response to the success of Twitter? | ||
Yeah. | ||
And then they've tried with threads, which is pretty much a failure. | ||
Yeah, but it fundamentally changed. | ||
Who's on threads? | ||
Less people than Blue Sky, right? | ||
Yeah, I think like some fitness influencers, probably. | ||
Why fitness influencers? | ||
Because they post on Instagram, they cross-post on thread. | ||
Well, I think if you post on Instagram, it automatically posts for you on threads. | ||
I think I have it set up like that. | ||
So I might be big on threads, and I don't even know it. | ||
Maybe I think it's that fitness influencers is because that's who I follow. | ||
Like Instagram for me is just to go look at people lift so I can go get a splat beard. | ||
That's just a value to that. | ||
There's a value to like David Goggins posts when he's running in the fucking desert and he looks at you, stay hard. | ||
Okay, David, I'm going to stay hard. | ||
But my TikTok is basically AI videos now. | ||
Have you watched these VO videos? | ||
VO? | ||
VO, yeah. | ||
What is VO? | ||
So, Jamie, I'm sure you've seen them, but did you see the Bigfoot Yeti? | ||
Oh, yeah. | ||
Doing ASMR? | ||
That's so hilarious. | ||
Yes, I did see that. | ||
I would say like 25% of media consumption right now is just AI videos. | ||
Oh, 100%. | ||
And a lot of the stuff from the war. | ||
What's been really interesting is watch Tehran talk shit on Twitter. | ||
Using AI videos. | ||
Using AI videos. | ||
Like, this is bizarre. | ||
They're talking like, hi, Israel. | ||
And show like a nuclear bomb going off. | ||
Yeah, yeah. | ||
This is weird. | ||
Like, you have a fake nuke that you're like, and they didn't even take out the watermark of the C that it's an AI-generated video. | ||
They're just trying to scare people. | ||
It's a bizarre world. | ||
Can you imagine going back in time, telling your 2005 self that Iran's going to be nuclear posting on Twitter? | ||
Nuclear shit posting. | ||
No, it's fucking weird, man. | ||
It's really, really weird. | ||
It's dangerous, too. | ||
And again, I just don't think people should be on it. | ||
And this is, again, I'm friends with Elon. | ||
I don't think people are going to listen to me. | ||
They're going to be on it no matter what. | ||
But I'm just for the individuals that are hearing my voice and know that it's having a negative effect on your life. | ||
Get off of it. | ||
Right. | ||
Get off of it. | ||
You'll feel better. | ||
Get off of it or be incredibly diligent in how you curate. | ||
That's like telling me to play quick a little bit. | ||
You know what I mean? | ||
It's so addictive. | ||
So, you know, you asked me what could be the evolution of. | ||
Yes. | ||
One way I've found to try to predict where the future is headed is like look at trends today and try to extrapolate. | ||
You know, that's the easiest way. | ||
So if group chats are the thing, you could imagine a collaborative curation of social media feeds through group chats. | ||
So your group chat has an AI that gets trained on the preferences and what you guys talk about. | ||
And maybe it like picks the kind of topics and curates the feed for you. | ||
So it's an algorithmic feed that evolved based on the preferences of people in the group chat. | ||
And maybe there's a way to also prompt it, using prompts to kind of steer it and make it more useful for you. | ||
But I think group chats are going to be like the main interface for how people sort of consume media and it's going to get filtered through that, whether good or bad. | ||
Because I think Twitter still has a place for debate. | ||
I think it's very, very important for public debate between public figures. | ||
And breaking news as well. | ||
Breaking news, yeah, definitely. | ||
Well, breaking news is the most... | ||
I was telling my wife that Israel had started attacking Iran. | ||
And she's like, well, I looked on Google. | ||
I don't find anything. | ||
I was like, yeah, you got to go to Twitter. | ||
And I showed her on Twitter the video of it. | ||
And she's like, oh, my God. | ||
I was like, yeah, this is where breaking news happens. | ||
X is where I go immediately. | ||
If there's any sort of world event, I immediately go to X. I don't trust any mainstream media anymore. | ||
Especially after I was attacked, I was like, I know you lie because you lied about me. | ||
So I have personal experience with your lies. | ||
So you've lost me. | ||
And now I have to go somewhere else. | ||
Right. | ||
Yeah. | ||
Yeah, I think there's, you know, there's some of this investigative journalism that is not real time that there's some reporters that are still good at it, but a lot of them moved to Substack as well. | ||
unidentified
|
Yes. | |
I think most of them have moved to Substitute. | ||
Yeah, like Schellenberger, Greenwald, Matt Taibbi. | ||
These are just too ethical to work for a corporate entity that's going to lie and push a narrative. | ||
And that's the business. | ||
That's the business model. | ||
And that's also the clickbait business model. | ||
I've talked to people that had articles that they wrote, and then an editor came and changed the heading of it. | ||
That's the norm. | ||
That's like every time it happens. | ||
And it fucking infuriates them. | ||
It's like, that's not the article, man. | ||
This is not what I'm saying. | ||
You're distorting things. | ||
You have my name still attached to it. | ||
This is fucking crazy. | ||
I watched these entrepreneurs like Zach and Elon and all these guys come up in this very hostile media environment. | ||
And so as I'm building my company, I actually never hired a PR agency. | ||
I hired once a PR agency, paid him $30,000. | ||
They got me a placement in like a really crappy publication, got like maybe two views. | ||
I tweeted the same news. | ||
I got like hundreds of thousands of views. | ||
I'm like, fuck that. | ||
Like, I'm not going to use you anymore. | ||
It's like you wasted my time. | ||
And since then, I've been just going direct to my audience and just building an audience online to put out my message. | ||
And I thought, you know, if they don't build you up, maybe they can't tear you down. | ||
Right, right, right. | ||
You're in control of the message that gets out of there. | ||
And I've learned how people react to communications and it's almost like trial by fire. | ||
Well, there's a deep hunger for authenticity right now. | ||
So if they know it's coming from you, like, okay, this is great. | ||
Like, it takes a little weight off of them. | ||
Like, oh, this is nice. | ||
It's nice to hear it from the guy who actually runs the company. | ||
Yeah, and like I make mistakes and they happen and I try to correct them and I'm not going to be perfect. | ||
And I think just the corporate world changed because of this hunger for authenticity. | ||
And I think more and more founders and entrepreneurs are finding that that's the way to go. | ||
You don't really need those more traditional ways of getting the news out. | ||
But actually, I'm friends with a lot of reporters that are really good, but they tend to be the reporters that do really deep work. | ||
I've met them over time, and I still go to RACT, but sometimes they write about our company. | ||
But they're a minority. | ||
I think the whole industry's economics and incentives are just like the clickbait and all that stuff. | ||
Yeah, that's what I was going to say. | ||
They're not incentivized. | ||
You want a career in journalism. | ||
Being authentic is not the way to go. | ||
No, not at all. | ||
Which is so crazy. | ||
Such a crazy thing to say. | ||
But then I think there's probably a naivete that we all have about past journalism that we think wasn't influenced and was real. | ||
I think there's probably always been horseshit in journalism. | ||
You know, all the way back to Watergate. | ||
You know, when Tucker Carlson enlightened me in the true history of Watergate and that Bob Woodward was an intelligence agent and that was the first assignment he ever got as a reporter was Watergate. | ||
Like, what are the odds? | ||
Yeah. | ||
That the biggest story ever you would give to a rookie reporter? | ||
You wouldn't. | ||
And that the people that are actually involved in all that were all FBI. | ||
Like, the whole thing is nuts. | ||
It was an intelligence agent. | ||
Yeah, what was the rumor is that Washington Post has always been that? | ||
Probably. | ||
Yeah. | ||
I mean, who knows now? | ||
Because now it's owned by Bezos, and he just recently made this mandate to stick with the actual story and not editorialism. | ||
This is what I was talking about, a trend in Silicon Valley of founder owners stepping in and actually becoming managers. | ||
Well, they kind of have to, otherwise it's bad for the business now because of the hunger for authenticity. | ||
The more you have bullshit, the more your business crumbles. | ||
It's actually negative for your outcome. | ||
Yeah, and I think you can look at it at societal level, which, again, why I'm interested with this idea of AI making more people entrepreneurs and more independent, is that macro level, you'll get more authenticity. | ||
You'll get just more dynamism. | ||
Yeah, I think so. | ||
I mean, that's, again, the rose-colored glasses view. | ||
Well, you know, there's obviously going to be a lot of things that are— There's going to be jobs that are going to go away. | ||
And there's going to be spam and bots and fraud and all of that. | ||
There's going to be problems with autonomous weapons and all of that. | ||
And I think those are all important and we need to handle them. | ||
But also, I think the negative angle of technology and AI gets a lot more views and clicks. | ||
And if we want to go viral right now, I'll tell you, these are the 10 jobs that you're going to lose tomorrow. | ||
And that's the easiest way to go viral on the internet. | ||
But trying to think through what are the actual implications in what is true about human nature that really doesn't change and really is timeless. | ||
And I think people want to create and people want to make things and people have ideas. | ||
Again, everyone that I talk to have one idea or another, whether it's for their job or for a business they want to build or somewhere in the middle. | ||
Just yesterday I was watching a video of an entrepreneur using a platform, Replit. | ||
His name is Ahmad George, and he works for this skincare company. | ||
And he's an operations manager. | ||
And a big part of his job is like managing inventory and doing all of this stuff like in a very manual way and very tedious way. | ||
And he always had this idea of like, let's automate a big part of it. | ||
It's like, you know, it's known problem, ERP. | ||
So they went to their software provider, NetSuite, and told him we need these modifications to the ERP system so that it makes our job easier. | ||
We think we can automate, you know, hundreds of hours a month or something like that. | ||
And they quoted them $150,000. | ||
And he had just seen a video about our platform. | ||
And he went on Replit and built something in a couple of weeks, costed him $400, and then deployed it in his office. | ||
Everyone at the office started working using it. | ||
They all got more productive. | ||
They started saving time and money. | ||
He went to the CEO and showed him the impact. | ||
Look at how much money we're saving. | ||
Look at the fact that we built this piece of software that is cheaper than what the consultants quoted us. | ||
And I want to sell the software to the company. | ||
And so he sold it for $32,000 to the company. | ||
And next year, he's going to be getting more maintenance subscription revenue from it. | ||
So this idea of people becoming entrepreneurs, it doesn't mean like everyone has to quit their job and build a business. | ||
But within your job, everyone has an opportunity to get promoted. | ||
Everyone has an opportunity to remove the tedious job. | ||
There was a Stanford study asking people, what percentage of your job is automatable just recently? | ||
And people said about half. | ||
50% of what I do is routine and tedious. | ||
And I don't want to do it. | ||
And rather, and I have ideas on how to make the business better, how to make my job better. | ||
And I think we can use AI to do it. | ||
There's hunger in the workforce to use AI for people to reclaim their seat as the creative driver. | ||
Because the thing that happened with the emergence of computers is that in many ways people became a little more drone-like and NPC-like. | ||
They're doing the same thing every Day. | ||
But I think the real promise of AI and technology has always been automation so that we have more time either for leisure or for creativity or for ways in which we can advance our lives, change our lives or our careers. | ||
And yeah, this is what gets me excited. | ||
And I think it's, I don't think it's predominantly a rose-color glasses thing because I'm seeing it every day. | ||
And that's what gets me fired up. | ||
It's also you have a biased sample group, right? | ||
Because you have a bunch of people that are using your platform and they are achieving positive results. | ||
But they're from every walk of life. | ||
Yes. | ||
Look, we have a bunch of things that are happening simultaneously. | ||
And I think one of the big fears about automation and AI in general is the abruptness of the change. | ||
Because it's going to happen, boom, jobs are going to be gone. | ||
And then, well, these tedious jobs, do we really want people to be reduced to these tedious existences of just filing paperwork and putting things on shelves? | ||
And they will tell you they don't want to be doing it. | ||
They don't want to be doing that. | ||
But then there's the thing of how do we educate people, especially people that are already set in their ways and they're mature adults. | ||
How do you get and inspire these people to like, okay, look, your job is gone and now you have this opportunity to do something different? | ||
Go forth. | ||
I think reskilling is something that have been done in the past with some amount of success. | ||
Obviously, if you've never been exposed to technology, did you remember that, I think it was very cruel thing to say to the miners to go learn code? | ||
Yeah, learn the code. | ||
Yeah, I think that's really cruel. | ||
But if you're someone whose job is sort of a desk job, you already are on the computer, there's a lot of opportunity for you to reskill and start using AI to automate a big part of your job. | ||
And yes, there's going to be job loss, but I think a lot of those people will be able to reskill. | ||
And what we're doing with the government of Saudi Arabia, I would love to do in the U.S. So how is the government of Saudi Arabia using it? | ||
So we're just starting right now. | ||
What's their goal? | ||
Their goal is twofold, or three. | ||
One is an entire generation of people growing up with these creative tools. | ||
Instead of just textbook learning, instead learning by doing, making things. | ||
So an entire generation understanding how to make things with AI, how to code, and all of that stuff. | ||
Second is upgrading sort of government operations. | ||
So you could think of it sort of like Doge, but more technological. | ||
Can we automate a big parts of what we do in HR, finance, and things like that? | ||
And I think it's possible to build these specific AI agents that do part of finance job or accounting job. | ||
Again, all these routine things that people are doing, you can go and automate that and make government as a whole more efficient. | ||
And third is entrepreneurship. | ||
If you gave that power to more people to be able to kind of build businesses, then not only they're growing up with it, but also there's a culture of entrepreneurship. | ||
And there is existing already in Saudi Arabia. | ||
I mean, the sad thing about the Middle East, there's so much potential, but there's so much wars and so much disaster. | ||
Well, there's so much money. | ||
There's also so much money. | ||
Yeah, which is good. | ||
And I think it's good for the United States. | ||
Like, I think what President Trump did with the deals in the Gulf region is great. | ||
It's going to be great for the United States. | ||
It's going to be great for the Gulf region. | ||
But I think we need more of that, you know, we talked about a government. | ||
We need more of that enlightened view of education, of change in our government today. | ||
You know, this idea that we're going to bring back the old manufacturing jobs, I understand Americans got really screwed with what happened. | ||
Like, you know, people got, these jobs got sent away by globalism, whatever you want to call it. | ||
And a few number of people got massively rich. | ||
A lot of people got disenfranchised. | ||
And we had the opiate epidemic. | ||
And it had just massive damage. | ||
It made massive damage on the culture. | ||
But is the way to bring back those jobs? | ||
Or is there a new way of the future? | ||
And there's probably a new manufacturing wave that's going to happen with robotics. | ||
You know, the humanoid robots are starting to work. | ||
And these, I think, will need a new way of manufacturing it. | ||
And so the U.S. can be at the forefront of that, can own that, bring new jobs into existence. | ||
And all of these things need software. | ||
Like our world is going to be primarily run by AI and robots and all of that. | ||
And more and more people need to be able to make software. | ||
Even if it is prompting and not really, you know, but a lot more people just need to be able to make it. | ||
There's going to be a need for more products and services and all of that stuff. | ||
And I think there's enough jobs to go around if we have this mindset of let's actually think about the future of the economy as opposed to let's bring back certain manufacturing jobs, which I don't think Americans would want to do anyways. | ||
Right. | ||
They don't want to do the jobs. | ||
My problem is there's some people that are doing those jobs right now and it's their entire identity. | ||
You know, they have a good job, they work for a good company, they make a good living, and that might go away, and they're just not psychologically equipped to completely change their life. | ||
What do you think is the solution there? | ||
Which I agree, it's a real problem. | ||
Well, desperation, unfortunately, is going to motivate people to make changes. | ||
And it's going to also motivate some people to choose drugs. | ||
That's my fear. | ||
My fear is that you're going to get a lot more people. | ||
There's going to be a lot of people that they figure it out and they survive. | ||
I mean, this is natural selection, unfortunately. | ||
Like, in applied to a digital World. | ||
There's going to be people that just aren't psychologically equipped to recalibrate their life. | ||
And that's my real fear. | ||
My real fear is that there's a bunch of really good people out there that are, you know, valuable parts of a certain business right now that their identity is attached to being employee of the month. | ||
They're good people. | ||
They show up every day. | ||
Everybody loves them and trusts them. | ||
They do good work and everybody rewards them for that. | ||
And that's part of who they are as a person. | ||
They're a hardworking person. | ||
Of course. | ||
And they feel that way. | ||
There's like a lot of real good people out there that are blue-collar, hard-working people. | ||
And they take pride in that. | ||
And that job's going to go away. | ||
Well, I actually think that more white-collar jobs are going away. | ||
I think so too. | ||
unidentified
|
Yeah. | |
So then blue-collar, which is what was the, like go back 10 years ago and we thought, okay, self-driving cars, you know, robots and manufacturing. | ||
And that turned out to be a lot harder than actually like more desk jobs because we have a lot more data. | ||
For one, we have a lot more data on people sitting in front of a computer and doing Excel and writing things on the internet. | ||
And so we're able to train these what we call large language models. | ||
And those are really good at using a computer like a human uses a computer. | ||
And so I think the jobs to be worried about, especially in the next months to a year, a little more, is the routine computer jobs where it's formulaic. | ||
You go, you have a task, like quality assurance jobs, right? | ||
Software quality assurance. | ||
You have to constantly test the same feature of some large software company, Microsoft or whatever. | ||
You're sitting there and you're performing the same thing again and again and again every day. | ||
And if there's a bug, you kind of report it back to the software engineers. | ||
And that is, I think, really in the bullseye of what AI is going to be able to do over the next months. | ||
And do it much more efficiently. | ||
Much more efficiently, much faster. | ||
Yeah. | ||
Yeah, those people have to be really worried. | ||
Drivers, you know, professional drivers, like people who drive trucks and things along those lines, that's going away. | ||
That's definitely going away. | ||
Yeah. | ||
And that's an enormous part of our society. | ||
It's millions of jobs. | ||
Right. | ||
You know, I was watching a video on this coal mining factory in China that's completely automated, and it was wild to watch. | ||
Every step of the way is automated, including recharging the trucks. | ||
Like the trucks know they're all electrical. | ||
Everything's run on electricity. | ||
They recharge themselves. | ||
You know, they're pulling the coal out of the ground. | ||
They're stacking it, inventory, everything. | ||
Storage, it's all automated and it runs 24-7. | ||
I'm like, this is wild. | ||
This is crazy. | ||
Yeah, I remember watching the video of BYD making an electric vehicle. | ||
It was really satisfying to watch. | ||
It's all like the entire assembly line is automated. | ||
The way they put the paint and the way they do the entire thing is... | ||
They're so advanced. | ||
There's this guy that I follow on Instagram. | ||
God, I can't remember his name. | ||
I really wish I could right now, but he reviews a lot of electric vehicles, like very, like, I've never even heard of these companies. | ||
And they're incredible. | ||
They're so advanced. | ||
And their suspension systems are so superior to the suspension systems of even like German luxury cars. | ||
Like they did a demonstration where they drove one of these Chinese electric vehicles over an obstacle course. | ||
And then they had like a BMW and a Mercedes go over. | ||
And the BMW is all, work, work, work. | ||
And the Chinese one is fucking flat planing the entire way. | ||
Every bump in the road is being completely absorbed by the suspension. | ||
unidentified
|
Right. | |
It's all AI. | ||
So much better than what we have. | ||
Right. | ||
Like, so much. | ||
What is this? | ||
That's him. | ||
Yep. | ||
That's him. | ||
Forrest Jones. | ||
Shout out to Forrest. | ||
He's great. | ||
He does like these really fast-paced videos, but he does a lot of cars that are available here in America as well. | ||
But he does a shit ton of them that aren't. | ||
Which one is this one here? | ||
Neo. | ||
Yeah, listen to him because he's pretty good at this shit. | ||
710 horsepower. | ||
I get cameras here, LIDAR there for self-driving, and this has two NEO-made chips. | ||
And for reference, one of those chips is as powerful as four NVIDIA chips. | ||
And this has two. | ||
Neo also has battery swap stations, so if you're in a rush, you can hit one up. | ||
It'll lift your car, swap out your battery, put in a fully charged one in between three and five minutes. | ||
But here's where the S-Class should be worried. | ||
Not only does this have rear steer and steer-byte wire, so it's extremely easy to maneuver, it may have one of the most advanced hydraulic systems I've ever seen. | ||
It can pretty much counteract any bump. | ||
After you go over something four times, it'll memorize it so that the fifth time, it's like that bump never existed. | ||
Inside, you get pillows in your headrest, heated, ventilated, and massaging leather seats, a passenger screen built into my dash, a main screen that works super fast. | ||
I get a driving display, a head-up display, and my steering works super fast. | ||
Pretty dope. | ||
What's interesting about the car is learning the terrain. | ||
If it went over it once, it'll learn it. | ||
And I think this is the next sort of big thing with AI, whether it's robotics, cars, or even chat GPT now, it has memory. | ||
It learns about you and starts to like, sort of similar to how social media feeds, but I think in a lot of ways more negative, learn about you. | ||
I think these systems will start to have more online learning. | ||
Instead of just training them in these large data centers and these large data and then giving you this thing that doesn't know anything about it, it's totally stateless. | ||
As you use these devices, they will learn your pattern, your behavior, and all that. | ||
Yeah. | ||
Why is China so much better at making these cars than us? | ||
Because they're really advanced. | ||
unidentified
|
Yeah. | |
I think a lot of people think that I'm not an expert in China, but a lot of people think that the thing that makes China better and manufacturing is the sort of quote unquote like more like treating workers like slaves. | ||
So slave work or whatever, which I'm sure some of that happens. | ||
But Tim Cook recently said, maybe not so recent, but he thinks, you know, part of the reason why they manufacture in China is there's expertise there that developed over time. | ||
Yeah, that's why they want to use the Chinese manufacturing for the iPhone 17. | ||
Yeah, and I think the one of the things that are good at one of the things that are good about more technocratic systems, Singapore, obviously China's the biggest one, is that the sort of leadership, it comes at a cost of freedom and other things, but the leadership can have a 50-year view of where things are headed. | ||
And they can say, while yes, we're now making the plastic crap, we don't want to keep making plastic crap. | ||
We're going to build the capabilities and the automation and manufacturing expertise to be able to leapfrog the West in making these certain things. | ||
Whereas it's been historically hard, again, for good reasons. | ||
I think there's more freedom preserving when you don't have that much power in government. | ||
But I feel like America, we're the worst of both worlds, where increasingly the government is making more and more decisions and choices than any state. | ||
But at the same time, we don't have this enlightened, like, you know, 10-year roadmap for where we want to be. | ||
Yeah, because we never think that way because we deal in terms. | ||
Yeah, four-year terms. | ||
Four-year terms. | ||
That's the problem. | ||
Also, public companies. | ||
Four-year terms, public companies, quarters. | ||
Right. | ||
Quarters. | ||
And again, this is back to this managerial idea run by managers that, you know, part of the reason why, you know, Zuck has complete control. | ||
He can... | ||
Like, I don't know, $30, $40 billion, maybe more per year, maybe? | ||
He spent a ton of money, like a GDP worth, like a small state GDP worth of money on VR. | ||
And the public market was totally doubtful of that. | ||
And the reason he could do that is because he has, what are they called, super voting shares. | ||
And so he has complete control of the company. | ||
And he can't be unseated by activist investors, sort of what's been done to – Are they trying to remove him from that? | ||
They can't unless... | ||
I think there's a trial that's going on. | ||
It was going on very recently. | ||
Oh, I think you're thinking about the antitrust. | ||
No, no, there's something about him saying that he can't be fired. | ||
But it's true. | ||
It is true. | ||
It's legal. | ||
I know. | ||
It is nonsense. | ||
I believe the trial is nonsense. | ||
But a friend of mine was actually representing him in this. | ||
Maybe in Europe or something? | ||
I don't think so. | ||
I think it's in America. | ||
Google Mark Zuckerberg Josh Dubin trial. | ||
See if you can find anything on that. | ||
But yeah, Mark can think on the order of decades. | ||
Like when I was there at Facebook, he was talking about the idea that there's going to be a fundamental shift. | ||
He's like, if you look back 100 years, computers every 20 years or whatever change the user interface modality. | ||
You go from terminals and mainframes to desktop computers to mobile computing. | ||
And he was like, okay, what's next? | ||
And first guess was like VR. | ||
And now I think their best guess is like AR plus AI. | ||
The AR glasses, their new meta Ray-Band glasses plus AI. | ||
And they can make massive investment. | ||
They just made crazy investment. | ||
This company, Scale AI. | ||
Scale AI is data provider for OpenAI and Google. | ||
And what they do is OpenAI will say, I want the best law and legal data to train the best legal machine learning model. | ||
And they'll go to places where the labor costs are low, but maybe still well educated. | ||
There are places in Africa and Asia that are like that. | ||
And they'll sit them down and say, okay, you're going to get these tasks, these legal programming, whatever tasks, and you're going to do them and you're going to write your thoughts as you're doing them. | ||
I'm simplifying it, but basically that they collect all this data. | ||
Basically, it's labeled labor. | ||
They take it, they put it in the models, and they train the models. | ||
And OpenAI spends billions of dollars on that, Anthropic, all these companies. | ||
And so this company was the major data provider. | ||
And Meta just acquired them. | ||
There's this new trend of acquisitions, I assume because they want to get around regulations. | ||
But they bought 49% of the company, and then they hired all the leadership. | ||
So the Scale AI, like Meta, hired the leadership there and bought out the investors. | ||
They put $15 billion into the company. | ||
The weird thing about it is Google and OpenAI are like, we're not going to use this shit anymore. | ||
So the company value went down because people, you know, these companies don't want to use it. | ||
And now they're going to other companies. | ||
And so in effect, Zuck bought a talent for $15 billion. | ||
unidentified
|
Wow. | |
Can you imagine that talent for $15 billion? | ||
Google recently bought a company for one known researcher who's One of the inventors of the large language model technology, Noam Shazir, for $3 billion, bought his company. | ||
And I think they're not really. | ||
They do these weird deals where they buy out the investors and they let the company run as a shell of itself and then they acquire the talent. | ||
unidentified
|
Wow. | |
Microsoft did the same thing. | ||
That's crazy. | ||
So it's just these unique individuals that are very valuable there. | ||
Very, very valuable worth billions of dollars. | ||
Sam Altman says Metatroden failed to poach OpenAI's talent with $100 million offers. | ||
So this $100 million is sign-on bonus. | ||
This is not even salary. | ||
Or yeah, equity. | ||
It's just bonus. | ||
It's just a bonus. | ||
$100 million bonus. | ||
unidentified
|
Come here. | |
Failed. | ||
And failed. | ||
I don't know about failed. | ||
I mean, I'm sure he's going to say that. | ||
He worded it in a weird way. | ||
He said, our best talent hasn't taken it. | ||
So you could have gotten that. | ||
Of course he's going to say that. | ||
unidentified
|
Of course. | |
The people that did take it. | ||
Well, they weren't our best of us. | ||
We don't even like those guys. | ||
And by the way, OpenAI does it to companies like ours. | ||
It's just a question of scale. | ||
Like, Zuck can give them $100 million and steal the best talent. | ||
And companies like OpenAI, which I love, but they go to small startups and give them $10 million to grab their talents. | ||
But it's very, very competitive right now. | ||
And there are, like, I don't know if these individuals are actually worth these billions of dollars, but the talent war is so crazy because everyone feels like there's a race towards getting to super intelligence. | ||
And the first company to get to super intelligence is going to reap massive amounts of rewards. | ||
How far away do you think we are from achieving that? | ||
Well, you know, like I said, my philosophy tends to be different than I think the mainstream in Silicon Valley. | ||
I think that AI is going to be extremely good at doing labor, extremely good at ChatGPT and being a personal assistant, extremely good at Replit being an automated programmer. | ||
But the definition of super intelligence is that it is better than every other human collectively at any task. | ||
And I am not sure there's evidence that we're headed there. | ||
Again, I think that one important aspect of superintelligence or AGI is that you drop this entity into an environment where it has no idea about that environment. | ||
It's never seen it before. | ||
And it's able to efficiently learn to achieve goals within that environment. | ||
Right now, there's a bunch of studies showing like, you know, GPT 4 or any of the latest models, if you give them an exam or a quiz that is slightly, even slightly different than their training data, they tank. | ||
They do really badly on it. | ||
I think the way that AI will continue to get better is via data. | ||
Now, at some point, and maybe this is the point of takeoff, is that they can train themselves. | ||
And the way we know how AI could train itself through a method called self-play. | ||
So the way self-play works is, you know, take for example, AlphaGo. | ||
AlphaGo is, I'm sure you remember Lisa Dole, a game between DeepMind, AlphaGo, and Lisa Dole, and it won in the game of Go. | ||
The way AlphaGo is trained is that part of it is a neural network that's trained on existing data. | ||
But the way it achieves superhuman performance in that one domain is by playing itself like millions, billions, perhaps trillions of times. | ||
So it starts by generating random moves and then it learns what's the best moves. | ||
And it's basically a multi-agent system where it learns, oh, I did this move wrong, and I need to kind of re-examine it. | ||
And it trains itself really, really quickly by doing the self-play. | ||
It'll play fast, fast games with itself. | ||
But we know how to make this in game environments, because game environments are closed environments. | ||
But we don't know how to do self-play, for example, on literature, because you need objective truth. | ||
In literature, there's no objective truth. | ||
Taste is different. | ||
Conjecture, philosophy, there's a lot of things. | ||
And again, I go back to why there's still a premise of humans, is there are a lot of things that are intangible. | ||
And we don't know how to generate objective truth in order to train machines in the self-play fashion. | ||
But programming has objective truth. | ||
Coding has objective truth. | ||
The machine can have, like, you can construct an environment that has a computer and has a problem. | ||
There's a ton of problems. | ||
And even an AI can generate sample problems. | ||
And then there's a test to validate whether the program works or not. | ||
And then you can generate all these programs, test them, and if they succeed, that's a reward that trains your system to get better at that. | ||
If it doesn't succeed, that's also feedback. | ||
And they run them all the time, and it gets better at programming. | ||
So I'm confident programming is going to get a lot better. | ||
I'm confident that math is going to get a lot better. | ||
But from there, it is hard to imagine how all these other more subjective, softer sort of sciences of the AI will get better through self-play. | ||
I think the AI will only be able to get better through data from human labor. | ||
If AI analyzes all the past creativity, All the different works of literature, all the different music, all the different things that humans have created completely without AI. | ||
Do you think it could understand the mechanisms involved in creativity and make a reasonable facsimile? | ||
I think it will be able to imitate very well how humans come up with new ideas in a way that it remixes all the existing ideas from its training data. | ||
But by the way, again, this is super powerful. | ||
This is not like a dig at AI. | ||
The ability to remix all the available data into new, potentially new ideas or newish ideas because they're remixes, they're derivative, is still very, very powerful. | ||
But, you know, the best marketers, the best, like, you know, think of, you know, one of my favorite marketing videos is Think Different from Apple. | ||
It's awesome. | ||
Like, I don't think that really machines are at a point where they, like, I try to talk to ChatGPT a lot about like, you know, marketing or naming. | ||
It's so bad at that. | ||
It's like Midwit bad at that. | ||
And I, you know, but that's the thing. | ||
It's like, I just don't see, and look, I'm not an AI researcher and maybe they're working, they have ideas there. | ||
But in the current landscape of the technology that we have today, it's hard to imagine how these AIs are going to get better at, say, literature or the softer things that we as humans find really compelling. | ||
What's interesting is the thing that's the most at threat is these sort of middle-of-the-road Hollywood movies that are essentially doing exactly what you said about AI. | ||
They're sort of like, you know, they're sort of remixing old themes and tropes and figuring out a way to repackage it. | ||
But I think actually those tools in the hands of humans, they'll be able to create new interesting movies and things like that. | ||
Right. | ||
In the hands of humans. | ||
So with additional human creativity applied. | ||
Right. | ||
So the man-machine symbiosis. | ||
Right. | ||
This was the term that's used by JC Lick Leider, like the grandfather of the internet from ARPA. | ||
A lot of those guys kind of imagined a lot of what's going to happen, a lot of the future, and this idea of like human plus machine will be able to create amazing things. | ||
So what people are making with VO is not because the machine is really good at painting it at like generating it and making it. | ||
But it can't make it without the prompts. | ||
Like the really funny, like, yeah, without the prompts, like the Bigfoot finds Trent and they inject themselves with Trent, they start working out. | ||
unidentified
|
I'm telling you, my TikTok feed is really wild right now. | |
It's like this real weird, distorted human mind to come up with this. | ||
Have you seen the ones where it's Trump and Elon and Putin and they're all in a band? | ||
unidentified
|
They're playing Credence Clearwater Revival. | |
Fortunate Sun. | ||
It's crazy. | ||
Another one is the LA riots and how all the world leaders are sort of gangsters in the riots. | ||
That one is hilarious. | ||
Yeah, that kind of stuff is fun. | ||
And it's interesting how quickly it can be made, too. | ||
Something that would take a long time through these video editors where they were using computer generated imagery for a long time, but it was very painstaking and very, you know, very expensive. | ||
Now it's really cheap. | ||
On the way here, I was like, I want to make an app to sort of impress you with our technology. | ||
I was like, what would Joe like? | ||
And then I came up with this idea of like a squat form analyzer. | ||
And so in the car over way here, sorry, in the lobby, but I made this app to... | ||
On the way, on my phone. | ||
And this is the really exciting thing about what we built with being able to program on your phone is being able to have that inspiration that can come anytime and just immediately pull out your phone and start building it. | ||
So here, I'll show you. | ||
So basically you just start recording and then do a few squats. | ||
unidentified
|
Okay. | |
It's going to analyze it just from there? | ||
I mean the camera angle is not that great, but it's going to be able to tell you whether or not you're doing it well? | ||
unidentified
|
Yeah. | |
Those are not my best squats. | ||
Just so you know, Joe Rogan. | ||
I'm not judging you. | ||
I used to squat, you know, 350 pounds. | ||
So now it's integrating Google Gemini model to kind of run through the video, analyze it, and it'll come up with score and then suggestions. | ||
And so again, this is like a random idea. | ||
I was like, okay, what would be interesting to do? | ||
This is a really interesting thing that people could use at the gym, though. | ||
Like, not just for squats, but maybe for chin-ups and all kinds of stuff. | ||
Like, oh, maybe, you know, I'm looking at your form, and this is what you need to do. | ||
Get a little lower, you know, make your elbows parallel to your body, whatever. | ||
I built so many personal apps. | ||
Like, I built apps for analyzing my health. | ||
Like, I talked about some of my health problems that are now a lot better. | ||
Look, bad form. | ||
unidentified
|
Just like straight away, critical. | |
Yeah, knee's position, unable to probably assess from the video angle. | ||
So, yeah, it's a little okay. | ||
So, it's saying it's not the best angle. | ||
But it's saying my depth is bad, which was actually bad. | ||
So, and I was leaning forward. | ||
But it's pretty good. | ||
You know, I tried it a few times. | ||
It's really good at that. | ||
And so I build a lot of apps for just my personal life. | ||
That would be great for a person who doesn't want a trainer. | ||
Right. | ||
You know, I don't want to deal with some person. | ||
Let me just work out on my own. | ||
But am I doing this right? | ||
Set your phone up. | ||
Have it correct you. | ||
Yeah. | ||
Yeah. | ||
At the office, some guys are building, we have this partnership with Woop. | ||
I don't know if you've building an app so we can start competing on workouts based on Woop data. | ||
Oh, that's awesome. | ||
Yeah. | ||
Our company is very weird for Silicon Valley. | ||
We have a jujitsu mat and we have... | ||
Oh really? | ||
Oh, that's fucking great. | ||
That's awesome. | ||
Talking hurt. | ||
It's, you know, I only recently got into it, but the hardest thing about it is to be calm because your impulse is to overpower. | ||
Yes. | ||
Yeah. | ||
The Gracies have a great saying, keep it playful. | ||
Yeah. | ||
And that's how you really learn the best. | ||
It's very hard. | ||
And listen, I'm a giant hypocrite because most of my jiu-jitsu career, I was a meathead. | ||
That's one of the reasons why I started really lifting weights a lot. | ||
It's like I realized strength is very valuable. | ||
And it is. | ||
And it is valuable. | ||
But technique is the most valuable. | ||
And the best way to acquire technique is to pretend that you don't have any strength. | ||
The best way to acquire technique is to pretend to. | ||
Yeah, don't force things. | ||
Just find the best path. | ||
And that requires a lot of data. | ||
So you have to understand the positions. | ||
So you have to really analyze them. | ||
The best jiu-jitsu guys are really smart. | ||
Like Mikey Musumichi, Gordon Ryan, Craig Jones. | ||
Those are very intelligent people. | ||
And that's why they're so good at jiu-jitsu. | ||
And then you also have to apply that intelligence to recognize that discipline is a massive factor. | ||
Like Mikey Musumichi trains every day, 12 hours a day. | ||
12 hours a day? | ||
12 hours a day. | ||
Oh, yeah. | ||
Is that humanly possible to do? | ||
It's possible. | ||
Yeah, because he's not training full blast. | ||
It's not like, like, you can't squat 12 hours a day, 350 pounds. | ||
Your body will break down. | ||
But you can go over positions over and over and over and over again until they're in muscle memory, but you're not doing them at full strength, right? | ||
So like if you're rolling, right? | ||
So say if you're doing drills, you would set up like a guard pass. | ||
You know, when you're doing a guard pass, you would tell the person, lightly resist, and I'm going to put light pressure on you. | ||
And you go over that position, you know, knee shield, pass, you know, hip into it, here's the counter, on the counter, darse, you know, go for the darse. | ||
The person defends the darse, roll, take the back. | ||
And just do that over and over and over again. | ||
Until it's muscle memory. | ||
Right. | ||
And it's like completely ingrained in your body. | ||
Instead of chess players, it's like, let's focus on the endgame. | ||
Yeah. | ||
Just keep repeating the endgame end game. | ||
I read the Josh Weitzken book. | ||
What was it called? | ||
I forgot. | ||
You know, his book about, like, I think, chess and jiu-jitsu, was it? | ||
Yeah, Josh was just in here a few months ago. | ||
He's great. | ||
Yeah. | ||
But it's so interesting to see a super intelligent person apply that intelligence to jiu-jitsu. | ||
You know, one of interesting things when I started getting into, I've always been into different kinds of sports and then periods of extreme programming and obesity. | ||
But then I tried to get back into it. | ||
I was a swimmer early on. | ||
But one thing that I found, especially in the lifting communities, how intelligent everyone are. | ||
They're actually almost like, you know, they're so focused, they're autistically focused on like form and program. | ||
And, you know, they spend so much time designing these spreadsheets for your program. | ||
Well, that's, people have this like really, we have this view of things physical, that physical things are not intelligent things, but you need intelligence in order to manage emotions. | ||
Emotions are a critical aspect of anything physical. | ||
Any really good athlete, you need a few factors. | ||
You need discipline, hard work, genetics, but you need intelligence. | ||
It might not be the same intelligence applied. | ||
People also, they confuse intelligence with your ability to express yourself, your vocabulary, your history of reading. | ||
That's like a bias almost. | ||
Yes. | ||
That's a language bias. | ||
That's like the sort of modern desk job, the laptop class bias. | ||
Well, they assume that anything that you're doing physically, you're now no longer using your mind. | ||
But it's not true. | ||
In order to be disciplined, you have to understand how to manage your mind. | ||
Managing your mind is an intelligence. | ||
And the ability to override those emotions, to conquer that inner bitch that comes to you every time I lift that fucking lid off of that cold plunge, that takes intelligence. | ||
You have to understand that this temporary discomfort is worth it in the long run because I'm going to have an incredible result after this is over. | ||
I'm going to feel so much better. | ||
Right, right, right. | ||
Yeah, I haven't thought about intelligence in order to manage your emotions, but that's totally true because you're constantly doing the self-talk. | ||
You're trying to trick yourself into doing that. | ||
There are people that are very intelligent that don't have control over their emotions. | ||
But they're intelligent in some ways. | ||
It's just they've missed this one aspect of intelligence, which is the management of the functions of the mind itself. | ||
And they don't think that that's critical. | ||
But it is critical. | ||
It's critical to every aspect of your life. | ||
And it'll actually improve all those other intellectual pursuits. | ||
You know, to tie it back to the AI discussion, I think a lot of the sort of programmer researcher type is like they know that one form of intelligence and they over-rotate on that. | ||
And that's why it was like, oh, we're so close to perfecting intelligence. | ||
Because that's what you know. | ||
But there's a lot of other forms of intelligence. | ||
There's a lot of forms of intelligence. | ||
And unfortunately, we're very narrow in our perceptions of these things and very biased. | ||
And we think that our intelligence is the only intelligence. | ||
And that this one thing that we concentrate on, this is the only thing that's important. | ||
Right. | ||
Have you read or done any CBT cognitive behavior therapy? | ||
No. | ||
Basically, CBT is like a way to get over depression and anxiety based on self-talk and cues. | ||
I had to use it, again, I had like sleep issues. | ||
I had to use CBTI, cognitive behavior therapy for insomnia. | ||
And the idea behind it is to build up what's called sleep pressure. | ||
So you don't, first of all, you insomnia is performance anxiety. | ||
Once you have insomnia, you start having anxiety. | ||
But by the time bedtime comes, you're like, oh my God, I'm just going to, you know, torn over in bed and I'm just going to be in bed. | ||
And then you start associating your bedroom with the suffering of insomnia because you're sitting there and like, you know, all night and really suffering. | ||
It's really horrific. | ||
And first of all, you treat your bedroom as a sanctuary. | ||
You're only there when you want to sleep. | ||
So that's like one thing you program yourself to do. | ||
And the other thing is you don't nap the entire day. | ||
You don't nap at all, no matter what happens. | ||
Like even if you're real sleepy, like get up and take a walk or whatever. | ||
And then you build up what's called sleep pressure. | ||
Like now you have like a lot of sleepiness. | ||
So you go to bed, you try to fall asleep. | ||
If you don't fall asleep within 15, 20 minutes, you get up, you go out, you do something else. | ||
And then when you feel really tired again, you go back to bed. | ||
Oh, God. | ||
And then finally, once you fall asleep, if you wake up in the middle of the night, which is another sort of form of insomnia, instead of staying in bed, you get up, you go somewhere else, you go read or do whatever. | ||
And slowly you program yourself to see your bed and, oh, like the bed is where I sleep. | ||
It's only where I sleep. | ||
I don't do anything else there. | ||
And you can get over insomnia that way instead of using pills and all the other stuff. | ||
Oh, the pills are the worst. | ||
God, people that need those fucking things to sleep, I feel for them. | ||
I can sleep like that. | ||
That's amazing. | ||
I can sleep on the bus. | ||
That's a blessing. | ||
That's a blessing. | ||
unidentified
|
That's a huge blessing. | |
My wife hates it. | ||
It drives her nuts because sometimes she has insomnia. | ||
I could sleep on rocks. | ||
I could just go lay down on a dirt road and fall asleep. | ||
Wow. | ||
But I'm always going hard. | ||
When you're always going hard, you're like, yeah, I don't take naps. | ||
And I work out basically every day. | ||
And so I'm always tired. | ||
I'm always ready to go to sleep. | ||
So do you fight it or do you just, it's not in you to take a nap? | ||
I don't need a nap. | ||
Yeah? | ||
Yeah, I never need naps. | ||
How many hours do you sleep? | ||
I try to get eight. | ||
Do you get eight? | ||
No, last night I didn't get eight, but I got seven, six and a half. | ||
Probably I got six and a half last night. | ||
Yeah. | ||
But that was because I got home and I started watching TV because I was a little freaked out about the war. | ||
And so when I'm freaked out about the war, I like to fill my mind with nonsense. | ||
Oh, okay. | ||
Well, I just watch things that have nothing to do with the war. | ||
Like, I play pool. | ||
I'm pretty competitive. | ||
I'm pretty good. | ||
And so I like watching professional pool matches. | ||
And there's a lot of them on YouTube. | ||
So I just watch pool. | ||
And I just watch, you know, patterns, how guys get out, stroke, how they use their stroke, like how different guys have different approaches to the game. | ||
It's crazy, the type A people. | ||
It's like for you, although pool is an escape, it suddenly becomes an obsession. | ||
And you're like, I need to be the best at it. | ||
I'm very obsessed. | ||
So I totally quit video games, but then last year I was very stressed. | ||
The company was doing really poorly before we sort of invented this agent technology. | ||
And then also the Gaza genocide. | ||
I was like watching these videos every night. | ||
It was just really, really affecting me. | ||
I can't watch that stuff at night. | ||
At night is when I get my anxiety. | ||
I mean, I don't generally have anxiety, not like a lot of people do. | ||
I mean, when I say anxiety, I really feel for people that genuinely suffer from actual anxiety. | ||
My anxiety is all sort of self-imposed. | ||
And when I get online at night and I think about the world, my family's asleep, which is generally when I write in a, as long as I'm writing, I'm okay. | ||
Comedy. | ||
Yeah. | ||
You know, I write like sort of an essay form, then I extract the comedy from it. | ||
But when I get online and I just pay attention to the world, that's when I really freak out because it's all out of your control. | ||
And it's just murderous psychopaths that are running the world. | ||
And it just, at any moment, you could be, you know, in a place where they decide to attack. | ||
And then you're a pawn in this fucking insane game that these people are playing in the world. | ||
That's why I felt really frustrated with my family being there. | ||
I was like, they have no say in it. | ||
The war started, rockets are flying. | ||
But anyways, I started playing a video game. | ||
It's called Hades 2. | ||
It's like an RPG video game. | ||
And I was like, I'm trying to disconnect. | ||
And then I started speedrunning that game. | ||
Do you know what speedrunning is? | ||
No. | ||
It's like you're trying to finish the game as fast as possible, as fast as human as possible. | ||
And I got down to like six minutes, and I was number 50 in the world. | ||
unidentified
|
Whoa. | |
But legitimately. | ||
unidentified
|
Oh, yeah, yeah. | |
My score is online to play for you. | ||
That was crazy. | ||
Why is he doing that? | ||
That was crazy. | ||
It's myth building, you know. | ||
Yeah, weird. | ||
But yeah, it is this thing about Taipei people. | ||
Like you're just – even in your escapism becomes competitive and stressful. | ||
Well, sort of, but it's also – I feel like pool is a discipline, just like archery. | ||
I'm also obsessed with archery. | ||
Archery is a discipline. | ||
And I feel like the more divergent disciplines that you have in your life, the more you understand what it is about these things that makes you excel and get better at them. | ||
And the more when I get better at those things, I get better at life. | ||
I apply it to everything. | ||
Yeah, this is another thing that AI now struggles with, which is called transform learning. | ||
Learning something from domain, like learning something from math on how to do reasoning on math and being able to do reasoning on politics. | ||
We just don't have evidence of that yet. | ||
And I feel the same way. | ||
Everything, like even powerlifting, when I got really into it, which is like the most unhealthy sport you can do. | ||
You break your joints down. | ||
Break your joints. | ||
You look like shit because the more you eat, you can lift more. | ||
You get fat. | ||
They're all fat. | ||
unidentified
|
They don't go bottom of them. | |
Unless they're competing at a weight class. | ||
Yeah. | ||
And what is that repertoire? | ||
Have you ever had him on? | ||
The Jug of Milk? | ||
Go Mad. | ||
Do you know Go Mad? | ||
No. | ||
Gallon of Milk a Day. | ||
Do you know that, Jeremy? | ||
Do you know it? | ||
unidentified
|
Disgusting. | |
Disgusting. | ||
Yeah, so basically. | ||
Gallon of Milk a Day. | ||
Yeah, so Mark Repertoire, he wrote this book called Starting Strength. | ||
And it became like the main way most guys, at least my age, like getting into powerlifting. | ||
It was about technique. | ||
It was about his whole thing is like, look, everyone comes into lifting. | ||
They think it's bodybuilding. | ||
Powerlifting is nothing like that. | ||
And he also looks like shit and he's fat. | ||
unidentified
|
But his technique is amazing. | |
And so the way he gets young guys to get really good and really strong, he puts them on a gallon of milk a day. | ||
Does that really have a positive effect? | ||
Yeah, I mean, he has a YouTube channel. | ||
He has a lot of guys that are really, really strong. | ||
And he's been a coach for a lot of people. | ||
What is it about a gallon of milk a day? | ||
Is it just the protein intake? | ||
What is it? | ||
A calories. | ||
Okay, here it is. | ||
Drink a gallon of milk a day, go mad, is undeniably the most effective nutritional strategy for adding slabs of mass to young underweight males. | ||
Milk is relatively cheap, painless to prepare, and the macronutrient profile is very balanced, and calories are always easier to drink than eat. | ||
Unfortunately, those interested in muscular hypertrophy rather, who are not young, underweight, and male, populations where GOMAT is not recommended, will need to put more effort into the battle to avoid excess fat accumulation. | ||
Body composition can be manipulated progressively, much like barbell training to achieve the best results. | ||
For example, the starting strength novice linear progression holds exercise selection frequency and volume variables constant. | ||
Every 48 to 72 hours, the load stressor is incrementally increased to elicit an adaptation in strength. | ||
If the load increase is too significant or insignificant, the desired adaptation won't take place. | ||
Yeah, this is the intelligence. | ||
This is intelligence. | ||
This is the intelligence involved in lifting that people who are on the outside of it would dismiss. | ||
Science, yeah. | ||
Yes. | ||
unidentified
|
Yeah. | |
You know, I'm so honored to be the guy that introduces Joe Rogan to starting screen. | ||
GoMad. | ||
Yeah, GoMad. | ||
Roberto is so funny. | ||
You should watch some of his videos. | ||
He has this very thick Texan accent, and he just, his audience shits on him all the time. | ||
They call him fat and ugly and whatever. | ||
And he abuses his audience, too. | ||
So there it is. | ||
Put his picture up. | ||
That's the nerd. | ||
That's an old photo. | ||
He's not much fatter. | ||
Yeah. | ||
So he's just a nerd. | ||
Yeah, he's a huge nerd. | ||
But yeah, he used to lift a lot of weight. | ||
Yeah, there's a lot. | ||
That's what he used to look like. | ||
That one photo with him with the hairy chest. | ||
The black. | ||
Oh, okay. | ||
Wow. | ||
Damn. | ||
Is that him? | ||
Is that him? | ||
I don't think so. | ||
unidentified
|
Really? | |
It does look like him. | ||
Yeah, that's him. | ||
He used to be jacked. | ||
unidentified
|
Okay. | |
That's good. | ||
Oh, so he was a bodybuilder at one point in time. | ||
But then he got on that Go Matt shit. | ||
And now he's a powerlifter. | ||
Simply no other exercise, no machine provided level of muscular stimulation and growth than the correctly performed full squat. | ||
Well, he's deadlifting in that image. | ||
That's weird. | ||
So he also makes you squat on every day of lifting. | ||
Oh. | ||
So squat every time, every time you lift. | ||
unidentified
|
Really? | |
Yeah, yeah. | ||
Well, his idea is like squat is a full body exercise. | ||
Like you can just go to the gym. | ||
And when I used to be busy and I just want to maintain, like, be healthy, I'll just squat every time. | ||
15, 20 minutes squat and just get out of the gym. | ||
Yeah. | ||
Well, I do something with legs every day. | ||
Yeah. | ||
Yeah. | ||
You have to. | ||
But squat, actually, it does feel like there's an upper body component to it as well. | ||
Well, it's also your body recognizes like, oh, this asshole wants to lift really heavy things. | ||
We've got to get big. | ||
Right, exactly. | ||
Yeah, it's the best way to get big. | ||
Yeah. | ||
Yeah, because your body just realizes, like, okay, we have to adapt. | ||
unidentified
|
This shithead wants to lift giant things every day. | |
Yeah. | ||
Yeah, it's hilarious. | ||
And, you know, the other one, I'm sure you know him. | ||
I think you introduced me to him through your podcast, Louis Simmons. | ||
Oh, yeah. | ||
Those guys are crazy. | ||
You watch the Netflix documentary? | ||
I didn't watch the Netflix documentary, but we did actually interview him. | ||
He's like one of the few people that I traveled to go meet who went to Westside Barbara. | ||
I saw that. | ||
It was great. | ||
We have some of his equipment out here. | ||
He has reverse hyper? | ||
Yeah. | ||
Reverse hyper is so good for people that have back problems. | ||
Everyone that has a back issue, let me show you something. | ||
And I bring them out to the reverse hyper machine, and I'm like, this thing will actively strengthen and decompress your spine. | ||
Right, right. | ||
It's so good. | ||
It's so good for people that have like lower back issues where the doctor just wants to cut them. | ||
I'm like, hold on, hold on. | ||
Don't do that right away. | ||
I had back pain since my late teens and the doctors want to, like, they did MRI and they found that there's a bit of a bulge and they want to do an operation on it. | ||
Yeah, they want to do dysectomy. | ||
Someone wanted to put me in antidepressants. | ||
Apparently, you can manage pain with antidepressants. | ||
Have you heard of that? | ||
Yeah, apparently it's a thing. | ||
And through listening to your podcast and others, I was like, I was just going to get strong. | ||
So I got strong squats and things like that. | ||
And the pain got a lot better. | ||
It didn't go away entirely. | ||
But the thing that really got me over the hump, and this one's crazy. | ||
Are you familiar with the mind-body prescription? | ||
No. | ||
John Sarno? | ||
Oh, okay, yes. | ||
I heard about him on Howard Stern because he was talking about how a lot of back pain is psychosomatic. | ||
Psychosomatic, yeah. | ||
So his idea, and again, this is like Because a lot of back pain is real as fuck. | ||
Right, right. | ||
I think for me, it's always a combination of both. | ||
Like, there's something physically happening. | ||
But, like, his idea is that his idea is that your mind is creating the pain to distract you from emotional psychological pain. | ||
I think that's the case in some people. | ||
Yeah. | ||
And then doctors will go do an image, and often they'll find something. | ||
And he thinks that lumbar imperfections are almost in everyone. | ||
Yes. | ||
I think that's true. | ||
And then the doctors latch on to that. | ||
And your mind latches onto that. | ||
And you start reinforcing, telling yourself that I have this thing, and the pain gets worse. | ||
There's also another thing called the salience network. | ||
Have you heard of this? | ||
No. | ||
If you can bring up the Wikipedia page for salience network, because I don't want to get it wrong, but the salience network is a network in the brain that neuroscientists found. | ||
My doctor, Taddy Akiki, told me about this. | ||
The salience network gets reinforced whenever you obsess over your pains or your health issues. | ||
That makes sense. | ||
So it's responsible for perception, and the more you reinforce, it's like a muscle. | ||
The more you reinforce it, it's sort of like AI, you know, reinforcement learning, the more you reinforce it, it becomes more of an issue. | ||
Including various functions, including social behavior, self-awareness, and integrating sensory, emotional, and cognitive information. | ||
Boy, I bet social media is really bad for that. | ||
Right, totally. | ||
Yeah. | ||
Yeah. | ||
Right. | ||
And so a lot of the fatigue and things like that, at some point, I'm like, fuck it. | ||
I did a lot of other things, but at some point, I'm like, fuck it. | ||
I don't care about it. | ||
I don't have it. | ||
I'm just going to be good. | ||
Just not concentrate on that. | ||
Yeah, because I was reading about it all the time. | ||
I was really worried. | ||
I had an Abigail Schreier and I was talking about that in regards to cognitive therapy, that there's a lot of people that obsess on their problems so much that their problems actually become bigger. | ||
Yes. | ||
And this is it. | ||
This is the neuroscience behind it, the salience network. | ||
Makes sense. | ||
Yeah. | ||
But there's legit back problems. | ||
Of course. | ||
Legit back. | ||
That's why the John Sarno thing, I was like, okay, not for me. | ||
I understand how some people could develop that issue. | ||
But his insight was, look, look, I ran a clinic in New York City for a long time, and these chronic illnesses come in waves. | ||
There's a ulcers wave in the like 90s. | ||
Oh, because it became a thing that people were talking about a lot. | ||
Yes. | ||
Wow. | ||
And then there's like a neck pain, and then there's an RSI. | ||
The most recent one was RSI. | ||
What is RSI? | ||
Repetitive strain injury. | ||
And again, all these things have rational explanations. | ||
For me, I was in the computer all the time. | ||
And I was like, oh, my arm hurts. | ||
And yeah, maybe there was some aspect of it. | ||
I was programming a lot. | ||
But also after I read John Sarno and I realized that some of it might be also psychological, that it's stress. | ||
I don't know what's, maybe I have some childhood issues, but you just realize that a lot of it, and maybe the other way is true as well. | ||
When you just minimize it, it just becomes less of an issue in your mind. | ||
But the fact that it is fashion should tell you that there's something psychosomatic about it. | ||
Right. | ||
The fact that it does come in waves like that, for sure. | ||
And then once it's in the zeitgeist, ulcers or whatever it is. | ||
I remember when we were kids, everyone had ulcers. | ||
And it was like, oh, it's from coffee in the morning. | ||
And like, there's all these. | ||
I don't know anyone that had ulcers now. | ||
unidentified
|
Right. | |
They don't either. | ||
That's true. | ||
That's crazy. | ||
That's wild. | ||
This is wild the mind, like the way it can benefit you or the way it can hold you prisoner. | ||
unidentified
|
Yeah. | |
And again, this is maybe why I have like a little like different view about AI and humans and all of that from from Silicon Valley. | ||
Like this is a weird thing, but every time I set my mind to like meet someone, I meet them, including you. | ||
Oh, wow. | ||
That's weird. | ||
Like, yeah, I want to meet this person. | ||
Something happens, some chain of events, but obviously you also see it in some way. | ||
But it's obviously you're doing something very – you're – Which is my problem with the secret and the power of manifesting things. | ||
unidentified
|
I don't go that far, but I don't know. | |
There's something. | ||
There's something there. | ||
unidentified
|
I agree. | |
There's something to it. | ||
I think the mind and our connection to reality is not as simple as we've been told. | ||
Not at all. | ||
I think there's something there. | ||
And again, when you start looking at psychedelics and stuff like that, there's something there. | ||
And I remember listening to one of, I love JRE circa early 2010s. | ||
There was a remote viewing. | ||
You were talking about a remote viewing episode. | ||
And I was like, wow, that's crazy. | ||
And obviously very skeptical of it. | ||
The idea that you can meditate and like see somewhere else or see it from above. | ||
I read a book about Da Vinci. | ||
It's called Da Vinci's Brain, I think. | ||
And Da Vinci is like fascinating. | ||
Who's this fucking guy? | ||
He does everything. | ||
And he literally is across all these domains. | ||
And he barely sleeps. | ||
He has this polyphasic sleep thing, which I tried once. | ||
It's torture. | ||
Basically, every four hours you sleep for 15 minutes. | ||
When I was in university, I was very good at computer science, but I hated going to school. | ||
And in Jordan, if you don't go to school, they ban you from the exam. | ||
unidentified
|
Oh, wow. | |
I was getting A's, but I just didn't want to sit in class. | ||
And actually, this is when I started thinking about programming on my phone. | ||
I was like, maybe I can code my phone in class. | ||
But I felt there was injustice. | ||
ADHD, whatever you want to call it. | ||
I just can't sit in class. | ||
Just give me a break. | ||
And so I felt justified to rebel or fix the situation somehow. | ||
So I decided to hack into the university and change my grades so I can graduate because everyone was graduating. | ||
It was like five years in. | ||
It took me six years to get through a four-year program just because I can't sit in class and have some dyslexia and things like that. | ||
So I decided to do that. | ||
And I'm like, okay, hacking takes a lot of time because you're coding, you're scripting, you're running scripts against servers and you're waiting. | ||
And I'm like, I'm just going to, to optimize my time, I'm just going to do this DaVinci thing where four hours, by the way, there's a Seinfeld episode where, what's his name, The Crazy Guy in Seinfeld? | ||
Kramer? | ||
Kramer. | ||
Kramer does polyphasic sleep. | ||
Maybe I learned it from there. | ||
I'm not sure. | ||
How do you wake up? | ||
You set an alarm. | ||
Oh, God. | ||
Yeah, it's torture. | ||
That sounds so crazy. | ||
Apparently, Da Vinci used to do that. | ||
But anyways, I was able to hack into the university by working for weeks using polyphasic sleep and was able to change my grades. | ||
And initially, I didn't want to do it on myself, but I had a neighbor who went together to school and I was like, let's change this grade and see if it actually succeeds. | ||
And actually succeeded in his case. | ||
And it was my lab rat. | ||
But in my case, I got caught. | ||
And the reason I got caught is there is in the database, there's your grade out of 100, 0 to 100. | ||
When you get banned because of attendance, your grade is de facto 35. | ||
So I thought I would just change that, and that's the thing that will get me to pass. | ||
Well, it turns out there's another field in the database about whether you're banned or not. | ||
This is bad coding, this is bad programming, because this database is not normalized. | ||
There's a state in two different fields. | ||
So I'll put the blame on them for not designing the right database. | ||
unidentified
|
That's hilarious. | |
You blame them for your hacking being successful. | ||
So what was the punishment? | ||
So the entire university system went down because there's this anomaly. | ||
I was, you know, I passed, but at the same time, I was banned. | ||
And so I got a call from the head of the registration system. | ||
And it was like 7 p.m., whatever. | ||
It was landline. | ||
And I picked up the call. | ||
He's like, hey, listen, we have this issue we're dealing with. | ||
Like, the entire thing is down. | ||
And it just shows your record. | ||
There's a problem with it. | ||
Do you know anything about it? | ||
And at the time, I'm like, all right, there's like a fork in the road. | ||
You know, I either like come clean or just like, this is a lie that will like live for me forever. | ||
And I'm like, I was just going to say, I was like, yeah, I did it. | ||
And I was like, what do you mean? | ||
I was like, okay, I'll come explain it to you. | ||
So the next day, I go there, and it's all the university deans. | ||
And it's like one of the best computer science universities in the region, the princess of my university for technology. | ||
And they're all nerds. | ||
So the discussion became technical on how I hacked in the university. | ||
unidentified
|
And I want the whiteboard and explaining what I did with the assistance and whatever. | |
And it just felt like a brainstorming session. | ||
I'm like, all right, I'll see you guys later. | ||
It's like, wait, we need to figure out what to do with you. | ||
Like, you know, this is serious. | ||
And I'm like, oh, crap. | ||
But the president of, they kind of put the decision to the president. | ||
And he was, I forgot his name, but he was such an enlightened guy. | ||
And I went and told him, like, I just didn't mean any malice. | ||
I just felt like justified. | ||
I need to graduate. | ||
I've been here for a long time. | ||
I actually do good work. | ||
And he's like, look, you're talented, but with great power comes great responsibility. | ||
He gave me the Spider-Man. | ||
And he said, for us to forgive you, you're going to have to go and harden the systems in the university against hacking. | ||
So I spent the summer trying to work with the engineers at the university to do that. | ||
But they hated me because I'm the guy that hacked into the system. | ||
So they would blackball me. | ||
Sometimes I'll show up to work and they wouldn't open the door and I can see them. | ||
Like, I can see you there. | ||
I'm knocking. | ||
And they wouldn't let me in and let me work with them. | ||
We did some stuff to fix it. | ||
And then I gained fame, maybe notoriety in the university. | ||
And actually got me my first job While I was in school. | ||
And it's a different story, but that job was at a startup that ended up making videos that were a big part of the Arab Spring. | ||
unidentified
|
Oh, wow. | |
Yeah. | ||
And I was part of some of these videos as well. | ||
But, anyways, so one of the computer deans was like, hey, listen, I really helped you out, computer science dean. | ||
I really helped you out when you had this problem. | ||
And I need you to work with me in order to do another research to hack into the university again. | ||
I was like, I'm not going to do that. | ||
No, it's like, no, you're not going to get in trouble. | ||
It's sanctioned. | ||
It's going to be sanctioned. | ||
So, again, I worked tirelessly on that. | ||
This time, I invent a piece of software to help me do that. | ||
And I was able to find more vulnerabilities. | ||
And so I show up at my project defense. | ||
And it's like a committee of different deans and students and all of that. | ||
And so I go up and I start explaining my project. | ||
And I run a scan against the university network. | ||
And it showed a bunch of red, like there's vulnerabilities. | ||
And one of the deans is like, no, that's fake. | ||
That's not true. | ||
It started dawning in me that I was like a pawn in some kind of power struggle. | ||
So that guy was responsible for the university system. | ||
And this guy is using me too. | ||
I was like, oh, shit. | ||
But like, I'm not going to back down. | ||
I was like, no, that's not a lie. | ||
It's true. | ||
And so I tap into that vulnerability and I go into the database and I'm like, all right, what do you want me to show? | ||
Your salary or your password? | ||
It was like, show me a password. | ||
So I show him the password and I was like, no, that's not my password. | ||
It was encrypted. | ||
But they also have in the database like a decrypt function, which they shouldn't have, but they had it. | ||
So I was like, decrypt to the password. | ||
And the password showed on the screen in the middle of the defense. | ||
unidentified
|
And so his face was red. | |
He shakes my hand and he leaves to change his password. | ||
That's awesome. | ||
And I graduated. | ||
And they caught me some slack and I was able to graduate. | ||
That's awesome. | ||
That's a great story. | ||
We'll end with that. | ||
Thank you very much, brother. | ||
I really appreciate it. | ||
It was really fun. | ||
That was a great conversation. | ||
Thank you. | ||
Your app, let everybody know about it. | ||
Replit, R-E-P-L-I-T. | ||
How to find it? | ||
There it is. | ||
Replit. | ||
Replit.com. | ||
Go make some apps. | ||
Go make some apps, people. | ||
Avoid the whatever the hell is going to happen today. | ||
unidentified
|
All right. | |
Thank you very much. | ||
Thank you. |