Speaker | Time | Text |
---|---|---|
I think the somewhat obnoxious suggestion I had was that we need something like a Truth and Reconciliation Commission, which was set up in post-apartheid South Africa, where, you know, as long as we got to the truth and we figured out what had happened, there can be reconciliation. | ||
But the first step is we need to come clean, we need to have some accountability for... | ||
I don't want us to have lots of arrests. | ||
I don't want lots of people to just get prosecuted. | ||
But I do think we need to have a lot more transparency into what exactly was going on in the sausage-making factory. | ||
And my suspicion is that that sort of transparency... | ||
will very much discourage a repeat of this behavior. | ||
unidentified
|
All right, Peter Thiel, let's go. | |
Last time we did this was about a year and a half ago, pre-Trump re-election. | ||
I would say the world was a little different. | ||
The first time we ever sat down was 2018. The world was very different. | ||
We started to get to know each other in about 2016. That feels like a different world altogether. | ||
So my first question to you is, you used to be the counterculture tech bro. | ||
Now everybody has come around to your positions, it seems. | ||
But as a guy that likes being in the opposition, just personally, how do you feel about that? | ||
I feel a lot better about it. | ||
I don't think there's any virtue in being in the opposition of one for the sake of that or anything like that. | ||
So, no, it's quite strange how much Silicon Valley shifted. | ||
interested. | ||
And of course, in some ways, even the Trump election in 2024 feels almost miraculous on some level. | ||
If you look at the total votes, it wasn't that enormous a shift. | ||
But if you believe in demographic determinism, it's an enormous shift because the electorate's different in 2024 from 2016. The Republicans in 2016, one would have said, were old, white people. | ||
And eight years later, a lot of them would not be alive. | ||
And they would be replaced by a younger, more diverse electorate. | ||
And so the Republicans should never have won. | ||
Another election if you believed in the sort of demographic determinism of identity politics where you do not vote based on reason or argument, but you vote on sub-rational factors like your gender or your race or your sexual orientation or other things like that. | ||
And for Trump and J.D. Vance to win, there were millions and millions of people who had to change their minds. | ||
And that's what I think. | ||
It was so impressive and so hopeful that somehow there was an argument that was made. | ||
It convinced people. | ||
It worked. | ||
And we have another chance in this country. | ||
What did you see in J.D. early? | ||
Because you were basically his first backer. | ||
And I think the first real show that he did, like primetime show online, was my show. | ||
And it was because you said to me at lunch one day, you were like, why don't you chat with this guy? | ||
And you could see how rough he is in that. | ||
And now he's just become perfectly polished, I would say. | ||
Man, I always, I don't want to exaggerate my abilities in these things, but I think, for a long time, I had a keenly felt sense that there was some need to rethink Republican priorities. | ||
We needed to somehow move on from, you know, the rhino Republicanism of the Bush years. | ||
And even the zombie Reaganism of the 1980s. | ||
And there were elements of these things that could be kept, but then you really needed to update it in various ways. | ||
And I always found him to be very thoughtful, interested in trying to figure these things out. | ||
And then somehow, yeah, it all came together. | ||
Were you even surprised the way he has so stepped into this thing? | ||
I mean, to me, it started at that debate. | ||
Suddenly it was like, oh, he was ready to level up, and now you see it in these speeches he's given all over the world at this point. | ||
I don't know what the right euphemism is, but I always thought he had a high ceiling. | ||
Fair enough. | ||
Why do you think you were sort of first in? | ||
I mean, I know you're a contrarian thinker in general, but when it comes to all of these tech guys that are now part of the administration, many of whom, I mean, Elon and David Sachs, guys that you started PayPal with 20-plus years ago, you were really the outsider on all of these guys. | ||
Maybe I was out of my mind. | ||
I had no idea what I was doing in 2016. I had this naive idea in 2016. Outside of Silicon Valley, it's like, you know, roughly half the people vote for one party, half vote for the other. | ||
And so, on some level, I thought supporting Trump for president in 2016 was the least contrarian thing I'd done in my life. | ||
You know, if you're just doing what half the population is doing, there's nothing... | ||
But in that world, it's a pretty insular world, and you still live there at the time. | ||
And I thought even with respect to that world, you know, they were, most of them were in the Democratic Party. | ||
They were having these intra-democratic fights, and I wasn't... | ||
Fighting over, you know, was it going to be, you know, Hillary or Bernie Sanders or, you know, I thought, yeah, you can have intense fights there. | ||
And then, you know, on the Republican side was just this different thing. | ||
And yet there was some way where it got weaponized and it was dangerous in ways that were greater than ever before. | ||
I mean, you know, obviously there were ways people were very anti-Bush or there were sort of probably a lot of... | ||
Republicans were very deranged about Clinton in different ways. | ||
But certainly, the way it happened with Trump was, I think, quite unprecedented. | ||
I think the... | ||
And then, of course, I was in a... | ||
I was not in an sort of executive CEO role at one of these companies where you are in sort of a... | ||
Much higher pressure position. | ||
You know, I was maybe a board member. | ||
But you were still on the board at Facebook, right? | ||
I think that is, in some ways, that is a, you know, it's a much less public role than if you're a senior executive or CEO at one of these companies. | ||
And there was, yeah, the felt sense of pressure on these people was extraordinary. | ||
You know, it's always sort of a question, I think, to some extent, people changed their minds and shifted. | ||
You know, I've known. | ||
I've known, you know, someone like Elon is sort of like, in some ways he's one of a kind, in some ways he's paradigmatic for a lot of these things. | ||
I've known Elon since 2000. And for the first 20 years, I wouldn't call him a conventionally liberal person, but, you know, I would think that it was a little bit more on the, you know, liberal, vaguely libertarian. | ||
You know, he was, Tesla was a flagship company, and it was always the Republicans didn't really believe in climate change, and so it was more naturally, it felt like the Democratic Party was naturally friendlier to Tesla than the Republican Party, and that's sort of, that was roughly where Elon was, and then for a variety of reasons, you know, he really shifted a lot. | ||
And then I think there were... | ||
I think there are different stories like this for a lot of people, why they shifted that are interesting. | ||
So I think my story in a way is not quite as interesting because I've had a lot of these beliefs for a longer period of time. | ||
The thing that's interesting is why did so many of these other people change their minds? | ||
So you think it's possible if you had had a leading position at one of these companies that the pressures would have been different on you? | ||
I think they would have been extraordinary. | ||
unidentified
|
And there was... | |
Yeah, it's always, there's always sort of a question, you know, why was Silicon Valley, why were the big tech companies so liberal, so left-wing? | ||
And one version was, you always say, well, it's because of, and it's because of the, you know, it's the CEOs or sort of these woke billionaire types or communist billionaires or something weird like that. | ||
Then there's, maybe it was the snowflake millennial workforce, sort of this bottom-up pressure. | ||
But I think the factor that was much bigger than those two was this top-down governmental regulatory pressure. | ||
And if you're a big company, there's a lot of surface area. | ||
There's a lot of areas where you're possibly regulated, you're in a regulatory gray zone of sorts. | ||
Maybe it's especially true if you're doing some kind of new technology where people don't quite know. | ||
And so there was an incredible amount of pressure that came from the government. | ||
It tended to come much more from the left than the right on these companies. | ||
Yeah, it felt very, very dangerous to go too much against that, I think. | ||
So when you see an evolution, let's say someone like Zuckerberg, who, you know, you had a party during inauguration weekend that it felt like, I mean, I happened to be there, but J.D. was there, and obviously you're there, and many senators were there, and Mike Johnson was there, but Zuckerberg was there, and Sam Ullman. | ||
It felt like the, and Saxon, it just felt like the next version of what sort of government is going to be and this thing we see him coming together. | ||
But when you see an evolution like Zuckerberg seems to be having right now, do you think that that's more calculated, a calculated business self-preservation? | ||
Or do you see that it seems to be authentic and that he's coming around? | ||
Or do you even think it matters, I guess, would be a better question. | ||
I think it matters a lot. | ||
I think it's always hard to know exactly why. | ||
It's probably overdetermined. | ||
But yeah, I think there's a part of it as an Elon story where if Elon can get away with this much, maybe I can get away with a little bit less. | ||
And you're sort of, you know, what's the Viking term? | ||
You're drafting behind Elon. | ||
Right, he's basically the rabbit and the dogs are all chasing. | ||
You're drafting behind Elon. | ||
I think, but then I think there's also... | ||
You know, it's always, if you do a certain policy, like wokeness or feminism or liberalism or socialism, and it doesn't work, this is a very ambiguous thing. | ||
And it can mean one of two things. | ||
It can mean you need to do more, or you should just maybe stop doing it altogether. | ||
And I think for a lot of Silicon Valley, for many years... | ||
There was sort of this intensification of the politics that happened in the 2010s and went in some ways into overdrive during COVID in 2021. And it was, well, maybe we're not doing enough. | ||
We need to pay more taxes. | ||
We need to do more DEI, more wokeness. | ||
And at some point, it dawns on people. | ||
It's just not working altogether. | ||
And then, you know, you don't shift back a little bit at that point. | ||
Maybe you shift quite a lot. | ||
And I think something like that is what has happened in so many of these places. | ||
And, you know, I have a decent number of conversations with tech CEOs and what they tell me one-on-one. | ||
And again, maybe it's very biased because they know what my perspective is, but what they tell me one-on-one is, you know, I mean, maybe they aren't fully supportive of Trump, but... | ||
They are so against the political correctness and all the ways in which this has just damaged their companies. | ||
The companies don't work. | ||
I think I know you well enough to know the answer to this, but as you watch a lot of these people come to the positions that you were saking out when they were not popular to have, just on a personal note, do you ever feel like the apology or... | ||
Just some sort of acknowledgement would be nice? | ||
Because you've been hit very hard in the media over the years, and it turns out that the culture actually did shift your way. | ||
I know nobody gets the apology or something, but sometimes there's something nice about someone else's evolution when it comes with that. | ||
You know, yes, that would be nice, but I... You don't have time! | ||
I think if someone says that they... | ||
They'd come around to agreeing with me. | ||
That's a pretty big apology right there. | ||
That's good enough. | ||
That's good enough for me. | ||
What would you say to the people that right now are freaking out, who are just like, wait a minute, you've got all of these tech bros, and they built these companies, particularly PayPal 20 years ago. | ||
They've done all these things. | ||
They own all these companies. | ||
they have so many interests all together and and that's too much concentrated power and that you know that argument which we're hearing a lot of right now um man i i don't i don't even i don't even know what that means like Where are people using this power? | ||
How are they using it? | ||
I'm not making the argument. | ||
And this is, again, too psychological, but so many of these things I always think are projections where there were so many critiques people had. | ||
The left had a President Trump. | ||
You know, it was sort of like, you know, maybe he didn't have all his marbles. | ||
And it was like, well, no, this was sort of a projection of it turned out for the person who came after Trump. | ||
Right. | ||
Or there were all these ways that he was this fascist who was threatening democracy in all these ways. | ||
And, you know, I mean, man, so much of the stuff that has come out with USAID. | ||
Crazy. | ||
All these ways that, you know, this center-left establishment, you know, was doing the exact opposite of, or was doing the things it was accusing the other side of. | ||
And so, I don't know. | ||
I think there was, you know, I always, the sociology of it, I always think of our side. | ||
We're this ragtag rebel alliance. | ||
The people are, they're not natural. | ||
You know, they aren't naturally synced up in the sort of robotic way. | ||
Right, right. | ||
This alliance makes no sense in some way. | ||
It's very heterogeneous, you know. | ||
You have the, you know, autistic C-3PO policy wonk person, and you have the teenage Chewbacca person, and you have the... | ||
They're all autistic. | ||
I know they're all autistic. | ||
And that's sort of what our side looks like. | ||
And the other side... | ||
It was synced up. | ||
They were, you know, they were robotic ditto heads, you know, just stormtroopers or whatever, you know? | ||
Do you have, like, just like a fundamental aversion to that thing? | ||
Like, when you just see a whole bunch of people agreeing on something, the way that the left has just fallen into lockstep and they had no ability to argue with each other or just counter ideas, is that something that just automatically goes off in your head? | ||
Yes. | ||
It's a very subtle thing. | ||
There's obviously some point where consensus tells you that we have the truth, and then there's some point where consensus, too powerful consensus, tells you that you have something totalitarian going on. | ||
And so, yeah, if you have 51 in a democracy, you think the 51% are more right than 49. If you had 70%, that's even better. | ||
And if you're at 99.99% on one side, maybe you're in North Korea. | ||
And so there is some point where the wisdom of crowds gives way to the madness of crowds. | ||
And my intuition is always that this happens way sooner than you think. | ||
And again, this is very hard to quantify. | ||
But I think one of the other dynamics in this, you know, How bad was the echo chamber on both sides? | ||
And so, you know, there's a right-wing echo chamber where people talk to themselves and, you know, listen too much to themselves. | ||
And there was some critique of Twitter or X.com that, yeah, if you were on there, you saw all these other people saying things that agreed with you, and this wasn't necessarily reflective of the larger society. | ||
And then I think there was also... | ||
An incredible sort of center-left establishment, echo chamber, where, you know, you had to say that everything was great with Biden, and then overnight everything was, he had to go, and then somehow you had to move, in lockstep, you moved so quickly to Kamala Harris, you couldn't possibly have thought that decision through, and then she was, you know, she got branded as, you know, the 60-year-old. | ||
The person who should be a grandma gets rebranded as a 17-year-old teenager who's coming from an ABBA VSCO concert and is the youngest, most dynamic person you can imagine. | ||
And somehow, it was this incredibly forced, incredibly fast consensus that served them catastrophically. | ||
There were parts of it that worked. | ||
There were parts that were effective. | ||
There was part of the resistance in Trump 1.0. | ||
That was, you know, the Rachel Maddow, Anderson Cooper, kind of getting all the magnet files in log step, or whatever, you know, however you want to describe it, filings in log step, was powerful. | ||
And this time around, it's gone very haywire. | ||
And maybe this is where, you know, the left immediate, it hasn't even helped the left. | ||
And I think that's not coming back. | ||
Well, the meme that they started doing after the election was, oh, we need our own Joe Rogan. | ||
And it's like, you guys had Joe Rogan. | ||
It's because you were so hysterical that you created all of us. | ||
I think he was a Bernie Sanders supporter, right? | ||
Right. | ||
So you've got to kind of admire their ability to purge people and put us all together. | ||
Or they had, I mean, I don't know, they had RFK. They had Elon. | ||
unidentified
|
They had Tulsi. | |
They had Elon. | ||
You know, they had all these people. | ||
Yeah, and somehow they've all come around to largely your... | ||
And I think, you know, again, I think there were very different stories for all of them in terms of where the shifts happened. | ||
But again, one of them was, at some point, the need to be in lockstep was too much. | ||
You know, if you have to have a full frontal lobotomy to remain a good Democrat in good standing, it's too much. | ||
You know, one of the other changes that's happened that's, I think... | ||
Quite, quite extraordinary. | ||
The only party, the only group that still believes in the elite universities, in a way, are the conservative Republicans. | ||
And so, if President Trump were here, he would tell you that he went to Penn, Ivy League school, only very smart people, David, go to Penn. | ||
And he'd talk about that for quite a while. | ||
And J.D. Vance went to Yale Law School. | ||
It's the top. | ||
And, you know, at the margins, you know, there are other reasons we think he's good, but this is a credential we like about him. | ||
And any sort of elite credentialing has collapsed on the left. | ||
It was, you know, Bill Clinton was a Rhodes Scholar, Yale Law. | ||
You know, Hillary Clinton was Yale Law. | ||
Obama was Harvard Law. | ||
I think Kerry was Harvard undergrad. | ||
And there was some way, you know, the Democratic Party... | ||
Used to be more elite. | ||
And then when Biden said he was a transitional candidate, it wasn't a transition. | ||
He was not going to have a sex change operation, but it was transitional from smart to dumb, from elite to very non-elite. | ||
And then Kamala was, even Howard was not Harvard. | ||
You couldn't even point this out. | ||
This is probably a racist thing to say or something. | ||
And then by the time you get to Waltz, it's way worse. | ||
The credentials are even dumber than Kamala's. | ||
There are no smart people left. | ||
And it's like a generational, it's a generational change. | ||
I think Chuck Schumer, Nancy Pelosi, they were talented people, very, in some meritocratic sense, were very impressive, are very impressive. | ||
There's not one younger Democrat who's in that league. | ||
Gavin Newsom, University of Santa Clara. | ||
Shapiro was probably too smart to put on the ticket, but it's Georgetown Law School. | ||
He was also too Jewish to put on the ticket. | ||
Well, maybe that too, but Georgetown Law School is like number 14. So even Josh Shapiro is so far below what you had with someone like... | ||
Yeah, like Clinton or Obama. | ||
And so I think—and then, you know, I don't know how much I want to read into it, but I'm tempted to say they don't want people who are smart. | ||
You know, being—the elite thing used to be, okay, you could think for yourself a little bit. | ||
You were part of this club where, you know, behind closed doors we could have a conversation and we have somewhat dissenting views and we could hash things out. | ||
I don't think any of this goes on anymore. | ||
And we just want these NPC robots. | ||
How much of that do you think is connected to Harvard as an institution? | ||
I've been to plenty of dinners at your places, and Harvard comes up an awful lot. | ||
Just what's gone on there, the rot, how it's changed, the endowment, all of these things. | ||
I don't come from that world, so I don't think I ever fully understand. | ||
I would always think, why is everyone so obsessed with what's going on at Harvard? | ||
But when you describe what you just described there, that the elites in their own way could do the behind-closed-doors thing and agree to disagree and have a conversation, yet they created a system that churned out robots, it seems that Harvard is... | ||
Is Harvard the epicenter of that? | ||
And what is that? | ||
Harvard is somehow... | ||
Harvard is probably... | ||
A number of these schools, there's some way where Harvard was the first among equals, and, you know, there was part of it that was an exclave of Washington, D.C. You have the Kennedy School of Government, and the Harvard Economics Department used to have a tremendous say in economic policy. | ||
And so, yeah, and then there was, you know, there's probably some Harvard, Massachusetts thing with, you know, I don't know, Kennedy, McGovern. | ||
I think there were only two Democrat presidents. | ||
Carter pretended to be allied with the Northeast establishment, and then he sort of double-crossed them halftime. | ||
I mean, it was Yale, but who somehow managed to outflank the Harvard people. | ||
But maybe almost going back to JFK, it's been an incredible part of the power structure. | ||
And then I think in a strange way, it's gone. | ||
I think if you were a liberal student, let's say a liberal white male at Harvard, there's no longer any... | ||
Maybe there's still some way you can get a track towards a good job on Wall Street or in Silicon Valley, but it is... | ||
No longer seems to track into the center-left establishment. | ||
When Alan Bloom spoke at Harvard in the late 1980s, the sort of obnoxious way he started his speech was, my fellow elitists, because it was like everybody at Harvard was really elitist, but they pretended not to be. | ||
Today, you couldn't even, you know, they all say, no, we're all so egalitarian, and then if you're a Harvard student, you shouldn't expect anything more from that, and you should, you know. | ||
You're not special because you're here. | ||
And so somehow, yeah, you had this egalitarian rhetoric for a long time, but it's been, it's somehow, the form is the same, but the substance has changed. | ||
So as a guy that has encouraged an awful lot of young people to not go to college, and I'm very happy to say that virtually everyone I've hired is now a college dropout. | ||
It's become a running joke around here. | ||
I only hire college dropouts, and it works every time. | ||
I mean, do you take some sort of like... | ||
You know, I'm never sure quite how quickly it reverses, but I think there has been a very big decline, and it's very hard to track. | ||
Harvard's probably still, in some sense, the first among equals. | ||
They still have a $50 billion endowment. | ||
They can keep going for a very long time. | ||
But yeah, these things can go bad without people being able to correct it or notice it. | ||
I always think that cities are very different from companies. | ||
Companies normally have a finite lifespan, and there's a limit to how bad the politics in a company can get, because if it gets too messed up, the company just goes bankrupt, you go out of business. | ||
In a city, The average city that exists, that ever was founded, still exists in the world today. | ||
And your average city is an immortal being. | ||
It lasts forever. | ||
And at some point you can mess it up so badly, it goes broke. | ||
Or it gets destroyed. | ||
Carthage gets destroyed. | ||
Maybe Detroit is past the point of no return. | ||
But the network effects in cities are so powerful that there are a lot of these in-between cities where... | ||
They can be extremely bad, extremely dysfunctional, and can persist for a long time. | ||
And I think if you modeled it, universities are actually closer to cities than companies. | ||
The top universities in the country were the ones that have been around for close to 400 years at this point. | ||
Harvard, Yale, Stanford was the big top university west of the Mississippi. | ||
So there's something about them that's very durable. | ||
But if an immortal being goes bad, it can go very bad for a long time. | ||
Does that tell you that we need institutions in some sense, even if they have largely been rotted out? | ||
That's something about the continuum from generations. | ||
Like we sort of need these ideas, even if they're not really what we think they are. | ||
I don't know exactly what it says. | ||
It tells you It tells you that there's not necessarily a straightforward mechanism for replacing Harvard. | ||
It also tells you that maybe Harvard can be a lot worse than it looks based on its endowment or the fact that it's still attracting all these talented people. | ||
It can tell you that it's extremely hard to reform internally. | ||
Maybe it tells you it's important to keep trying. | ||
I don't disagree with conservatives who try to influence these things and try to figure out a way to do it. | ||
I'm probably a little bit black-pilled on doing it myself. | ||
It does tell you it's probably quite hard to come up with alternative institutions. | ||
But then at some point... | ||
You have these occasional points where you have these phase shifts and people realize they're working so badly. | ||
Maybe the Democratic Party has realized that nothing's coming out of Harvard. | ||
And so, you know, we shouldn't particularly privilege these people. | ||
They're not learning anything. | ||
Or they're learning all the wrong things. | ||
I think they're learning very little. | ||
I did this function at Yale. | ||
I spoke at the Yale Political Union. | ||
Last September, you talked to a bunch of the students, and it's like every class, there's not a single class that's as hard as any class they had in high school. | ||
Wow. | ||
It's just, again, this is maybe more on the humanity side, but it's unbelievably easy. | ||
You get an A in every class. | ||
Did you feel that? | ||
You were teaching at Stanford, weren't you? | ||
I think there's all these complicated things, but all these ways the motivational structure isn't great. | ||
People, you know, it's not clear where it leads. | ||
It's... | ||
There's all these ways these institutions have gotten pretty bad. | ||
They've persisted for a long time. | ||
Maybe they'll continue doing so, maybe they won't. | ||
What is it? | ||
The Public Lies, Private Truths book, Timur Koran, I believe. | ||
And the thesis is that you have a revolution when, not when everybody knows that the government's lying, but it's when everybody knows that everybody knows the government's lying. | ||
And then all of a sudden, at one point, it's like the emperor has no clothes and everybody can see the emperor has no clothes. | ||
Everybody knows that everyone knows the emperor has no clothes. | ||
And then things can happen. | ||
And so, maybe that's... | ||
Maybe we are at that point with these universities where they were kind of off. | ||
A lot of people kind of knew it. | ||
We're now at a point where just about everybody knows that everybody knows. | ||
It's similar to what... | ||
What's going on with Doge and the government employees? | ||
Well, that's exactly what I was going to say. | ||
So that must be, I mean, there's a line that you said to me years ago that I say on this show probably once a week, which is that you said, I wouldn't be a libertarian if any of it worked. | ||
And to me, that distills it down perfectly, that if the system, whatever the system is, if it roughly worked, then you wouldn't need to be a libertarian because you'd say, okay, I can give in to this thing because it kind of works. | ||
And there were ways it used to work better than it does now. | ||
I mean, public schools in the U.S. were pretty high-functioning in the 1950s and 1960s, and they've declined massively. | ||
And then, yeah, probably, you know, there was a way that the, it's hard to imagine this, but there was a way that the, you know, in the 1930s when the New Dealers took over, the government was meritocratic and you had the best, the brightest people work It was still very talented in the 50s and early 60s. | ||
NASA had... | ||
The most talented rocket scientists in the U.S. and the whole world. | ||
And you could do things. | ||
And so, yeah, there's a way that there's been a real decline in these institutions. | ||
And this is where I always think of libertarianism, classical liberalism. | ||
They're not necessarily timeless and eternal truths, but I think they are more true now than they were 60 years ago. | ||
That's interesting. | ||
So do you think that if Doge keeps uncovering stuff and we keep cutting the waste and finding the fraud? | ||
Sure. | ||
There's more justification for a much smaller government. | ||
There's much less of a need for a big government than in the past. | ||
In all sorts of ways. | ||
I don't know. | ||
There's a... | ||
You know, one of the parts of academia that I've always thought is as rotten as the humanities is the sciences. | ||
And, you know, the thought experiment I always give is, you know, if you think about the government, you have the DMV, the post office, the NSA. What's the most messed up government agency? | ||
It's obviously the NSA because we have no idea what's going on there. | ||
Whereas, you know, post office and... | ||
A DMV, you go in, you can see no one's working. | ||
You can't get a pen, but you have a sense of what they're doing. | ||
So there's actually a little bit of accountability when it's so transparently incompetent. | ||
And then I think the humanities, you can tell that it's obviously woke ridiculousness. | ||
No one's reading books anymore or anything. | ||
The science is very, very hard to know. | ||
We had a tale of these two university presidents that got fired in the last year. | ||
There was the Harvard president, Claudine Gay, who was sort of the plagiarizing DEI woman. | ||
But then you had the Stanford president, Mark Tessier-Levine, where it was basically fraudulent dementia research. | ||
She was a neurobiologist, probably stole tens of millions of dollars engaged in fraudulent research. | ||
And the conservatives, it's much easier for us to critique Claudine Gay than Mark Tessier-Levine. | ||
And that makes me think that maybe there's a complementarity between the rot in the humanities and the rot in the sciences, but probably the rot in the sciences is even worse. | ||
And at some point, we figure out we shouldn't be funding this anymore. | ||
It is. | ||
And it's really unstable. | ||
It can change a lot. | ||
In some sense, do you think that this moment that we're at, where we now are resetting some of this, and Doge is coming in, and we're on this incredible moment with AI, and I want to talk about Palantir a little bit and all these things, do you think we sort of, there was no other way that history could have gone than to have gotten to this point? | ||
That things weren't going to work for a while, as you just described, and scientists would want to go to NASA, and we needed a state. | ||
And then over time, the bloat just got bigger and bigger, and there was really no way around that. | ||
It's sort of a function of our success, and then it led us to the strange moment we're in. | ||
Yeah, I had a feeling you were going to say that. | ||
Yeah, it feels like there's some natural entropy where, you know, these organizations become more bloated and bureaucratic over time, and there's some sort of entropic drift like that. | ||
I don't like being too determinist. | ||
I always think there's room for human agency. | ||
I don't think it had to happen this way. | ||
But, yeah, given that that's what has happened, that there has been an extraordinary decline, you know, the natural thing at this point is probably to downsize things significantly, to rationalize things, and maybe there's an opening for that happening much more than there was even nine years ago. | ||
I don't know. | ||
The crazy thing I kind of wonder is... | ||
Is whether this would have even been possible four years ago, if Trump had gotten re-elected in 2020, I somehow think we weren't quite ready to do this. | ||
And maybe, I don't know, this is too pessimistic a thing to say, but maybe we needed the four Biden years. | ||
No, I don't even think that's pessimistic. | ||
I think that's realist. | ||
They were educational, and it was just, you know, it was an ancien regime with a, you know, way past, with a... | ||
And then some of Biden encapsulated this Ancien regime that was way, way past its sell-by date, and it took a while to educate enough people about that. | ||
It's interesting. | ||
Why do you think that would be a pessimist way of viewing it? | ||
Because you'd like to think we could just... | ||
I was hoping we could do all this stuff in 2017. So you saw the chance to fix these things, and that was sort of your calculation. | ||
Sure, sure. | ||
I thought there was a lot that was wrong in this country. | ||
I always thought Make America Great Again was the most pessimistic slogan a Republican had said. | ||
It was, we are no longer a great country. | ||
And then finally, President Trump was going to be this forcing function where we'd be able to talk about all these problems. | ||
And by and large, I don't think people were ready for this in 2017. It was much more shooting the messenger than listening to this very disturbing message that he had. | ||
Yeah, that is why I'm sympathetic to the determinist argument, because it sort of feels like he had to lose to get us here. | ||
I think... | ||
Again, I don't think one should be Pollyannish about everything getting fixed, but I think there's an opening now that did not exist eight years ago. | ||
It's up to the people in the Trump administration, it's up to people like you to keep pushing. | ||
It's going to still take a lot of human agency. | ||
To get this country into a better place. | ||
Back on track. | ||
You once said to me that you would never bet against Elon Musk. | ||
As you see him get involved in this thing, having started a company with him, and now so many other guys that you guys have worked with are part of this operation. | ||
I mean, do you see this as sort of like the best case scenario, what's happening right now? | ||
That there are these great people involved and that they really are doing the things that... | ||
That we need to be done, and it's not just with the little scalpel, but it is with the sledgehammer, and that's probably the only way? | ||
Yeah, I don't know. | ||
Look, I mean, it's so hard to judge this. | ||
I would never bet against Elon. | ||
I think he will, you know... | ||
We shouldn't be too determinist the other way, though. | ||
We shouldn't just assume that Elon will solve all our problems for us, and Elon will be much more likely to succeed if all of us do our share in trying to articulate this. | ||
We don't just count on him to do everything. | ||
Right. | ||
Have you gotten a call to do something since your whole crew has gotten a call? | ||
I've tried to encourage them to bring on board as many of the good people I know as possible. | ||
I feel, again, I'm... | ||
So many more people in this administration that I think are really first rate. | ||
I think it's off to a much stronger start than eight years ago. | ||
What do you make of the collapse of the gatekeepers? | ||
Because that seems to be the other story here. | ||
There's sort of the obvious story in that we're fixing government and it's becoming more transparent. | ||
But that's all to the backdrop of nobody is buying the BS that we've talked about here anymore and that cable news is in the tank and people are watching this instead of that and all of those things. | ||
Are you concerned that with no gatekeepers, where does that actually spin us off into? | ||
Do you have any concerns around any of that? | ||
That we won't agree on anything at some point? | ||
Yes, but it's always... | ||
It's always what's the problem. | ||
I don't know if we frame it morally, is the problem that we have too much relativism or too much totalitarianism. | ||
And I think the problem was way more in the totalitarian side than the relativist side. | ||
In the philosophy of science, I always think you can think of science as fighting a two-front war against excessive dogmatism and excessive skepticism. | ||
And so, if you're a good scientist, you can't be too dogmatic. | ||
A lot of early modern science was against the excess dogmatism of the Catholic Church and the decayed Aristotelianism, all this stuff. | ||
That's what 17th, 18th century science fought against excessive dogmatism. | ||
You had to think for yourself. | ||
You couldn't just be a dogmatic scientist. | ||
But you also can't be too skeptical if you're a scientist. | ||
Don't believe my senses. | ||
I don't think you're sitting in front of me. | ||
I might be in a simulation or who knows? | ||
I don't know. | ||
Everything might be an illusion. | ||
That's not a good attitude for science. | ||
So there's some balance between anti-dogmatic and anti-skeptical. | ||
You have to fight both. | ||
And so, yes, on an abstract level, you're right that if you are just anti-dogmatic. | ||
Which is, you know, the gatekeepers preserve the dogmas, and we're just fighting the gatekeepers, and if we were pure anti-dogmatic, that wouldn't be healthy, because then we're too skeptical. | ||
And there needs to be some balance. | ||
But my intuition is, you know, the problem in our society is not too much skepticism. | ||
And too little, you know, if you ask, where's that balance between too much dogmatism, too much skepticism? | ||
Or dogmatism is like totalitarian, skepticism is like nihilistic relativism, and we're way too close to the excess dogmatic side. | ||
And so, for example, to use the science example of this, you know, there are all sorts of places where science today fights skepticism. | ||
You're not supposed to be a climate skeptic. | ||
You're not supposed to be a vaccine skeptic. | ||
You're not supposed to be a stem cell skeptic. | ||
You're not supposed to be a Darwin skeptic. | ||
So all sorts of places where if you're too skeptical, you can't be a scientist. | ||
They're fighting skepticism all the time. | ||
I don't think if you asked any scientist, they could even name a thing where science is too dogmatic. | ||
And so if they're always fighting skepticism... | ||
They can't even mention one place where science is too dogmatic. | ||
That's telling you, it's incredibly dogmatic. | ||
And we need to be a little bit more skeptical. | ||
unidentified
|
Right. | |
That seems to be shifting now, at least. | ||
I think it has shifted some, but this is where, man, the gatekeepers were wrong, and they shouldn't be believed. | ||
And yeah, look, there probably are some point where people believe in too many conspiracy theories, they believe in too many crazy things. | ||
But directionally, people believe in way too few conspiracy theories 12 years ago. | ||
Way too few. | ||
Right, where now we could veer into everyone believing everything, and that creates another problem, but basically— I think we're still not even close to that. | ||
So that's interesting. | ||
So you're basically, you're okay with sort of that pendulum swinging as long as... | ||
I'm okay. | ||
As long as we don't end up in the multiverse while it swings. | ||
You know, I'm okay with Joe Rogan and the UFOs and ancient civilizations. | ||
And it is, it's, you know, I don't have to agree with it, but I think it's a very healthy corrective to the zombie dogmatic, you know, establishment that we had that just blocked all questions and blocked so much stuff that made sense. | ||
It's like, I don't know, it's, it was, it was, it was wrong on so many things. | ||
And again, the COVID thing was like this watershed moment. | ||
And if they were so wrong about COVID, so wrong about, it was, it was the food market or the Wuhan lab. | ||
And you couldn't ask questions. | ||
And they didn't just block things that were wrong. | ||
They blocked a lot of things that were, I think, correct. | ||
Yeah. | ||
And then, you know. | ||
Why shouldn't we also rethink the climate science narrative and all these other things? | ||
Do you see climate as the big next one that people are going to really start challenging more outwardly? | ||
Because that seems like the major one that we still aren't quite addressing, although there's a few more people online willing to— I think there are a lot of versions. | ||
The big meta-narrative on science, the big meta-narrative, I would say, is people claim that there's a lot of science going on and that science is very healthy. | ||
And my skeptical view of science in general is I think most of these people are not scientists. | ||
They're not doing much work. | ||
They're not making any interesting new discoveries. | ||
They're just politicians getting money from the government. | ||
One of the people I know is this guy, Bob Laughlin, who's a physics professor at Stanford. | ||
He got a Nobel Prize in physics in 1998. He suffered from the extreme delusion that once he got a Nobel Prize in physics, he'd be free to look at anything he wanted to. | ||
And the area of science that he went after was he was convinced that most of the scientists, even at a place like Stanford, weren't really doing very much work, weren't doing very much science, were stealing money from taxpayers. | ||
And they weren't too thrilled to hear that? | ||
And you can imagine, this was a more taboo question. | ||
Yeah. | ||
More taboo topic than just going narrowly after climate science or, you know, or any of these things. | ||
And obviously he got promptly defunded and his grad students couldn't get PhDs anymore. | ||
And then my, you know, my sort of hermeneutic of suspicion is that if there's a topic you can't discuss, there's ideas you aren't allowed to articulate, my shortcut is they're just true. | ||
You just know they're true. | ||
So if you're not allowed to articulate this idea. | ||
That the universities haven't been doing much science. | ||
That the fraud that happened at Stanford, where the president engaged, wasted tens of millions of dollars in fraudulent dementia research, that is par for the course. | ||
That's what most of these people have been doing. | ||
This is why our society is stuck. | ||
And the fact that you're not allowed to articulate that, that suggests, to me, as a shortcut, it's probably directionally very true, and there's a lot of room to revisit this. | ||
Obviously, there are some narrow fields like climate science or COVID science that weren't so narrow that are important, but I would generalize it to all of science. | ||
It's string theory in physics. | ||
It's cancer research where they promised us they'd cure it in a few years for the last 50 years. | ||
It's always around the corner. | ||
So what is that? | ||
Do you think it's because there's so many people that are just kind of grifters or in over their head, or is it because most of us have no knowledge of any of this stuff? | ||
So if I was to sit down with the leading string theory expert, I would have no idea what they're talking about, and I would have to nod. | ||
Yes, I think there is a hyper-specialization of late modernity, that we have ever narrower groups of experts telling us how great and wonderful they are. | ||
There's something about that that makes it very hard to evaluate the experts. | ||
And then there's some layer, the political intuition I have is we should actually be very skeptical of what they're saying. | ||
Because the fact that it's so hard to evaluate the string theorists means we should be especially skeptical of it. | ||
And this is sort of a, yeah, so there's this hyper-specialization feature of late modernity. | ||
There are probably ways that the peer review process... | ||
You know, was used for so much of the evaluation of what got published and what got funded. | ||
And this was, you know, this was not necessarily a process that worked that well. | ||
It led to consensus. | ||
And then if the consensus was wrong, or maybe you just funded very boring things, things were very incremental. | ||
And so, yeah, I think there are maybe a lot of different reasons that it's sort of, you know, developed in this way. | ||
But yeah, the sweeping claim I would make is we should be skeptical of the scientific enterprise in its totality. | ||
Almost all of it is way exaggerated. | ||
And this would be the really tough critique of universities. | ||
It's sort of a debating point. | ||
I think there always are two different ways you can go after an opponent in an argument or something. | ||
You go after the opponent. | ||
Or in a military campaign or whatever, you go after your opponent at the weakest point, you're most likely to score at least, you know, a tactical win. | ||
You know, get a point. | ||
If you go after your opponent at the opponent's strongest point and you're able to win, it's hard. | ||
But if you're able to go beat them at their strongest point, that's game, set, match. | ||
And the university's weakest points in some ways are, you know... | ||
Let's say, all the nonsense in the humanities. | ||
But then, you know, what they tell their donors is, well, you know, the humanities don't really matter. | ||
What's really going on is the sciences. | ||
And so if we can show, let's say, I don't know, I'm not sure physics and string theory are the pinnacle, but if you could show that that's been a cul-de-sac where there's been no progress for 40 or 50 years, then maybe by transitivity, you know, if... | ||
The string theorists are the smartest physicists, and the physicists are the best scientists, and if they haven't done anything, maybe we should assume nobody's done very much, and there is again this wholesale revaluation that needs to happen. | ||
It's part of it also that even if people found out that some of that was complete nonsense and you really poked a hole in some of their, you know, most precious things, that a certain amount of this is just like, it's just sort of social behavior that people are just like, oh, we donate to the same things. | ||
We circulate in the same things. | ||
So, like, everyone's willing to kind of look away at the bad stuff because that's what their whole sort of world is about in some sense. | ||
Yeah. | ||
unidentified
|
Thank you. | |
Yeah, so I... Probably, coming back to the critique of the universities, if we critique them, I think you should try to critique them both at their weakest point and their strongest point, but we shouldn't let them... | ||
Just hide behind the science. | ||
What's that silly 80s song? | ||
You know, she blinded me with science. | ||
Right, right, right, exactly. | ||
They blinded you with science. | ||
Yeah, which that does not quite work anymore, I suppose. | ||
So you must be thrilled about Bobby, I assume, because he's basically coming in with that skepticism. | ||
I mean, that seems to be the driving point behind the entire movement. | ||
Yes, and again, there's a critique of Bobby. | ||
He's too skeptical, and how can you be against vaccines? | ||
He's categorically against them. | ||
But yes, my directional bias is we've had way too much of this totalitarian, top-down dogmatism. | ||
It's more corrupt than the Catholic Church was four or five hundred years ago. | ||
It's more dogmatic, you know? | ||
I think there were more divisions. | ||
There was more natural debate. | ||
You know, I don't know, the Dominicans and the Franciscans and the Jesuits, they weren't all the same. | ||
And there was natural heterogeneity within the Catholic Church. | ||
And this is... | ||
This is way more everybody is in lockstep. | ||
No interesting differences. | ||
No real debates. | ||
Let me ask you about Palantir, because it's come up a couple times on the show. | ||
We've played a bunch of videos of Alex Karp, who I think is one of the great thinkers these days. | ||
And also, he looks like the guy from Ready Player One who created the virtual world. | ||
But I had Joe Lonsdale sitting in that seat just a couple weeks ago, and we talked a bit about Palantir. | ||
And he smiled when I asked him, but I feel like everybody asked the same question. | ||
What? | ||
Is Palantir. | ||
That was the same smile he gave me. | ||
Well, I'm always tempted to do the literary answer from Lord of the Rings. | ||
It's this seeing stone which is very powerful. | ||
It helps you understand the world better. | ||
It was originally created by the elves. | ||
It was meant to be used for good purposes. | ||
It is potentially a very dangerous technology. | ||
It's very powerful. | ||
It ends up playing a very important role in the way the series goes. | ||
We could do the entire Lord of the Rings thing. | ||
I know you love the references. | ||
And basically, it ultimately gets used for good, even in the Lord of the Rings, where Aragorn shows Sauron the sword, Anduril, the sword that was reforged with which the One Ring was cut off Sauron's finger in the Palantir, and he retakes the Palantir for good. | ||
And then Sauron is fooled into launching a premature attack. | ||
And then that's how the hobbits get to Mount Doom in Mordor and destroy the One Ring. | ||
And so, yeah, there's sort of all these—so the Palantir ultimately does get used for good. | ||
It was—you know, the genesis of the company, we founded it over 20 years ago, was— Was to try to find some way to solve the terrorism intelligence community problem. | ||
And if you define technology as doing more with less, we needed to have more security with fewer violations of civil liberties, fewer intrusions of privacy. | ||
And I felt that the... | ||
The low-tech debate version of this was sort of a Luddite ACLU versus, you know, a heavy-handed, you know, dumb airport security where, you know, one silly shoe bomber makes everyone take their shoes off for two decades after that. | ||
And that was, you know, that's sort of, that was the initial goal. | ||
And then there's, yeah, there are all these ways that you want to sift through data to try to have a, you know, and, you know, the libertarian idea was we could not afford to have more terrorist attacks. | ||
Because if you have a terrorist attack and you don't have a high-tech solution, you will always get the heavy-handed, low-tech stuff that we got after 9-11 with the Patriot Act and all those things. | ||
And then, of course, there is some software layer to... | ||
Not just for the intelligence community, but for the military, for the defense capabilities generally, where we need to be able to coordinate resources and coordinate all these things much better. | ||
I don't want to propagandize this too much. | ||
I'm always bad at doing this, but I think Palantir is sort of a... | ||
Is the right way to do this, is the best solution. | ||
It's extremely hard to start defense-related tech companies. | ||
It's, again, this very locked-in, frozen, hard-to-change system. | ||
That's the glass-half-empty version. | ||
The glass-half-full version is, you know, there's also some room for something that can be massively improved, and at some point that matters. | ||
I don't know how much you're doing with the day-to-day there, but I know you love the philosophical debates behind that between privacy and liberty and all of those things. | ||
Is there that constant debate between, okay, we have to work with these governments and sort this information, but we don't want to step on these rights, and probably it's different in every country, so there must be just like an endless amount of just purely philosophic debate beyond just the tech stack that you guys are building to help countries make sure that, you know, things aren't blowing up, basically. | ||
Yes, but I also, I think there are, you know, I do think there are a lot of debates we have. | ||
At the end of the day, we are very pro the Western world. | ||
We want to work with Western, Western-allied governments. | ||
We believe that, in some sense, our side is still better than the other side on these things. | ||
And then I think there are a lot of ways that... | ||
That something like Palantir ends up being also constrained on government action, where if, you know, a lot of the abuses, a lot of the most extreme sort of overreach happened in contexts where people thought there would be no accountability ever inside the government. | ||
So, you know, if, and this was, you know, probably like... | ||
Something like the Guantanamo stuff in Bush 43. Nobody knew what was going on. | ||
You know, if nobody knows, it can go really crazy. | ||
And then, of course, at some point, people know, and the inmates take over the asylum. | ||
And it sort of went probably too much the other way. | ||
But there probably were a lot of deep state excesses. | ||
That happened where if Palantir had been used, there would have been more of a record of what people were doing. | ||
You know, the NSA has not worked with Palantir. | ||
And all these different suspicions, you know, one version is just they had a not invented here bias. | ||
But I think another version was there were probably a lot of things at NSA where it was too entangled. | ||
The FISA courts were way out of control. | ||
And if you had this outside software system where you actually tracked how often this stuff was used, there'd be much more risk this would have gotten exposed and would have been a track record, much tighter track record of these things, and it actually would have limited it. | ||
So my, you know, my, yeah, my, we somehow got blocked at NSA very early on. | ||
And I tend to think that the explanation Alex and I are sympathetic to was that the people at NSA believed that whatever power the software gave them, it also limited their power because there would be more accountability. | ||
unidentified
|
Right. | |
It would basically sort of end up with a USAID situation on them, in some sense. | ||
At some point, you have a line-item tracker, you know exactly where the money's going. | ||
Like, the USAID thing worked because, I believe you had like a level five, there's a, I think the technical government thing on USAID is you have a level five description, which is the line-item description. | ||
And then there was a level four description, which is how you sort of group these things together. | ||
And I believe the level four was what was reported to the Trump political people. | ||
And then the Level 5 and Level 4 were wildly divergent. | ||
By the way, I believe something like this is what Oliver North was charged with in the 1980s, was that he engaged in fraud because he had mischaracterized various payments. | ||
And so I think so many of these, if we want to be aggressive, so many of the people working at USAID... This is the sort of thing that was far more possible in this sort of paper pencil. | ||
non-transparent world. | ||
What do you think the healthiest thing is to happen to some of these people? | ||
Like, as we uncover the fraud, you know, I'm not a huge fan of just starting to arrest people and people get caught up in things, and that's not a defense of anyone. | ||
And clearly there's been criminal abuse and fraud and all those things. | ||
But I think once we start arresting people, it puts us in sort of this cycle where we'll just all do it to each other. | ||
What's your general philosophy around that? | ||
I think the somewhat obnoxious suggestion I had was that we need something like a Truth and Reconciliation Commission, which was set up in post-apartheid South Africa, where, you which was set up in post-apartheid South Africa, where, you as long as we got to the truth and we figured out what had happened, there can be reconciliation. | ||
But the first step is we need to come clean, we need to have some accountability for... | ||
I don't want us to have lots of arrests. | ||
I don't want lots of people to just get prosecuted. | ||
But I do think we need to have a lot more transparency into what exactly was going on in the sausage-making factory. | ||
And my suspicion is that that sort of transparency will... | ||
Very much discourage a repeat of this behavior. | ||
I think the FISA process was completely out of control. | ||
The Russia conspiracy theories 2016-2017, it was completely insane, contrived, in some ways extremely malicious, and we should come clean and we should publicize every single FISA investigation that was made. | ||
And we should discuss who are the people who drove these investigations. | ||
And then again, I'm not sure they should go to jail or get fired, but it should be at least part of their record in their career at these departments. | ||
unidentified
|
Right. | |
I don't know if you saw it. | ||
And I think there's the part where I'm sort of hopeful. | ||
I think we're already at a point where the FISA process is way less out of control than it was eight, nine years ago because people are at FBI and NSA, they are more hesitant to do this because they kind of worry there might be accountability at this point. | ||
But I also suspect it was really out of control in the recent past. | ||
Identity politics calls us always to... | ||
Relitigate the ancient past, you know, the sins of our forefathers. | ||
And what I think the Trump administration should be investigating is not the distant past, but the recent past. | ||
You know, it's more important to look into COVID-19 and Fauci and, you know, all these people than into 1619. And so, and then what I, what my suspicion is... | ||
That there was a lot of abuse. | ||
There was a lot of crazy stuff that happened because, and this again is sort of an obnoxious comparison, I think there was something about the Biden thing that was crazier than apartheid South Africa. | ||
Because apartheid South Africa by the 1980s, they knew they were on the wrong side of history. | ||
They knew their days were numbered. | ||
The Biden people, the Obama-Biden people, they were so delusional. | ||
They thought they were on the right side of history. | ||
And, you know, they were going to be the winners. | ||
They would always get to rewrite the rules. | ||
It was like, again, it was like the Ancien Regime, pre-revolutionary France. | ||
They thought it would go on forever. | ||
And so if you're in a world where you think you're on the winning side, you will always be able to get to rewrite the rules. | ||
You know, on one analysis, how much will you push the envelope? | ||
How much will you bend these rules? | ||
And so I think there was a lot. | ||
A lot. | ||
And there's a lot of strings that can be pulled, and if we pull enough of them, the fabric of the ancien regime of liberalism will completely unravel. | ||
It does tell you something beautiful about our mechanism to reset to truth, right? | ||
That they couldn't do it this time. | ||
I mean, the day before the election, Obama was on stage doing the very fine people hoax, and nobody bought it. | ||
And it was like... | ||
Man, the gall of this guy to put that lie out there when it's been debunked a jillion times. | ||
So remind me, what was that? | ||
The very fine people, you know, the Charlottesville, that there were very fine people on both sides, meaning the white supremacists versus the others. | ||
And it's like, wow, you really thought you could still do it. | ||
But enough of us have had the reset that they just couldn't do it anymore. | ||
Well, it's... | ||
Yeah, there are a lot of variations on this. | ||
One, you know, again... | ||
One hopeful thought I've had is that maybe identity politics was simply something that was a pre-internet phenomenon politically. | ||
Because identity politics, or the politics of identity politics, works by telling different people different messages, you can micro-target them, and nobody notices it. | ||
And I believe the last... | ||
The presidential election that was truly a pre-internet election was 2008 when Obama got elected president. | ||
And he could tell different things to different people. | ||
It was like Mark Penn was sort of the micro-targeting Clinton 90s person. | ||
But the basic version for Obama in 2008 was you told black people to vote for me because I'm black. | ||
And he told white people, vote for me because I'm post-racial. | ||
And by the time you get to 2016 with Hillary Clinton, It doesn't work anymore. | ||
You can't tell, you know, women to vote for you because you're a woman and men to vote for you because you're post-gender. | ||
Right. | ||
And then, you know, by 2024, I don't even think Kamala was nearly as bad as Hillary or Barack Obama were in, you know. | ||
8 and 16 years earlier. | ||
But even the smallest things where she had the fake accent and changed a little bit from one setting to another, it doesn't work anymore. | ||
And so, yeah, there's sort of a... | ||
If you say intersectionality, I don't know, maybe 7% of the population are black women. | ||
And so... | ||
It means that the black women are supposed to vote for you. | ||
And does that mean the other 93% of the people are going to move to the back of the bus or are going to get off the bus and should take a hike? | ||
Right. | ||
You go to them and you say black women or you'll say, they'll literally be like, black lesbians are the backbone of the economy. | ||
And then other people hear that and they're like, okay. | ||
So that micro-targeting basically because the internet has exposed everything. | ||
Yeah, people... | ||
In 2020, it seems delusional what they were thinking, but some of the intersectionality was, it was the union of blacks and women and all these people. | ||
It didn't work. | ||
You know, on some level, I think the Democrats vaguely intuited it because he went with Joe Biden in 2020, who was the old straight white male, even older than Mr. Trump. | ||
And so it was like this. | ||
Retro, non-DEI person. | ||
And then the promise was he was going to be the last one ever. | ||
And then after this, we'd have a diverse person. | ||
And in the abstract, yeah, there are more people who are not old straight white men than who are. | ||
But if you don't have an abstract, diverse person, it's always, you know, is it a white woman? | ||
Is it a black trans? | ||
unidentified
|
You know, woman, girl, whatever. | |
And in practice, you know, it's always too specific. | ||
And maybe, yeah, maybe they already knew in 2020 that it was somehow not quite going to work. | ||
And maybe that's why they were stuck with Biden for so long, because they knew that the post-Biden had to be a specific, diverse person, and then that would offend more people than it would include. | ||
It would exclude more than it would include. | ||
Again, it was in this internet world where it's transparent enough. | ||
Right, they set their own trapdoor, in essence. | ||
Let me ask you one other thing, and then we can continue this over dinner separately. | ||
It's interesting, because it seems to me that you're quite white-pilled at the moment, in some sense. | ||
Even when you're talking to me now, there's a smile, you seem very hopeful, and you usually are the contrarian. | ||
This is, I suppose, a big thing to end with, but we haven't really talked about AI. I know you're a sci-fi guy. | ||
I'm a big sci-fi guy. | ||
In all the sci-fi stories, we're about to set the biggest trap ever for ourselves, and humanity is about to basically hand over all of its individuality to the machine, whether it's The Matrix or Terminator or whatever. | ||
Are you... | ||
Where are you? | ||
I know you're not overly deterministic, but just sort of kind of blank slate. | ||
As we enter this phase of AI, and as we're about to cross this horizon, do you feel that we will be able to do it in a way that will allow humans to flourish for the most part? | ||
Or are you more worried about the other part? | ||
Or is it something that you're just going to kind of punt because you're not overly deterministic? | ||
Let me start by saying... | ||
Maybe I'm relatively more white-pilled than I have been at various points in the past. | ||
I still think there's an enormous array of incredible, messy problems. | ||
We didn't talk about all the foreign policy crisis in the Ukraine and Iran. | ||
Even bigger, the Taiwan crisis that's probably going to come to a head at some point in the next few years. | ||
There are all sorts of ways. | ||
You know, it's going to be very hard for the Trump administration to square the circle economically. | ||
You have to somehow figure out ways to extend the tax cuts, but we can't really borrow that much more money. | ||
The deficits are out of control. | ||
It's not that easy to cut the spending. | ||
You're not supposed to hike the taxes. | ||
You can't keep borrowing money. | ||
So there are sort of a number of, you know, extraordinarily difficult challenges. | ||
I like how you just spit the white pill out right there. | ||
unidentified
|
That's a little bit of a qualifier here. | |
I think one of the ways I always think of the AI problem or the existential risk problems, we have all these different existential risks, and there's nuclear war, there are bioweapons, there are killer robots, and all sorts of ways the AI technology can be quite... | ||
Potentially quite dangerous and quite scary. | ||
And I've always thought that, you know, one of the other existential risks, though, maybe climate change, maybe you have various environmental forms. | ||
And a part of me thinks these are real problems. | ||
We need to find a way to talk about them. | ||
You know, a lot of the... | ||
The people who are apocalyptic are probably not apocalyptic enough. | ||
Greta's only worried about climate change. | ||
She's not worried enough about AI. Right. | ||
And, you know, nuclear war and all these other ones. | ||
So, maybe she'll just get all these... | ||
She's mostly worried about Gaza now. | ||
Yeah. | ||
So, she's moved from the planet to the Palestinians. | ||
unidentified
|
Yeah. | |
Really jumped the shark or whatever. | ||
I don't know what the right metaphor is. | ||
unidentified
|
Maybe. | |
Yeah. | ||
Passed herself by date or something. | ||
But I think one of the risks that's also very big is that if we don't get science and tech to work, if we don't have some sense of progress, all the ways our society works by progressing. | ||
The middle class are the people. | ||
In the U.S., the middle class I define as people who expect their kids to do better than themselves. | ||
And this generational compact, it somehow broke down between the boomers and the millennials. | ||
And, you know, we have this society where the younger people are in many ways not expected to do as well as their parents. | ||
And so this sort of no growth, zero growth, stagnation in this society is going to derange. | ||
It will derange our institutions. | ||
It's not going to work. | ||
Any default where there's no innovation and there's no progress, that's not idyllic. | ||
You know, if everyone's riding your bicycle a la Greta, you're probably in North Korea. | ||
And so I think the way I would articulate the worry that I have about AI, yes, I have some worries about the technology, and you don't want to downplay them. | ||
I have even more worries about going from the frying pan into the fire of, you know, worldwide totalitarian control of regulating it and stopping it. | ||
And, you know, the Rand Corporation, which used to be sort of a techno-optimist thing in the 60s and 70s, has basically been taken over by the EA people. | ||
Jason Matheny is sort of this bureaucratic guy who runs the Rand Corporation, and he is basically one of the things they're pushing. | ||
It's something called global compute governance. | ||
It's basically totalitarian, one-world government, control of the whole world to, you know, maybe in the limit case, monitor every keystroke on every computer to make sure nobody can program a dangerous AI. And that seems to me far worse than the alternative. | ||
And this was, you know, on some level, if we can think about the politics, the Biden administration was leaning very hard into this totalitarian EA direction. | ||
We're going to have this Luddite, heavy-handed, you know, slipping towards totalitarian controls of technology. | ||
And then I think, you know, the Trump administration is this very needed corrective. | ||
We're going to—it's going to be dangerous, it's going to be risky, but it's far safer to try this than to lock it down. | ||
So what do we title this episode? | ||
Gray-pilled Peter Thiel? | ||
I always think we need to be, you know, look, it's up to us to do these things. | ||
These things are not inscribed. | ||
We have to make it work. | ||
We have to work. | ||
It's up to humans. | ||
It's up to the choices we do. | ||
If we're, you know, if you're too—I don't like—I think if you have to have an attitude, it's—you can be a little bit optimistic, a little bit pessimistic, extreme optimism, extreme pessimism. | ||
They both tell you you don't need to do anything or you can't do anything, and they're both—they converge to laziness. | ||
And so, gray pill sounds kind of bland, but— It's definitely better than, you know, happy-clappy optimism or, you know, hiding in a basement pessimism. | ||
Good to see you, my friend. |