Speaker | Time | Text |
---|---|---|
We've got to figure out a way to make it so we just have a one-button thing where everything syncs up with one button. | ||
Is that possible one day? | ||
unidentified
|
Maybe. | |
We're live. | ||
Right now we're live? | ||
We're live live. | ||
Cool. | ||
How are you, sir? | ||
Yeah, I'm not too bad. | ||
Welcome. | ||
Thanks for coming, man. | ||
Appreciate it. | ||
Yeah, no worries. | ||
Thanks so much for having me. | ||
Sam Harris is going to be with us, but he flaked out last minute. | ||
He's a busy man. | ||
Yeah, he's a busy man. | ||
So, I'm interested to talk to you about a bunch of things, but one of the big ones is this idea of effective altruism. | ||
And this is something that you really promote to the point where, I don't know if this is true, but I read this about you, that everything that you make over $36,000 a year, you donate? | ||
Yeah, that's right. | ||
Wow. | ||
Yeah, so everything... | ||
Technically, it's everything above £20,000 from 2009 Oxford. | ||
So just for inflation, cost of living changes and stuff. | ||
But that's about $36,000. | ||
So you've just sort of decided that, which is, by the way, the 1% for the whole world. | ||
Yeah, not quite. | ||
About 2%. | ||
Yeah, I'll be in the top. | ||
Still be in the top 2%, even despite... | ||
I thought it was $34,000. | ||
I think $34,000 puts you in the top 1%. | ||
I think it's $55,000. | ||
Oh, has it changed? | ||
Maybe since Trump's been in office. | ||
Yeah, that's right. | ||
But it's a... | ||
You know, what you're doing is... | ||
If that's really the case, that's a very charitable thing. | ||
Yeah, and it's also... | ||
I mean, it's most of my income over the course of my life. | ||
Like, especially as an academic, you're not going to earn tons. | ||
Though... | ||
Since effective altism blew up, you end up getting things like speaking fees and, you know, and I give all that away as well. | ||
So it's gonna end up probably being, like, the large majority of income over the course of my life. | ||
Do you ever, like, want to buy something and be like, shit, if I wasn't so goddamn generous, I'd be able to get this? | ||
You know, I never do. | ||
unidentified
|
Really? | |
I, like, basically never think that, yeah. | ||
I think, like... | ||
I feel like in contemporary society, we just get bombarded with marketing stuff all the time, saying like, oh, you really need this thing if you're going to have a good life. | ||
And I think in almost every case, that's just not true. | ||
I think the psychological evidence just shows that once you're above a certain level of income, additional money just has a very small impact on your happiness. | ||
And in my own case, like... | ||
The things that make me happy are being surrounded by friends, that's free. | ||
Gym membership, that's like $40 a month or something. | ||
It's not very much. | ||
I can afford that. | ||
Being able to work on what I really am passionate about, and I already have that. | ||
So my life is just so good in so many ways, and I feel like there's so much of a focus on money and how money is the key to happiness, and I think it's just all bullshit, basically. | ||
Well, it's definitely some bullshit in it. | ||
And I see that a lot in my neighborhood because I live where white people go to breed. | ||
And they go to breed and they sit down and they just talk about things. | ||
They talk about Range Rovers and certain watches and certain purses and shoes. | ||
And it becomes this constant... | ||
The amazing thing is just how you adapt. | ||
It's called the hedonic treadmill. | ||
The richer you are, the richer you need to be. | ||
Oh, yeah. | ||
So I was once part of a conversation. | ||
I was going to give a talk and I was going to a family and I was on a private jet, in fact. | ||
And the conversation was discussion of different private jets and which private jets are better than so on. | ||
This other person has this really nice private jet. | ||
And it just means that, like, at no stage do you ever lose the, like, oh, I could just have this nicer thing. | ||
No, because you can get to the point where you want a jumbo jet, like one of those Qantas Airbuses and deck that out like a house. | ||
Yeah, I mean, I'm sure that one of those Richard Branson-type characters probably has something like that. | ||
Yeah, that's probably right. | ||
Well, it seems to get to this... | ||
You hit this critical mass stage where you, you know, like these billionaire characters, where they start buying $100 million yachts and $400 million yachts. | ||
And what is the most expensive yacht? | ||
I believe it's a half a billion dollars or more. | ||
That's incredible, yeah. | ||
And you need to have a staff to... | ||
Take care of it the whole time. | ||
And if it ever... | ||
The thing is, I think if I had a yacht, that would make my life worse. | ||
Because now I'd be stressing about this yacht, like, what if it gets damaged, like, I feel bad that I'm not using it. | ||
Mmm, yeah. | ||
Yeah, I would imagine. | ||
Unless... | ||
Well, I guess not, though, because if you kind of... | ||
Look at this. | ||
Oh, Jesus Christ! | ||
It's a billion? | ||
Billion dollars on a yacht. | ||
The streets of Monaco is what it's called, and it is one billion dollars. | ||
Go to that thing. | ||
That's it? | ||
Thanks. | ||
Oh my god, it's a neighborhood! | ||
It's a floating neighborhood! | ||
I think on all of these things you should replace the cost with how many bed nets you could buy for children in sub-Saharan Africa. | ||
Oh, well, that's just ridiculous. | ||
Hold on, go up. | ||
Did one say $1.2 billion? | ||
Scroll up. | ||
Estimated price. | ||
Oh my god, the eclipse. | ||
Oh, 450 million to 1.2 billion. | ||
That's like when you go to get it made, and you go like, how much is it gonna cost me? | ||
Like, between 450 million and 1.2 billion, you're like, ah, you know, normal money. | ||
Yeah, yeah. | ||
Normal shit. | ||
Fuck it change. | ||
That is fucking insane. | ||
Look at that goddamn thing. | ||
I mean, oh, it's a replica of the Monaco Grand Prix track. | ||
Oh my god, that's insane. | ||
So you can drive around on your yacht at a ridiculous rate of speed. | ||
So this guy probably has like a Ferrari that goes all over the surface of his crazy yacht. | ||
He's got a fake beach! | ||
But it hasn't been sold yet. | ||
Oh, it hasn't? | ||
So it's not actually owned yet, I don't think. | ||
Mmm. | ||
Oh, okay. | ||
It's gonna be interesting who buys that. | ||
They're gonna get a lot of attention. | ||
Well, there are enough people. | ||
There's a bunch of those people. | ||
Yeah. | ||
I mean, I don't know how many... | ||
If it's 1.2 billion, that's probably, there's only a couple of thousand people in the world who are worth that much. | ||
Really? | ||
Yeah. | ||
Even if they're willing to sink their whole fortune. | ||
How many billionaires do you think there are worldwide? | ||
Let's guess. | ||
Three and a half thousand billionaires. | ||
3,500? | ||
You sound very confident. | ||
I think it's about that, yeah. | ||
Oh, that's a large number. | ||
That is kind of crazy. | ||
3,500 people that have more than $1,000 million. | ||
unidentified
|
Yeah. | |
And there's old Will McCaskill. | ||
I know. | ||
35,000? | ||
Cuts it off. | ||
Half that. | ||
1,800. | ||
1,800 people that are billionaires? | ||
Oh, you're happy. | ||
Well, no, I'm just happy we've got a fact checker on here. | ||
Oh. | ||
Correct all my false statistics. | ||
Well, that's a lot of money, man. | ||
But it is one of those weird things where I do not think that money equates to happiness. | ||
One of the things that money does do is it alleviates the stress of bills. | ||
But a lot of those stress of bills can be alleviated by not buying as many things, right? | ||
It's like a lot of the stress of bills that people have is sort of self-imposed stress. | ||
Like you get a mortgage for a very large house, you have car payments, you have all these different things that you're paying for. | ||
So that kind of money stress that some people put themselves under Is actually not really necessary, right? | ||
Yeah, absolutely. | ||
So if you broke it down to what do you actually need? | ||
Just need a nice place to live where it's not crime-ridden and it's safe. | ||
You need a bed. | ||
What else do you need? | ||
Food? | ||
Yeah, you need food, exercise, obviously. | ||
Are you one of those no TV dudes? | ||
Do you have a TV? No. | ||
Well, I watch, you know, Netflix, HBO. Oh, okay. | ||
unidentified
|
All right. | |
Just finished Veep, which I love. | ||
Is it good? | ||
Yeah, it gets better. | ||
The first seasons aren't so good, but then it gets really good. | ||
Really? | ||
I don't have that kind of patience for not-so-good seasons. | ||
Oh, yeah, I just get addicted. | ||
Even if I watch something and I think it's awful, I still just, I will get addicted. | ||
unidentified
|
Right away? | |
I have to watch all of it. | ||
Yeah, I have, like, the most compulsive personality. | ||
Have you seen House of Cards? | ||
I've not seen, deliberately not started House of Cards. | ||
Oh, that's a good show. | ||
That's a good show. | ||
I'm deep into that. | ||
Yeah, like, Game of Thrones makes my life worse. | ||
I, like, hate it. | ||
Really? | ||
I think it's amazing television, but I find it just so discussing. | ||
Because it's so good? | ||
I still have to watch it all the time. | ||
Why do you find it distressing? | ||
The violence? | ||
Yeah, the violence, people getting their heads popped and stuff. | ||
Oh, that one with the mountain? | ||
Yeah, that's the one that really stays with me. | ||
unidentified
|
Woo! | |
That's rough, yeah. | ||
It gives me a lot of anxiety because I know there's only two seasons left. | ||
And the next season, this one coming up, is only seven episodes, and the final season is only six. | ||
I'm so happy about that. | ||
It's like... | ||
It's not making me happy, Will. | ||
I'm not very happy about that at all. | ||
It's like someone saying they're going to stop selling heatherwin or something. | ||
And then you're like, well, I'm going to have to get hooked on OxyContin's then. | ||
That's what I feel. | ||
I'm going to have to watch the whole season all over again, or the whole series. | ||
So you have a television. | ||
You have a computer, I'm sure. | ||
Yeah, of course. | ||
I have a computer. | ||
Yeah, I like move around a ton, so I normally, like I don't have a house, but it wouldn't be convenient to have a house because I'm traveling so much. | ||
So you rent an apartment or something? | ||
Yeah, I rent an apartment. | ||
You live in England? | ||
I live in Oxford most of the time. | ||
I spend quite a chunk of my time out in the Bay Area. | ||
Like, a significant part of our staff and the non-profits out there. | ||
I've got lots of contacts, sister organizations out there. | ||
So most of your time, it seems like you're spending working for charitable organizations or Yeah, so I have kind of three hats. | ||
So one is an academic, so I'm a professor at Oxford. | ||
Second is this kind of more public figure where I'm talking about these ideas through books or on this podcast and so on. | ||
And then third is I run a nonprofit called the Center for Effective Altruism, which is more about like finding the best charities, the ones that are doing the most good, going to help other people the most, and trying to promote them and try and get people to give more and to give more effectively. | ||
Yeah, we've gone over ineffective charities, or I shouldn't say ineffective, but charities that are, the way they're structured, when you look at how much money is actually going towards the charity itself, and how much is going towards the structure of the organization, it's kind of crazy. | ||
unidentified
|
Yeah. | |
Yeah, I mean, I think, so that's normally the focus of ineffective charities is on, like, yeah, how much is spent on overheads. | ||
Right. | ||
But I actually think that's not the most important thing. | ||
The most important thing is, what's the charity actually doing? | ||
Like, what's the actual program? | ||
So, one charity, for example, that I'm sure, like, you'll find funny is a charity called Homeopaths Without Borders. | ||
And it goes to Haiti, in particular, and distributes homeopathic remedies, which don't work. | ||
They don't provide any health benefit. | ||
And even if it had a 0% overhead cost, so apps spent nothing, everyone was volunteers, it would still be a bad charity. | ||
You still shouldn't be giving to that charity. | ||
Right. | ||
That's a hilarious one. | ||
I didn't know that that one existed. | ||
Yeah, yeah. | ||
It's kind of small. | ||
I would imagine. | ||
Thankfully. | ||
Homeopaths without borders. | ||
unidentified
|
Jeez. | |
God. | ||
But then there's some super effective charities like, you know, a program that's, you know, saving a life with every three and a half thousand dollars like the Against Malaria Foundation. | ||
Even if they were spending a bunch on, you know, investigating what the best areas to focus on or like paying their staff more. | ||
If what you should just care about is how much money you're putting in and what you're getting as an outcome. | ||
Right. | ||
Well, I think it's impossible for you to give $10 and all $10 is going to go directly to the charity because there's got to be overhead. | ||
There's got to be infrastructure. | ||
There's got to be a bunch of people working there, rent. | ||
There's costs. | ||
But the question is, at what point does it become kind of a scam? | ||
Because there are most certainly some organizations that appear to be charitable organizations but are really kind of a scam. | ||
Yeah, there's definitely some. | ||
So like the Kids Wish Network, for example, kind of like the Make-A-Wish Foundation, similar idea. | ||
And they spent 99% of their budget on fundraising. | ||
So they were just like this kind of charitable Ponzi scheme, basically. | ||
So they spent all their money on fundraising itself. | ||
Yeah, to then invest in more fundraising. | ||
And 1% somehow or another gets out there. | ||
Maybe it's not as high as 99%, but it was about 90%. | ||
Something crazy. | ||
So what does that money get to? | ||
What do they do with the actual money itself? | ||
And then the idea behind that was granting wishes for sick children. | ||
Do you remember the San Francisco thing with Bat Kid? | ||
There was a big event, lots of publicity around it. | ||
Was Bat Kid a child that had some strange disorder? | ||
Yeah, so the child... | ||
I don't know the details. | ||
I think the child had leukemia. | ||
Their wish was that they wanted to be Batman for the day. | ||
Oh, okay. | ||
This is a different thing. | ||
Yeah, okay, cool. | ||
So the Make-A-Wish Foundation set up this amazing story where they've got to drive in a Batmobile and have this fantastic day where they're basically Batman for the day. | ||
Kids' wish network is doing basically the same thing. | ||
They find seriously sick kids, often terminally ill kids, and say, what one thing would you want? | ||
And we'll make it happen. | ||
But there is a lot of focus on particularly bad charities. | ||
You know, the ones that are just really corrupt or completely dysfunctional. | ||
I think that's not actually the most important message. | ||
What's most important is just even among the charities that are kind of good, even the ones that are making a difference, there's still a vast difference in the impact that you have. | ||
Difference of hundreds or thousands of times between the charities that are merely good and the ones that are really the very best. | ||
And that's primarily dependent on what program are they focusing on. | ||
unidentified
|
Hmm. | |
So, is there any charity that people should avoid spending their money on? | ||
Like, are there charities that you feel like are just so ridiculously ineffective? | ||
Yeah, I mean, like, the ones we mentioned of Kids Wish Network or Homeopaths Without Borders. | ||
The Homeopaths Without Borders is just ridiculous. | ||
It's like voodoo on parade. | ||
Just stop. | ||
Yeah, I mean, there's another one, I can't remember it, but it does... | ||
Astrology Without Limits? | ||
Astrology Without Limits. | ||
No, it does dolphin therapy for autistic children, which has no evidence of working, but does actually just have some, like, risk of the children drowning. | ||
Oh, Jesus Christ. | ||
Yeah, so you can, like, cherry-pick these examples, but the thing is that these are just, like, not really representative. | ||
In general, I think charity's doing good, but the question is just, like, in the same way as if you're buying a product for yourself, you don't just want to get, like... | ||
You know, a laptop, as long as it works. | ||
You want to find, like, what's the best laptop I can get with my money? | ||
Right. | ||
Or if you're investing, you want to not just get, like, an okay return. | ||
You want to see, well, what's the best return I can get? | ||
Right. | ||
So in that sense, I think, like, the number of charities that you think are just, yeah, this is really competing for being the most effective charity in the world, that's actually very small. | ||
So GiveWell, for example, is an evaluator. | ||
It looks at all sorts of different global health And global development charities. | ||
And its list of charities that's like, yeah, this is just super good. | ||
You should really be donating to them. | ||
It's only seven charities long at the moment. | ||
Wow. | ||
And that's up from last year when it was only four charities long. | ||
unidentified
|
Wow. | |
Seven charities out of how many? | ||
I mean, what is the overall total of active charities? | ||
It's got to be in the thousands. | ||
Hundreds of thousands. | ||
Yeah, I'm sure. | ||
What got you involved in this? | ||
You're a young guy. | ||
You seem like you should be playing video games and skateboarding or something. | ||
I spent a lot of my teenage years playing video games. | ||
Yeah? | ||
Yeah. | ||
It was, again, compulsive personality. | ||
Yeah. | ||
I need to ban myself from doing it. | ||
So your compulsive personality is now going towards good things. | ||
Yeah, the key was managing my life so that the things I get really focused on and addicted to were good things rather than bad. | ||
So yeah, it all started back in... | ||
So I was back in high school, kind of undergraduate, of being very convinced by the arguments of this philosopher, Peter Singer. | ||
Oh, I know Peter Singer. | ||
He's like a radical animal rights activist as well, right? | ||
Yeah, he has a few things. | ||
And he had this argument, which is that, you know, the way I tell the story is, imagine someone walks, is walking past a shallow pond, and they see a child drowning in that shallow pond. | ||
And they could run in, and they could save the child. | ||
But they're wearing a really nice suit, a suit that costs like $3,000. | ||
And so they say, no, I'm not going to save that child. | ||
I'm just going to walk by and let it drown, because I don't want to lose the cost of this suit. | ||
I normally say, look, in moral philosophy, we have a technical term for people like that. | ||
They're called assholes. | ||
And this is how I convey it in my seminars. | ||
And obviously we all agree, like, yeah, come on, if it's just you could clearly save this child that's right in front of you, you ought to do that. | ||
The cost of $3,000 does not count. | ||
But then what Peter Singer's insight is, he says, well, what's the difference between that child that's right there in front of you and that child that's in sub-Saharan Africa who you could save? | ||
You'll never meet them, for sure. | ||
But you could still save their life with just a few thousand dollars if you donate it to a really effective non-profit. | ||
And he considers all the different ways in which these cases might be disanalogous, but decides ultimately, like, no, there's actually just no morally relevant difference. | ||
And so, yeah, we do just have an obligation to give away at least a very significant proportion of our income. | ||
And I was really convinced by this kind of on an intellectual level for many years, but I never really did anything about it. | ||
And not until I went to Oxford to do a postgraduate degree in philosophy. | ||
And in the summer between then I needed some money, I worked as a fundraiser for Care International, a global development charity. | ||
So I was one of those annoying people in the street who would kind of get in your way and then ask you to donate $10 a month. | ||
And it meant that all day, every day, I was talking about, like, look, this is the conditions of people in extreme poverty. | ||
We can do so much to help people at such little cost to ourselves. | ||
You know, why are we not doing this? | ||
And I was just over and over again kind of getting these apathetic responses. | ||
And I was just getting so frustrated because I just thought, look, these people are just not living up to their own values. | ||
People clearly do care, but there's some sort of block going on. | ||
And then I thought, well, I'm going to do philosophy. | ||
And at the time, I was planning to do philosophy of language, logic, very esoteric stuff. | ||
And so I thought, well, I'm not living up to my own values. | ||
I should really try and make a change. | ||
And so I went to Oxford, and I started asking a whole bunch of different academics, well, what's the impact of your work? | ||
What kind of a difference have you made? | ||
And normally they were like, I'm not really in it to make an impact. | ||
I'm just kind of interested in these ideas. | ||
And that was pretty disheartening. | ||
But I kept persisting until I met another postgraduate student called Toby Ord. | ||
And he just blew me away. | ||
Because he had also been convinced by these ideas, but he'd gone one step further. | ||
And he'd said, yep, I've made a commitment to give away almost all of my income over the course of my life, about a million pounds. | ||
At the time, he was living on 9,000 pounds, saving 2,000 pounds, and donating 2,000 pounds. | ||
So he was like really hardcore. | ||
But the thing, as well as actually taking these ideas and putting them into practice, what really blew me away was just how positive he was. | ||
And it was not that he was kind of wearing this hair shirt, flagellating. | ||
Instead, he was saying, look, this is an amazing way to live. | ||
We have this amazing opportunity to do a huge amount of good, to help so many other people, thousands of people, at what's actually a very low cost to ourselves. | ||
And me having that one person who also kind of shared my worldview, shared my ambitions, just meant kind of gave that little psychological block, was lifted. | ||
And it meant that I was like, okay, cool, I'm on board. | ||
First, I kind of committed 10%. | ||
Then I was like, no, actually, I think I can do this further pledge. | ||
And then that meant I had this question of, well, I'm planning to give away like a million pounds over the course of my life. | ||
Where should that money go? | ||
You know, I want to make sure it has as big an impact as possible. | ||
And that meant I started digging into, well, how can we compare between different charities? | ||
I found there was a ton of work from health and development economics that could help us to answer this. | ||
And what began as this kind of side project between these two, you know, Ivory Tower academics, me and Toby, We found that loads of people just were really taken by this idea, both of giving more, but in particular of giving more effectively. | ||
And over time, this kind of global movement called effective altruism started to form around these ideas and started to broaden in a couple of ways. | ||
So, one is that I broadened away from just charitable donations to also thinking about, well, what should I think about with respect to my personal consumption? | ||
What should I think about with respect to my career? | ||
If I'm really aiming to do as much good as possible, what should I do? | ||
And then secondly, also starting to think about cause areas other than just global poverty as well. | ||
And it tends to be the case that within the community at the moment, the cause areas that people think are the very most pressing are global health and development still for sure, but then also factory farming, where, again, there's just such a vast amount of suffering, which is completely unnecessary. there's just such a vast amount of suffering, which is | ||
And then also preservation of the long run future of humanity and worrying about risks of global catastrophe, things that may be fairly unlikely, but would be very, very bad if they did happen, especially relating to new technology like novel pathogens. | ||
viruses you could design in a lab and so on. | ||
Well, you're also very concerned with AI as well, right? | ||
Artificial intelligence? | ||
Yeah, that's exactly right. | ||
And that's, I think, in this category of If you look at the history of human progress, technological change just creates these huge step changes in just how humanity progresses. | ||
So it was only 12 years in 1933 to then 1945 between Leo Szilard first coming up with the idea of the nuclear chain reaction. | ||
And that was just a purely conceptual idea on a bit of paper. | ||
12 years from that to then the deployment of the first nuclear bomb. | ||
And think how radical a change that is, suddenly being in the nuclear age. | ||
That was only 12 years. | ||
We went over the invention of the airplane to dropping an atomic bomb out of the airplane. | ||
I believe it was 50 years, right? | ||
Somewhere in the neighborhood of 50 years? | ||
Can you take a few? | ||
Yeah. | ||
So technological progress can suddenly go in these huge leaps. | ||
That we're not prepared for. | ||
That we're often very not prepared for. | ||
And I think artificial intelligence is in this category where we're really making radical progress in AI, especially over the last five years. | ||
It's really one of the fastest developing technologies, I think. | ||
And yet has huge potential in so many different ways. | ||
And as with any new technology, huge positive potential. | ||
Really, if you get AI right, you can solve almost any other problem. | ||
But also potential risks as well. | ||
Where there's risks that might be more familiar, you know, worries about automation, unemployment. | ||
Worries about autonomous weapons, which I think should be taken seriously. | ||
And then also just worries about, well, what if we really do manage to make human-level artificial intelligence? | ||
Very good arguments that would then quickly move to superhuman-level artificial intelligence. | ||
And what then? | ||
Are we now in a situation like the Neanderthals versus Homo sapiens where we've suddenly created this intelligence that is greater than our own? | ||
Are we able to control that? | ||
Are we able to ensure that transition is positive rather than negative? | ||
Have you ever considered the possibility when you look at all the impoverished people in the world, all the cruelty, all the people that are so just concerned with material possessions and shallow thinking and war and just the evil that men do? | ||
Is it possible that we're sort of an outdated concept that what we are as these biological organisms that are still slaves to the whole Darwinian evolutionary survival of the fittest natural selection sort of paradigm that we've operated under for all these many thousands and hundreds of thousands of years as humans Is it possible that we're giving birth to the next thing? | ||
That just like we don't long for the days when we used to be monkeys throwing shit at each other from the trees, one day we will be something different, whether it will be a combination of us and these machines, or whether we're going to augment our own intelligence with some sort of Artificial, whether it's some sort of an exo-brain or something that's going to take us to that. | ||
Or it's going to be simply that we create artificial intelligence. | ||
Artificial intelligence no longer has use for us because we're illogical. | ||
And then that becomes the new life form. | ||
And then we're hiding the cave somewhere, hoping the Terminators don't get us. | ||
Yeah, I mean, I think, like, over the long term, I mean, with all of these things, the question of kind of timelines is very hard. | ||
And sometimes people want to reject this sort of discussion because, oh, this is so far in the future. | ||
Whereas I think, like, if something's sufficiently important, we should be talking about it even if maybe it's, you know, decades or generations hence. | ||
It might not be, right? | ||
I mean, it might not be that far away. | ||
But who knows? | ||
Like with the atomic bomb, that was hugely fast progress. | ||
Just, you know, 12 years. | ||
So we want to be prepared. | ||
But then as for, like, yeah, is it going to be Homo sapiens around for the next, you know, in a thousand years' time? | ||
I think that would just be extremely unlikely. | ||
That will be around? | ||
You think we're not going to be around anymore? | ||
Yeah, I mean, I think if intelligent creatures are still around, it's going to be in a thousand years' time, it's going to be something that's not... | ||
Homo sapiens, like you said, there's kind of three... | ||
Or it's like not what we would consider kind of typical humans now. | ||
Well, we're obviously severely flawed, right? | ||
I mean, if you ask people, if you ask the average person, do you think that in your lifetime you can imagine a world without war? | ||
Most people say no. | ||
Like the vast majority of people say no. | ||
A world without crime, a world without violence, a world without theft. | ||
Most people say no. | ||
That just shows you how inherently flawed most people view the human species. | ||
We know that we can do it in small groups. | ||
Like if the three of us were on an island, I'm pretty sure we wouldn't be stealing from each other and murdering each other, right? | ||
Just a few of us. | ||
But when you get to large-scale humanity, it becomes very easy to... | ||
Disassociate or create this diffusion of responsibility where there's you know enough people So you don't really value them as much and you're allowed to get away with some pretty heinous stuff Especially when you consider drone warfare things that we're able to do with long distance where we're not seeing the person that we're having the effect on It's a very flawed thing the human species Wouldn't it be better if something better came along? | ||
I mean, I think there's, yeah. | ||
Sorta. | ||
Not good for you and I, though. | ||
We'd be obsolete. | ||
Yeah, I mean, well, we're going to be obsolete in a hundred years anyway. | ||
I mean, as in, we'll be dead. | ||
Right. | ||
So the question is just, will our kind of, you know, generations hence, will, you know, the question's not really about us, it's about our grandchildren. | ||
What really forces the idea... | ||
To be considered, what is valuable about life? | ||
Is it the experience? | ||
Is it happiness? | ||
Is it shared fun? | ||
Is it love? | ||
What's valuable about being a person? | ||
And how much of that is going to change if we're made out of something that people have created or maybe we're made out of something artificial intelligence has created because we've created something that's far superior to us. | ||
So yeah, I mean, I have a view on this, as you might expect. | ||
I mean, in my view, the thing that's valuable and the only thing that's valuable ultimately is conscious experience. | ||
So that's good conscious experiences, happiness, joy, and so on. | ||
That's positive. | ||
That's good for the world. | ||
Negative conscious experiences, suffering, pain, distress, those are bad for the world. | ||
And so that's why it's a good thing for me to do some service to you to benefit you, but I can't do anything good to benefit this bottle of water. | ||
Right. | ||
And so then the key question in terms of what should we think about, supposing it is the case that, you know, a thousand years time, it's now synthetic life, it's artificial intelligence or something that's, like, that are in charge and there are no longer any humans, would this be good or bad? | ||
The question for me is, you know, are they having conscious experiences and are those conscious experiences good or bad? | ||
So that's it. | ||
Just conscious experience. | ||
That seems so selfish. | ||
It's a controversial view. | ||
There's a thought experiment which is often used to challenge this view. | ||
Do you want to hear it? | ||
Yes. | ||
So it's called the experience machine. | ||
And the idea is, supposing that tomorrow you could plug into this machine. | ||
It's like the most amazing VR you could ever have. | ||
And in this machine, you will live, let's say you'll live 200 years, and you'll be in the most amazing bliss. | ||
You'll have the most amazing experiences of, you know, and your experiences will involve incredible relationships, incredible creative achievement and so on. | ||
And it'll just be like the perfect life that you could live experientially for the next 200 years. | ||
And the question is... | ||
Insofar as you are self-interested, so put aside considerations you might have about wanting to make the world a better place, but just insofar as you care about yourself, would you plug into this thing? | ||
Bearing in mind that in a certain sense, all of these experiences are going to be fake. | ||
You're going to have experiences of having amazing friendships, writing great works of art and so on. | ||
But they're not going to be real. | ||
It's just sensory inputs provided by a computer. | ||
So the question is, would you, or ought you, insofar as you're self-interested, plug into this machine? | ||
What would you answer? | ||
That's a very good question. | ||
I might already be plugged into it, right? | ||
Oh, so this is a great question. | ||
And I think it's a good argument against, is the question, well, supposing you were already plugged in. | ||
Would you unplug? | ||
Supposing I told you that actually you're a banker in Monaco and... | ||
Fuck Monaco. | ||
I'm not interested in that. | ||
No. | ||
I want to stay right here. | ||
Yeah. | ||
Can I stay plugged in, please? | ||
Do I have to pay more? | ||
What do I have to do? | ||
You would have to do nothing, but... | ||
It's interesting, then, if people think... | ||
So most people... | ||
And it seemed like maybe yourself would, intuitively, you'd say, no, I wouldn't plug into this machine. | ||
I don't know if I would say that. | ||
I would have to really deeply consider it, because right now, it's just so abstract, this idea that that could be possible. | ||
It's fantasy. | ||
We're having fun. | ||
But... | ||
When you talk to the leading minds when it comes to virtual reality or artificial reality or simulation theory, when they start talking about what will be possible one day, they're going to, without a doubt, within 100 years or 500 years or whatever the number is, They're going to be able to create an artificial reality that's indiscernible from this reality. | ||
You're going to be able to feel things. | ||
There's going to be emotions that come to you. | ||
They're going to be able to recreate every single aspect of an everyday life. | ||
It's just a matter of time. | ||
I mean, they're really close now. | ||
And not really close in terms of, like, they don't give you emotions and they don't give you feeling. | ||
But if you put on an HTC Vive and go through some of those virtual reality games, I mean, it's bizarre how real it feels. | ||
Yeah, yeah. | ||
And when you go back to like playing Pong, did you ever play Pong? | ||
You know, it's such a weird thing that that happened inside of our... | ||
When I was a kid, Pong came along and we were blown away. | ||
We couldn't believe that we could actually do something on the television. | ||
You could see it move. | ||
It was so fantastic. | ||
And if you gave that to one of my kids, they'd spit on it. | ||
They'd be like, what kind of piece of shit video game is this? | ||
They would think it's just so ridiculous. | ||
But to me, at the time, it was amazing. | ||
You go from that to one of these HTC Vive games, which has all taken place within my lifetime, and you go, well, a lifetime from now, if you follow the exponential increase in the ability, the technological innovation, it's going to be spectacular. | ||
It's going to be... | ||
So when that does happen, how will you be able to know... | ||
If it's indiscernible, how will you know if you're in it? | ||
And how do you know if you're not in it right now? | ||
That's the real question, right? | ||
Yeah, I mean, there are actually some arguments for thinking, you know, this is Nick Boston, a colleague of mine, his simulation argument, for thinking we are in a simulation right now. | ||
In fact, it's very likely that we should be. | ||
Yeah. | ||
Do you buy that? | ||
I actually, I'm kind of agnostic. | ||
I think you should take the hypothesis seriously. | ||
But I think the... | ||
The argument doesn't quite go through for... | ||
What's attractive and what's not attractive about that theory to you? | ||
His version of it. | ||
Yeah, so the argument is that... | ||
Frame it, if you could, like his version of it. | ||
Yeah, so his argument is that in the future, supposing we believe that the human race doesn't go extinct, or post-humans don't go extinct over the next few thousand years... | ||
And secondly, that the people in the future have an interest in recreating their past, just for kind of historical interest or for learning, that they're going to be interested in running, because they're now going to have huge, amazing computer power. | ||
They're going to be able to create simulations of the past. | ||
That they're going to have some interest in running simulations of the past. | ||
Well, if that is true, then the number of simulations that these future people are going to be running will vastly outnumber the number of actual timelines, the kind of base universe, as it were. | ||
So for the one real universe where history kind of unfolds, there's also, let's call it, 10,000 simulations of that universe. | ||
And if that's true, then... | ||
It's the case that, well, given that I'm just, you know, these things really are indiscernible for the people who are inside them, it's overwhelmingly likely, just in the base rate, that I'm going to be in a simulation rather than in the real world. | ||
And what Nick Bostrom says actually is not that we definitely are in a simulation, but he just points out the conflict between these three kind of beliefs that we would seem to hold. | ||
One is that we're not going to go extinct in the near future. | ||
Two is that, you know, people in the future will have some interest in simulating the past. | ||
And thirdly, that we're not living in a simulation. | ||
And he himself gives, you know, a reasonable degree of belief. | ||
Maybe he thinks it's like 10% likely, 15% likely that we're in a simulation. | ||
Other people who understand the argument vary a bit more, but I think it's something you should at least be taking seriously. | ||
The reason I reject it is kind of even weirder, I think, or it's somewhat technical. | ||
But the basic thought is just that According to the best guesses from cosmologists, we're actually in an infinite universe. | ||
The universe is infinitely big. | ||
Now, we can't affect an infinitely big universe. | ||
We're restricted by the speed of light to what we can affect and to what we can see. | ||
But the best idea, according to the best theory we have, the universe just kind of keeps on going. | ||
But if so, then there's already like an infinite number of observers of people kind of in that bottom universe. | ||
And that means that you've now got kind of an infinite number of people kind of experiencing things, and then you've got the simulations, and you've got like 10,000 simulations. | ||
But you can't say there's 10,000 times as many simulated beings as there are real beings, because there's already an infinite number of real beings. | ||
You're looking so consternated. | ||
No, no, no, go ahead, keep going. | ||
But that means if you've got... | ||
So the key of Bostam's argument was that... | ||
You've got 10,000 times as many simulated beings as you have real, like, non-simulated beings. | ||
But the problem is an infinite number of real beings because the universe is infinite. | ||
Yeah, that's right. | ||
And so if you've already got an infinite number of real beings, the fact that you've got 10,000 times infinite, that's still infinite. | ||
Right. | ||
And you can't... | ||
It's kind of a case where, like, our best methods of assigning degrees of belief to things kind of run out. | ||
If you think it's, you know, there's an infinite number of... | ||
Simulated beings, an infinite number of real beings, then what's the chance of you being one or the other? | ||
I mean, like, we don't actually have the, like, tools to be able to answer that. | ||
Neil deGrasse Tyson was trying to explain this to me a couple of weeks ago, that there are infinities that are bigger than other infinities. | ||
Yeah, so that's also the case, but... | ||
Yeah, that was right. | ||
unidentified
|
Broke my brain again. | |
So the number, but the key, we're all talking about the lowest, what's called cardinality, the smallest infinity, which is the size of the infinity of all the integers, one, two, three, four, counting numbers. | ||
unidentified
|
Right. | |
And if you take that size of infinity and multiply it by 10,000, let's say, you just get the same number, which is infinity. | ||
Right. | ||
And then what Neil was saying was, yeah, there are these even bigger levels of infinity. | ||
So if you look at not just all the counting numbers, but all of the numbers you can make fractions out of, a half, a quarter, an eighth, and so on, that's just more numbers than the infinity of the counting numbers. | ||
I've spent a lot of time trying to understand why human beings are so obsessed with innovation, why human beings are so obsessed with technological progress. | ||
And one of the things that I continue to come to is that we think of everything in this world as being natural, but the behavior of butterflies and wolves and the way rivers run down from the mountain. | ||
But we don't think of ourselves and our own behavior as natural. | ||
We don't think of our own thirst for conquest and innovation and even materialism. | ||
I think materialism is probably a very natural reaction to our need to somehow or another fuel innovation. | ||
And that one of the ways to ensure that innovation is constantly fueled is that people are constantly obsessed with buying new things, constantly obsessed with the latest and greatest, which fuels innovation. | ||
And when you look at the universe itself, and you look at all the various things that we know to be natural processes in the universe, like in order to make a human being, a star has to explode. | ||
When you literally are made out of stardust, which is... | ||
When you run that by people for the first time, they go, wait, what? | ||
In order for you to have carbon-based life form that has to be created inside a burning, dying star, and that's the only way you make this thing, what you are right now. | ||
And then that thing makes artificial reality, and then that thing makes... | ||
Perhaps even crazier. | ||
I mean, if you follow the ideas of technological progress, if something gets to a point where it's indiscernible from reality, how do you know it's not a new reality? | ||
How do you know it's not a new kind of reality like Jamie sent me hip to these artificial worlds that people have created it online Where they're essentially infinite and they're constantly Changing and morphing and growing and the games are terrible people don't like them because you go you go to places And there's fucking nothing there. | ||
Yeah, and you can go to an infinite number of these places and there's nothing now These adventures are non-existent. | ||
So you're on these You're in these gigantic fake worlds where you're traveling from place to place, but right now we're looking at it in a very two-dimensional way. | ||
You're looking at it on a flat screen. | ||
One day it's not going to be two-dimensional. | ||
One day it's going to be something that you're interfacing with. | ||
Your consciousness is interfacing with it. | ||
Is it only real if we can take it and drop it on something? | ||
If we can hit it with a hammer? | ||
If we could put it on a scale? | ||
If we can use a measuring stick and measuring it? | ||
Is it only real there? | ||
Or is it real if it follows every single check? | ||
Like if you check off every single item on the list of conscious reality and conscious experience? | ||
Yeah, I think that's a great question, because I think the dichotomy that a lot of people think in terms of natural, non-natural, I think it's just meaningless. | ||
I mean, people firstly think this is natural and this is not. | ||
I mean, in a sense, everything we're doing is natural because homo sapiens are part of a natural process. | ||
And maybe in another sense, everything we're doing is not natural. | ||
But then why does that matter? | ||
What's the model relevance of something being natural versus not natural? | ||
Lots of stuff that happens in the natural world is just really awful. | ||
Huge amounts of cannibalism, murder, suffering. | ||
So it's not clear why we would care about something being natural rather than non-natural. | ||
But then the second question is, yeah, let's consider this virtual reality again, this experience machine that you could plug yourself into. | ||
And as part of the description, I said, oh, none of this would be real. | ||
You'd have all of these interactions with people that you think are friends and so on, but that wouldn't be real. | ||
And I think you could very well push back on that and say, why should something be physically instantiated? | ||
Like... | ||
In order for it to count as a real experience. | ||
Why is it not the case that in this virtual reality you're interacting with algorithms, but that's just as much... | ||
At least it's possible for that to be just as much friendship as if you're interacting with people who are, you know, flesh and blood. | ||
And I think it's hard to explain kind of what the difference would be. | ||
Because, you know, if you think about Star Trek... | ||
Jean-Luc Picard can be friends with data and android. | ||
He's not biological, but we think that you can still have moral worth and friendships and so on with creatures that are not made of human biology. | ||
In which case, why does the fact that something merely lives on silicon, why wouldn't that exist? | ||
Or as seemingly merely software, why does that mean you couldn't have a genuine friendship with that thing, if it acts in a sufficiently sophisticated way, perhaps? | ||
Isn't there also an issue with our incredibly limited ability to view reality itself? | ||
Because we're only viewing the dimensions that are relevant to us in this current state of carbon-based life form, this talking monkey clinging to the spaceship flying through the universe, right? | ||
This is what's important to us. | ||
But when you pay attention to those, the dudes who write on yellow legal pads and they get into quantum physics and they have all those crazy equations that nobody but them understands, Maybe you do. | ||
I look at that shit and I go, what the fuck are they writing? | ||
But they believe, I mean, what is the current model? | ||
They believe there's at least 11 dimensions. | ||
There perhaps could be more. | ||
What if there is a dimension that you can plug into that it's purely consciousness-driven, meaning there's no physical experience, there's no touching the ground, there's no gravity, but you exist in a conscious state and it's perpetual. | ||
Like, if you take A rocket ship, and it gets past our gravity and shoots off into distant space, and you have a clear shot of, you know, 14 billion years back to the beginning of the universe itself with nothing in the way, you're just gonna keep going for 14 billion light years. | ||
You're just gonna keep going. | ||
Like, what if there is a place that your consciousness can go to like that, where it can't? | ||
It's no longer burdened by biology, by the timeline of birth to death. | ||
By the limitations of the flesh, but consciousness itself can exist in some bizarre dimension that we just haven't access to. | ||
So yeah, I mean, I think consciousness is probably just ultimately a physical process. | ||
Why do you think that? | ||
In, ultimately because of conservation of energy. | ||
The reason being, so, you know, there's this age-old philosophical debate between the monists and dualists. | ||
People who think, is consciousness just ultimately some sort of physical process? | ||
Or is it something special? | ||
So Descartes thought there was this... | ||
Pineal gland, this little bit of your brain, and your conscious kind of soul was just kind of steering your monkey body through this pineal gland. | ||
But the question is just for why... | ||
I think the strongest argument about why that couldn't be right is it seems to be... | ||
It would have to be creating energy out of nowhere. | ||
And we've never... | ||
It seems to be just fixed law of the universe that that just can't happen. | ||
Because in order for, you know, this conscious mind to, if it's not merely a physical process, if it's not just the brain, in order for it to be able to affect what this physical entity is doing, it would have to use energy to be able to do that. | ||
So the energy would have to be coming from somewhere, and if it's not coming from just the physical realm, then suddenly we've got this counter-example to all the rest of science. | ||
Sort of, but are you aware of the theories of human neurotransmitters being pathways to other dimensions like dimethyltryptamine? | ||
Do you know about all that? | ||
I mean, I know about DMT. Do you know it's produced in the pineal gland? | ||
Where Descartes thought that all that stuff was going on, the seed of the soul, what the Egyptians called the Eye of Horus, and the reason why the Catholics and so many ancient religions were so focused on pine cones and their... | ||
Their art and their imagery, that's the pineal gland. | ||
That's the image of it. | ||
That's what it's supposed to represent. | ||
And for people who've had these intense transformative psychedelic experiences by consuming exogenous dimethyltryptamine, which is produced by the brain, that you have these insane transformative experiences where you feel like you are traveling to other dimensions. | ||
Yeah, so I think... | ||
I mean, I do want to say, like... | ||
Have you done any of that? | ||
I've never done DMT, no. | ||
Oh, you son of a bitch. | ||
unidentified
|
Why not? | |
What are you doing? | ||
You're wasting your time. | ||
I know. | ||
I'm such a good boy. | ||
But it's something that's in the brain. | ||
I mean, it's a natural product of human biology. | ||
I mean, whether it's natural or not isn't the question. | ||
Just, you know, if I'm going to have a career based on my brain, I want to be very careful to... | ||
To not break it? | ||
To not break it, yeah. | ||
Yeah, but it's one of the most transient drugs ever observed in the body. | ||
Your body brings it back to baseline in like 15 minutes. | ||
Okay, because I mean, there's a lot of, I do think there's like tons of, people very often greatly overestimate the risks of non-legal drugs, like MDMA is like super safe and so on. | ||
Overestimate the risk, is that what you're saying? | ||
Of MDMA? Yeah. | ||
MDMA is weird, right? | ||
That's a weird one. | ||
It's not a natural drug. | ||
Dimethyltryptamine, I think the real concern would be psychological, because what you face is so bizarre. | ||
Terence McKenna had the best quote about it, I think he said, that you would risk death by astonishment. | ||
Yeah. | ||
It's so bizarre that it's almost a sin for a guy as smart as you to not experience it. | ||
But you just come right back and even when you're there, you're there. | ||
It's you. | ||
It's not like your consciousness dissolves into some bizarre quasi-living state and then you have to work your way back to being you again. | ||
No, you're you. | ||
You're Will McCaskill in the dimension, whatever the fuck it is. | ||
But what's crazy about it is that this is produced in the very area where Descartes was believing the seat of the soul is, and so many different Eastern religions, and all this psychological, like, all these different... | ||
Religions and all these different cultures, they were all convinced that that one gland had some massive significance in terms of the spirit and the soul, whatever that means, whatever the spirit means. | ||
So yeah, so then the question is just in these experiences, is it the case that you're like genuinely seeing into another dimension? | ||
unidentified
|
Right. | |
Or is it the case that you just have a new kind of perspective on consciousness? | ||
So one thing I do think is that In terms of conscious experience, there's the sort of conscious experiences that humans have access to. | ||
And I think that must just be 0.001% of the entire landscape of possible conscious experiences. | ||
So if you think, imagine if you were a bat and you could echolocate. | ||
That's just a radically different conscious experience. | ||
I don't think that maps onto any sort of conscious experience that humans could have. | ||
Have you seen people do that? | ||
You see blind people? | ||
Some blind people can do that? | ||
It's pretty amazing. | ||
It is amazing. | ||
Very effectively, too. | ||
It's like shockingly effectively. | ||
Yeah, I think you're absolutely right. | ||
I mean, but there's also experiences, human experiences, that are available without drugs that some people have achieved through radical states of meditation and kundalini yoga, where they could achieve natural psychedelic states. | ||
Holotropic breathing, people that have done that have experienced, like, really radical psychological transformations and incredible psychedelic experiences from that as well. | ||
Yeah, and so I think, like... | ||
These sorts of experiences are very important, very interesting. | ||
I said that maybe we experience 0.01% of all possible conscious experiences, and that just allows you to see a little bit more of this potential vast landscape. | ||
Whereas I think there's nothing unmagical about saying ultimately that's all explained in terms of physics, in terms of different sorts of neurons firing and different sorts of transmitters and so on. | ||
We don't need to say, oh, and it's also this other thing which breaks all the known laws of physics that you're seeing into some other dimension in order for that to be an incredibly important thing. | ||
And nor is it unscientific to say we know almost nothing about consciousness. | ||
In terms of the areas of scientific inquiry, we have no understanding at all about the relationship between conscious experiences and, you know, what we would think of as physical processes. | ||
We really have no idea about, you know, if you give me any sufficiently complicated physical processes which are conscious and which are not, All we can go on is really this, well, I'm conscious, and so I know that things that are kind of like me are probably conscious too. | ||
And that's the best we've got, really. | ||
And this is known as a hard problem of consciousness. | ||
And philosophers often say that they've solved it with something, and I think it's always begging the question. | ||
I think we should be very open to the fact that Just as in, you know, 3000 BC, people had no idea about the laws of physics. | ||
This was just completely unexplored territory. | ||
We should think contemporary science, this is just a big, like, big black gap in our scientific understanding. | ||
And perhaps it's something maybe 21st century science, maybe 22nd century science can really get to grips with. | ||
It does seem like the ultimate question. | ||
Like, what is it for? | ||
Why is it here? | ||
What controls it? | ||
Is it in the mind? | ||
Is it external? | ||
Is the brain just an antenna that tunes into consciousness? | ||
The dimethyltryptamine question is so bizarre because it's the most potent psychedelic drug known to man and your brain makes it. | ||
What's it in there for? | ||
I don't know if this is a myth, but I've heard it's what gets made when you die. | ||
Yeah. | ||
They believe that during high rates of stress, your body believes you're going to die. | ||
And when you're dreaming, when you're in heavy REM sleep, your body produces larger amounts of it than baseline. | ||
But they don't know. | ||
It's really difficult. | ||
They've only just now, within the last few years, the Cottonwood Research Foundation, which... | ||
Dr. Rick Strassman has a big part of it. | ||
He's the guy who wrote the book DMT, The Spirit Molecule. | ||
He did a bunch of the first FDA approved drug trials with civilians where they took people and they gave them a schedule one drug dimethyltryptamine, which is so crazy that it's a schedule one drug that your body produces. | ||
But they gave it to people intravenously over the course of several months and they documented all the different trips and all the different commonalities that these people had in their experiences. | ||
And he's working very closely with the Cottonwood Research Foundation. | ||
And one of the things that they found is that they've recently discovered, it was just anecdotal evidence that it was produced by the pineal gland. | ||
We knew that DMT was produced by the liver and the lungs, but now they know for sure because they've isolated it in rats. | ||
So in living rats, they know that they produce DMT with the pineal gland. | ||
So that explains a lot of ancient Eastern mysticism and all the symbology, all these symbols that people had to represent this gland. | ||
Now they know, okay, well this gland definitely does produce this incredibly potent psychedelic drug. | ||
But now the question is, at what levels, during what periods of stress, do you have to bring someone to the point of death before they experience this? | ||
And if that is the case, is it possible that consciousness itself is something that we, since we haven't really figured out what exactly it is, is it possible that consciousness can travel Through this chemical pathway that maybe these intense dimethyltryptamine experiences are in fact a gateway to what people have assumed exists from the beginning of time, | ||
like an afterlife, or a sea of souls, or something, some stage of existence other than this physical existence that we all experience right now. | ||
Yeah, so, I mean, I feel like I'd be... | ||
Sounds like crazy talk, right? | ||
It sounds pretty crazy. | ||
It's coming out of my mouth and I'm going, what the fuck are you talking about, dude? | ||
I think I'd just be surprised if consciousness was just this one chemical. | ||
I think it's much more likely that it's this emergent phenomenon from this incredibly complex system of billions of different neurons firing in a certain way. | ||
And when you have a certain process that's sufficiently complex in the right way, somehow, and this is just this big black box that we've got no idea about, somehow subjective experience comes out of that. | ||
But it would seem... | ||
I mean, otherwise the issue is you could have just DMT traveling and just a test tube or something and Petri dish. | ||
And it would seem like, oh, is this Petri dish conscious? | ||
That would seem really strange. | ||
Why would that be the case? | ||
If you're breathing air and the air keeps you alive, like you're breathing in and bringing out, you don't think that air carries the life with it to another place, right? | ||
Air is just a component of life. | ||
It's something that your body requires. | ||
Yeah. | ||
So, I mean, it's possible. | ||
Maybe it's the case. | ||
Though, again, I feel I'd be surprised if it was like this chemical is necessary for consciousness in some way. | ||
I'm not saying it's necessary. | ||
But I am curious as to how consciousness varies. | ||
You know, consciousness and the actual feeling of being alive varies depending upon your health, depending upon stress levels. | ||
Depending upon love and happiness and all these different factors change the way you view the world, which is really interesting because in effect that changes consciousness and you can be more, you know, you can be more elevated like you can I guarantee you All this effective altruism that you're concentrating on is somehow or another elevating your consciousness because you're putting out so much love and so much happiness and you're helping so many people. | ||
There's so many positive benefits to your very existence. | ||
I've got to believe that somehow or another that manages to come back to you. | ||
I mean, it definitely comes back to me in kind of how I feel about my life. | ||
I mean, when we were talking about how money is just not the key to a happy life, the question is, well, what is? | ||
And the answers are having a great community, having a greater purpose in life, feeling like you're making a difference. | ||
So all of these reasons are why. | ||
So we've built up this kind of community around effective altruism. | ||
You know, people all around the world who are making a significant change. | ||
So for example, donating 10% of their income to the charities they think are most effective or pursuing a career that they think is really effective. | ||
And one thing I wasn't surprising from the outset, but I'm so happy happened, is that this strong community has formed. | ||
It's kind of like a little global village or something. | ||
And people have found that actually, far from being a sacrifice, as you might have expected, this is actually incredibly rewarding. | ||
Because you've now got this community of people who have shared aims to you, and you're all working towards this greater goal. | ||
And that's something that I think is very lacking in the world today. | ||
So many people just... | ||
They work 9 to 5, and they have a nice time on the weekend, but they're like, where is all of this going? | ||
At the end of my life, I'm really going to think, yeah, I made the most of this. | ||
Whereas if you think at the end of your life, like, yep, I dedicated my life to helping others, and I had this transformative impact on thousands of people, you're not going to think at the end of your life, gee, I really wasted that. | ||
It's just something I don't think you can really look at. | ||
If you go deep, though, down the philosophical rabbit hole, You really consider that life is this temporary experience and even benefiting someone through this temporary experience is still a temporary experience It's like you are helping some you gave them a pillow for the ride and it's a temporary ride the ride comes to an end and then what and then what is the point of all this like what is the point of effective altruism if you're just helping people during this temporary ride and That doesn't seem to mean anything. | ||
Yeah, so I think there's two things. | ||
I like your eyebrow. | ||
unidentified
|
It's really cool. | |
I can't help myself. | ||
I can do that too. | ||
Just raise up. | ||
unidentified
|
I just go, what the fuck is this? | |
Freak myself out. | ||
Well, we do get freaked out at this, you know, when you think of existential ants. | ||
The angst of existence. | ||
So I think there's two answers here. | ||
The first is that the ride is the goal, ultimately. | ||
Again, if you think the purpose of life is to increase the amount of happiness and reduce the amount of suffering, the final goal is good experiences, and the kind of anti-goal is bad experiences. | ||
So when we're sitting here talking, having a great time, this is us kind of achieving. | ||
This is us getting points on the win counter. | ||
Because we're having a good time. | ||
That's right, yeah. | ||
If we were really hating this, then we'd be losing. | ||
Well, even more so because we're broadcasting this live and millions of people are going to hear it. | ||
And hopefully they're enjoying it. | ||
Hopefully. | ||
And maybe if they're not, at least there's a little stress relief. | ||
Like maybe they're at the gym and they go, these fucking idiots! | ||
And they're doing squats and they're getting angry. | ||
Yeah. | ||
So I think that's the first thing. | ||
But then the second thing relates to this idea of cosmic significance. | ||
Where what often motivates... | ||
So you say, oh, we're just along for a ride. | ||
We're all going to get eaten up by the sun eventually, and so on. | ||
What's the kind of greater purpose of life? | ||
But I actually think there are some ways that our actions now can have much greater cosmic significance. | ||
And that's because, I think, if you think that the human race survives for the next few centuries, it seems kind of inevitable that we're going to spread to the stars. | ||
And I think that would be good. | ||
Again, from this perspective, we can go into more arguments if you want, of just saying what we want to do is promote happiness and reduce suffering. | ||
If that means we can live on other planets as well and have kind of thriving civilizations there, not only where the people are having great lives, but also making scientific, artistic contributions and so on, then that's a good thing to do as well. | ||
Well, there's no technological reason for thinking that we won't be able to do that in the future, given current rates of technological progress, unless something really bad happens along the way. | ||
And this kind of gets back to one of the things we talked about right at the start was one of the focus areas of the effect of altruism community is on trying to reduce Risks of human extinction, of global catastrophic risks. | ||
These are the sorts of things that could imperil the human journey, as it were. | ||
And I think that if you're working to mitigate some of these things, Then you're increasing the chance that we do get to the sort of level where humanity can have a thriving future, not just on this planet, but on other planets as well. | ||
And that actually means your actions really do have this huge cosmic significance. | ||
So the conscious effort to be a kind person, a generous person and effective altruism spreads and it impacts people. | ||
There's this ripple effect and your good deeds could perhaps Fuel enough people with this thought and with effective altruism and more people might act on that to the point where we reduce the amount of suffering, to the point where we extend the lifespan of human beings, we extend the areas where we have no war, we reduce the amount of violence to the point where we can successfully innovate to the point where we can get off this planet. | ||
And then start from scratch with a new dictator on Mars. | ||
Donald Trump on Mars. | ||
How about that? | ||
Yeah, I mean, so I think... | ||
Booting on Mars. | ||
Well, if he could become president of Mars, I'd be pretty happy with that. | ||
It'd be fascinating. | ||
We'd have to go to war with Mars. | ||
Do you think, though, I mean, I've wondered about this many, many times. | ||
I wonder if it's an outdated idea, this idea of traveling to the stars. | ||
And again, I go back to this whole interdimensional thing. | ||
I wonder if that's the reason why we have never been visited by other planets, by species from another planet. | ||
Maybe that's not what happens. | ||
Maybe they develop artificial realities. | ||
Like what Jamie was talking about to me with these artificial computer realities. | ||
If someone develops some sort of a matrix-like world where you can plug into it and experience an infinite number of things, an infinite number of artificially created dimensions that are indistinguishable from this, Why would you want to, like, risk a six-month trip in a metal tube to another planet? | ||
I mean, maybe that's really retro. | ||
Maybe that's a really ancient way of looking at things. | ||
Maybe it's like zeppelins, like big flying balloons instead of, you know? | ||
So, yeah, the question you've raised is called the Fermi Paradox. | ||
Right. | ||
Which is, just given there's so much, so 100 billion stars in our galaxy, 8 billion galaxies in the affectable universe, 100 billion in the observable universe, The universe is also pretty old, 15 billion years old. | ||
So if it was the case that life is very common, that it's very easy for us to, life to then develop to a level of advanced technological ability, we should expect to see evidence of aliens all over the place. | ||
But yet we see absolutely none. | ||
And that means that from somewhere from a habitable planet, somewhere along the path from a habitable planet to space-faring civilization, there must be some big filter. | ||
There must be some step that's just incredibly hard for that, or incredibly unlikely that civilization moves them, or life moves them that step to another. | ||
And one hypothesis is this, yeah, like, people just... | ||
Civilization gets to a sufficiently advanced level and they just chill out. | ||
Or they go internally. | ||
Yeah, they go internal. | ||
The issue with that explanation, I think, is it's just not strong enough. | ||
Because... | ||
You'd have to think that that's, for this kind of filter to work, it has to be a really strong filter. | ||
Filter? | ||
Yeah, as in like, because there's just so many stars, so many Earth, so many seemingly habitable planets, it has to be the case that it's exceptionally unlikely at some stage or other. | ||
Like, not just really unlikely, as in like, you know, one in a trillion unlikely planets. | ||
On this path from habitable planet to spacefaring civilization. | ||
And so you'd have to think, of a trillion civilizations that get to this level of technological ability, they all choose to turn inward. | ||
And that seems just very unlikely. | ||
It seems like, well, at least one would really try and spread out. | ||
And if so, then we'd see evidence of that. | ||
Because, cosmically speaking, the time from getting to the level of technological capability where you can spread to the stars and the level where we'd be able to kind of see real evidence of that is kind of small. | ||
So I actually think that the reason that we can't see aliens is because the very first stages of life are incredibly unlikely. | ||
The move from nothing to kind of basic replication, and then secondly, the move from single-celled organisms to multi-celled organisms. | ||
And the reason for thinking this is very unlikely is it took an incredibly long time on Earth, billions of years before this happened. | ||
And in particular, in the move from single-celled to multi-celled life, that's only ever happened once. | ||
And so, given that we don't see any aliens, we should think some part of this is really hard. | ||
Our best guess is that that move from single-celled to multi-celled, and perhaps from the creation of the first cells as well, that was incredibly difficult. | ||
And that means that we're just exceptionally lucky to be alive, as it were. | ||
But if the universe is infinite, that means that this has happened an infinite number of times. | ||
That's right. | ||
Though it might be very far away, like sufficiently far away that we are not connectable to each, like we can't contact each other or observe each other. | ||
But there's an infinite number of those infinitely far places. | ||
So there would be some clusters of the universe. | ||
And again, the idea of the universe is only a hypothesis. | ||
And I'm just deferring to other people who say it's the leading hypothesis. | ||
Well, the most puzzling hypothesis to me was the evidence of supermassive black holes being at the center of every galaxy. | ||
And that the hypothesis was that the supermassive black holes are exactly one half of one percent of the mass of the entire galaxy. | ||
And that if you go through those supermassive black holes, you may in fact go into a completely new universe, filled with hundreds of billions of galaxies, each with supermassive black holes at the center of those galaxies, which will take you to hundreds of billions of galaxies in another universe. | ||
It's never-ending, and that's what the real infinity is. | ||
It's not just the mass of all the things that we can observe in the 14 plus billion light years that we know of from the Big Bang to today. | ||
It's all of those things being portals to incredibly different, totally new universes. | ||
Okay, yes, it's turtles all the way down. | ||
Turtles all the way down. | ||
So the real question to me, and I proposed this to Brian Cox and I didn't get a sufficient answer, it's why would we assume that there's someone more advanced than us? | ||
It is possible that someone, some species, something is the tip of the spear. | ||
That something is the first. | ||
That something is the most advanced life form in the universe. | ||
Why would we assume that someone would be more advanced than us if we are the most advanced thing that we can find? | ||
The only logic That I could point to was that we are relatively young in terms of the history and the age of the universe itself. | ||
The universe itself being roughly 14 billion years old. | ||
We are 4.6. | ||
What is the age of the earth? | ||
Somewhere in there? | ||
Somewhere in the neighborhood, right? | ||
Relatively young when you consider that 10 billion years of life, give or take, or of existence happened before we came along. | ||
But why would we assume that there's anything out there that's more advanced? | ||
And why would we assume that this isn't as far as anybody's ever gotten? | ||
In terms of infinity, right? | ||
14 billion years seems like a long time. | ||
But in terms of infinity, it's a blink. | ||
So I think we should believe that in the... | ||
And again, let's now just ditch the infinity and just think about the observable universe, which is finite. | ||
Because people pulled over sweating in their car right now. | ||
Yeah, yeah, exactly. | ||
Infinity... | ||
Have you ever heard of Graham's number? | ||
This is now a total recognition. | ||
Of Graham's number... | ||
I don't believe so. | ||
What is Graham's number? | ||
It got known as the largest number ever seriously used in the mathematical proof. | ||
And Tim Urban of Wait But Why has this amazing post trying to explain just how big Graham's number is. | ||
And you have to use a special notation in order to be able to explain it. | ||
And numbers just get really big. | ||
And once you really start to think this through, you're just like, you're left just kind of walking back and forth. | ||
Yeah, not like just totally freaked out. | ||
Yeah. | ||
For our little monkey minds. | ||
Because you think like trillion is so big. | ||
unidentified
|
Yeah. | |
Trillion is just a speck of dust compared to the famous number. | ||
unidentified
|
Right. | |
Even a trillion years is a speck of dust. | ||
When you consider the possibility of the universe itself being infinite or the possibility that is a continuous cycle of big bangs to expansion to contraction back to an infinitely small point, back to another big bang, which is a plausible possibility. | ||
Yeah, I mean, I think, yeah, I'm also very worried, you know, I'm not Neil deGrasse Tyson, I'm sure I'm butchering tons of the science. | ||
I think my understanding at the moment is that we currently think that the universe is just expanding and it just keeps expanding further. | ||
I know it was definitely a leading theory that it was going to expand and slow and then kind of crunch. | ||
Yes. | ||
But you mentioned humans being the most advanced kind of creature. | ||
I think that probably is correct in the observable, or certainly our galaxy, let's say. | ||
Well, we know it is in our solar system, right? | ||
Yeah, that's right. | ||
But I think we know it is in our galaxy as well. | ||
You think so? | ||
It's so far. | ||
But the thing is that it's like 100,000 light years. | ||
Oh, nothing. | ||
But when you're thinking about 15 billion years of the age of the universe, that's actually just a very short period of time. | ||
unidentified
|
Right. | |
But why would you assume that 100,000 light years from now, there's not something exactly like us? | ||
So it's possible. | ||
But the thing is that if it was... | ||
Somewhat easy, or if it was just not incredibly difficult for intelligent life to evolve, then it would have happened in the past already and we would see evidence of it. | ||
And the fact that we don't see any evidence at all of intelligent life and other solar systems at all suggests that it's incredibly difficult for that to happen. | ||
But isn't that like being in the woods and unzipping your tent and sticking your head out and saying, I don't see anything. | ||
This must be empty woods. | ||
It's more like... | ||
I mean... | ||
You're talking about a very small area that you've observed and we've taken account of. | ||
So I think it's more like... | ||
Because I think... | ||
If an alien civilization or us in the future goes to kind of start, yeah, spreading to the stars, in the course of, you know, just a few seconds, a million years, let's say, there will be really significant evidence. | ||
You'd see Dyson spheres being constructed around suns, you know, to harness the sun's energy. | ||
You'd see some evidence of, like, galactic engineering projects and so on. | ||
It would be like a really big impact. | ||
Do you think you'd see that with hundreds of thousands of light years between us and the observable objects? | ||
But again, 100,000 light years is just not very long compared to the kind of 15 billion. | ||
So it would just be this amazing coincidence if it's the case that... | ||
A life that's as advanced or more advanced than us has evolved at just the same time as us, where 100,000 years, give or take, is basically just the same time, but hasn't evolved more than a million years ago, where we would start to see kind of major impacts of that. | ||
So if something within the observable universe... | ||
But we've observed so little. | ||
We don't even have really adequate photographs of anything outside of our solar system. | ||
I mean, everything is just radio spectrum. | ||
You know, the analysis is that they're getting off of light waves of what the components of the atmosphere is. | ||
So using your analogy, what I'm suggesting is that if it was the case that intelligent life was not that hard to come by, you'd stick your head out the tent and you'd look like Tokyo rather than looking like the woods. | ||
But why does it have to look like Tokyo? | ||
Why can't it look like Kansas? | ||
Why can't it be like really spread out and very little life? | ||
Because I think if life is spreading out, then it's just going to want to, what does life do? | ||
It just tries to harness resources and tries to grow more of itself. | ||
Maybe it reaches a point where it realizes that's futile. | ||
It just concentrates on effective altruism at home. | ||
So that's the turning inward suggestion again. | ||
And so maybe it's the case that like, yeah. | ||
Like, is it more important to get your shit together at home or to go all over the world with the same bullshit ideas? | ||
Right? | ||
And if that's the case... | ||
Wouldn't that be the same thing that you could turn towards interstellar travel? | ||
Like, wouldn't it be more important for these communities to concentrate on taking care of their planet and figuring out a way to work in some sort of harmonious fashion with the very nature of the planet itself rather than travel to the stars? | ||
I mean, possibly. | ||
But now imagine there's... | ||
So on this alien planet, there's 10 billion aliens, and they're like, let's say they're a thousand years more advanced than humans are at the moment. | ||
In order for this argument to work, it'd have to be the case that every single one of them makes that decision to just turn inwards and focus on... | ||
Why would that be the case? | ||
Because not all those people would be the ones that would innovate in the first place. | ||
It wouldn't have to be everyone that makes a decision, but it would have to be everyone of a high enough consciousness to figure out how to make these interstellar machines decides not to harness this nuclear power and jet off into space. | ||
But I think over time that would just be everyone. | ||
Really? | ||
Well, yeah, I mean, just technological progress just keeps going, and eventually, like, I mean, obviously we're doing this, like, weird thought experiment. | ||
Right, right, right. | ||
Speculating on, like, economics and sociology of a hypothetical alien world. | ||
But, uh... | ||
I mean, just at some point, as a civilization progresses, then there's going to at least be many, many actors with sufficient power and capability to spread to the stars. | ||
And you need to say that every single one of them decides to turn inwards. | ||
So it's sort of like technology becomes very rare and then ultimately over time becomes very common, like the cell phone. | ||
Like the cell phone, yeah. | ||
unidentified
|
Right. | |
So when a cell phone was first invented, it was extremely rare and very expensive. | ||
Now everyone has one and the capabilities of those cell phones have greatly, greatly improved. | ||
Yeah. | ||
And that this will happen with everything, including space travel. | ||
Yeah, I mean, but also it doesn't need to be the case that it gets out to 10 billion people, even if it's just like 1,000 people or something. | ||
Again, it would just seem unlikely that, you know, in every civilization and every one just has, you know, even just 1,000 people, everyone chooses not what a single person thinks exists. | ||
Hey, I just want there to be more spread out. | ||
Now, that obviously is dependent upon there being a more advanced civilization than human beings on planet Earth. | ||
Because if there weren't, if there were a few years behind us, like if they were stuck in the 1950s, or maybe they're stuck in ancient Greece, then obviously they don't have the capabilities yet. | ||
We might be the very most advanced. | ||
We might be the very tip of the spear, right? | ||
Yeah. | ||
And I just think, yeah, because I think it would be unlikely that... | ||
Something more advanced happened just a little bit faster than us, but not, say, 100 million years ago, which is not very long ago in cosmic terms. | ||
But it's still possible. | ||
I mean, it's still possible that something happened 100 years quicker than us, or that they haven't had the same setbacks that we've had in terms of, like, asteroidal impacts and natural catastrophe, supervolcanoes and the like. | ||
It's a real weird thought experiment, because you start thinking, and you start extrapolating, okay, well where are we gonna be? | ||
You know, where are we gonna be? | ||
And why would we do that? | ||
Like, that's one of the things that always gets me about this whole trip to Mars, and I have a joke about it in my Last comedy special, where people were, somebody actually said this to me, like, because it was before California had solved its drought, or Mother Nature solved our drought for us, rather, where people were like, hey man, we should really consider going to Mars, because, I mean, look at our environment, California's almost out of water, and my joke was like, we're right next to the fucking ocean. | ||
Like, there's so much water, you can't see the end of it. | ||
We have a salt problem, we don't have a water problem. | ||
Like, what are you gonna do? | ||
You gonna bring water to Mars? | ||
Like, that's the stupidest thing I've ever heard in my life. | ||
Yeah, there's this weird, when people start talking about Mars, I mean, I think, so there's the project of going to Mars, setting up a colony. | ||
Now, like, the aim of doing that, because it's awesome. | ||
Totally on board with that. | ||
In the same way as, like, going to the moon, it's like, look what we can achieve. | ||
This is an exciting, like, global human project. | ||
Even just the space shuttle going into orbit, it's pretty badass, right? | ||
Yeah, exactly, exactly. | ||
But then this talk of like, oh, well, we need this in order to be able to survive as a species. | ||
I'm like, look, if you want to have this kind of refuge or colony in order to make the Earth more robust, Mars is just not a great place to pick. | ||
There's so many different ways that, I mean, Mars is like really inhospitable. | ||
And if you wanted to build a refuge, why not go under the sea? | ||
That's, like, going to be protected from, you know, viruses or asteroid impacts and so on. | ||
Not really, though. | ||
If one of those big things that's slammed into the Yucatan slams into where your village is in the sea? | ||
I mean, if you had this underwater village with, you know, 10 years of food supplies and so on, then you could, like, come back. | ||
Because the impact from the asteroid wasn't just, like, shook everyone up. | ||
It's that the sky is gone. | ||
unidentified
|
Mm-hmm. | |
The skies get clouded over with ash. | ||
The Earth rang for a million years. | ||
Oh, what is that? | ||
As in like... | ||
From the impact. | ||
Like... | ||
Yeah. | ||
That's so interesting. | ||
That's so insane. | ||
Yeah. | ||
When you think about how big that thing was that killed the dinosaur 65 million years ago and that there's hundreds of thousands of those things floating around in space. | ||
So yeah, I was asking some people at NASA just two days ago actually on how many of them we've managed to identify. | ||
Because they're serious about kind of scanning the skies to find them all. | ||
And the answer, I thought we had it covered. | ||
I thought this was something that NASA was like, yeah, yeah, we know where all the Earth killers are. | ||
And their response was like, no, we've got no idea. | ||
We don't know how many of them are out there, and so we don't know how many we've managed to track. | ||
There's a guy named Randall Carlson that I've had him on podcast a few times, and he's obsessed with the idea that asteroidal impacts were probably what ended the Ice Age, you know, 10 to 12,000 years ago. | ||
And there's a significant amount of physical evidence that points to this. | ||
Both in evidence of impact in nuclear glass. | ||
I think it's called tritonite. | ||
I forget the exact word. | ||
But it appears all throughout Europe and Asia at around that same timeline, around 10,000 and 12,000 years ago, when they do core samples. | ||
And it points to this idea that there were significant... | ||
impacts from Asteroidal objects all over Europe and all over Asia around that time they think some of them Slammed into the ice caps that were you know North America was covered in a giant chunk of it was covered in as much as two miles high of ice just 10,000 years ago and And he points to an incredible amount of physical change in the environment that looks like it took place over a very short period of time. | ||
Like catastrophic change over an incredibly short amount of time that he believes points to these impacts, melting the ice caps, creating massive amount of flooding, killing off who knows how many people, resetting civilization in many different parts of the world. | ||
This evidence of the nuclear glass, of these micro-diamonds that also exist, they find them during nuclear test sites when they blow off bombs, and they also find them at asteroid impact sites. | ||
And when you know that we have been hit many times in the past, and they do have evidence of that, and then you see the moon and all the different impact craters on the moon, you know that this is just What he calls a cosmic shooting gallery, essentially. | ||
He's like, it's very likely that that was the cause of the end of the Ice Age. | ||
There's a lot of this climate data that sort of seems to point to that as well. | ||
So this is now, like, really outside my area of expertise. | ||
I'll send you some links to some of his stuff, because he's been obsessed with this for about 30 years. | ||
Fascinating guy. | ||
The two things that would really surprise me about that are, firstly, just that there were so many ice ages, and it just seemed to be this, it comes on, goes off. | ||
Oh, sure, yeah. | ||
You know, fairly dynamic, predictable process, whereas asteroid impact, super random. | ||
So you wouldn't expect to have this kind of back and forth dynamic if it was asteroids that was doing it. | ||
And then secondly, my understanding would be that asteroids would cool the planet because asteroid hits, ash just spreads out all over the sky. | ||
That just blocks out sunlight. | ||
So it would surprise me if it had this kind of warming effect. | ||
Well, I think the idea is that, first of all, when it hits, the impact is massive, and it melts off just the huge chunk of the amount of water that is covering North America, right? | ||
And that's one of the things that causes this massive flooding and this massive changing of the topography. | ||
And as far as, like, what causes the natural... | ||
I don't know if it interrupts it temporarily, and then it comes back and gets warmer. | ||
But, yeah, that natural cycle of... | ||
Warming and cooling has been going on since they, I mean, from as far back as they can measure it. | ||
What he's talking about is significant quick changes. | ||
Also the extinction event that killed somewhere around 65% or more of all of the large mammals in North America. | ||
Really quickly, like woolly mammoths, really quickly. | ||
Sabertooth tigers, really quickly. | ||
They don't know about that. | ||
There's a lot of speculation back and forth about that. | ||
Because they think that humans did it, but then they found these mass dead sites where they're not consumed. | ||
What was the ones that he showed where these woolly mammoths, they found them Where their legs were broken, and it looked like just the impact of something had knocked them flat, and they had found like thousands of them in these mass death sites. | ||
Interesting. | ||
But I thought that the... | ||
So firstly, it just seemed to me like the homo... | ||
The idea that it was humans killing them all just seems like... | ||
Crazy. | ||
Oh no, I thought it just seems like such a good explanation. | ||
But they didn't even have... | ||
They had atlatls. | ||
That was like the best weapon they had at the time. | ||
They weren't even riding on horseback at the time. | ||
But then with respect to the death sites, I thought the mechanism for killing a woolly mammoth is you've got like 200 humans and you just chase the woolly mammoth off a cliff. | ||
That does work if you can get them near a cliff. | ||
But the idea of getting them all near cliffs and killing them all off by a bunch of people that hadn't figured out the wheel seems a little unlikely. | ||
It's just... | ||
It's possible. | ||
Like, over thousands of years. | ||
Because that's the thing, like, we often tell these stories about, you know, pre-civilization humans. | ||
It's like, oh, and then they migrated and made this great journey to Europe and so on. | ||
And often that's like, they moved a mile every year. | ||
unidentified
|
Right. | |
So it's like, great journey is actually just this very gradual thing. | ||
Yes, yes, very gradual. | ||
And similarly, if you've got this grave site and it's got, wow, hundreds of woolly mammoths in this one place, that might be over thousands of years. | ||
I mean, again, this is just something I No, that's the thing. | ||
They're talking about carbon dating, that it's all within the same time period. | ||
You'd have to really go over his stuff with a fine-tooth comb and talk to him about it, because I'm not the right guy. | ||
I just listen to him and go, whoa, and then try to relay it as much as possible. | ||
There's a podcast that I actually retweeted today, because somebody brought it up on YouTube. | ||
It's available, so I'll send you to it afterwards and see what you think about it. | ||
But this is something, yeah, if you know the book Sapiens... | ||
No, you're like the fifth person to talk about it. | ||
I've got to get it. | ||
Everyone talks about Sapiens. | ||
Sapiens is like THE book. | ||
Pull on up to that. | ||
Pull that a little closer to you because it makes a big difference in the sound. | ||
But yeah, one of the things that most blew my mind there was how much megafauna there was in the early days of Homo sapiens. | ||
You know, moving across North America, there were two ton sloths. | ||
Huge giant sloths. | ||
And these are one of just very, very many massive megafauna that we just don't have anymore. | ||
Yeah, the blitzkrieg hypothesis is what they call the human animal killing off all of the other animals. | ||
It's a really troubling hypothesis because we don't want to think that we're capable of doing that. | ||
But obviously we do do that. | ||
I mean, we're doing it right now. | ||
We did it to the buffalo. | ||
I mean, we almost brought the bison. | ||
Did it to the dodo. | ||
Yeah. | ||
We're doing it just... | ||
Tasmanian tiger. | ||
There's a lot of different animals that within our lifetime have gone extinct. | ||
I mean, we're actually, like, in terms of extinctions, I'm not sure if we'll get the number right, but it would be pretty accurate to describe this as the fourth, and maybe it's not fourth, but mass extinction, because it's just huge, the number of species that have gone extinct as a result of human activity. | ||
And it's also one of those things where we don't think of it as being significant because it happens slowly over the course of many years, but if you look at it on a timeline, you're like, oh my god, look, everything's dying right now. | ||
Yeah, yeah, exactly. | ||
So it's... | ||
Slow by human standards, but very quick by geological standards. | ||
It's a fascinating subject, the end of the Ice Age happening so quickly, the animals dying off so quickly, and so many large mammals dying off so quickly. | ||
When you think about what we know people have done, like when we almost killed off the bison, we know why they did that, we know how they did that, and they did it with extraordinary weapons. | ||
I mean, they did it with high-powered rifles. | ||
They could shoot things from a far distance. | ||
They did it by shooting off trains. | ||
I mean, they did a lot of crazy shit back then. | ||
So we understand, I mean, and there's a lot of physical evidence. | ||
There's photographs of the actual piles of bones and all that crazy shit. | ||
When you take away those physical capabilities, the extraordinary physical capabilities, like even riding on horseback, there's a guy named Dan Flores, who's a fascinating guy, he's a scholar, who believes that even without The Europeans coming over here and market hunting and killing off all the bison, he thinks just the firearm and the horse with the Native Americans, it's entirely possible that they were going to eradicate the bison on their own. | ||
I mean, again, it just depends about timescales. | ||
So even if you're just killing like slightly more of the species, like killing just enough of the species that they're now below the, you know, two children for every two parents. | ||
unidentified
|
Right. | |
Viability stage. | ||
Yeah, exactly. | ||
Then just over sufficient time. | ||
Yeah. | ||
And remembering that Homo sapiens between the hunter-gatherer age was 190,000 years. | ||
It's very long time spans. | ||
Again, very short geologically, but... | ||
Yeah, very long time spans. | ||
So again, you don't have to be killing that many woolly mammoth to drive them to extinction over the course of several thousand years. | ||
What are your thoughts when it comes to the ethical reintroduction of animals that have gone extinct? | ||
Like, there are some people in Russia that are currently... | ||
Working on some form of a woolly mammoth. | ||
We're going to take woolly mammoth DNA from some of these frozen bodies that they've gotten. | ||
I mean, they've gotten some pretty intact woolly mammoths now, and they're going to try to clone one. | ||
Yeah, so I don't know the details of how this will work. | ||
I guess they have to gestate it in an elephant. | ||
But I mean, I think it's like scientifically interesting. | ||
I don't think there's anything wrong with it. | ||
I don't think there's anything... | ||
Like where you have woolly mammoths everywhere. | ||
Yeah, I mean, I think... | ||
I don't think there's any ethical imperative to do it. | ||
I think there's not an imperative not... | ||
Like, I would think just if there's more woolly mammoths, that's the same as there just being more elephants. | ||
And it might be of scientific interest. | ||
I heard... | ||
While we're on, like, hypotheses that we heard and we're like, oh, that's cool, but sound ridiculous. | ||
Yeah, I heard the idea was reintroducing woolly mammoths to, like, stomp down snow in order to prevent... | ||
Yes. | ||
unidentified
|
...prevent... | |
Global warming. | ||
Yeah, to slow it down somehow or another. | ||
Yeah. | ||
There's definitely things of the sort of thing that people say over dinner, but... | ||
Yeah, well, the idea wasn't just stomp down snow, but also to eat the foliage. | ||
Okay. | ||
Yeah, there's like some exfoliating thing that they're doing where they would consume so many trees and so many plants that it would actually lower the temperature of the earth. | ||
unidentified
|
Like, what in the fuck? | |
Seems that you're skeptical of that. | ||
But, I mean, there is this philosophical question of whether you should... | ||
So, the question of biodiversity loss. | ||
Which has been huge. | ||
How do you value that? | ||
So is it the case that loss of a species, you can just cash that out in terms of impacts on individuals? | ||
Because obviously it's bad for the animals that die in the course of that, and we maybe have a loss of information that we can just not get back. | ||
But is there something intrinsically bad about just having fewer species? | ||
To act in a way that suggests they seem to believe yes, but it's hard. | ||
I think it's hard philosophically to cash that out. | ||
I think it's hard to explain like why would we care so much about losing species where we don't seem to care about having, you know, Deliberately randomizing breeding and so on, that we get more species. | ||
It seems like we're only just conservative about not losing them. | ||
But if it really is of value to have greater diversity of species, why do we not actively try and promote a greater amount of biodiversity rather than merely preventing loss of biodiversity? | ||
I think the reintroduction of species, if you have an environment that's stable, if you have some sort of an ecosystem that's stable, and then you reintroduce a predator or prey or some animal that's going to eat up all the foliage, you're running this big risk, and you're taking these big chances that you can sort of predict the future. | ||
You could look at A plus B, well, that's going to equal C. But it doesn't always work that way, and there's been disastrous results When they've introduced species to other environments where they're not native. | ||
You know what's going on with places like Australia? | ||
Australia is kind of hilarious in that regard. | ||
Yeah, so they introduced a type of frog to Australia. | ||
I'm going to butcher this as well. | ||
They introduced a type of frog to Australia. | ||
It took over. | ||
So they introduced rabbits to try and eat these frogs or something to eat the frogs. | ||
And then they took over and didn't kill the frogs. | ||
Well, then they introduced foxes to try to kill the rabbits, and they killed all the ground-nesting birds, and they introduced cats to kill the foxes, and cats to kill the rabbits. | ||
Well, especially back then. | ||
You know, when they were doing this in the 1800s in Australia, they really didn't know what the fuck they were doing. | ||
They were thinking short-term, right in front of them. | ||
They also brought in a bunch of animals that don't have natural predators, so they have to gun them down from the fucking sky. | ||
I mean, they have all these deer and stags and all these magic beasts. | ||
I mean, have you ever seen a stag? | ||
They're incredible. | ||
They roar. | ||
They sound like a lion. | ||
And they have so many of them in Australia and particularly in New Zealand, but they don't have any natural predators. | ||
Zero. | ||
No predators. | ||
So they have to fly over in helicopters and gun them down. | ||
And they leave them. | ||
They just leave them to rot. | ||
They just have too many of them. | ||
It's the same with kangaroos as well. | ||
Have you seen those herds of kangaroos? | ||
Have you ever seen that? | ||
No, I haven't actually. | ||
Oh my god, there's a video that some guy took somewhere in Australia and it is Thousands and thousands of kangaroos running across this field, and it looks like some apocalypse, some apocalyptic kangaroo invasion. | ||
See if you can find that, Jamie, in a video, because it's worth seeing to realize, oh, this is what can happen when there's no predators. | ||
Animals just get completely out of control. | ||
Yeah, so I'm vegetarian and have been for a long time now. | ||
But with some other vegetarian friends, we had the conversation of, yeah, what would be the most ethical meat to eat? | ||
And I think we concluded that kangaroo would be the most ethical because it's being killed anyway, because they just need to, like, you've got this population explosion. | ||
It's on land that wouldn't be otherwise used for anything. | ||
They're roaming free. | ||
They've got pretty good lives. | ||
The environmental impact is therefore going to be low and non-existent as well. | ||
Obviously, kangaroo meat is very unusual in almost all of those regards. | ||
It's not, though. | ||
Yeah, I mean, it's very nutritious, apparently. | ||
Kangaroo is actually a type of deer, believe it or not. | ||
Yeah, I don't believe that. | ||
I thought it was a marsupial, which is a totally different... | ||
It is, but it's related to the deer in some... | ||
Look at these fuckers. | ||
Just hanging out. | ||
This is not the one I'm talking about, though. | ||
There's a bunch of them running across a field. | ||
This is just a large population of kangaroo. | ||
Yeah, there's some way in the deer family in some strange way. | ||
See if Jamie can find that, too. | ||
Do you know that we have wallabies in Scotland? | ||
Yeah, I know. | ||
Yeah, in an island called Inch Conachan. | ||
Yeah, I've heard of that. | ||
I've visited them a number of times. | ||
And were they introduced to Scotland? | ||
Yeah, so Lady Avon Cahoon had... | ||
Oh, that bitch. | ||
Yeah, well, no need for that. | ||
Who is she? | ||
So she, I actually don't know, but she owned the island. | ||
She owned a zoo on the island, like a personal zoo. | ||
And she died, I think. | ||
The zoo went to rack and ruin, so it just kind of... | ||
The wallabies just got out? | ||
And the wallabies took over, yeah. | ||
And the first evidence, because people wouldn't regularly visit this, was they would find these dead wallaby carcasses on the mainland. | ||
And that was during the winter, the loch, Scottish for lake, would freeze over and the wallabies would hop on the ice and then get hit by a car. | ||
But they're now very tame. | ||
It was a shame because I first found out about them back when it was still a bit of a secret. | ||
That's fascinating. | ||
Now it's become a bit of a tourist hotspot. | ||
Wow. | ||
It says that kangaroos are marsupials and more closely related to possums than deer. | ||
Oh, okay. | ||
So they're not related to deer, correct? | ||
Yeah. | ||
Someone had told me that they were in some way in the deer family, or cousins of deer, or something like that. | ||
Early explorers said that they were just, that's what their descriptions were, that they were like deers without antlers, and they stood upright like men, but I saw some, I mean, it's a Q or a question, so I didn't find like an official scientist saying, here's the sighting on it, but yeah. | ||
Yeah. | ||
I wish I had this my whole life. | ||
Someone who could just follow me around and correct me every time I say something. | ||
Well, this is an amazing time. | ||
Somebody put something up on Instagram today and it was a quote from the 1800s about an ancient philosopher or an ancient scholar rather would give his life for the information that's available to the common school boy today. | ||
And this is from a quote from 1888. Wow, okay. | ||
Which is nothing now compared to what we can do. | ||
Yeah, I think there's another statistic. | ||
And again, it's unclear how do you measure this, but in terms of written information at least, one newspaper has more written information in it than a typical person in the 1700s would be exposed to for their entire lifetime. | ||
I wonder what was the natural predator of kangaroos? | ||
Because kangaroos, they're a native animal to Australia, and if they didn't... | ||
Do you know, there was a giant predator in New Zealand, at least, at one point in time. | ||
It was called the host eagle, and it was an enormous eagle, the biggest eagle they think that ever lived. | ||
It had something like a 10-foot wingspan, and they believed they'd even hunted people. | ||
A huge, huge eagle. | ||
And it's a part of the... | ||
I guess it's the Maori? | ||
It's a part of their ancient mythology, and they found out that it was actually a real animal. | ||
Somewhere around the 1400s was made extinct through hunting. | ||
My understanding was in Australia, before humans invaded... | ||
Proconyles. | ||
My understanding was that it was just no major predators for... | ||
That's the Tasmanian tiger. | ||
The thylacine? | ||
Yeah, they call that thing the Tasmanian tiger. | ||
That died during human, like, modern times. | ||
That's a crazy looking picture. | ||
Look at its face. | ||
Look at that mouth on that thing. | ||
Jesus Christ. | ||
But that, I believe those things died off in the 1930s. | ||
I just typed this in here. | ||
Now it's extinct, but the dingo is probably the more closest related predator they have. | ||
When did it die? | ||
Thylacine is now extinct. | ||
However, humans arrived in Australia at least 50,000 years ago and introduced the dingo about 5,000 years ago. | ||
Hmm. | ||
So maybe those things were eating kangaroos. | ||
A big part of kangaroos, I guess, would probably be catching them when they're not with their young, but they carry their young inside their body in that pouch, which makes them different from any other kind of animal that would be prey, because they can take care of their young and bounce away quickly. | ||
Well, this is why, so in terms of large mammals, humans killed every single type of large mammal other than kangaroos in Australia. | ||
I think there were kind of hundreds of different types originally. | ||
Oh, there's a bunch of different things other than kangaroos? | ||
Yeah, yeah, yeah. | ||
Like what? | ||
Again, I don't know. | ||
Maybe giant koalas, let's say. | ||
But yeah, and my understanding was the reason for that was because they didn't have natural predators. | ||
And so they just didn't know what to do with people. | ||
Yeah, exactly. | ||
Yeah, that makes sense. | ||
You know, have all of these like defensive mechanisms. | ||
Right. | ||
And also have wolves and coyotes and bears and all these different things that are chasing them down. | ||
That's interesting, the concept of what's the most ethical thing to eat. | ||
I would think you would think it would be like mollusks. | ||
Okay, so I do think it's totally fine to eat anything. | ||
Well, what I say is I don't eat anything with a brain. | ||
So that means that oysters, mussels, clams, they're okay. | ||
So I got convinced. | ||
I didn't used to be like this. | ||
I got convinced by an advocate for what's called bivalve veganism. | ||
I mean, it doesn't make a big difference. | ||
I don't really like these things. | ||
I eat them occasionally, but... | ||
You don't like, like, mussels? | ||
No. | ||
Really? | ||
Yeah, no. | ||
Have you ever had linguine with mussels, like at a good Italian restaurant with a nice red sauce? | ||
Yeah, I mean, so when they're good, they're fine. | ||
And when they're bad, they're really bad. | ||
Well, that's in that case with everything? | ||
No. | ||
Some things, when they're good, they can be. | ||
Gross hamburgers. | ||
I mean, you can get down the line, you know, you can rot in food. | ||
No, but, like, you know, good pizza is just amazing pizza. | ||
Or, like, I feel like the very best muscles I'm, like, meh towards. | ||
Really? | ||
Yeah. | ||
Oh, man, you need to go to a really good Italian restaurant. | ||
Yeah. | ||
Have you ever had linguine with clams? | ||
Do you like clams? | ||
unidentified
|
Yeah, I think so. | |
Again, I just feel pretty indifferent about them. | ||
Oh, you're crazy. | ||
You just need to go to a really good restaurant. | ||
You guys are eating in England, man. | ||
That's the problem. | ||
They don't know how to make Italian food there. | ||
Yeah, that is true. | ||
I mean, there's a few people right now that are screaming in England, I make good Italian food, you son of a bitch! | ||
I'm generalizing and I'm aware I'm ignorant in saying that. | ||
Look, I can't defend English cuisine. | ||
Oh, there's some great... | ||
Having been out to New York, San Francisco... | ||
Well, London has some amazing restaurants now. | ||
London does, yeah. | ||
But it was always the generalized, stereotypical knock was that the food in England was terrible. | ||
The first time I went there was pretty bad. | ||
But yeah, with respect to what's the most ethical meat, I think it is a really interesting question because I think... | ||
You know, the debate on vegetarianism and so on is normally phrased as this either-or thing, like not doing anything or just go vegetarian or vegan. | ||
But I was interested in this question of just, yeah, well, supposing you only want to go halfway or of the different foodstuffs, like what are the ones that are going to do the most in terms of animal welfare if you cut them out? | ||
Because most people, when they go halfway to being vegetarian, they might cut out red meat to cut beef and so on. | ||
And I actually think that's, if you care at least about the animal welfare side of things, I think that's just wrong. | ||
And I think there's two reasons for that. | ||
One is respect to the amount of suffering that the animal has in the course of its life, where... | ||
The way that chickens are currently treated, if you look at just average, and again, we're talking about most chickens, though. | ||
You're talking about factory farming conditions? | ||
Factory farming conditions, which is well over 90%, I think like 99% of chickens that are eaten are in these conditions. | ||
Their lives just, I think they're the worst off creatures in the planet, basically. | ||
And cows, I think, often don't have great lives, but it's just nothing really compared to chickens. | ||
And I think pork are similar, like pigs also have really terrible lives. | ||
Whereas larger animals, cows, sheep, just in general aren't being treated as badly. | ||
And then the second question is, how many animals are you affecting? | ||
Where if you consume a steak or something, that's like a thousandth of a cow on average. | ||
Whereas you can easily eat kind of half a chicken. | ||
And that's a factor that people normally don't consider as well. | ||
And obviously, maybe you value a cow's life greater than a chicken's life or something. | ||
We do in some strange way. | ||
There's a hierarchy that humans have almost inherently, or at least we do in the Western world. | ||
Yeah, I think it's really hard to know. | ||
Like, this is one of the hardest... | ||
Philosophical question I've thought about for ages and have eventually given up on is you've got an unhappy cow day, an unhappy chicken day, which is worse. | ||
How do you weight those two? | ||
You can't. | ||
Or an unhappy fish. | ||
People have very few feelings about fish. | ||
Yeah. | ||
Like you see a dead fish, people don't feel the same way they feel if you see a dead lamb. | ||
Yeah. | ||
But in general, I've become more sympathetic. | ||
I think there's a bias where, you know, we tend to sympathize more with things that look like us. | ||
Fish have these weird, you know, kind of look alien. | ||
They don't take care of their young, too. | ||
That's a lot of differentiation. | ||
Yeah. | ||
And so over time, I've definitely become a lot more sympathetic to taking suffering of chickens and fish fairly seriously. | ||
But I think when you combine these two factors of, again, yeah, fish, I think, except that there's less good information on them, but I think this might be in the category. | ||
But certainly chickens and pigs compared to beef. | ||
I actually think if you just want to kill, like take out most of the suffering from your diet, removing chickens, caged eggs, I think in the US actually that's basically all eggs, unless you kill them yourself. | ||
And pork, I think you're removing, and maybe fish, I think you're removing most of the suffering from your diet. | ||
Vastly more than when it comes to beef or milk. | ||
Yeah, well in terms of like the amount of individuals that get impacted, you're right. | ||
And that one cow can feed much more people obviously than one chicken can. | ||
So if you're taking one life in that form. | ||
What disturbs me most about factory farming Well, for one thing disturbs me, it sort of existed and then I found out about it and it was already there. | ||
And I had been eating it all along. | ||
And that shocked me in that I was... | ||
I remember sitting back, I'd watched some documentary on it, and I remember sitting back thinking, like, this happened because we weren't paying attention. | ||
Because I was a grown man when I found out about it. | ||
I hadn't been paying attention. | ||
And when you leave people alone and you say, hey man, do you think you can get us some beef? | ||
The guy's like, yeah, yeah, yeah, I got it. | ||
Don't worry about it. | ||
You just stay over there in your city. | ||
I'll take care of it over here, out of sight, out of mind. | ||
And then when we find out about it, and then you hear about, in America we have these things called ag-gag laws. | ||
I'm sure you're aware of those. | ||
Unbelievable. | ||
Like, no possible justification for this. | ||
Terrifying. | ||
It's just because... | ||
So, yeah, where it's... | ||
They're hiding information. | ||
Hiding information, yeah. | ||
And there was a case where there was an animal welfare activist goes into a factory farm, is filming instances of animal cruelty for a kind of documentary film that gets presented. | ||
And she got tried and had to go to prison for not intervening in the animal cruelty. | ||
That was just happening all the time, and she was the person just actually... | ||
So she got tried for not intervening, not stopping the animal cruelty? | ||
Yeah, which is happening all of the time. | ||
I thought she would get tried for violating the ag-gag laws. | ||
No, well, she was... | ||
Because it's an invasion of privacy on a corporation and corporate secrets. | ||
Yeah, I think it was prior to the ag-gag laws. | ||
Oh, so they found another way to try her, to discourage... | ||
That's right, yeah. | ||
That's so insane. | ||
But the thing you said earlier, which when you were talking about the ways in which humans are broken, I think if you just look at, yeah, suffering that humans are inflicting now, the thing that's really worrying is how mechanized it's become. | ||
So imagine if, yeah, there was even, let's say, a chicken just right here in front of us. | ||
And I just for fun just kick it. | ||
People would be outraged. | ||
People would just think I'm this kind of despicable person. | ||
And that's the natural reaction, because I'm just caught inflicting unnecessary suffering on this creature. | ||
But then you can just modify the circumstances such that this natural emotional reaction of sympathy just fades away, where now it's this huge warehouse, and it's not just one chicken, it's hundreds of thousands of chickens, and it's all mechanized, and it's all taken out of sight. | ||
Suddenly, yeah. | ||
I mean, Joseph Stalin said, yeah, a single death is a tragedy, a million deaths is a statistic. | ||
I don't generally like to take life lessons from Stalin, but it's an extremely good quote. | ||
But he was talking about humans. | ||
And, you know, any death of a human will be tragedy. | ||
And when they get to large numbers, it's sort of... | ||
It's very difficult to calculate because it's hard for people to understand or grasp the concept of a million people dying in a war. | ||
What's bizarre about factory farming is that it's all kind of done behind these warehouse walls. | ||
It's all undercover and it's all incredibly common and it's all not discussed. | ||
Like if a war is happening, I was going to say if a war is happening and 100,000 people a month are dying, we're discussing, you know, how do we mitigate this? | ||
How do we stop this? | ||
How do we bring peace? | ||
There's so few people wondering how to stop chicken suffering. | ||
Yeah, absolutely. | ||
I mean, because we've looked into this, and one of the reasons it's such a priority area is just the amount of just philanthropic money going into this, when the focus is really on factory farming, not stray dogs and so on. | ||
It's in the low tens of millions of dollars. | ||
Of trying to stop factory farming? | ||
Yeah, or trying to mitigate it. | ||
What is the solution? | ||
Like, other than going vegetarian, Have we reached this point, sort of like unmanageable point, where the population centers like Los Angeles, New York, whatever, that don't grow their own food have gotten so massive that in order to fuel these people with food, especially with animal protein, you almost have to have these setups? | ||
Yeah, I mean, I think if you've got the constraint of animal protein I mean, I think the answer is probably still no, but the other thing is you just don't need that constraint of animal protein. | ||
We eat radically more meat than we did, you know, 50 years ago, 100 years ago. | ||
Far more than we need to have a healthy diet. | ||
I mean, I've been vegetarian 11 years. | ||
Do you eat eggs, though? | ||
Free-range eggs? | ||
Yeah, I do. | ||
That's a, for me, I don't understand why people don't. | ||
Like, when PETA had that whole campaign about eggs or chickens periods, I'm like, look... | ||
I can understand you not wanting to eat factory farm chickens' eggs because these animals are tortured and they're confined and it's horrific, but you can definitely find eggs. | ||
And I have my own chickens. | ||
I have 22 chickens. | ||
And they lay eggs and I eat their eggs all the time and I ate five of them this morning. | ||
They're great. | ||
But when you're talking about those eggs, it's like... | ||
There's no suffering. | ||
The eggs come out. | ||
They don't become a chicken. | ||
You take them. | ||
It's free. | ||
And those chickens, by the way, they're a bunch of little murderers. | ||
They run around my yard. | ||
I've seen them eat a mouse before. | ||
If they found a bird that was down, like a nesting bird that had fallen out of a nest, they'll fuck that bird up. | ||
They eat anything that's on the ground. | ||
The only thing they don't seem to like, they don't seem to like slugs. | ||
Okay, you've tried to feed them slugs. | ||
No, they eat them. | ||
We pick up a rock in my garden. | ||
I'll pick up a rock and the chickens come over and just jack anything that's under the rock. | ||
They figured out that when I lift up the rock, there's bugs under there. | ||
They're little murderers, man. | ||
They're ruthless. | ||
They don't like slugs. | ||
They try them and then they start shaking their head. | ||
They try to get the slime off their beak and they kind of freak out. | ||
So yeah, I mean, there's this big, within the animal welfare, kind of activists, this is actually quite big divide between, you could call them maybe the abolitionists on one side and the welfareists. | ||
And the abolitionists' view is just, you know, the way we treat animals is like how we treated slaves. | ||
This is just, this is kind of the equivalent of slavery of our time. | ||
And the only, and you know, imagine if we'd been in... | ||
Slave-owning Americans said, like, hey, well, why don't we just cut down the number of slaves we hold? | ||
It's just not doing enough moral seriousness. | ||
The welfarists, in contrast, are more like, look, almost all the suffering, if we're going to quantify the suffering of the way humans, the animals now, 99% of it comes from factory farms. | ||
If we could eliminate that factory farms, sure, there's still something left. | ||
It's not like, even if you agree we're not at the kind of final stage, but this is where the vast majority of Both the animals used and the worst conditions are. | ||
And so the welfareist would instead say, look, let's really just focus all of our attention on this. | ||
And things like the arranged eggs or circuses or fur are just, these are just really kind of not the main issue. | ||
And, you know, I'm naturally most sympathetic to the kind of welfareist perspective. | ||
But it is interesting of the animal. | ||
I know lots of people who were in the welfarist camp and then moved to the abolitionist camp on welfarist grounds, where the kind of worry is just that if you're just trying to get people to do a little, then you're not actually going to move them at all. | ||
Whereas you need to have this hard moral line, and then people kind of see the integrity of that and follow it. | ||
Well, it seems to me that there's a slippery slope when agriculture and civilization were introduced that someone wasn't going to exploit it to the nth degree. | ||
And figure, well, there's just got to be a better way to squeeze money out of this situation. | ||
And then next thing you know, you've got these factory pig farms. | ||
I'm sure you've seen the horrific one where they fly the drone over the lakes of pig piss and pig shit. | ||
Absolutely. | ||
And that these animals are living just completely confined where they can't even turn around. | ||
And they're just pumping them up with whatever the fuck they need to keep them alive until they get to a certain point where they can kill them. | ||
Yeah. | ||
And it is true. | ||
So many people would be absolutely... | ||
If that was right there in front of them, they would be sickened. | ||
Yeah, hence the ag-gag laws. | ||
In order to keep that money coming in, they have to keep people in the dark of these situations. | ||
And unless they go online and seek it out and watch these videos, and those videos are very polarizing too, because, you know, when you come to a lot of these animal rights organizations, a lot of them have roots in the Animal Liberation Organization, which doesn't even believe that you should have pets. | ||
They think that your pets are all, you know, prisoners. | ||
Yeah, it's so interesting, going back to Peter Singer, where he said Animal Liberation, which was the name of his book, which was kind of text, you know, a founding text for what became the animal rights movement. | ||
And what's interesting is that Singer doesn't believe in rights. | ||
He's a consequentialist. | ||
He's a utilitarian. | ||
He never used the word once. | ||
So his approach would just be thinking, yeah, what's going to do the most good? | ||
And on the pets question... | ||
I don't want to speak for Peter, but he's going to think, well, if they have a good life and they're well treated, it just seems fine. | ||
And again, he'd want to say, like, the focus should be on... | ||
Suffering. | ||
Yeah, on the vast magnitudes of suffering that go on the factory farms. | ||
That's priority one, two, three, and four. | ||
Yeah, I have a hard time even entertaining the conversation that there's something wrong with a healthy pet dog. | ||
Like, that dog loves the owner, the people love the dog, and the dog has obviously gone through an incredible evolutionary process where it's gone from being a wolf to being a chihuahua. | ||
Like, if you think that thing should be out fending for itself in the forest, boy, you're dooming that little fucker to death. | ||
I mean, well, the question, the dog, in all of these cases, like, the animals wouldn't exist otherwise. | ||
And they wouldn't exist from people. | ||
I mean, if it wasn't for people breeding them and making them this bulldog, like this thing that can't even hardly breathe and walks with a waddle, like, we're weird that we've done that in the first place. | ||
Yeah, I mean, I find, especially the pets, like dogs that have difficulty breathing, genetic diseases, I find it kind of gross. | ||
It is gross. | ||
That we've kind of done that. | ||
I've got one. | ||
What type of dog do you have? | ||
It's a Shibu Inu English Bulldog Mix. | ||
Poor little fucker. | ||
He's a mess. | ||
We got him as a puppy, you know, because he was cute and, you know, he just seemed like he needed a home and we took him in. | ||
God, he's all messed up, man. | ||
I mean, I've had him for 10 years. | ||
He's had all these surgeries and can't walk right. | ||
His hips are all fucked up. | ||
It's just, like, they breed him to the point where he's half Shibuino, so he's actually better off than a lot of Bulldogs. | ||
Because he's 12 now. | ||
I don't think Bulldogs usually live that long. | ||
I don't think they live to that age. | ||
But he's got all sorts of, like, difficulties. | ||
He can't really run. | ||
You know, he's lazy. | ||
He just likes to lay down, snore. | ||
But the poor little things, like if you look at an actual, like, legit English bulldog with their flat faces, like, they have massive respiratory problems. | ||
Yeah, so I find that, like, the fact that we, like, engage in this product in the process of bleeding them kind of weird. | ||
But then, like, yeah, if you're going to have a dog and look after it, well, like... | ||
It's not the problem, right? | ||
It's definitely, yeah. | ||
And so there's this question of just, if you're talking about that, are you just, like, distracting from the main issue, which is... | ||
Right. | ||
Well, it also seems to me that this is just like everything else in life. | ||
Like, as you go down the rabbit hole and you look at it deeper and deeper and deeper, you go, God, this is a complicated issue. | ||
How do you get all these people to stop eating so much meat so that you don't need so much meat, so that you don't need factory farming and have to get people aware of what is the consequences of going and buying a chicken sandwich? | ||
Well, do you know where that chicken came from? | ||
Here, check this out. | ||
Are you happy now? | ||
And a lot of people, they watch those videos and then they go, ah, fuck it. | ||
I'm hungry. | ||
I want a chicken sandwich. | ||
Yeah, lots of people do. | ||
I do think, though, like, so in the UK, at least, if you buy a pack of cigarettes, you get these pictures on them showing kind of what this is what your lungs will look like if you smoke 20 a day and there's warnings and things. | ||
Yeah, that doesn't stop people in some weird way. | ||
I mean, people are addicted to cigarettes. | ||
I think it must have some impact. | ||
I don't think it does. | ||
But I wonder if you could buy a pack of chicken and it would say, well, this is this field of piss and shit that this chicken grew up in. | ||
Right. | ||
Like the opposite of an ag-gag law. | ||
Like force it in your face. | ||
Yeah, because look, I mean, you're just giving the consumer more information. | ||
How can that be bad? | ||
Like if you went to the butcher shop, went to the butcher section of the grocery store, and there was videos that were playing constantly above the packaged meat that showed these animals getting like a piston through the head and hanging by their ankles and getting bled out while they bucked and kicked. | ||
How many people that would be a fucking conveyor belt of baby male chicks falling into this? | ||
Yeah getting ground up. | ||
Yeah, that would be a fascinating Psychological examination to watch people walk up to that butcher shop and see those videos playing like if that became the law and I mean, there's an amazing... | ||
There's a comedy show, a sketch show, that did something kind of similar, which was, you know, they would go up to the butcher's counter and say, okay, I'd like some sausages. | ||
They'd go, okay, pick up a little baby pig and put it into this box. | ||
It's obviously fake. | ||
And just, like, do this action and sausages would come out. | ||
Obviously, they're not actually killing a pig. | ||
Right, right, right. | ||
And people would be outlaid. | ||
unidentified
|
Don't do that. | |
Like... | ||
And it's like, do you not know where pork comes from? | ||
So yeah, that's the thing that's just amazing is how people can call themselves animal lovers. | ||
Well, there's also people that love animals and eat meat. | ||
Like they'll eat steak and then get mad at people for hunting animals. | ||
I've experienced that personally. | ||
This is a good case, though, of the salience issue where... | ||
I mean, so I, like, oppose hunting. | ||
I think it's bad for the animal that gets killed. | ||
But the thing is, it's just so salient compared to factory farming. | ||
And it's like, you know, would I prefer that people hunt meat rather than, like, factory farming? | ||
It's like, of course. | ||
And, like, you do the math, it's like, not only am I behind that, I'm behind that, like, a thousand times. | ||
But again, the hunting is just, it's this very salient thing. | ||
You know, in the UK, Huge about fox hunting and so on that's a different thing because fox hunting you're not you're not eating it No, I mean it's supposed to be for yeah, it's kind of like vermin control Yeah, and there's some there's some logic to that that if you don't have natural predators you need to figure out some way to control certain populations that can be damaging like Fox or in some places black bear and there's a bunch of different animals that you do have to control because they don't have a they don't have a natural predator Yeah, | ||
yeah, but the thing that's like Incredible for me is just how people can have such long views on that and such long views on hunting and then just know the action to factory farming. | ||
They just don't see it. | ||
It is just because we are very manipulable as humans in terms of our model reactions. | ||
That's really worrying because we can't go far with that. | ||
Yeah, but there's certain animals that you have to control the populations of, especially invasive species, like pigs. | ||
Like, wild pigs are a huge problem in America in getting bigger and bigger. | ||
I know you guys don't have them as much in the UK, but in America, particularly in Texas, and now in Northern California, there's just massive, massive populations of wild pigs. | ||
And they give birth two to three times a year. | ||
And they can give birth to as many as three to six piglets. | ||
And then six months later, those piglets are ready to give birth. | ||
So they just boom, boom, boom, boom. | ||
And if you don't control their populations, what are you going to do? | ||
Are you going to let wolves loose to control their populations? | ||
I mean, they have to figure out how to do it. | ||
And so they've taken to a lot of the same strategies that they're using in New Zealand that we talked about with stags. | ||
In Texas, they have these helicopter hunts where they fly around in helicopters and just gun down hundreds and hundreds of these pigs. | ||
They do wind up donating the meat of that pigs to homeless shelters and people who need it. | ||
It's actually very nutritious and very healthy and very good for you. | ||
And that's probably way better than buying pig from someone who's raised it in some horrific factory farming environment. | ||
For people that just want the animals to live and be unchallenged and, you know, unpreyed upon, I get it. | ||
It all seems very disturbing. | ||
But you've got to control the populations because you're not going to have any agriculture. | ||
I mean, they're going to find out where the farms are and they tear them apart at night. | ||
They're nocturnal animals. | ||
You can't stop them with fences. | ||
They go right through fences. | ||
They're huge, huge animals. | ||
Wild pigs create millions of dollars of damage in Riverside County. | ||
unidentified
|
Wow! | |
Yeah, I didn't know about this at all. | ||
Riverside County is super populated, but this is an enormous, enormous problem in this country. | ||
And by the way, when you look at that animal, what's really cool about pigs is that they morph. | ||
When you see that animal, it looks very different than a domestic pig, but it's the exact same animal. | ||
They're all the same genus. | ||
It's called sous scroffa. | ||
And when you take a domestic pig and you let it go, within months, within months of being free, their hair starts to change, their snout starts to elongate, their tusks start to grow longer. | ||
Once they become feral, once they realize they have to fend for themselves, there's an actual physiological change in the structure of their body. | ||
So interesting. | ||
It's fascinating. | ||
It's really crazy. | ||
Their hair gets thicker. | ||
They develop a thicker plate, the males do, around the chest to protect themselves from other males when they fight. | ||
It's bizarre. | ||
So those wild pigs that people see, there's a bunch of different kinds. | ||
Some of them are Russian boars. | ||
They're wild, you know, different kind of pig. | ||
But ultimately, they all interbreed with each other. | ||
Yeah, yeah. | ||
This is so interesting as well, coming back to the question of what's natural and not and so on. | ||
And people often think this about meat they're eating as well, where if you look at the, you know, chickens can barely stand because they've been so engineered to have these huge breasts. | ||
Pigs that you're talking about are not meant to be pink, meant to be brown. | ||
Cows, can you really imagine a cow evolving in the wild? | ||
Of course not. | ||
All of these things are incredibly unnatural through thousands of years of selective breeding. | ||
Well, cows don't live in the wild, but here's where it gets interesting. | ||
In Australia, when cows have gotten wild, they've gotten loose from these pens that people held them in, and then they become what they call scrub bulls, and they're out there in the wild, and people hunt them like they would hunt a wild animal, and they're very wary, and they run from people, they see people, they get the fuck out of there, and the bulls are incredibly violent. | ||
Like, the male cows, these scrub bulls, are some of the most dangerous things to hunt in the world. | ||
Because they'll actively chase you down. | ||
Like a bull. | ||
Like, you know, if you see, like, people trying to ride bulls, how bulls kick and, you know, they go crazy. | ||
Well, these scrub bulls are essentially those bulls, but many, many, many generations wild. | ||
So they're feral bulls. | ||
unidentified
|
Man. | |
Yeah, so they sort of were bred to be this domestic thing, and then they got loose, and then they became this wild thing. | ||
And so they look slightly different, like that's what they look like. | ||
That's a scrub bull. | ||
So they're becoming slowly, over the course of many generations, a more wild animal. | ||
So they have these hunts for these scrub bulls. | ||
And if that thing sees you, by the way, that crazy looking bull, they will fuck you up. | ||
They're some of the most dangerous animals that you can encounter in the wild, apparently. | ||
But I have a buddy, my friend Adam. | ||
Yeah, they look different. | ||
Like, look at the hump on its back. | ||
I mean, that looks like some crazy wild African animal. | ||
And it was originally, a long time ago, a regular domestic cow. | ||
But yeah, so it shows just how artificially they are. | ||
If that's the sort of changes you get over just the course of a few generations. | ||
Natural selection, as opposed to what we're doing with dogs, you know, when we create a bulldog. | ||
I mean, that is, those are the animals that have survived, and they've changed their coloration, their physical structure looks different, you know, over many, many generations. | ||
It's quite, quite fascinating. | ||
It's like we have to figure out where we stand, I think, in terms of the entire ecosystem, because we're certainly not viable. | ||
We can't go out there and live amongst those animals. | ||
I mean, we won't. | ||
We'll get killed. | ||
We'll get eaten. | ||
So we have to stay inside of our homes. | ||
We have to stay inside of our environments. | ||
And then we have to figure out, like, how much of an impact should we have on those things around us? | ||
Should we be like all the other animals, like the wolves and all these other animals, the coyotes that have this impact on the environment? | ||
Or should we try to lessen our footprint to the point where we have zero impact on any of the animals and we just live inside of these sustained areas that grow vegetation? | ||
It's an interesting question because those animals prey on each other. | ||
They all do. | ||
And should we be a part of that? | ||
Should we take part in that? | ||
I definitely don't think we should factory farm. | ||
I definitely think that that was a huge mistake. | ||
And I also definitely think that that huge mistake is what led us to be able to have these gigantic cities. | ||
And I don't think necessarily cities are a huge mistake, but man, trying to figure out how to feed those people... | ||
In the way that they're accustomed to eating right now, that's a massive battle. | ||
Yeah, but I think the kind of question of large populations and how do you feed them, that massively tells in favor of lower meat consumption or vegetarians. | ||
Sure. | ||
Because you've got this 10 to 1 rule where to create a calorie of meat, you need 10 calories or more of grain or soy or whatever you're feeding them. | ||
Unless you're dealing with people just consuming wild pigs. | ||
Yeah, wild pigs or kangaroos or something. | ||
There are exceptions to that as well. | ||
But I don't think there's enough to feed people, though. | ||
That's the other thing. | ||
There's 350 million people in this country. | ||
There's not 350 million wild pigs. | ||
Yeah, exactly. | ||
But it means that in the future, just as populations get larger, then... | ||
Yeah, again, we're just going to need to use land and energy more efficiently. | ||
So this is yet another argument in favor of plant-based diets. | ||
Yeah. | ||
Well, in America, at least, the majority, the vast majority of the money that goes towards conservation, towards keeping wild animal populations high, is actually from hunting. | ||
unidentified
|
Hmm. | |
It's a real strange contradiction that makes people really uncomfortable once they find it out, is that the vast majority of the money that goes to protect habitat, to preserve wild lands, it comes from hunting. | ||
In fact, hunters voluntarily agreed, I believe it was in the 1930s, to give up 10% I want to make sure that number's right, too. | ||
But the money, in terms of the percentage of sales of hunting equipment, goes directly towards conservation. | ||
Interesting. | ||
Yeah. | ||
There's all these different entities, like the Rocky Mountain Elk Federation, that have repopulated elk into all these areas, but done so specifically so that people can hunt them. | ||
So it gets really weird. | ||
Yeah, yeah. | ||
It might be an uneasy alliance between... | ||
In many people's eyes, but they're the ones that are giving up the money. | ||
The money is not coming from altruistic organizations that just want to preserve these animals so that they can exist in the free, wild way that they did before people got here. | ||
But there's more white-tailed deer in America today than when Columbus landed. | ||
And that's all because of conservation, because of hunting. | ||
So it's another one of those weird things where when you look at the whole picture... | ||
Yeah, there's a solution that I've heard suggested for reducing species loss, which is to allow basically ownership of species. | ||
So you could copyright the panda. | ||
Oh, yeah. | ||
Now, isn't this a weird idea? | ||
But the idea is that now, suddenly, like at the moment, no one has a financial incentive to ensure that pandas don't go extinct. | ||
Whereas if someone were able to say, no, if you want to use a panda in video or so on, you have to pay the owner. | ||
Yeah. | ||
Well, I'm very uncomfortable with the idea of this laboratory-created meat. | ||
As to where that's going to go. | ||
Why are you uncomfortable? | ||
I'm very, well, positive about it. | ||
I mean, the science is kind of tricky, but... | ||
I'm positive about it in that it's not going to be any animal suffering. | ||
It's going to be fascinating in that regard. | ||
But what's going to happen if we find out that... | ||
Well, there's a bunch of different things, right? | ||
First of all, we have to make sure it's healthy. | ||
We have to make sure that it doesn't cause some sort of a weird disease because you're not eating something that's living and moving and when you eat sedentary creatures maybe there's some sort of an adverse impact on our biology because I think there's an adverse impact when you eat protein from an animal that is like weak and sick and they've actually shown There was a study that Dr. Rhonda Patrick sent me recently that showed that animals that eat older, | ||
sick animals die quicker. | ||
They have a shorter lifespan and exhibit less health characteristics, I believe it was, than animals that ate younger animals. | ||
And there seems to be some sort of a direct correlation between eating younger healthy things and having a positive healthy impact on physical life itself, the animal that's consuming it. | ||
And that if you're eating something that never existed in the first place, like... | ||
Unless they're able to recreate the characteristics of a healthy animal, like a strong muscle tissue, like maybe they could do that with electrical impulses, like some sort of electrical muscular stimulation. | ||
Yeah, I don't see why that would be a problem. | ||
I mean, at least you'd think you'd be able to get past that, in fact, where, you know, the meat that we've currently got, stuffed full of antibiotics, you know, there's often viruses that... | ||
Sure. | ||
Yeah, viruses that kind of arise. | ||
Swine flu, avian flu. | ||
Avian flu, exactly. | ||
You could avoid all of them. | ||
All that stuff comes from factory farming. | ||
And in fact, yeah. | ||
And in fact, then, once you start to engineer meat, perhaps you could engineer exactly the healthiest sort of meat. | ||
You've got so much more control over the product. | ||
That would be crazy. | ||
And so, of course, you've got to, yeah, with the development of any new technology, you've got to be cautious about it. | ||
But ultimately... | ||
It seems like we should be able to get to the point where we have tastier, cheaper, more healthy meat that has far less carbon dioxide as a side effect, uses far less land area. | ||
It's going to be better than every single way. | ||
I think the science, it does seem hard, in particular, just to get the costs down low enough. | ||
Well, I think they've got it down pretty low. | ||
I mean, there was a recent article about it where they were talking about the original one was worth hundreds of thousands of dollars, and now they've got it down to like 20 bucks. | ||
Yeah, I think that was misleading, actually. | ||
unidentified
|
Was it? | |
Yeah, it's a shame. | ||
Like, it seemed to me that there's been a little bit too much hype around in Víctor de Mí, where, yeah, there's some stories of, like, the costs are radically going down. | ||
unidentified
|
Right. | |
Whereas it's definitely much lower than that. | ||
I think it's definitely still decades away. | ||
Decades? | ||
Yeah, I think so. | ||
Really? | ||
Yeah. | ||
What makes you think decades? | ||
So the argument is that currently the... | ||
So it depends on what you're talking about. | ||
Like egg white, I think, is pretty easy comparatively. | ||
Milk is comparatively easy, but structured meat. | ||
So, you know, steak or chicken, it has a structure. | ||
That's, I think, very difficult. | ||
And I think... | ||
Apparently part of what the difficulty is, there's a certain solution that you need to grow this meat in, and that solution is currently very expensive. | ||
And the key part of the cost, even once we get to the point of being able to develop this, getting the cost down low enough so it's competitive, You're going to need to take this fluid that currently costs, I don't know how much, but like a thousand dollars a litre, get it down to the cost of soda. | ||
And we don't currently, it seems, have like a clear kind of scientific path towards that. | ||
It would be the ultimate conundrum. | ||
If they found out that the only way to make that fluid and to make it financially viable was to make it out of ground-up pets that get killed anyway, euthanized pets. | ||
Like, would people be upset if they took euthanized pets and they used it to make the fluid to grow the artificial meat in? | ||
Or would they prefer those euthanized pets just be cremated? | ||
So, at the moment, that fluid does have to come from animals. | ||
There's a certain part of it that is animal-based. | ||
I was just guessing. | ||
So, it's not exactly ground up pets. | ||
Puppy brains. | ||
Only puppy brains. | ||
As I understand, it's still currently not vegan. | ||
But it's interesting. | ||
I think it's going to change. | ||
I mean, I do think given the level of just moral cognitive dissonance that's currently going on between people's attitudes to animals, pets, any animal they can see, and consumption of meat, once you take self-interest out of the equation, once you've got meat that is cheaper and just as tasty, I think just everyone's going to switch. | ||
And then within a generation, people will look back at the current generation and just think, how did anyone ever engage in such... | ||
Abominable activity as factory farmed meat. | ||
Yeah, well, it's probably one of the darkest things that we as a civilized humanity do. | ||
When you think about, other than war, which is obviously the most horrific thing, or one of the most horrific things, I mean, it's arguable that in terms of suffering, it's the next thing. | ||
Because, I mean, it has to be, it is the next thing, right? | ||
Other than poisoning people for profit, you know, other than companies that have polluted environments that have wound up poisoning people. | ||
Yeah, so in terms of animals, so 50 billion animals are killed every year for human consumption. | ||
Worldwide. | ||
Worldwide. | ||
Most of them have kind of short lives, so... | ||
No, broiler hens have six-week lives. | ||
That's crazy. | ||
Six weeks. | ||
So from the time they're incubated to the time they're in an oven. | ||
Six weeks. | ||
That's nuts. | ||
That's the point at which they die. | ||
That's the highlight of their life, in my view. | ||
Because their life is just filled with suffering. | ||
So that means at any one time, there's seven billion animals in factory farms right now. | ||
Living, basically being tortured for the entirety of their short lives. | ||
So the entire population of the human race. | ||
It's basically one-to-one, yeah. | ||
At any one time. | ||
Isn't it nuts that that's less than 100 years old? | ||
Yeah, much less than that. | ||
Less than 50 years old, really. | ||
Who's the first crazy asshole that jammed those chickens into those little cages? | ||
Henry Field, I think? | ||
That's the guy? | ||
Is that his name? | ||
So he, fascinating, Rise of the Free Marketeers. | ||
So back in the 50s, I'm going to go on a digression, but it's not as bad as Infinity, I promise you. | ||
That's all right. | ||
They're all awesome. | ||
So back in the 50s, free market economics was just completely dead. | ||
It was just not a mainstream idea at all within academic economics. | ||
But it really rose to prominence across the end of the 60s, certainly the 70s, and then Thatcher and Reagan getting in power. | ||
Huge uptake in this intellectual movement. | ||
And so the question is kind of where did it come from? | ||
And it was actually very significantly driven by a small number of people in the 50s and early 60s, like very deliberately saying, okay, we want this ideology to become really dominant. | ||
And one of the most important first organizations was the Institute for Economic Affairs based in London, a think tank. | ||
And it was funded by the person who brought factory farmed chicken to Britain. | ||
So it's weird because I promote this idea of earning to give as something that young people should consider. | ||
Not as the only thing they should do, but as one of many things they should do, or should consider doing. | ||
If you want to do good, you could go and, like, directly have an impact. | ||
But there is also another option, which is doing something you're really passionate about that maybe has less of a direct impact, earn a lot, and donate what you're making, at least a significant part of that, to the things you think are most important. | ||
And then I think of this, I think it's Henry Fisher, sorry. | ||
This is like this most perverse instantiation of that, where the guy went and became a factory farming entrepreneur, basically on an indicative give grounds. | ||
Jesus Christ. | ||
Isn't that just so indicative of how humans are so contradictory? | ||
We're so complex and so strange. | ||
Mm-hmm. | ||
In that we will find all these justifications for all these bizarre behaviors, and then we're never, like, totally pure. | ||
Like, there's so many people that are so... | ||
This is the terrible example, but it's the one I use all the time. | ||
Bill Cosby made so many people laugh, and he raped about a hundred, or whatever, allegedly. | ||
You know? | ||
Like, he was... | ||
Helping and putting out so much love to so many people and then being fucking evil to a bunch of people that he drugged It's like yeah that this this exists this Duality that this yeah think about Nazi Germany you think about the number of people who were involved in the Holocaust who loved their children and then children and if you talk to them You would have had it like a great conversation. | ||
They would have been very caring and so on this is I mean, yeah, it's a very powerful idea, the banality of evil, have a Rents phrase, where, yeah, the worst crimes committed are not because people are bad. | ||
It's because... | ||
Not bad or evil in the way that you think James Bond villain, like this person's plotting something. | ||
It's just because they have some goal that is... | ||
Some goal on which they are indifferent to suffering, and they cause that as a side effect. | ||
And so it's the same. | ||
If you ask people, do you want animals to suffer horrifically, infect the farms? | ||
They'll say, no, of course not. | ||
It's just that I don't care. | ||
Casualties of war. | ||
Exactly. | ||
And casualties of civilization. | ||
Yeah. | ||
And the same insight, actually, when we talk about AI as well, is... | ||
You know, sometimes in the media, people say, oh, the worry about AI is a Terminator that's going to want to kill humans. | ||
But that's not the worry at all, the idea. | ||
Or when you think about Homo sapiens and Neanderthals, again, it's just having some other entity that has goals on which you're just not very important. | ||
Right. | ||
And that means that, yeah. | ||
Well, and they're also going to judge us. | ||
I mean, if they are intelligent and they are superior to us, they're going to judge us based on the entire whole of our behavior. | ||
And they're going to go, look at this messy species. | ||
This fucking species is crazy. | ||
Elon Musk has my most terrifying quote. | ||
His quote is the most terrifying to me, that he thinks that with AI we are summoning the demon. | ||
Summoning the demon, yeah. | ||
I love that quote. | ||
It's just like... | ||
I mean, I worry... | ||
Yeah. | ||
Like, I think a lot of the media attention around AI is, like, has been really unfortunate because it suggests, like, it's coming next year and it's going to control its... | ||
Like, the demon, I think, anthropomorphizes it more than is necessary and so on. | ||
Sort of. | ||
I think... | ||
But of its ultimate goal is the extinction of the human race. | ||
That's very demonic in our regard. | ||
Yeah. | ||
I mean, it's more... | ||
Indifferent? | ||
Yeah, indifferent. | ||
Sort of the way we think about mollusks? | ||
Yeah. | ||
Yeah, exactly. | ||
Or the way we think of, like... | ||
You know, mosquitoes. | ||
Yeah. | ||
Mosquitoes are my favorite because vegans will slap mosquitoes. | ||
Oh, yeah. | ||
I mean, are mosquitoes sentient or not? | ||
They're alive. | ||
Yeah, but I think clams and mollusks aren't sentient. | ||
Then insects, I'm like... | ||
Well, there's some weird arguments about that then. | ||
I mean, why not eat crickets? | ||
Because cricket protein is excellent. | ||
I've had cricket bars before. | ||
They're covered in chocolate. | ||
They taste really good. | ||
They're high protein. | ||
Yeah. | ||
I mean, many... | ||
I do know many people who do advocate for that. | ||
My view is just like, if you're unsure, then play safe. | ||
And you'd be eating a lot of crickets. | ||
Yeah, but there's a lot of crickets out there to eat. | ||
Well, if you're growing them, like, you'll be hunting crickets with a tiny little spear. | ||
I don't think that's how you do it. | ||
You're a lot more brutal than that. | ||
I mean, I think factory farming for crickets would be a horrific institution. | ||
You know, and you just, what would you do? | ||
Just fucking swarms of them and smash them down to a protein bar. | ||
What I worry about is that, what is the current number of the amount of species that have ever existed that are now extinct? | ||
It's fucking huge. | ||
It's like 99.99% or something. | ||
Why not us? | ||
Why not us? | ||
And if we do give birth to artificial intelligence, if we are the caterpillar that gives birth to the butterfly that winds up taking over the world, some artificial butterfly. | ||
Yeah, I mean, I think the thing that worries me is that You know, it's... | ||
AI is kind of its own thing. | ||
And I think, you know, we do... | ||
Because it's, like, potentially extremely beneficial as well. | ||
Right. | ||
Even if supposing it goes well, then it's a huge thing. | ||
We should care about it whether or not we're worried about the extinction risk. | ||
One of the rare cases, I think, where we can really see into the future and think, yes, this is going to be a transformative technology. | ||
We don't know when it's going to happen, but when it does, it's going to be transformative. | ||
And it's going to be very powerful. | ||
And that means we should have some kind of careful thought about it. | ||
But it seems to me there's a variety of ways that the human race could kill itself now. | ||
So novel pathogens being one example. | ||
Large Hadron Collider. | ||
I mean, so my colleague, Toby, actually wrote a paper on the Large Hadron Collider because there was all this, you know, talk about, oh, we could create black holes and so on. | ||
And so he wrote an academic paper where he just talked about the risk analysis that they did. | ||
And they said, oh, the chance of the Large Hadron Collider creating a black hole or something else that was, like, really dangerous is 10 to the power negative 63. Yeah. | ||
You know what that's not? | ||
Go on. | ||
Zero. | ||
It's not zero. | ||
Firstly, it's not zero. | ||
Those motherfuckers. | ||
The odds were really long. | ||
We didn't know. | ||
But the second thing also is that you shouldn't think that anything's 10 to the negative 63, really, unless you have very, very strong models behind it. | ||
Because what's the chance that you just made some mistake in your calculation? | ||
It's like, you know, maybe it's as low as one in a million. | ||
But that mistake completely swamps the probability. | ||
And so that was the point that he was making. | ||
Just statistical point saying, look, I'm not commenting on whether this is dangerous or not. | ||
It's just that you've made a mistake in your methodology. | ||
With respect to your risk assessment. | ||
And so it was really funny because then he was there, this very calm, sensible, you know, philosopher from Oxford in a press meeting with the Large Island Collider surrounded by all the, you know, aluminum, like tinfoil. | ||
Aluminum foil, tinfoil hat people? | ||
Yeah, aluminum foil. | ||
Aluminium. | ||
I love the way you guys say aluminum. | ||
I know. | ||
Apparently, I was so annoyed when I found this, but apparently your way is correct. | ||
Of course it is. | ||
We're American. | ||
How dare you? | ||
Tire with an I, really. | ||
And then you start saying things like foyer, click. | ||
Oh, there's a nice click of people in the niche in the foyer. | ||
Oh, niche. | ||
Niche, clique, foyer. | ||
Foyer. | ||
We say foyer, though. | ||
Do people say foyer? | ||
I've definitely heard foyer. | ||
They're Walmart people. | ||
Those are white trash. | ||
Where are you from, originally? | ||
Where am I from? | ||
New Jersey is where I was born. | ||
Interesting, because I thought it was a New York thing, but... | ||
Maybe. | ||
Maybe not. | ||
I didn't really grow up there. | ||
I grew up all over the place. | ||
Boston, mostly. | ||
Yeah, but no, you do so many things long. | ||
It's very disgusting. | ||
Oh, how dare you. | ||
It's very disgusting to me. | ||
I'll tell you what we don't do. | ||
We don't do queens. | ||
You guys still have queens. | ||
I love the queen. | ||
Get out of here with that shit. | ||
She's still going. | ||
Ridiculous. | ||
It's so funny. | ||
I feel like every time I go from... | ||
No, I think the queen is hilarious. | ||
unidentified
|
It's ridiculous. | |
That's her job. | ||
Her job is to wave in this very particular way. | ||
She doesn't even really wave. | ||
She just kind of rocks her hand back and forth. | ||
Some sort of weird semi-Vulcan stance. | ||
It's kind of funny, yeah, talking to especially some of the kind of progressive friends I have in America, and they're like, you've got a monarchy, isn't this? | ||
Like, isn't everyone talking about it? | ||
But you guys think it's quaint. | ||
Yeah, we're just like, no one really thinks about it. | ||
Well, she doesn't really have power, right? | ||
But she still lives in a fucking castle. | ||
She lives in a castle. | ||
She lives off the dime. | ||
But if you do an economic analysis, she brings in more money than she... | ||
Um, sort of, but she takes a lot. | ||
She's sort of the anti-Will McCaskill, if you ask me. | ||
Yeah, that's right. | ||
I mean, the charities they support. | ||
She just gets all this free money, and that bitch just wears gold and shit and drives around in a limo. | ||
It's kind of ridiculous. | ||
So you could, yeah, you could definitely get the same tourism benefits. | ||
People are mad that I say bitch. | ||
I'm not really meaning the word bitch. | ||
It's like, all due respect, folks. | ||
It's just a figure of speech. | ||
It's a humorous figure of speech. | ||
Okay, well, I appreciate the... | ||
Yeah, I don't want to disparage your ruler. | ||
Yeah, I appreciate the caveat. | ||
Such a strange ruler, though. | ||
Kings and queens and Prince Charles. | ||
It's a really funny part of British culture. | ||
It's so funny because I spend a lot of time in California, but every time I come back, it seems to be on some major event to do with royalty. | ||
So one was the queen's birthday, one was the event of the queen being the longest ever running monarch, one was her jubilee. | ||
We don't know about that at all. | ||
We don't know about any of those things. | ||
For you, it's these massive events. | ||
Well, it just means I come off the plane, been in America for a while, and there's just pictures of the Queen everywhere. | ||
Ah, I see. | ||
Okay, yeah, I'm definitely back in Blighty now. | ||
Now, what's going on now? | ||
Anything crazy? | ||
What's happening now? | ||
In the UK, yeah. | ||
Huge news today. | ||
What? | ||
Well, for me, as a Scot, Nicola Sturgeon, the First Minister, so like the leader of Scotland, I kind of think of Scotland to the UK as like state to federal, but it's a little bit different. | ||
Announce there's going to be a second, she's planning a second Scottish referendum. | ||
So because Britain is taking itself out of the European Union, where they expect, is that announcement going to be made Tuesday, end of month, but very shortly. | ||
Scotland did not want to leave the EU, voted overwhelmingly in favour of remaining. | ||
So Scotland in general tends to lean a lot further left than the rest of the UK. And previously had an independence referendum, it was very close actually. | ||
52% were in favour of staying part of the union, so they stayed part of the union. | ||
There's now going to be a second referendum, at least this is what Nicola Sturgeon is saying. | ||
And because of the Brexit vote, I think it's much more likely that Scotland will say, yes, we're going to leave, and then they remain part of the European Union, whereas the rest of Britain will leave. | ||
And it's interesting for me because I was very kind of pro the Union against independence in the previous referendum. | ||
Now I'm not sure. | ||
I think I probably am in fact, because I think that Brexit was just such a bad decision that I kind of want them to be punished for it. | ||
And well, I think there's two things. | ||
One is that... | ||
I think that now the case for Scotland being part of the EU but not part of Britain, the economic case makes a bit more sense now than it did in the past. | ||
But then secondly, I would worry that Britain leaves the EU, does that trigger spark a much larger movement where just the EU as a project breaks down? | ||
And if it's the case, like, well, UK leaves the EU, but as a result, the country just falls apart. | ||
So you wanted that to happen? | ||
You wanted England to fall apart, to be punished for leaving the EU? I mean, I think it would be like a very major signal. | ||
But what if they prospered? | ||
And they were correct. | ||
Oh, yeah. | ||
I mean, then if I was convinced that the Brexit was the right decision, it was actually best for the world, then I would change my mind. | ||
I don't know enough about it, but I do have a friend who's very knowledgeable, and he's from England, and his take on it was the real issue with the EU is that you're dealing with a bunch of people that aren't even elected. | ||
They're just sort of running the European Union. | ||
And he's like, we don't have to tell you when you just look at history what happens when people have a great amount of power and aren't even elected to their position. | ||
And you're allowed to just go to any part of the European Union and move into it. | ||
He's like, that was very detrimental and very bad in terms of the way England's Financial structure was set up. | ||
They were like, this would be detrimental to England, but beneficial to other places. | ||
And the idea was that we were supposed to accept the fact that it would be detrimental to England and beneficial to other countries. | ||
And many people in England did not want to do that. | ||
And in making that decision, they were thought to be xenophobic, they were thought to be nationalistic, and that it was racist. | ||
So I think there's, yeah, two things. | ||
I mean, one thing is, yeah, I don't like... | ||
Yeah, I mean, so there's kind of two things. | ||
One, with respect to the kind of sovereignty question. | ||
I mean, like, European Union, like, it has its own parliament and so on. | ||
You can vote on that. | ||
You each get a number of... | ||
And the reason, insofar as it's undemocratic, it's mainly just because people don't care. | ||
They don't care whether or not it's democratic? | ||
As in voters. | ||
So turnout to elections for members of the European Parliament. | ||
The turnout is very low. | ||
I think it's like 30% sort of thing. | ||
Maybe it'll be larger now that they realize the consequences of it. | ||
Well, I mean, there's not going to be any more because it's going to leave. | ||
It's going to implode? | ||
Well, Britain's leaving. | ||
Britain's leaving. | ||
So you're no longer voting for members of European Parliament. | ||
So that's one question. | ||
And then, like, is this good or bad for Britain? | ||
I think, like, the economic case is just incredibly strong for Europe being kind of good for Britain. | ||
The reason being just, like, free trade in general benefits both parties. | ||
You want to really maximize the amount of free trade. | ||
But then the bigger thing for me is just like with respect to unity between countries is like the tail risk, risk of war, which we don't really think about because we haven't had a world war since, you know, the early mid 20th century. | ||
But Europe had had, like, a long period of comparative peacefulness, like before the First World War, people thought, no, it's unthinkable, given the level of interconnectedness between the countries that a world war could break out, and then two did. | ||
Right. | ||
And so, and I think those sorts of things would be, you know, that's the tail outcome, but can be very bad indeed. | ||
And we don't often think about it because it's just this occasional thing. | ||
And so that's why, in general, I'm just almost always more pro-closer relations between countries. | ||
That makes sense to me. | ||
What he said makes sense to me as well, though, when he was saying essentially it was like, think of the United States, but now think of each state being a country. | ||
You're allowed to elect a leader of that country, but you can't elect a leader for the United States. | ||
And so that's essentially how he was looking at the European Union. | ||
He was saying the European Union is, they're not elected, and yet they're controlling all these other elected officials and elected states, all grouped together. | ||
Instead of thinking them as like Germany and thinking it was England, think of them as states. | ||
And think of the European Union and the officials, the people that are in control of the European Union aren't even elected. | ||
Yeah. | ||
So, I mean, you do elect the parliament. | ||
And then it's also the case that the analogy, like the amount of power that Europe has over the remaining, the other countries is like, you know, nothing like the amount of power the federal government has over the states. | ||
You know, the UK sets, so the powers the EU has, one of the things that got made lots of attention was bendy bananas. | ||
This got like a real focus area for people's ire. | ||
Bendy? | ||
What does that mean? | ||
So according to EU regulations, so EU has a single market, so that means you have just the same standards across all countries. | ||
But then that means you just start to have these standards on for things, like bananas. | ||
And so there was one EU regulation which was that a banana couldn't be too bendy, otherwise it would count as a defective banana. | ||
And so people were like up in outrage about this, like how can the EU dictate to us the shape of our bananas? | ||
But I think the case is like a good one. | ||
It's like, it's really not that important. | ||
It's just a banana. | ||
Why do they even try to regulate it then? | ||
Well, it's because if you want to have like a free, like single market, you need to have common standards across. | ||
But doesn't the market dictate those standards where like the bendy bananas don't sell and then the straighter ones do? | ||
Yeah, I mean, I don't know more of the detail about the bananas. | ||
It seems to me like any time the government steps in on something as fucking ridiculous as the bend in the shape of a banana, they'll be like, hey, fuckface, why don't you go take care of poverty? | ||
You know? | ||
Why don't you handle something real instead of dealing with bendy bananas? | ||
Look, so on the bendy bananas case, yeah, I can't, off the top of my head, think of why you'd want to not allow the sale of other bendy bananas. | ||
But that's what people worry about when they worry about bureaucracy, when they worry about too much control. | ||
Yeah, so... | ||
That's a great example, in fact, of why people don't want micromanaging of our culture. | ||
Yeah, but then the question is, do we want to leave over bananas? | ||
Well, there's a lot of other factors. | ||
It's not the bananas that caused it, right? | ||
But the thing is, the UK, as part of the European Union, has sovereignty over its income taxes, all of its laws, as long as they don't conflict with the UN Declaration of Human Rights, which was first invented by the UK, has control over all of its internal legislation. | ||
It can go to war if it wants, and it did. | ||
So the loss of sovereignty seems pretty mild from my perspective, and I feel like I feel like they focus on these examples, which is like, okay, maybe, like, let's say, yeah, it's okay, it's a cost. | ||
We would like to be able, like, maybe it would be better if Britain could make decision of bananas. | ||
Maybe the bananas was the bad call. | ||
Well, it definitely doesn't seem like a universal reaction. | ||
I mean, there's a large percentage of the people in England that are very upset about Brexit. | ||
It's a really interesting sort of a divide between people. | ||
Yeah. | ||
I mean, the thing that I find fascinating is that we would make, and I think this in general, I think this with elections as well, because I studied a bunch of voting theory while doing my PhD, and we make these momentous decisions as a country where we get everyone in the population to try and go to a specific place and then get the smallest possible information out of them that you can, which is just a single tick, like yes or no. | ||
Whereas there's so much more you could be doing. | ||
Right. | ||
In one case with a referendum, rather than just at a particular date, where the turnout is affected by things like the weather, it's affected by what happened in the week before, instead you just have three referenda. | ||
And given the momentousness of the decision, spending more money on actually getting the accurate views of the people, Is super important. | ||
So instead, yeah, you have three over the period of six months and choose the best, you know, best out of three, basically. | ||
That would be like a more accurate representation of what people think over time. | ||
Sure, but isn't there also a gigantic issue with people not being informed about what they're voting on? | ||
You don't have to be informed. | ||
About what you're voting on, you certainly don't have to be accurate. | ||
You could easily be misled. | ||
And the actual hard, provable facts could be completely outside of your grasp, and yet you still make a big decision. | ||
Yeah, I wondered before about having a... | ||
A test? | ||
A test, yeah, you go. | ||
But really, really basic. | ||
I think it would still... | ||
There's this question of just, why do we care about democracy? | ||
What's the point? | ||
Who questions that? | ||
It seems like a really important thing. | ||
Oh yeah, political philosophers talk about this all the time. | ||
So they kind of agree, like... | ||
Democracy seems good. | ||
Other forms of government that we know so far seem terrible or worse. | ||
But why? | ||
Why is democracy good? | ||
Is it just that democracy gives us this way to boot out dictators and the risk of a single person taking power is just really, really bad and so we just need some mechanism to get rid of that? | ||
Is it that it's intrinsically valuable? | ||
Is it that people just have a right to have equal representation and that's just this fundamental thing? | ||
Or is it justified just in terms of the consequences? | ||
Is it because if everybody's able to contribute, then people will make better decisions? | ||
I don't necessarily think it's an either-or. | ||
I think there's also that people like to feel like they play a part. | ||
Like they don't want to feel like they're being ruled over by some monarch. | ||
They want to feel like they have some sort of a play in their decision-making. | ||
It's also one of the gross things about Trump winning in this country is how many people gloated You know, how many people gloat upon victory that their side won, and then you're dealing with this whole team mentality that people adopt when it comes to any sort of an issue. | ||
Yeah, well, I mean, this is... | ||
Including Brexit, right? | ||
Yeah, no, in general, this is one of the things I'm really worried about with... | ||
Is increasing levels of partisanship. | ||
This is just this really robust phenomenon that we're seeing. | ||
And it's really worrying because it means that we're just undermining any chance of people changing their minds. | ||
Like, Trump won. | ||
Like, people say, well, of course, it's Comey, etc. | ||
But, like, the vast majority of Trump's votes were, and similarly for Hillary's votes, were from people who just always vote Republican or always vote Democrat. | ||
Well, not necessarily, because Trump won by so many votes that a good percentage of them had to have voted for Obama, just statistically. | ||
Oh, but I'm still thinking, of Trump's votes, what proportion of people have only ever voted Republican? | ||
That's a good question. | ||
And I would, like, definitely bet greater than 80%. | ||
Really? | ||
Probably better than 90%. | ||
Yeah, that's right. | ||
I mean, if you look at the polls, like, it's always that, in terms of expected number of votes, like, oh, it's only 46% in favor of Trump. | ||
Well, there's also the issue that the independents in the swing states, whether it's Gary Johnson or whether it's Jill Stein, those independents, the amount of votes they got would have swung the other way towards Hillary. | ||
Yeah, I remember looking into this for Jill Stein in particular, and actually it was the case, she would have won the popular vote by even more, but in none of the swing states did she get enough of a percentage. | ||
Not just Jill Stein, but Gary Johnson as well. | ||
Yeah, though Gary Johnson, it seemed to me, was split almost evenly between Thump and Hill. | ||
Right. | ||
But this is an interesting case, so... | ||
The thing that people don't think about so much is I think the process, we call this a democracy, but one single checkbox every four years is the smallest amount of information you can be getting from people. | ||
And it's susceptible to all sorts of different things. | ||
And this happens on both sides. | ||
So supposing Jill Stein became really popular, took 10% of the vote, She would have just killed Hillary. | ||
Like, absolutely. | ||
Or supposing that Evan McMullen, is that his name? | ||
Who's that? | ||
Yeah, he was a Republican independent. | ||
Okay. | ||
Did well in Utah. | ||
Anyway, supposing a far-right candidate does really well. | ||
Again, takes all of the votes away from Trump. | ||
The fact that that's possible shows that, like, first past the post, the voting system is a very bad voting system. | ||
It's not accurately representing the will of the people. | ||
And we could do so much better than it would mean that... | ||
As a democratic process, you'd be much closer to representing what people actually believe or feel about things. | ||
Because right now, it means that, yeah, you can be influenced by stuff like how much support does a third party get? | ||
That's a terrible system. | ||
It's a terrible system. | ||
It lasts too long. | ||
The decisions last for four years. | ||
This person gets locked into position unless you impeach them and then remove them from office. | ||
They're stuck. | ||
It sucks. | ||
I wish I could talk about it more, but I can't. | ||
I've got to get the fuck out of here. | ||
But that's the least interesting thing we talked about today. | ||
But the AI and all the other stuff is just fascinating stuff. | ||
If people want to know more about your effective altruism movement and more about you, where should they go? | ||
They should go to effectivealtruism.org. | ||
That's got tons of information about effective altruism. | ||
If there's one takeaway that you really want to do, you think, wow, actually, this was kind of cool. | ||
I do want to make more of a difference. | ||
We've just launched a set of funds, so it just means you can donate within one of these core areas of global development, animal welfare, or preservation of the long-run future against global catastrophic risks. | ||
You can just donate and have it ensure that it will go to the very most effective non-profits. | ||
0% overhead, depending on how you donate. | ||
And we don't take any money along the way. | ||
And just means that, yeah, super easy to donate as effectively as possible. | ||
Alright, beautiful. | ||
Thank you, Will. | ||
Appreciate it, man. | ||
It was fun talking to you. | ||
We'll be back tomorrow with Jim Norton. | ||
See ya. | ||
unidentified
|
That was fun, man. |