Speaker | Time | Text |
---|---|---|
unidentified
|
(dramatic music) | |
On the Rubin Report each week, we try not to get too caught up in the minutia | ||
of what's going on each day in the news. | ||
While paying attention to current events is obviously important, there's so much happening all of the time, and staying on top of every single news story can be overwhelming and exhausting. | ||
The really important but less sexy issues, like infrastructure and education, often get overlooked or left behind. | ||
Instead, I try to focus our conversations on the philosophical underpinnings that lead my guests to their conclusions rather than discussing every specific occurrence that happened that very day. | ||
Real conversations and clear thinking are a long game, and it's often more important to understand why people think a certain way than how they think about a particular issue. | ||
Unfortunately, this technique is totally lost on almost everyone these days, which is why instead we bounce from one daily outrage to the next. | ||
In an election year, particularly this election year, this concept of bouncing from one outrage to the next has reached a whole new level of insanity. | ||
Before you even have a minute to process what the candidates actually said, media on both sides is telling you how outraged you should be and which hashtag you should use to voice your outrage with. | ||
Every quote, statement and comment is endlessly clipped and analyzed to the point that you can barely figure out what was originally said. | ||
Candidates subtweet each other to see who can come up with the best insult and go on late night talk shows to see who can get the most fake laughs. | ||
The side show has become the main show and we've all got a front row seat. | ||
Recently I said I was going to support libertarian candidate Gary Johnson to see if we could help get him to 15% in the national polls, which would have qualified him for the presidential debates. | ||
Although I explained that Gary isn't even a great candidate, nor a particularly good libertarian, at the very least we needed more voices to be heard on the debate stage than what the Democrats and Republicans have managed to throw at us. | ||
My hope was that if we could just get someone else to join Trump and Hillary for even one debate, that we could spark some new ideas in people which might make our political discourse a little saner. | ||
Unfortunately, Gary didn't make it into the debate, and much of the blame is on Gary himself. | ||
In addition to a generally uninspiring campaign in a year when a huge percentage of Americans would have seriously given a third party a real look, Gary has had a series of gaffes, making me even question my initial support for him. | ||
Just google Gary Johnson Aleppo or Gary Johnson Tongue or Gary Johnson Illegal Immigration. | ||
While we failed to get another candidate into the debates this time around, it's my hope that no matter who wins this election, by 2020 we'll finally get ourselves out of this stifling two party gridlock. | ||
It will be incredibly hard work to make this a reality, but it starts by starving both parties of what they crave most, which of course is money. | ||
Until we get around to putting them on their cash diet, the least we can do is listen to new and interesting voices who are trying to cut through the political clutter. | ||
My guest this week is futurist and journalist Zoltan Istvan, who is running for the president on the newly formed Transhumanist Party. | ||
The goal of the party, which Zoltan himself created, is to put science, health, and technology at the forefront of American politics. | ||
While I somehow doubt that 2016 is going to be Zoltan's year, I think the issues that he's talking about, like artificial intelligence, designer babies, and chip implants, are things that will be shaping the future of our society, regardless of who's in office. | ||
Whether it's President Clinton or President Trump, technology is coming at us faster than ever, and we should be thinking about the moral and ethical consequences of a world changing at light speed. | ||
You never know, maybe discussing all this stuff now will buy us an extra year or two | ||
before the robots take over, though I for one will gladly welcome our new robot overlords. | ||
unidentified
|
[Theme Music] | |
Joining me this week is an author, a futurist, and the 2016 presidential candidate for the | ||
transhumanist party aiming to put science, health, and technology at the forefront of | ||
American politics, Zoltan Isman. | ||
Welcome to The Rubin Report. | ||
Thank you so much for having me. | ||
The Transhumanist Party. | ||
Sounds very Philip K. Dick, Tom Cruise kind of futuristic movie, minority report, something something. | ||
What is? | ||
Well, you know, transhumanism, it's really the field of kind of using science and technology to modify the human being, but it's all sorts of crazy technologies, but it's been around for about 30 years, but there's never been a political element to it. | ||
And so about two years ago I founded the Transhumanist Party and now we have a bunch of advisors and was nominated to run for, you know, the presidential election in 2016 and all of a sudden it's become a little bit of a hit and now we have people worldwide that are very interested in it and we're putting forth a political foot to support transhumanism and radical science. | ||
Yeah, alright, so we're going to talk some radical science. | ||
Let's start with the radical science that is in your left hand right now. | ||
There is a chip in there. | ||
Right. | ||
You are part cyborg. | ||
Yeah, so there's a tiny chip in here. | ||
It's literally the size of a grain of rice. | ||
It's so small you inject it. | ||
You can self-inject it with a syringe. | ||
And it allows you to do things like start your car. | ||
Or in my case, I have a door opener where I just put my hand to this little screen and it opens up automatically. | ||
And it also actually, if you have the right software, sends out a text that says, when in 2016, if I come close enough to your phone. | ||
Really? | ||
It does all sorts of funny little things. | ||
So what is the element of that? | ||
that connects you then to politics? | ||
Like why say, all right, I'm into this stuff and we're gonna get into all the AI stuff | ||
and genetics and all that kind of stuff. | ||
But what makes you say, all right, I'm interested in this. | ||
Why go the political route at the same time? | ||
Well, you know, we have a lot of scientists, we have a lot of engineers, millions of them. | ||
They're doing amazing work. | ||
But scientists and many technologists are not oftentimes the most vocal when it comes to politics. | ||
And yet, if you don't pass certain types of regulations, you can't practice the science. | ||
I mean, we have an endless amount of debate right now about the CRISPR genetic editing technology, about whether we should regulate the field of artificial intelligence. | ||
Well, most scientists and technologists are not stepping up to voice their opinions or sending lobbyists to Washington. | ||
That's really what my campaign is about and what the Transhumanist Party is trying to do. | ||
We're trying to be, you know, that first organization that puts its foot in Washington, D.C. | ||
and says, hey, pay attention to science and technology. | ||
We represent that. | ||
So basically you're trying to just make that leap. | ||
So there's scientists just doing their thing for the good of science or for whatever company they work for or think tank or whatever it is, but they're doing it sort of like just separate from the way our public policy is made and you're trying to link it then. | ||
Back to the way policy is created. | ||
Yeah, I mean, I'm not a scientist. | ||
I'm a, you know, studied philosophy in college. | ||
I'm someone who's trying to popularize these ideas of science and technology because I love it. | ||
I just really, I'm just fascinated with these fields. | ||
And I find that the voice of many scientists is usually in, you know, they're in a laboratory with their, in their white coats bent over doing their experiments. | ||
And I think they need somebody to be out there in Washington saying, how can we get you more Yeah, so as someone that's into science and technology, which usually has a little something to do with facts and reason and things like that, you can't be that thrilled with our political climate at the moment. | ||
from more science and more technology. | ||
Yeah, so as someone that's into science and technology, which usually has a little something to do with facts | ||
and reason and things like that, you can't be that thrilled with our political climate | ||
at the moment. | ||
No, no, I'm not thrilled with it. | ||
You know, I mean, and you know, not to kind of bash religion or anything like that, | ||
but you know, one of the main things of my candidacy is that I'm running as a non-religious candidate. | ||
We officially have 535 members of Congress, 8 Supreme Court Justices, and the President, who all believe They're not that interested in reason. | ||
They're not that interested in the science that leads, you know, to kind of bionic hearts that can make us live indefinitely and those kinds of things. | ||
They'll all want those things. | ||
They'll want those, but they're not like drumming the table and saying, let's put funding for these scientists and engineers out there. | ||
And I think if we could convince them to be, you know, less, I guess, conservative, less religious, more open-minded about the future, Maybe the government could really actually lead when it comes to some of these amazing technologies. | ||
And these are technologies going to make everybody across the country, doesn't matter if it's rich or poor, live better. | ||
So that's really what I'm trying to do is, you know, drum up support so that the politicians across the country, you know, and our resources of America will go much more into science and technology. | ||
Yeah, and when we talk about these technologies, whether we're talking about the chip | ||
that's gonna start your car or open your garage door, whether we're talking about designer babies | ||
or any of that stuff, this is stuff that's gonna happen no matter what the politicians do. | ||
So in a way, you're trying to just give a roadmap so that once this stuff comes around, | ||
we have the sort of basic firewalls to make sure we're dealing with them properly? | ||
Is that what you're saying? - Yeah, yeah. | ||
In fact, one of the funny things about my candidacy is I'm not totally like pro-science, like let's do it all. | ||
For example, I don't believe that we should let artificial intelligence develop completely unregulated. | ||
It's very dangerous to think we might have a machine that's as smart as a human being and far smarter than us here in 10 or 20 years. | ||
So in some cases, my campaign says take a step back. | ||
But whatever we do, we want to ask those questions. | ||
And you know, You're going to find that most of the major politicians, | ||
like Trump or Clinton, are just not talking about artificial intelligence. | ||
Even though I've worked with the U.S. Navy, I had four U.S. | ||
Navy officers in my house a few months ago, I've been consulting with them, and we | ||
all know that artificial intelligence in the military is sort of the most important 10-year | ||
window of what you can do in terms of national security. | ||
Whoever can get to AI first is going to have such an unprecedented advantage in, you know, hacking codes and rewriting software and these kinds of things. | ||
Things where you could actually literally like You know, change the global landscape overnight if you were the very first person, you know, very first country. | ||
For example, imagine if North Korea got AI first and was able to hack into all our systems, stop our electricity, stop our power, stop the water flowing and stuff like that. | ||
And then, you know, the cleaners. | ||
I mean, you could turn off America. | ||
This is the big problem. | ||
And so, you know, the U.S. | ||
Navy and I had been discussing this, saying, well, of course, this is a number one priority of national security moving forward the next 10 years. | ||
Yeah. | ||
So is that the only way to get government to respond to some of this stuff properly? | ||
Like you have to show them a problem that is so bent, that's so connected to national security more than anything else. | ||
That they go, all right, well, we gotta get on AI. | ||
It's not just something that we can have some guy with a robot climbing stairs look at and wait for, because this is big stuff. | ||
The answer, sadly, might be yes, that you do need a national security issue. | ||
It's not just, oh, that's fun. | ||
You go have fun with your robots or your chips in your hand. | ||
You know, actually, the real reason the U.S. | ||
Navy, we even got in touch in the first place was they knew that, because when I got my chip implanted, I was on a campaign trail, and it was kind of quite a bit of publicity about that, you know, being the first chip presidential candidate. | ||
And they, you know, they called me and said, so we have a problem. | ||
Navy personnel, new soldiers, essentially, you know, coming on to both military bases and nuclear-armed vessels, and they have civilian ships in their hands that they've gotten. | ||
How do we deal with that? | ||
We don't know if we want to let, you know, a, you know, I guess a Navy person onto a submarine if he has a civilian ship and that submarine has nuclear capabilities. | ||
We don't know enough about that technology in his bodies. | ||
Meaning we don't know if they have malicious intent with it, or if someone hacks into it and then can hack into it. | ||
Exactly. | ||
And that's the real issue at hand is that, so they're actually now creating a bunch of, and this is what their reports were about, they're trying to create policy saying whether you can or you can't go onto these bases with this kind of embedded technology. | ||
And, you know, the funny thing about my chip is I've had it about a year now, but it's basically already obsolete. | ||
Like, I can't pay with my chip, but in The new chips that are coming out right now, you're able to go to Starbucks and swipe your hand and pay with that. | ||
And that's just in one year. | ||
So in three or four years, can you imagine what these chips are going to do? | ||
They're going to have all sorts of crazy things that they're going to do for us. | ||
You might be able to talk through them like a cell phone. | ||
And that presents a huge amount of problems for, I guess, all sorts of authorities, including just flying on an airplane and going through a screening. | ||
Right, so you mentioned that you're not for sort of unfettered scientific pursuit because of what you said about AI, for example, that it could lead to some terrible stuff. | ||
But is there any way to actually control that? | ||
I mean, once technology and once science has gotten to a certain point, is there even with the best intentions, | ||
that somebody's gonna come in and go, no, I'm gonna do the thing that is gonna lead | ||
to the robots taking over or whatever the case may be. | ||
Unfortunately, nobody has discovered that yet. | ||
And it's part of the reason that AI remains an unregulated industry is that nobody knows | ||
really how to regulate it. | ||
Unless just by stopping it completely. | ||
And then that's crazy. | ||
What are you gonna do, tell a university you can't study this stuff? | ||
So that's been one of the main issues with the AI industry. | ||
Now, I actually don't wanna necessarily stop it in any way. | ||
I would love to have AI. | ||
I just, I think I worry like everybody else that the very first AI is gonna be at least as important | ||
and big as the Manhattan Project. | ||
You have to see AI in terms of almost like a nuclear weapon. | ||
It could be so smart that it could do all sorts of things, and it could be malicious, and it could be very nice. | ||
So, you know, whatever happens, it's one of the very few times that I sort of, I don't want to say that I become anti-transhumanist, but I do pull back and say, you know, some of the experts that have been saying, you know, Elon Musk and whatnot have been warning. | ||
They're correct, Warren. | ||
This is something we've never done before. | ||
We've never introduced a species on planet Earth that's smarter than us. | ||
And we need to be careful with that. | ||
Would you say that's just the next step in human evolution? | ||
Like the whole point is we evolved to get to the point where we could now create something smarter than us. | ||
And we're just sort of going to accelerate what evolution is, while at the same time putting chips in our bodies and creating some other thing, some humanoid type thing. | ||
Yeah. | ||
I think if I did this interview with you in 25 years, probably 20-30% of me would already be machine parts or completely synthetic. | ||
And I have friends right now who are about a year, maybe two years away from amputating a limb and putting on a robotic limb. | ||
The robotic limbs that tie into your neural system now have gotten so sophisticated that you can grab a glass of water, you can write with them. | ||
They'd be doing this by choice? | ||
Yes, that's the whole thing. | ||
They'd be doing it electively. | ||
It's one thing if you've lost your arm because you were in a war zone in a landmine or something like that, but there are biohacker transhumanists out there, young kids mostly, that really want to take it to the next level. | ||
They're also trying to implant chips onto their brains so that they can be the very first people to commune with basic artificial intelligence. | ||
They have the first telepathic communication with machines. | ||
We now have so much different types of brainwave headsets and brainwave technology that you can actually put a chip in your head and talk to your computer in a very basic way. | ||
It's the beginning of the matrix. | ||
But this is what some of these biohackers are starting to do. | ||
It's a part of the transhumanist community, but it's sort of the real aggressive kind of young crowd. | ||
And I totally support it. | ||
I think it's wonderful stuff. | ||
But the arm one's very important, because I actually would like an artificial arm, too. | ||
Eventually, probably in 10 or 15 years, you're going to be able to take a football and throw it a mile. | ||
So, I mean, there's going to be advantages at some point about this. | ||
The question is, you know, how will your family member deal with it? | ||
How would my wife deal with it? | ||
She's like, well, if it's some robotic metal arm, I don't like it. | ||
You know, but it might be a warm synthetic arm that looks very similar to what flesh is, and still be able to throw a football a mile long. | ||
So we're going there, and I bet in 25 years I might have something like that. | ||
Yeah, so let's back up for a second and just talk about what got you personally involved in this. | ||
What's your background and what made you... | ||
Say, all right, I'm going to start a political party and talk about some stuff that nobody's talking about on the political front. | ||
Sure. | ||
Well, you know, I'm formerly with the National Geographic Channel. | ||
I worked there as a journalist for about almost four years. | ||
Well, over four years. | ||
And I did a lot of reporting with a video camera, solo reports, three to four minutes long, in between their new show called National Geographic Today. | ||
And the reason it's important is that I covered a huge amount of stories in a short amount of time where I kind of Really got a good picture of the world. | ||
But a lot of young journalists actually start off in conflict zones. | ||
And so a lot of the coverage I did was in conflict zones. | ||
So I saw some pretty heavy stuff. | ||
Deaths. | ||
Multiple rape victims. | ||
I saw torture victims. | ||
And it really got me worried about dying. | ||
And, you know, for any of the viewers who don't know, transhumanism's main goal is trying to overcome death with science and technology. | ||
So the more tragedy that I saw, the more I got interested in using technology and science to stop death. | ||
And I guess the war zones really hit it into me. | ||
I had a very close incident with a landmine in Vietnam when I was covering a story there. | ||
And afterwards, I just said, you know, I'm kind of done being a journalist, or at least such things. | ||
I came home and I wrote my novel, The Transhumanist Wager. | ||
The book did really well. | ||
I combined it with my journalism skills, and I've, you know, started writing a lot about the movement, which sort of led me to being a more visible figure in it. | ||
And, you know, that kind of led to the Transhumanist Party and also this ability to, I guess, suppose, you know, run for the presidency, because a lot of it is media interaction. | ||
And I've been an on-camera guy for a long time, so it's been very helpful to take that. | ||
So that's sort of from, you know, 10 years back to, I guess, where we are today. | ||
Yeah. | ||
So it kind of seems to me that for a full party, what you would need is people like you that are into the technology front. | ||
Then you would need some ethicists, some legal people, probably a series of things that I'm not thinking of at the moment. | ||
Do you have that core group of people that you kind of use to say, this is the idea, but now I need different people to look at it from their own particular place of expertise? | ||
We do. | ||
We have an amazing advisory board, but I, just to be honest, I don't actually consult with them that regularly because Frankly, they're all pretty important people and I can't bother them too much. | ||
But certainly, you know, I have consulted with each of them at various points to get, you know, a bigger picture on how to best run a campaign and also how to grow the movement. | ||
You know, the Transhumanist Party in this time, its main aim is not to You know, it's hard to define what a political party is to begin with. | ||
Because, you know, there are hundreds of political parties in America right now. | ||
And when you start one, it takes years. | ||
I mean, the Libertarian Party took them 40 years to get on all 50 state ballots. | ||
So we're here two years into this, you know, political party. | ||
We'll talk about them in just a second. | ||
We're really just trying to grow the movement itself and hopefully by 2020 maybe the Transhumanist Party would then be recognized enough where we have enough state parties, because we have a couple state parties, but not enough right now to make a real dent in the kind of political way we want. | ||
It's a growth process and it takes time. | ||
Yeah, so you are conceding defeat for 2016. | ||
Yes, yes. | ||
And in fact, I have said that from the very beginning, that I was never running to win because it's really impossible to win the presidency. | ||
I mean, you have to be on all the state ballots, and then you have to actually win these states. | ||
And even if you're somebody really, really popular, the chances of actually overturning the Republican and the Democrat system is Very difficult. | ||
What you can do is spread a message. | ||
And the way media works these days is you can really grow a movement. | ||
We see transhumanism like environmentalism was 20 years ago. | ||
We think... | ||
You know, environmentalism today has three or four billion people. | ||
Everybody, you know, a lot of people consider themselves green. | ||
We hope transhumanism does the same thing, and so we're trying to create a roadmap to that kind of success. | ||
Yeah, so with that in mind, you auditioned, is that the right word, to be Gary Johnson's VP? | ||
Is it an audition? | ||
What would you, what do you call it? | ||
Did they say come in and audition, dance around for us? | ||
What did they say? | ||
It was an interview. | ||
Gary was very early in on his, you know, declared his intent to run for the presidency and somebody introduced me to him. | ||
I was at the time interested. | ||
I've, you know, I've always been running My novel, The Transhumous Wager, is broadly considered a libertarian book. | ||
And I would say that I'm a left libertarian. | ||
What does that mean to you? | ||
A left libertarian means somebody who's sort of on the border between being a Democrat and being a libertarian, right on the edge. | ||
And that's where I fall, because I don't fall, and I know libertarians say, oh, well, you can't be a libertarian unless you're a full libertarian. | ||
Come on, I mean, there's gray zones to it all. | ||
Are you four driver's licenses? | ||
This is my big one with the libertarians, when they're screaming, Yeah. | ||
No. | ||
I am absolutely for driver's licenses. | ||
I find, like, you just gotta be realistic because, after all, you can't just jump in there and expect to change the world unless you're gonna also compromise a bit. | ||
So I try to compromise quite a bit more. | ||
You know, Gary Johnson invited me to spend, you know, be a guest at his house in New Mexico. | ||
It was just him. | ||
I, you know, I spent the night. | ||
I interviewed with him. | ||
He made me dinner. | ||
We watched Orphan Black together. | ||
He actually introduced me to, yeah, so that was great because he hadn't been introduced to Orphan Black. | ||
But he also introduced me to House of Cards, which I had never actually seen. | ||
I have two kids, so I don't have a TV at home. | ||
And so I've been watching those addictively now. | ||
And then the next morning, we continue in our conversation. | ||
And I don't think -- he told me up front that you don't have much of -- | ||
you probably don't have much of a shot because I'll probably be able to get a governor of some | ||
sort. | ||
But I'm interested in what you represent because you're young and you have a millennial crowd. | ||
I think Gary's very interested in science and technology. | ||
So for me, it was an honor to literally talk to him for 16, 18 hours over time. | ||
He picked my brain, I picked his brain, and I became a big fan of his, to be honest. | ||
But I didn't get the job. | ||
I wish I had, I wish I had. | ||
I won't ask if you guys smoked pot, but you can just nod appropriately. | ||
No comment. | ||
Fair enough. | ||
It's interesting you mentioned Orphan Black, because have you seen Black Mirror by any chance, which is also on Netflix? | ||
You know, I've seen only one. | ||
So Black Mirror, I've mentioned it on the show a couple times, Really all about how we're all, I don't have my phone on me now, they took it away from me, but that we're all walking around with this device and that it's becoming a mirror to us of our lives. | ||
And what I love about it is that it's showing the future in only three or four years, how subtle changes will actually change us a lot. | ||
How much do you think just the phone and the technology, the power That we're all walking around with is changing us, without us even realizing it. | ||
I think it's dramatic. | ||
I think, for example, even this election cycle, I've never, no one's seen such hate, no one's seen such trolling. | ||
I think that's a direct result of our ability to be 24 hours a day connected. | ||
We've never been so connected. | ||
It's because of the phones, it's because of the devices. | ||
We're becoming a nation of sort of haters and trolls because we have more content hitting our brains than we've ever had before. | ||
And that may or may not be good. | ||
I think these are the kinds of changes that are happening, just you see in the real life. | ||
You don't even need to make a Hollywood film on it. | ||
So I think it's very important. | ||
But one of the most striking stories, I think, of the last few years that's really hit me is Uber. | ||
Everybody was so excited they had a new chance to have an Uber career as a driver, as an Uber driver. | ||
Make money doing that and now all of a sudden Uber, like literally within a couple years now, introducing driverless Ubers. | ||
So that entire career that made so many young people excited is almost now gone. | ||
And that's how quickly the acceleration of technology is happening these days. | ||
Nobody realizes that what happened in 10 years is going to happen in the next 5 years, that amount of change. | ||
But then after 5 years, it's 2.5 years, then it's 1.25. | ||
So you go out 10 years, and you're literally making up four or five times the amount of change that's happened in the last decade. | ||
We are not prepared for it. | ||
It's going to be super exciting. | ||
We're going to stumble a lot as a society. | ||
So is the rapid acceleration of change, is that the biggest risk, more than the technology itself, just the ability for it to change so quickly and evolve so quickly? | ||
Is that sort of scarier to you than what we might come up with? | ||
I don't sense you're scared by any of this, actually, but the risk. | ||
Of course, of course. | ||
Yes, look, I think what's happening is, for example, you know, just even talking about the elections, Hillary and Trump are not addressing issues that are happening here in the modern world like artificial intelligence, gene editing, designer babies, huge things that have already happened and that will continue to happen. | ||
They're still addressing social security taxes, foreign policy, just like they did for the last 50 years. | ||
The problem is that Nothing is going to be as important to Social Security as this anti-aging or this longevity movement that's underfoot right now. | ||
I mean, when everyone lives 150 years, you can just throw out your Social Security. | ||
And that 150 years is already happening. | ||
We might only be a few years from reversing or stopping aging through genetic editing. | ||
So the questions that they're asking and the questions that the media is leading the country to ask, I think is a lot of times backwards. | ||
You know, things are changing much quicker than before and we need to learn to adjust. | ||
Yeah, you know, it's interesting. | ||
I read a book called 2030 by Albert Brooks. | ||
It was his fiction novel about what he thinks America is going to be like in 2030. | ||
And one of the issues that he thinks we're going to face the most is that our aging population is going to keep staying alive. | ||
And because of that, the economic strain that it's going to put on young people Is gonna cause, not like a racial war, but like an age war. | ||
Because the young people are gonna keep paying into a system to keep older people alive who aren't producing as much for society. | ||
And that's just one of the many, that's what popped into my head as you were saying that. | ||
Just there's so many issues. | ||
We're focused on the same five things. | ||
You know, the designer baby thing is another one that's good because it's like we're always focused on abortion, but there's a lot of other ethical things that are Popping up, whether we talk about them or not. | ||
Of course, of course. | ||
And, you know, all of them are interconnected. | ||
You know, the abortion thing is classic. | ||
My wife is an OB-GYN at Planned Parenthood, so I'm, you know, directly connected to a lot of that stuff. | ||
And the designer baby thing is so amazing because we now, you know, there's a couple universities around the world working on artificial wombs, and they've actually been able to raise a, you know, kind of a goat from A little embryo all the way to basically almost being alive. | ||
And we basically probably have this technology to do this with our children, you know, in 10 years time. | ||
Meaning that you can bypass basically the human body almost entirely. | ||
And you don't even need to have sex. | ||
And then there's other methods too. | ||
You can just take a shot and combine your own DNA. | ||
You know, there's cloning techniques. | ||
So the whole world is changing so dramatically. | ||
And people are still stuck on the concept of, you know, whether, you know, you're pro-life or pro-choice. | ||
When just literally five to ten years from now, there's going to be a huge amount of new technologies that not necessarily make that question obsolete, but change that question so dramatically. | ||
You know, for the pro-lifers, if you can preserve a life and you have this ectogenesis or this artificial womb, do you ask society, do you ask somebody to give up the baby and then raise it there, even if it's only been one week old? | ||
And then there's this other idea that we can now keep, you know, It used to be that the limit was 26 weeks and that's when a baby could survive, you know, outside the womb. | ||
Well, we're going to get it down to so a baby can survive in 12 weeks here in the next couple years. | ||
That changes again because then you're like, well, where is that line? | ||
So technology is making the world crazy. | ||
And, you know, hopefully and obviously in a good way, but what's happening is a lot of ethicists and a lot of the government is not able to keep up with that. | ||
Right, well even that example that you gave is really great. | ||
So if you're against abortion and you wouldn't want someone to have an abortion, but let's say the woman wants to have the abortion and if the technology exists to keep that fetus alive, You know, two weeks after insemination. | ||
If that, if something exists, well then, is it the state's responsibility to pay for that? | ||
Who has to pay? | ||
So there are real world, it sounds like sci-fi, but there are real world implications that could be happening right around the corner. | ||
And, you know, probably within 10 years we're going to be able to keep You know, a one month old, or even less, fetus alive. | ||
And that's going to be a very difficult question, I think, for society to face. | ||
The good news, though, is there's also going to come into technology, like your phone, that can just scan, or just take a blood test, and within minutes of having sex, tell you whether you're pregnant or not, or whether that's happening. | ||
Not minutes, but maybe within 12 hours. | ||
So you'll have better, and then probably other ways, because of morning after pills, to terminate it. | ||
sort of like, you know, just take vitamins or whatever, you know, something like that. | ||
Making the whole system a bit easier so there's a counterbalance because | ||
obviously, you know, if technology goes to one side then, you know, you know, maybe | ||
we'll have to actually revisit this entire question as a society back to the | ||
Supreme Court. And the problem is, though, technology changes so quickly that, you | ||
know, even the idea of having babies inside a womb is going to become | ||
something that I think is going to be very controversial. | ||
There's, you know, let's be honest, giving birth is something that's very medically | ||
dangerous. | ||
Oftentimes involves major surgery if you have a c-section, whatnot, and, you know, | ||
I don't want to get myself in too much trouble, but an evolving country, an evolving society would try to make it so that motherhood does not involve that type of medical danger. | ||
And that's where the artificial wombs are going to come in. | ||
It also means that women don't have to take off all that time off work. | ||
And now we can kind of get back to the equal pay area and say, is that what caused it? | ||
Or was there really discrimination built into the system? | ||
I think there probably is, but it would be a way to solve it. | ||
But what's most interesting is, again, just thinking about transhumanism, in the next | ||
10 years, all these things are going to be changing because of all these different types | ||
of technologies that we have, and rewriting what our parents taught us was right or wrong, | ||
and saying, well, the rules have changed. | ||
The greatest example that's happening right now is Mothers Against Drunk Driving. | ||
It's one of the largest nonprofits in the country. | ||
I have two kids, who wants to deal with any drunk drivers? | ||
But driverless cars will literally wipe out this tragic problem. | ||
that the whole world has because nobody will be driving drunk anymore. | ||
And it's a great idea for how a single technology has eliminated a problem | ||
that has literally killed hundreds and hundreds, well millions of Americans lives at this point. | ||
At the same point of course the counterbalance would be that someone could hack into the system. | ||
Of course. | ||
And he could be drunk in front of a computer. | ||
Right. | ||
So yeah, it gets crazy. | ||
So there really, I mean, you mentioned the Matrix before, but it really, when you open up the Pandora's box of this stuff, the sort of offshoots that you can go to are really endless. | ||
It is, and I think what's very interesting is, and I hope, this is one of the most difficult challenges I think with my candidacy and talking about transhumanisms, is how do economics work into it? | ||
Because what capitalism has done is always like, someone moved forward on this end, well then someone, there was a competitor or some other idea here, and so there always was a pretty good balance. | ||
But as we move forward, I'm not sure, like when everyone loses their jobs due to robots and whatnot, Whether that balance will continue and that's when it gets dangerous because so far technology has been neutral and it's been the people who have made it bad or good or whatever. | ||
In the future we need to somehow maybe through regulation or just maybe crossing our fingers and hoping that that balance continues and that technology and science continue to make our lives better but doesn't become some kind of dystopic world. | ||
And of course as all the sci-fi books point out. | ||
Right, so it's interesting because you're on one hand, you're describing yourself as a left libertarian, but you're acknowledging that some regulation might be necessary because we'll just sort of, we'll evolve ourselves out of the system at some point if we don't do that. | ||
You know, so on my, I took a funny bus tour across the country for four months on my campaign. | ||
And I'm, you know, I mingled a lot with truck drivers because they were all on the road and we were hanging out with them quite often and the truck stops. | ||
And, you know, they've now had in Europe, they had 10 big giant semi-rigs driverless across the continent. | ||
And America is only a couple years away from having driverless trucks. | ||
We have about approximately 4 million truck drivers. | ||
These are grown men. | ||
Many of them like guns. | ||
Many of them would not be able to get other jobs very easily because they've been truck drivers for the last 20 or 30 years. | ||
They're going to potentially be replaced here in a few years. | ||
We're talking four million jobs and grown men. | ||
So what do you do with that? | ||
You can't just say, oh, we're going to give you welfare. | ||
They probably wouldn't settle for that. | ||
These are men that'll say, no, I got a gun. | ||
Don't tell me what to do. | ||
So one of the things I'm trying to do at the party and with my campaign is establish economic parameters that might help people move into the new transhuman stage You know, either through universal basic income, there can be a libertarian version, or other different means, but whatever it is, we don't want the government to become Big Brother, but at the same time, we realize that we're not sure capitalism can survive if robots start taking everyone's job. | ||
Right, so in that case, you would be for a universal basic income, so that it would really just be a transition, sort of, for some of these people that maybe would just put their hands up and be like, the world's passing me by here, so let me survive. | ||
Well, and the good thing about a universal basic income, which I do support, but I support a kind of a thinner one, and not one that says overnight I want it to happen. | ||
It's gotta be something that's transitioned slowly so that all of the economies don't fall apart. | ||
Yes, at some point I think we need to consolidate welfare, we need to consolidate Obamacare, we need to consolidate Social Security, you know, into one giant package that says, even if you're going to, because we have this other issue that anyone that's born today is almost certainly not going to die, or there's going to be so many different types of technologies out there that can keep you alive and deaf. | ||
I mean, when you look at the lifespan age, it's doubled in the last century, and it's going to, now it's going to quadruple. | ||
The reality is that if people live that long, we need to do something for them. | ||
They need to have something that they can feed themselves, house themselves, you know, be able to live a life where they don't cause civil strife. | ||
And that's really my main worry, because for me, I don't really like the idea of universal basic income from a humanitarian perspective. | ||
I actually like competition. | ||
I think that's good. | ||
but what I really don't like is revolution or civil war. | ||
So that's more, and from a science perspective, the best thing to do is keep peace. | ||
So I would, a compromise, a nice compromise is a universal basic income. | ||
Yeah, so is the inherent flaw of all this that all of these technologies are going to be accessible | ||
to rich people or wealthier people or people with more means earlier? | ||
And that ultimately, and poor people, even with a universal basic income, | ||
won't have access to this stuff. | ||
And we're going to almost create a series of superhumans and a series of old-school, non-chip, regular people. | ||
And it'll really be like your human experience will be completely different than probably more than 50% of the world. | ||
You know, the question you just asked is the question I get asked the most because everybody worries about it. | ||
I thought it was a good one. | ||
Yeah, no, it is a great one. | ||
But it's the one that everyone worries about because everyone knows, most people are not part of the 1%. | ||
I'm certainly not part of that 1%, so I worry about the question, too. | ||
And I want to make sure that the, you know, That 1% doesn't take that technology and leave me in the dust. | ||
Because, you know, in the same way we're seeing accelerating technology to this J-curve, people improving, you know, technology growing so quickly, people's access or people, the rich people's personal improvements might actually skyrocket. | ||
And you might find like, for example, through gene editing technology, someone's child has 30 to 40% smarter than your child because They were able to afford the augmentation technology. | ||
Well, now society's unfair. | ||
Now we're getting into Brave New World type of stuff. | ||
So we need to, as a society, come up with a plan where that 1% doesn't leave us behind. | ||
It's not the same playing field it was before. | ||
The world is changing too quickly. | ||
So even though I want to stay as a libertarian, I understand that If we don't do something different, it could explode in a bad way. | ||
Right. | ||
And I guess you could argue that that's happening now anyway, right? | ||
Because right now, if you're wealthy, you can get your children to the best doctors, you can get them to the best schools, all of those things. | ||
I'm sure every metric there is would show that they're going to have a better chance to succeed in life. | ||
So we're just adding the technological component to it in this discussion. | ||
But this is stuff that's going on already. | ||
No, and you're right. | ||
I mean, when you look at how much longer rich people live than poor people, it's about 25%, you know. | ||
And when you look at how much more educated they are, you know, I mean, the facts speak for themselves. | ||
So, you know, I broadly would like to create a much more, a society with much more equality in it. | ||
But I think the issue right now is that that's a very difficult thing to do under all the circumstances. | ||
You know, there's just too much Right now, inequality seems to be growing. | ||
Technology seems to be accentuating that. | ||
And at some point, we're going to have to step in and say, wow, the world's changing really quickly. | ||
We better take a long, hard look at this. | ||
And one of the reasons we actually, like, I delivered a transhumanist bill of rights to the US Capitol. | ||
And one of the very most important rights in it is that everybody has a universal basic right to an indefinite lifespan. | ||
Access to transhumanist technology because of this very real concern that maybe Only 10% are gonna get it and that change would be so much more dramatic than the 10% even becoming wealthy I mean what if the 10% become 500% smarter Genetically. | ||
We're no longer dealing with the same species, then, practically. | ||
So we have to be careful we don't go down that path. | ||
I mean, if you think Black Lives Matter is something that is controversial, well, what are you talking about when you say the 1% have now become these gods, practically? | ||
I mean, that's something that I think is going to be far more controversial. | ||
We can't let that happen. | ||
So getting away from the human side of this, or the individual side of this, I think there's an interesting piece of this that's nation-related. | ||
Are there any countries right now that are doing a better job at this stuff, that are really looking at the future of this stuff and the ethics and all the things we're discussing better than the United States right now? | ||
I suspect we're not doing it particularly well. | ||
Well, I mean we have, you have to kind of go back to what happened with George W. Bush and his, you know, Essentially, 16, 17 years ago, stem cell technology was one of the most important technologies, and it still is, of the 21st century. | ||
And it's doing wonders right now for replacing, you know, giving people mobility back that have been in accidents and paraplegics and whatnot, and other types of treatments too. | ||
But he created this atmosphere where when he shut down federal funding for stem cells, literally for seven years during his two terms, They were set a precedent that was very anti-transhumanist. | ||
And the bioethicists that he had in the White House were very anti-radical technology. | ||
Can you explain for people that have no knowledge of this whole thing why, what the ethical | ||
argument was over that? | ||
Sure, sure. | ||
Well, you know, a stem cell basically essentially came from aborted fetuses originally. | ||
And then they were taking stem cell technologies, and stem cells are essentially original cells in your body that can become any different type of cells. | ||
But the problem is, in the very beginning, we were getting those stem cells from aborted fetus tissue. | ||
And that was very controversial to the Republicans. | ||
And of course, since we had a Republican in office, he said, no, the federal government is not going to support research on stem cell technology. | ||
And then what happened is Obama came into office and said, listen, stem cell technology is not about abortion or anything like that. | ||
It's about helping people and helping hundreds and hundreds of thousands of people, which is now happening. | ||
So Obama reversed the stem cell issue that George W. Bush had put in, and now there's been all this funding going into it, and all of a sudden stem cell technology is back as one of the great 21st century treatments. | ||
But the problem is that it said, you know, What George W. Bush did set a precedent, kind of what all the transhumanists around say, that was the very first anti-transhumanist government thing that stood in the way of our health. | ||
Because a lot of people think stem cells might even be used for something like heart disease, or it is being used for heart disease, or for cancer treatment, meaning it can be a life extension technology. | ||
We can live longer. | ||
So if a president steps in and says, you can't do that, he's also saying, we're not interested in you living longer. | ||
We have ethical frameworks that we have to work around. | ||
And that's when transhumans were like, whoa, the government has gone too far. | ||
And this is why it's very important that the bioethicists that are in the White House, that are supporting the president, that are supporting the government, are for technology and science. | ||
You know, you can have a scientist who doesn't love science, and believe me, that doesn't help the human race. | ||
That's not good. | ||
So, every week on this show, for one reason or another, I say that the road to hell is paved with good intentions. | ||
So, in a case like this, I would suspect that George W. Bush's decision-making was that he was looking out for the unborn child. | ||
So, I don't think his intentions were malicious. | ||
However, the result of his intentions were to go against what science can allow us to do. | ||
And I guess the bigger issue really is that If they had stopped it completely, you know, no government funding, and we're gonna prosecute you if we find out that you're doing this on your own, someone's still gonna do it. | ||
So doesn't, in a way, that would make it more dangerous? | ||
Because then you'd have only people doing it outside of the system? | ||
Of course, of course. | ||
And so this is, so setting up, you know, the question you asked earlier, this is exactly what sort of happened. | ||
And I agree with you. | ||
I don't think George W. Bush, of course, wanted to end lives early. | ||
I mean, he really just was trying to, you know, follow the wishes of his party and protect the unborn child. | ||
So I know that. | ||
But what happens is that it becomes a national issue and the end result for many people, including myself, was that my life and the health care system is less because of it. | ||
But I think what's happened now is this sets a precedent for genetic editing, which is sort of the next stem cell, the next war that the governments are fighting sort of with scientists right now. | ||
Genetic editing, the CRISPR case 9 technology, allows you to sort of Take your genetic nature and structure it almost like you would software code. | ||
You can put in different pieces so we're able to create animals that have much heavier muscles. | ||
I have friends that are already starting to look into trying to grow a third eye on the back of their head. | ||
Why didn't evolution give us a third eye? | ||
I mean, it sounds crazy, but that's something that we can start to think about with CRISPR technology. | ||
It's probably 10 or 15 years out. | ||
It's something that we're going to be able to do. | ||
The idea of growing another set of arms is now 100% possible because of this gene editing | ||
technology. | ||
Now you can see that's very, very controversial. | ||
Wow, you want four arms. | ||
Well, how does that work? | ||
I mean, you know, if you're in the Olympics, is that something different? | ||
Right. | ||
I mean, this is island of Dr. Morozov. | ||
Of course. | ||
Of course. | ||
So China, about a year and a half ago, you know, started working on the very first embryo | ||
and modifying it with this genetic technology. | ||
It sent the medical community into shockwaves, like, oh my god, they're doing, creating monsters, this and that. | ||
So immediately, a bunch of moratoriums were, people were arguing to put a moratorium on the technology. | ||
And luckily, that didn't exactly happen, but the American government has stepped in, and now they're considering the issue. | ||
Luckily, there's actually been There's pockets of CRISPR technology that have now been approved at least with some government consent to move forward because it's a technology that potentially can eliminate every single disease as well as do all sorts of other amazing things. | ||
The one that you hear a lot about with CRISPR technology is should we eradicate the mosquito by creating a mosquito that doesn't have malaria and that breeds with others and then no mosquitoes will ever have malaria. | ||
That's what CRISPR can do. | ||
Right. | ||
Wouldn't this then lead to Jurassic World? | ||
We create a dinosaur that's better than every other dinosaur and then it destroys the park, eats all the people, kills all the others? | ||
That's where the ethical quandary is. | ||
And that's why nobody is really sure. | ||
Now, I may not be super supportive of AI technology because I worry about it, but I am 100% supportive of CRISPR. | ||
I think as long, you know, in the transhumanist community we have this thing called morphological freedom. | ||
It's pretty much the most important fundamental right. | ||
Morphological freedom means the right to do anything with your body so long as it doesn't hurt somebody else. | ||
And that means growing a second head. | ||
As controversial as that is, that's what it means. | ||
It also means, you know, the ability to change yourself gender-wise. | ||
Anything you want to do, you should be able to do it. | ||
If you can do it through that way, it's 100% okay. | ||
Most transhumanists believe in that. | ||
I strongly believe in that. | ||
I think what's going to happen in the future is America and society as a whole is going to have to grapple with this new civil rights era of people using this genetic technology to change themselves so dramatically. | ||
I mean, you talk about skin color. | ||
We're talking about—transhumans are talking about the bar on Tatanui in Star Wars. | ||
I mean, we're talking about very different species. | ||
That's what some of my friends are trying to do when they're using this technology. | ||
I have friends right now that are trying to splice in plant DNA Yeah, wait, so people are actually testing this stuff on themselves, which I think is kind of fascinating. | ||
So you have the chip in you, but you weren't testing it on yourself. | ||
And you could solve world hunger that way too. | ||
Yeah, wait, so people are actually testing this stuff on themselves, | ||
which I think is kind of fascinating. | ||
So you have the chip in you, but you weren't testing it on yourself. | ||
You knew it was a proven technology, well, basically a proven technology at the time, right? | ||
Yes, although the chipping thing is very controversial. | ||
It's not FDA approved or anything like that. | ||
And in some states it can also be illegal depending on how it's administered because you actually have to get someone to put it in you. | ||
But those things aside, the chip's not a very big deal, but there came out last year a DIY CRISPR kit | ||
for around $100 that you can mess with and start taking apart your own genes | ||
and mixing it with other genes and then putting it back inside you | ||
and see if anything happened. | ||
Now, it's very rudimentary. | ||
That doesn't work. | ||
Yeah, this sounds incredibly dangerous. | ||
I mean, I'm on board everything you've said here, but it sounds-- | ||
Of course, and this is why this is so controversial. | ||
Now, most of these people have no idea what they're doing and they're just, I don't wanna say they're amateurs, | ||
but they're citizen scientists, but there are some universities doing the exact same thing | ||
on a much bigger level. | ||
I mean, CRISPR was basically born out of MIT, and they're the people | ||
that are literally gonna be changing the world. | ||
And like I said, it's been funny, even the Wall Street Journal, I think New York Times, everybody was running the controversy about using CRISPR to eliminate malaria. | ||
Because here we've had Bill Gates spend so much money on, thankfully, trying to eliminate malaria. | ||
It kills about 750,000 people every year. | ||
And we might have a chance to literally eliminate it entirely through this type of technology. | ||
But should we do it? | ||
What if we eliminate the mosquito and all of a sudden half the populations fall apart because, you know, it's like introducing rabbits somewhere. | ||
We don't know really what happens. | ||
Right, right, right. | ||
But at least it's got the country talking about it, and it's got it talking about not the third arm. | ||
I think the third arm, like I talk about, is a bit weird. | ||
But the mosquito's a good way to approach this radical technology that you can literally alter the genetic structure of a species to make it so that it's more friendly for humans. | ||
So these people, they buy the kit and they go, alright, I'm going to get some photosynthesis going on my arms so that the sun will re-energize, or the sun will not re-energize me, it'll feed me. | ||
Essentially, yeah. | ||
So the sun's going to feed me. | ||
I mean, this sounds like the beginning of a Marvel comic book, right? | ||
Where the guy then turns into a tree and-- | ||
It is. - And a tree man is now-- | ||
Of course, of course. | ||
And we're entering that age. | ||
I think what's so, I mean, and it gets really even weirder. | ||
I've even heard of a person trying to grow a second penis, for example. | ||
I mean, it can go really crazy what people wanna do. | ||
And I think though-- | ||
So is that the problem really, when you combine this with politics, | ||
that someone would watch, let's say you were polling at 10% right now, right? | ||
Like it was actually making a dent in that way. | ||
The takeaway from this interview would be someone would get you saying second penis. | ||
That would be the thing. | ||
And then they would go, see, he's some kook scientist, saying all this crazy stuff. | ||
And that is sort of the big flaw of all of this, trying to combine cutting edge science with our Antiquated political and media. | ||
I know. | ||
And if I do this again in 2020 and I'm for a bigger party and I have a better shot at it, I won't be saying things like that. | ||
unidentified
|
I have the luxury of a few more interviews where I can say... Yeah, you might regret this then, but yeah. | |
No, you know, but I think we have to take a bigger step. | ||
I think in four years, this technology we're talking about, it's not going to be some Far out science fiction thing. | ||
It's going to be something everybody's already heard of. | ||
We're already going to be using CRISPR technology to cure cancer. | ||
We already have cured some cancers or at least some people with it. | ||
Because you can go into their genetic structure and change things. | ||
It's just like coding. | ||
And I think at some point People may have to grow up with how fast technology is changing around America and not say, oh, I'm taking a sound bite just so I can sell ads for CNN or Fox or whatever. | ||
And I'm going to actually hear what this person has to say. | ||
Because if they did, they'd say, wow, this guy doesn't want to do that. | ||
But what he's saying is, isn't it amazing what we can actually do with this new technology? | ||
And the thing is, whether they like it or believe in it or not, it's happening. | ||
So you either get on the train or... And it's happening so quickly. | ||
You know, CRISPR came out like 18 months ago, and now we're already talking about eliminating, for example, the mosquito. | ||
I mean, that's a 750,000 deaths a year again. | ||
We're talking about a major humanitarian push for a science that's about 18 months old. | ||
Like unheard of. | ||
We could have got major views if you would have brought one of those kits and you could have done a little something on me right there. | ||
That would have been something. | ||
What would you say to the people that, and I think this is usually people coming from a religious perspective, that would just say don't tamper with, you know, it's really the God versus science argument, don't tamper with our God-given whatever. | ||
Now I get it, you're not a believer, but what do you think is the best way to argue that point | ||
for someone that believes that humbly and truly? | ||
You know, I actually, because I was raised a Catholic and have studied the Bible extensively | ||
and I have a degree in philosophy, but it's actually partially religion as well. | ||
And I would try to argue from the biblical point of view, which said, you know, God is giving us these technologies. | ||
God, if you believe in God, is giving us the ability to do these things to ourselves. | ||
I think it's something that if we can do it, it was destined by that higher power. | ||
And so I say, you know, Why should you be afraid of technology that is part of God's world? | ||
Embrace it. | ||
It's just part of that system. | ||
And actually, this works really well. | ||
I mean, it doesn't work very well when I talk about the chip, because everyone thinks, oh, it's Mark of the Beast, and this and that, and I get the anti-Christ comments so often. | ||
But I think when you're talking about CRISPR genetic technology, if you say to somebody, I'm going to cure your child of leukemia with this technology, And that was God's plan all along to make your life better. | ||
Then I think a lot of people say, you know what, I'm willing to accept that. | ||
I think if you talk about a fourth arm or some kind of horse body, that's when it goes too far. | ||
But if you can use this stuff for the greater good, most people, it doesn't matter how fundamental they are, especially when it concerns the health of loved ones, are going to be on board. | ||
And that's good. | ||
Yeah, does that show you how sort of our brains and our hearts are sometimes just not connected? | ||
Which the CRISPR is probably working on combining those things. | ||
Because sometimes I'll see some of these politicians that are totally against a woman's right to choose. | ||
And I am not a lover of abortion, but I would allow the woman to choose what she wants to do. | ||
But they'll be against abortion, and then at the same time, it's like you know at the end of the day if their own daughter got raped. | ||
They would figure out a way for their daughter to get an abortion. | ||
So there's a disconnect between what's going on intellectually and what's going on in the heart. | ||
And a lot of what you're talking about I think is combining those two things. | ||
A hundred percent. | ||
And I think, you know, what's happening is we are, as a species, improving our ability of reason. | ||
And that's something that is happening. | ||
We are actually... Are we? | ||
This election year I happen to... Okay, the election year aside. | ||
No, but we are slowly becoming a more, a less religious society and a more reasonable society, I do believe. | ||
And I think when you look at some of the statistics, science and technology are infiltrating our lives enough to make us say, You know, we really can't believe anymore in a Noah's Ark, because that just would have been sort of impossible. | ||
I know some people still do, but broadly speaking, most Christians wouldn't say, you know, firmly say, I believe in that anymore. | ||
They still may firmly believe in Jesus and whatnot. | ||
So we are growing, and I think the idea is to continue down that path. | ||
And hope that one day we come to a society that is much more spiritual based, because I like spirituality. | ||
I actually think spirituality is great, without the fundamentalism. | ||
And that way we can see more clearly. | ||
We can have a reason-based spirituality that still incorporates all the elements of faith and whatever, without the ones that kind of create all the chaos in the world. | ||
And that disconnect. | ||
Yeah, it sounds a lot like the book by Sam Harris, who's one of my favorites. | ||
Oh, the book's right there! | ||
Sam Harris Waking Up, A Guide to Spirituality. | ||
I quote it often. | ||
Oh, there you go, A Guide to Spirituality Without Religion. | ||
So have a spiritual life. | ||
You're not against that for all the logic that you believe in, but not attached to the dogmatic stuff that can't be proven. | ||
So my last question to you, I think, is the big one about all this. | ||
And I'll slightly quote one of my favorite movies, Contact. | ||
When Matthew McConaughey, who plays a preacher in it, and they're trying to decide whether they're gonna go to, you know, Vega, to the distant star or not, and he's talking about technology, and he was really fighting against it. | ||
He was saying, you know, is all of this stuff making us happier? | ||
At the end of the day, you can eat quicker, and you can talk to someone on the phone, and all this stuff, but is it making us happier? | ||
Do you think that at the end of the day, that our net happiness is actually any higher or lower than it was 200 years ago? | ||
So, you know, I'm a tough person to ask that question to because I just don't put that much, I guess, credibility in happiness. | ||
I put it in much more satisfaction of something else. | ||
what I make of my life. | ||
I really like work. | ||
Work for me is everything. | ||
You know, the passion of what I create with my hands, my career, those are the ways that I satisfy myself. | ||
Happiness is such, for me, a very nebulous term. | ||
Having studied philosophy, there's this great question that says, "If you could be made 100% perfectly happy | ||
"and put in a box and shot into space for eternity, "would you do it?" | ||
And almost nobody says that. | ||
Nobody wants to be perfectly happy. | ||
Because it goes against the core of value systems that we like to have, which is, it's got to be a bit of pain and suffering and all this built in the system. | ||
But there's other avenues that we do pursue, like satisfaction of life and of our careers and families and relationships. | ||
Whether we're happy or not isn't the most important question. | ||
I would say we're probably not. | ||
We're probably less happy than we've been before. | ||
Not that dramatically much, but I mean, who isn't? | ||
Who in, at least in America, doesn't work a gazillion million hours? | ||
I have mortgages and this and that and the phone's never going off. | ||
I have kids and diapers to change. | ||
I'm overwhelmed, to be honest with you. | ||
So happiness is not what I'm after. | ||
In the end of the day, I do believe something's being accomplished, and that's a very deep sense of satisfaction. | ||
And I think that's what technology is also doing to us. | ||
We're evolving so far that we're realizing, wow, we are actually becoming more than human. | ||
We're becoming almost godlike. | ||
We're becoming these amazing entities that can Not just discover the stars, but maybe even one day merge with the stars, become AI that spreads through the universe. | ||
And that is something that I think is incredibly enticing and incredibly, it really speaks to my heart on what's possible of the human being. | ||
Yeah, well I have to tell you, I thoroughly love this conversation because it's such a synthesis of the things that I love most, which are all this science stuff and future stuff and the political part. | ||
I like the science stuff a little more than the political part. | ||
What do you say we do this every two years? | ||
We'll do this. | ||
I assume we can live forever at this point, right? | ||
I think so. | ||
You'll hook me up with some of that stuff. | ||
I'll hook you up. | ||
I'll send you the tech. | ||
Yeah, yeah. | ||
So we'll do this every two years for say a hundred years and see where we're at. | ||
Absolutely. | ||
And go from there. | ||
One day, you know, then we'll do like a little, you know, montage and there'll be like the robot Zolt eventually. | ||
And then just the little computer in front of you one day doesn't even have the body. | ||
I assume we'll just be holograms at one point on different planets. | ||
Yeah, yeah. | ||
No, I would love that. | ||
Yeah, no, that sounds great. | ||
unidentified
|
All right. | |
Very cool. |