Speaker | Time | Text |
---|---|---|
unidentified
|
- Joe Rogan podcast, check it out. - The Joe Rogan experience. | |
- Train by day, Joe Rogan podcast by night, all day. - We're gonna just sit here and talk for a long time. | ||
Yeah, we already started right now. | ||
We already started. | ||
It has begun. | ||
Yes. | ||
What was your question, though? | ||
I was gonna ask, you know, like, what if something comes up, you know? | ||
Like what? | ||
You know, you need to, like, pee or something. | ||
Oh, you can totally do that. | ||
Yeah, we'll just pause and just run out and pee. | ||
That happens. | ||
Don't sweat it. | ||
I want you to be comfortable. | ||
Have you ever done a podcast before? | ||
First time. | ||
Really? | ||
First time. | ||
So tell me how, where Signal came from. | ||
What was the impetus? | ||
What was, how did it get started? | ||
It's a long story. | ||
Sorry, we got time. | ||
We got plenty of time. | ||
We got time. | ||
Okay, well, you know, I think ultimately what we're trying to do with Signal is stop mass surveillance to bring some normality to the internet and to explore a different way of developing technology that might ultimately serve all of us better. | ||
We should tell people, maybe people just tuning in, Signal is an app that is... | ||
Explain how it works and what it does. | ||
I use it. | ||
It's a messaging app. | ||
It's a messaging app, yeah. | ||
Fundamentally, it's just a messaging app. | ||
unidentified
|
Yes. | |
Explain... | ||
Lofty aspirations. | ||
Yeah. | ||
Yeah, it's a messaging app, but it's somewhat different from the way the rest of technology works because it is encrypted. | ||
So... | ||
Typically, if you want to send somebody a message, I think most people's expectation is that when they write a message and they press send, that the people who can see that message are the person who wrote the message and the intended recipient. | ||
But that's not actually the case. | ||
There's tons of people who are in between, who are monitoring these things, who are collecting data information. | ||
And Signal's different because we've designed it so that we don't have access to that information. | ||
So when you send an SMS, that is the least secure of all messages. | ||
So if you have an Android phone and you use a standard messaging app and you send a message to one of your friends, that is the least of all when it comes to security, right? | ||
Yeah, it's a low bar. | ||
That's the low bar. | ||
And then iPhone, what is this? | ||
Signal. | ||
Oh, there you go. | ||
So iPhones use iMessage, which is slightly more secure, but it gets uploaded to the cloud, and it's a part of their iCloud service, so it goes to some servers and then goes to the other person. | ||
It's encrypted along the way, but it's still, it can be intercepted. | ||
Yeah, I mean, okay, so there's... | ||
Like Jeff Bezos' situation. | ||
Yeah, like Jeff Bezos' situation, exactly. | ||
Fundamentally, there's two ways to think about security. | ||
One is computer security, this idea that we'll somehow make computers secure. | ||
We'll put information on the computers, and then we'll prevent other people from accessing those computers. | ||
And that is a losing strategy that people have been losing for 30 years. | ||
Information ends up on a computer somewhere, and it ends up compromised in the end. | ||
The other way to think about security is information security, where you secure the information itself, that you don't have to worry about the security of the computers. | ||
You could have some computers in the cloud somewhere, information's flowing through them, and people can compromise those things and it doesn't really matter because the information itself is encrypted. | ||
And so, you know, things like SMS, you know, the iMessage cloud backups, most other messengers, Facebook Messenger, all that stuff, you know, they're relying on this computer security model And that ends up disappointing people in the end. | ||
Why did you guys create it? | ||
What was unsatisfactory about the other options that were available? | ||
Well, because the way the internet works today is insane. | ||
Fundamentally, I feel like private communication is important because I think that change happens in private. | ||
Everything that is fundamentally decent today started out as something that was a socially unacceptable idea at the time. | ||
You look at things like abolition of slavery, legalization of marijuana, legalization of same-sex marriage, even constructing the Declaration of Independence. | ||
Those are all things that required a space for people to process ideas outside the context of everyday life. | ||
Those spaces don't exist on the internet today. | ||
I think it's kind of crazy the way the internet works today. | ||
If you imagined You know, every moment that you were talking to somebody in real life, there was somebody there just with a clipboard, a stranger, taking notes about what you said. | ||
That would change the character of your conversations. | ||
And I think that in some ways, like, we're living through a shortage of brave or bold or courageous ideas, in part because people don't have the space to process what's happening in their lives outside of the context of everyday interactions, you know? | ||
That's a really good way to put it, because you've got to give people a chance to think things through. | ||
But if you do that publicly, they're not going to. | ||
They're going to sort of like basically what you see on Twitter. | ||
If you stray from what is considered to be the acceptable norm or the current ideology or whatever opinions you're supposed to have on a certain subject, You get attacked, ruthlessly so. | ||
So you see a lot of self-censorship, and you also see a lot of virtue signaling, where people sort of pretend that they espouse a certain series of ideas because that'll get them some social cred. | ||
Yeah, exactly. | ||
I think that communication in those environments is performative. | ||
You're either performing for an angry mob, you're performing for advertisers, you're performing for the governments that are watching. | ||
And I think also the ideas that make it through are kind of tainted as a result. | ||
Did you watch any of the online hearing stuff that was happening over COVID? You know, where city councils and stuff were having their hearings online? | ||
No, I did not. | ||
It was kind of interesting to me because it's like, you know, they can't meet in person, so they're doing it online. | ||
And that means that the public comment period was also online, you know? | ||
And so it used to be that, like, you know, if you go to a city council meeting, they have a period of public comment where, you know, people can just stand up and say what they think, you know? | ||
And, like, ordinarily, it's like, oh, you got to go to city hall, you got to, like, wait in line, you got to sit there, you know? | ||
But then when it's on Zoom, it's just sort of like anyone can just show up on the Zoom thing. | ||
You know, they just dial in and they're just like, here's what I think, you know? | ||
And... | ||
You know, it was kind of interesting because particularly when a lot of the police brutality still was happening in Los Angeles, I was watching the city council hearings and people were just like, you know, they were just calling, you know, like, fuck you! | ||
I yield the rest of my time, fuck you! | ||
You know, it was just like really brutal and not undeservedly so. | ||
You know, what was interesting to me was just watching the politicians, basically, you know, who just had to sit there, and just, they were just like... | ||
Take it! | ||
And it was just like, you know, you get three minutes, and then there's someone else to get, you know, and they're just like, okay, and now we'll hear from, you know, like... | ||
And, you know, watching that, you sort of realize that it's like, to be a politician, you have to just sort of fundamentally not really care what people think of you, you know? | ||
You have to fundamentally just be comfortable sitting, you know, and having people yell at you, you know, in three minute increments for an hour or whatever, you know. | ||
And so it seems like what we've sort of done is like bred these people who are willing to do that, you know. | ||
And in some ways that's like a useful characteristic, but in other ways that's the characteristic of a psychopath, you know. | ||
Yes. | ||
Yes. | ||
And I think what we're seeing is that that also extends outside of those environments. | ||
To do anything ambitious today requires that you just are comfortable with that kind of feedback. | ||
Like Trump's tweets. | ||
of Trump's tweets. | ||
When he tweets, watch what people say. | ||
It's ruthless. | ||
They go crazy. | ||
They go so hard on them. | ||
So I'm assuming he doesn't read them. | ||
I'm assuming he just, or maybe he does and just doesn't say anything, but he doesn't go back and forth with people, at least. | ||
No, but I think, you know, Trump is perfectly capable of just not caring. | ||
You know, just like people, like, you know, Grayson is just like, yeah, whatever, you know, I'm the best, they don't, you know. | ||
And, like, that's, you know, that's politics. | ||
But I think, you know, the danger is when that, you know, to do anything ambitious, you know, outside of politics or whatever, you know, requires that you're capable of just not caring, you know, what people think or whatever, because everything is happening in public. | ||
I think you made a really good point in that change comes from people discussing things privately because you have to be able to take a chance. | ||
You have to be daring and you have to be able to confide in people and you have to be able to say, hey, this is not right and we're going to do something about it. | ||
If you do that publicly, the powers that be that do not want change in any way, shape, or form, they'll come down on you. | ||
This is essentially what Edward Snowden was warning everyone about when he decided to go public with all this NSA information. | ||
We're saying, look, this is not what we signed up for. | ||
Someone's constantly monitoring your emails, constantly listening to phone calls. | ||
This is not this mass surveillance thing. | ||
It's very bad for just the culture of free expression, just our ability to have ideas and to be able to share them back and forth and vet them out. | ||
It's very bad. | ||
Yeah. | ||
I think when you look at the history of that kind of surveillance, there are a few interesting inflection points. | ||
At the beginning of the internet as we know it, in the early to mid-90s, there were these DOD efforts to do mass surveillance. | ||
They were sort of open about what they were doing. | ||
One of them was this program called Total Information Awareness. | ||
And they were trying to start this office, I think called the Total Awareness Office or something within the DoD. | ||
And the idea was they're just going to collect information on all Americans and everyone's communication and just stockpile it into these databases and then they would use that to mine those things for information. | ||
It was sort of like their effort to get in on this at the beginning of the information age. | ||
And, you know, it was ridiculous. | ||
You know, it's like they called it Total Information Awareness. | ||
They had a logo that was like, you know, the pyramid with the eye on top of it. | ||
Oh, yeah. | ||
This is their logo. | ||
Oh, God. | ||
The pyramid with the eye, like, casting a beam on the earth. | ||
That bit of Latin there means knowledge is power. | ||
Oh, wow. | ||
And interesting, this program was actually started by John Poindexter, of all people, who was involved in the Iran-Contra stuff, I think. | ||
Really? | ||
Yeah, yeah. | ||
And he, like, went to jail for a second, then was pardoned or something. | ||
So, anyway, you know, they're like... | ||
It's just so fucked up that these people are in charge of anything. | ||
I know, but what's also kind of comical is that they were like, this is what we're going to do. | ||
Look at how crazy this is. | ||
This is our plan. | ||
And people were like, I don't think so. | ||
What year was this? | ||
This was like early, mid-90s. | ||
Look at this. | ||
Authentication, biometric data, face, fingerprints, gait. | ||
Iris, your gait. | ||
So they're going to identify people based on the way they walk? | ||
I guess your gait is that specific? | ||
Yeah. | ||
Then automated virtual data repositories, privacy and security. | ||
This is fascinating. | ||
Because if you look at, I mean, obviously no one thought of cell phones back then. | ||
Exactly, right. | ||
So this is like kind of amateurish, right? | ||
So it's like, they're like, this is what we're going to do, you know? | ||
And people are like, I don't think so. | ||
Even like Congress is like, guys, I don't think we can approve this. | ||
You need a better logo, you know? | ||
Yeah, for sure. | ||
But it's just this whole flow chart. | ||
Is that what this would be? | ||
What do you call something like this? | ||
What is it called? | ||
Flowchart, I guess, sort of. | ||
Designed to dazzle you. | ||
Yeah. | ||
It's like baffling to figure out what it is. | ||
Like, first of all, what are all those little color tubes? | ||
Those little ones? | ||
Those little cylinders? | ||
Those are data silos. | ||
Oh. | ||
That's the universal. | ||
They're all different colors. | ||
There's purple ones. | ||
What's in the purple data? | ||
Well, gate, maybe. | ||
That's where gate lives, yeah. | ||
It's all Prince's information. | ||
Okay, so that, you know, this stuff all sort of got shut down, right? | ||
Yeah. | ||
They're like, okay, we can't do this, you know? | ||
And then instead, what ended up happening was data naturally accumulated in different places. | ||
Back then, what they were trying to do is be like, our proposal is that everyone carry a government-mandated tracking device at all times. | ||
What do you guys think? | ||
It'll make us safer. | ||
And people were like, no, I don't think so. | ||
But instead, everyone ended up just carrying cell phones at all times, which are tracking your location and reporting them into centralized repositories that government has access to. | ||
And so, you know, this sort of like oblique surveillance infrastructure ended up emerging. | ||
And that was what, you know, people sort of knew about, but, you know, didn't really know. | ||
And that's what Snowden revealed. | ||
It was like, you know, we don't have this. | ||
Instead, it's like all of those things are happening naturally, you know. | ||
You know, gate detection, fingerprint, you know, like all this stuff's happening naturally. | ||
It's ending up in these places. | ||
And then... | ||
You know, governments are just going to those places and getting the information. | ||
And then I think, you know, the next inflection point was really Cambridge Analytica. | ||
You know, that was a moment where I think people were like... | ||
Explain that to people, please. | ||
Cambridge Analytica was a firm that was using big data in order to forecast and manipulate people's opinions. | ||
In particular, they were involved in the 2016 election. | ||
It was sort of, you know, so it's like, you know, what Stone have revealed was PRISM, which was the cooperation between the government and these places where data was naturally accumulating, like Facebook, Google, etc., you know, and the phone company. | ||
And Cambridge Analytica, I think, was the moment that people were like, oh, there's like also sort of like a private version of PRISM, you know, that's like not just governments, but like the data is out there. | ||
And other people who are motivated are using that against us, you know? | ||
And so I think, you know, in the beginning it was sort of like, oh, this could be scary. | ||
And then it was like, oh, but, you know, we're just using these services. | ||
And then people were like, oh, wait, the government is, you know, using the data that we're, you know, sending to these services. | ||
And then people were like, oh, wait, like anybody can use the data against us. | ||
And they were like, oh, you know, it's like, I think things went from like, I don't really have anything to hide to like, wait a second, these people can... | ||
Predict and influence how I'm going to vote based on what kind of jeans I buy? | ||
And then sort of where we are today, where I think people are also beginning to realize that the companies themselves that are doing this kind of data collection are also not necessarily acting in our best interests. | ||
Yeah, for sure. | ||
There's also this weird thing that's happening with these companies that are gathering the data, whether it's Facebook or Google. | ||
I don't think they ever set out to be what they are. | ||
They started out, like Facebook, for example, we were talking about it before. | ||
It was really just sort of like a social networking thing. | ||
And this was in the early days. | ||
It was a business. | ||
I don't think anybody ever thought it was going to be something that influences world elections in a staggering way. | ||
Especially in other parts of the world, where Facebook becomes the sort of de facto messaging app on your phone when you get it. | ||
I mean, it has had massive... | ||
Impact on politics, on shaping culture, on... | ||
I mean, even genocide has been connected to Facebook in certain countries. | ||
You know, it's weird that this thing that is in... | ||
I don't know how many different languages does Facebook operate under? | ||
All of them, yeah. | ||
I mean, that this was just a social app... | ||
It was from Harvard, right? | ||
They were just connecting students together? | ||
Wasn't that initially what the first iteration of it was? | ||
Yeah. | ||
Okay, I mean, I think you can say, like, no one anticipated that these things would be this significant. | ||
But I also think that there's, you know, I think ultimately, like, what we end up seeing again and again is that, like, bad business models produce bad technology, you know? | ||
That, like... | ||
Mark Zuckerberg did not create Facebook because of his deep love of social interactions. | ||
He did not have some deep sense of wanting to connect people and connect the world. | ||
That's not his passion. | ||
Jeff Bezos did not start Amazon because of his deep love of books. | ||
These companies are oriented around profit. | ||
They're trying to make money. | ||
And they're subject to external demands as a result. | ||
They have to grow infinitely, which is insane, but that's the expectation. | ||
And so what we end up seeing is that the technology is not necessarily in our best interest because that's not what it was designed for to begin with. | ||
That is insane that companies are expected to grow infinitely. | ||
What is your expectation? | ||
To take over everything. | ||
To have all the money. | ||
And then more. | ||
Yeah, if we extrapolate, we anticipate we will have all the money. | ||
There will be no other money. | ||
If you keep going, that's what has to happen. | ||
How can you just grow infinitely? | ||
That's bizarre. | ||
Yeah, and that's why, I mean, I think the Silicon Valley obsession with China is a big part of that, where people, they're just like, wow, that's a lot of people there. | ||
Yes, that's a lot of people there. | ||
You can just keep growing. | ||
Yeah, there was a fantastic thing that I was reading this morning. | ||
God, I wish I could remember what the source of it was. | ||
They're essentially talking about how strange it is that there are so many people that are so anti-human trafficking. | ||
They're so pro-human rights. | ||
They're so anti-slavery. | ||
All the powerful values that we ascribe, that we think of when we think of Western civilization, we think of all these beautiful values. | ||
But that almost all of them rely on some form of slavery to get their electronics. | ||
You have eight grams of cobalt in your pocket over there. | ||
Yeah. | ||
Mined by actual child slaves. | ||
Someone had to stick a – like, literally, they're getting it out of the ground, digging into the dirt to get it out of the ground. | ||
We were talking about it on the podcast. | ||
They were like, is there a way that this could – is there a future that you could foresee where you could buy a phone that is guilt-free? | ||
If I buy a pair of shoes, like I bought a pair of boots from my friend Jocko's company. | ||
He's got a company called Origin. | ||
They make handmade boots. | ||
And it's made in a factory in Maine. | ||
You can see a tour of the factory. | ||
These guys are stitching these things together, and it's a real quality boot. | ||
And I'm like, I like that I could buy this. | ||
I know where it came from. | ||
I could see a video of the guys making it. | ||
This is a thing that I could feel like... | ||
I am giving them money. | ||
They're giving me a product. | ||
There's a nice exchange. | ||
It feels good. | ||
I don't feel like that with a phone. | ||
With a phone, I have this bizarre disconnect. | ||
I try to pretend that I'm not buying something that's made in a factory where there's a fucking net around it because so many people jump to their deaths that instead of trying to make things better, they say, we're going to put nets up, catch these fuckers, put them back to work. | ||
Is it possible... | ||
That we would all get together and say, hey, enough of this shit. | ||
Will you make us a goddamn phone that doesn't make me feel like I'm supporting slavery? | ||
Yeah, I mean, I think you're asking... | ||
Too much? | ||
I think you're asking... | ||
I think that's the same as asking, will civilization ever decide that we collectively want to have a sane and sustainable way of living? | ||
Yeah. | ||
Sane and sustainable. | ||
And I hope the answer is yes. | ||
I think a lot of us do. | ||
You do, right? | ||
I do. | ||
You don't want to buy a slave phone, right? | ||
Yeah, I mean, but okay, so, you know, I feel like it's difficult to have this conversation without having a conversation about capitalism, right? | ||
Because, like, ultimately, you know, what we're talking about is, like, externalities, that the prices of things don't incorporate their true cost, you know, that, like, you know, we're destroying the planet for plastic trinkets and reality television, you know, like... | ||
We can have the full conversation if you like. | ||
Let's start with phones, though. | ||
Let's start with... | ||
Because when most people know the actual... | ||
From the origin of the materials, like how they're coming... | ||
How they're getting out of the ground, how they're getting into your phone, how they're getting constructed, how they're getting manufactured and assembled by these poor people... | ||
When most people hear about it, they don't like it. | ||
It makes them very uncomfortable. | ||
But they just sort of go, la la la. | ||
They just plug their ears and keep going and buy the latest iPhone 12 because it's cool. | ||
It's new. | ||
What would they do instead? | ||
Well, if there was an option. | ||
So, like, if you have a car that you know is being made by slaves, or a car that's being made in Detroit by union workers, wouldn't you choose the car, as long as they're both of equal quality? | ||
I think a lot of people would feel good about their choice. | ||
If they could buy something that, well, no, these people are given a very good wage. | ||
They have health insurance and they're taken care of. | ||
They have a pension plan. | ||
There's all these good things that we would like to have ourselves that these workers get. | ||
So you should probably buy that car. | ||
Why isn't there an option like that for a phone? | ||
We looked at this thing called a fare phone. | ||
We're going over it. | ||
Can't even fucking buy it in America. | ||
Like, no, America has no options for fare. | ||
They only have them in, like, Holland and a couple other European countries. | ||
Yeah. | ||
I mean, I think... | ||
Yeah, maybe it's good to, you know, start with the question of phones. | ||
I think if you really examined, like, most of the things in your everyday life, there is an apocalyptic aspect to them. | ||
Yes. | ||
I mean, you know, even agriculture, you know, it's just like, you know, the sugar you put in your car, you know, it's like, I've been to the sugar beet harvest, you know, it's apocalyptic, you know, it's like, you know, so I think there's just like an aspect of civilization that we don't usually see or think about. | ||
Not non-conscious, but I mean conscious capitalism would be the idea that you want to make a profit, but you only want to make a profit if everything works. | ||
Like the idea of me buying my shoes from origin. | ||
Like knowing, okay, these are the guys that make it. | ||
This is how they make it. | ||
This makes me feel good. | ||
I like this. | ||
If there was that with everything... | ||
If you buy a home from a guy who you know built the home, this is the man. | ||
This is the chief construction guy. | ||
These are the carpenters. | ||
This is the architect. | ||
Oh, okay, I get it. | ||
This all makes sense. | ||
Yeah, I mean, and I think that's the image that a lot of companies try to project. | ||
You know what I mean? | ||
Like, you know, even Apple will say, you know, it's like designed by Apple in California. | ||
Sure, designed. | ||
And I think that's the same as like the architect and the builders that you know, you know, but those materials are coming from somewhere. | ||
That's true. | ||
The wood is coming from somewhere. | ||
And it's not just wood. | ||
There's petrochemicals. | ||
That whole supply chain is apocalyptic and you're never going to meet all of those people. | ||
And so I think, sure, they're... | ||
I think it's difficult to be in that market, if you want to be in the market of conscious capitalism or whatever, because it's a market for lemons. | ||
Because it's so easy to just put a green logo on whatever it is that you're creating, and no one will ever see the back of the supply chain. | ||
That's a sad statement about humans. | ||
You know, that we're... | ||
That this is how... | ||
I mean, this is how we always do things if you let us. | ||
If you leave us alone. | ||
If there's a way... | ||
You know, I mean, privacy is so important when it comes to communication with individuals. | ||
And this is why you created Signal. | ||
But when you can sort of hide... | ||
All the various elements that are involved in all these different processes, all these different things that we buy and use. | ||
And then, as you said, they're apocalyptic, which is a great way of describing it. | ||
If you're at the ground watching these kids pull coltan out of the ground in Africa, you'd probably feel really sick about your cell phone. | ||
Yeah, but I don't think... | ||
I think it's a little more complicated than to say that just like humans are terrible or whatever. | ||
No, I don't think humans are terrible. | ||
I think humans are great. | ||
But I think if you put humans together and you give them this diffusion of responsibility that comes from a corporation and then you give them a mandate, you have to make as much money as possible every single year. | ||
And then you have shareholders and you have all these different factors that will allow them to say, well, I just work for the company. | ||
You know, it's not my call. | ||
You know, I just, you know, you got the guy carving up a stake saying, listen, I'm so sorry that we have to use slaves, but look, Apple's worth $5 trillion. | ||
We've done a great job for our shareholders. | ||
Yeah, yeah, yeah. | ||
At the end of the line, follow it all the way down to the beginning, and you literally have slaves. | ||
Yeah, I fundamentally agree, and I think that that's, you know, that's... | ||
Anytime you end up in a situation where, like, most people do not have the agency that they would need in order to direct their life the way that they would want, you know, direct their life so that we're living in a sane and sustainable way, that, yeah, I think is a problem. | ||
And I think that's the situation we're in now, you know. | ||
And honestly, I feel like, you know, the stuff that we were talking about before of, you know, people... | ||
You know, sort of being mean online is a reflection of that. | ||
You know, that That's the only power that people have. | ||
The only thing you can do is call someone a name, you're going to call them a name. | ||
And I think that it's unfortunate, but I think it is also unfortunate that most people have so little agency and control over the way that the world works that that's all they have to do. | ||
And I guess you would say also that the people that do have power, that are running these corporations, don't take into account what it would be like to be the person at the bottom of the line. | ||
To be the person that is... | ||
There's no discussion. | ||
There's no board meetings. | ||
Like, hey guys, what are we doing about slavery? | ||
Well, no, I'm sure that they do talk about that, honestly. | ||
But they've done nothing. | ||
They've probably done what they think is something. | ||
Even the CEO of a company is someone who's just doing their job at the end of the day. | ||
They don't have ultimate control and agency over how it is that a company performs because they are accountable to their shareholders, they're accountable to the board. | ||
I think there is a tendency for people to look at what's happening, particularly with technology today, And think that it's the fault of the people, the leaders of these companies. | ||
I think it goes both ways. | ||
Slavoj Žižek always talks about when you look at the old political speeches, if you look at the fascist leaders, they would give a speech and when there was a moment of applause, they would just sort of stand there and accept the applause because in their ideology, they were responsible for the thing that people were applauding. | ||
And if you watch the old communist leaders, like when Stalin would give a speech and he would say something and there would be a moment of applause, he would also applaud. | ||
Because in their ideology of historical materialism, they were just agents of history. | ||
They were just the tools of the inevitable. | ||
It wasn't them. | ||
You know, they had just sort of been chosen as the agents of this thing that was an inevitable process. | ||
And so they were applauding history, you know. | ||
Sometimes when I see the CEOs of tech companies give speeches and people applaud, I feel like they should also be applauding. | ||
That it's not them. | ||
Technology has its own agency, its own force that they're the tools of, in a way. | ||
That's a very interesting way of looking at it. | ||
Yeah, they are the tools of it. | ||
And at this point, if we look at where we are in 2020, it seems inevitable. | ||
It seems like there's just this unstoppable amount of momentum behind innovation and behind just the process of Creating newer, better technology and constantly putting it out and then dealing with the demand for that newer, better technology and then competing with all the other people that are also putting out newer, better technology. | ||
Look what we're doing. | ||
We are helping the demise of human beings. | ||
Because I feel, and I've said this multiple times and I'm going to say it again, I think that we are the electronic caterpillar that will give way to the butterfly. | ||
We don't know what we're doing. | ||
We are putting together something that's going to take over. | ||
We're putting together some ultimate being, some symbiotic connection between humans and technology, or literally an artificial version of life, not even artificial, a version of life constructed with silicone and wires and things that we're making. | ||
If we keep going the way we're going, we're going to come up with a technology that I think we're a ways away. | ||
Yeah, we're a ways away, but how many ways? | ||
50 years? | ||
The moment that I can put my hand under the automatic sink thing and have the soap come out without waving around, then I'll be worried. | ||
That's simplistic, sir. | ||
How dare you? | ||
Here's a good example. | ||
The Turing test is if someone sat down with, like in Ex Machina, Remember, it was one of my all-time favorite movies, where the coder is brought in to talk to the woman, and he falls in love with the robot lady, and she passes the Turing test, because he's in love with her. | ||
I mean, he really can't differentiate, in his mind, that is a woman, that's not a robot. | ||
Was it Alan Turing? | ||
What was the gentleman's name? | ||
Alan Turing. | ||
Alan Turing, that came up with the Turing test. | ||
You know, he was a gay man in England in the 1950s when it was illegal to be gay. | ||
And they chemically castrated him because of that. | ||
And he wound up killing himself. | ||
That's only 70 years ago. | ||
Oh yeah, yeah. | ||
It's fucking insane. | ||
I mean, just think that this man back then was thinking there's going to be a time where we will have some kind of a creation where we imitate life, the current life that we're aware of, where we're going to make a version of it that's going to be indistinguishable from the versions that are biological. | ||
That very guy, by whatever twisted ideas of what human beings should or shouldn't do, whatever expectations of culture at the time, is forced to be chemically castrated and winds up committing suicide. | ||
Just by the hand of humans. | ||
Fucking strange, man. | ||
Like, really strange. | ||
I mean... | ||
Worse than strange. | ||
Oh, yes. | ||
Horrible. | ||
But I mean, so bizarre that this is the guy that comes up with the test of how do we know when something is... | ||
When it passes, when you have an artificial person that passes for a person, and then what kind of rights do we give this person? | ||
What is this? | ||
What is it? | ||
If it has emotions, what if it cries? | ||
Are you allowed to kick it? | ||
You know, like, what do you do? | ||
Like, that's—but I made it. | ||
I turned it on. | ||
I could fucking torture it. | ||
But you can't. | ||
It's screaming. | ||
It's in agony. | ||
Don't do that. | ||
Yeah. | ||
I mean, you know, I don't think about this stuff that often, but it is, you know, it's an empirical test, right? | ||
So it's like, it's a way to avoid having to define what consciousness is, right? | ||
Which is kind of strange. | ||
We're conscious beings and we don't actually really even know what that means. | ||
Right. | ||
And so instead we have this empirical test where it's just sort of like, well, if you can't tell the difference without being able to see it, then we'll just call that. | ||
I think that is really a lot closer than we think. | ||
I think that's 50 years. | ||
I think that if everything goes well, I think I'm going to be a 103-year-old man on my dying bed being taken care of by robots. | ||
And I'm going to feel real fucked up about that. | ||
I'm going to be like, oh my god. | ||
I can't believe this. | ||
I'm gonna leave and then all the people that I knew that are alive, they're the last of the people. | ||
This is it. | ||
The robots are gonna take over. | ||
They're not even gonna be robots. | ||
They're gonna come up with some cool name for them. | ||
Yeah, I mean, I think that there's a lot of, most of what I see in like the artificial intelligence world right now is not really intelligence, you know, it's, it's just matching, you know, it's like you show a model, an image of 10 million cats, and then you can show it an image, and it will be like, I predict that this is a cat. | ||
And then you can show it an image of a truck, and it'll be like, I predict that this is not a cat. | ||
I think there's one way of looking at it that's like, well, you just do that with enough things enough times, and that's what intelligence is. | ||
But I kind of hope not. | ||
The way that it's being approached right now, I think, is also dangerous in a lot of ways, because what we're doing is just feeding information about the world into these models, and that just encodes the existing biases and problems with the world into the things that we're creating. | ||
That, I think, has negative results. | ||
But it's true. | ||
This ecosystem is moving and it's advancing. | ||
The thing that I think is unfortunate is that right now, that ecosystem, this really capital-driven investment startup ecosystem, has a monopoly on groups of young people trying to do something ambitious together in the world. | ||
In the same way that I think it's unfortunate that grad school has a monopoly on groups of people learning things together. | ||
Part of what we're trying to do different with Signal is it's a non-profit because we want to be for something other than profit. | ||
We're trying to explore a different way of groups of people doing something mildly ambitious. | ||
Has anyone come along and go, I know it's a non-profit, but would you like to sell? | ||
Well, you can't do that. | ||
There's nothing to sell. | ||
It's kind of amazing, though, that you guys have figured out a way to create, like, basically a better version of iMessage that you could use on Android. | ||
Because one of the big complaints about Android is the lack of any encrypted messaging services. | ||
Or just good messaging services. | ||
Yeah, they've just recently come out with their own version of iMessage, but it kind of sucks. | ||
You can't do group chats. | ||
There's a lot of things you can't do with it, and it's encrypted. | ||
I don't think it's rolled out everywhere, too, right? | ||
It's not everywhere. | ||
I don't think it's rolled out at all, actually. | ||
Oh, you could get a beta? | ||
Is that what it is? | ||
Yeah, I don't know what the... | ||
Right, so it's like, you know, Android... | ||
So Google, for Android, makes an app called Messages, which is just the standard SMS texting app. | ||
And they put that on the phones that they make, like the Pixel and stuff like that, you know. | ||
And then there's the rest of the ecosystem. | ||
You know, there's like, you know, Samsung devices, Huawei devices, you know, all this stuff. | ||
And it's sort of... | ||
It depends, you know, what's on those things. | ||
And... | ||
So, they've been trying to move from this very old standard called SMS that you mentioned before to this newer thing called RCS, which actually I don't know what that stands for. | ||
I think in my mind I always think of it as standing for too little too late. | ||
But they're trying to move to that. | ||
So they're doing that on the part of the ecosystem that they control, which is the devices that they make and sell. | ||
And they're trying to get other people on board as well. | ||
Originally, RCS didn't have any facility for end-to-end encryption. | ||
And they're actually using our stuff, the Signal Protocol, in the new version of RCS that they're shipping. | ||
So I think they've announced that, but I don't know if it's on or not. | ||
I have two bones to pick with you guys. | ||
Two things that I don't necessarily like. | ||
One, when I downloaded Signal and I joined, basically everyone that I'm friends with who was also on Signal got a message that I'm on Signal. | ||
So you ratted me out. | ||
You ratted me out to all these people that are in my contact list. | ||
Why do you want it to be difficult for people to communicate with you privately? | ||
Well, me personally, because there's a lot of people that have my phone number that I wish didn't have my phone number. | ||
And now all of a sudden they got a message from me that I'm on Signal. | ||
And then they send me a message. | ||
Hey, I'd like this from you. | ||
I want you to do that for me. | ||
How about call me about this? | ||
I got a project. | ||
So I just wish you didn't rat me out. | ||
I wish there was a way that you could say, do you want Everyone to know that you just joined Signal. | ||
Yes or no? | ||
I'd say no! | ||
Another one. | ||
Those little dot dot dots, the ellipsis. | ||
Yeah. | ||
Can you shut that off? | ||
Because I don't want anybody to know that I'm responding to a text. | ||
You can turn it off. | ||
Can you turn that off? | ||
Oh, okay. | ||
So it's in the settings? | ||
Yeah, privacy settings. | ||
Typing indicators, you can turn it off. | ||
Leave receipts, you can turn it off. | ||
That's a big problem with iMessage. | ||
People get mad at you. | ||
They see the dot, dot, dots, and then there's no message. | ||
Like, hey, you were going to respond, and then you didn't. | ||
Why don't you just relax? | ||
Just go about your life and pretend that I didn't text you back yet. | ||
Because I will. | ||
But it's not like the dot, dot, dots. | ||
People are like, oh, it's coming. | ||
Here comes the message. | ||
And then there's no message! | ||
Yeah, you can turn that off. | ||
You can also turn off your receipt so people don't even know if you've read their message. | ||
Yes, that's good, too. | ||
Yeah. | ||
My friend Sagar has it set up so that if he texts you, you have 30 minutes, bitch, and then they all disappear. | ||
All the messages disappear. | ||
Oh, oh, they disappear. | ||
Yeah, yeah, yeah. | ||
That's kind of a sweet move. | ||
I like that. | ||
With the discovery question of you don't want people to know that you're on Signal, it's kind of So, we're working on it, but it's a more difficult problem than you might imagine because you want some people to know that you're on... | ||
I'll text them! | ||
So you want nobody to know? | ||
Well, me personally, I have a unique set of problems that comes with anything that I do, like with messaging and stuff. | ||
I've changed my number once a year, and I have multiple phone numbers. | ||
I got a lot of problems. | ||
But this is a unique problem with me. | ||
All of a sudden, I'm like, how the fuck does he know? | ||
And then I had to ask someone. | ||
They go, oh no, when you sign up, it sends everybody on your contact list that's on Signal a message that says you're on Signal. | ||
I'm like, oh! | ||
Well, we don't send that, actually. | ||
I know you don't care, but we don't actually know who your contacts are. | ||
Signal does, though. | ||
The app does. | ||
The app on your phone does, and it doesn't even send a message to those people. | ||
It's just that those people know your phone number, and that app now knows that that phone number is on Signal. | ||
Did you do that just to get more people to use Signal? | ||
Why, when you sign up for Signal, does it send all the other people in your contact list on Signal a message? | ||
A lot of people like it. | ||
So a lot of people like knowing who they can communicate with. | ||
And the other thing is we try to square the actual technology with the way that it appears to work to people. | ||
So right now, with most technology, it seems like you send a message and the person who can see it is the person who received the message. | ||
You sent the message to, you know, the intent recipient, you know? | ||
And that's not how it actually works. | ||
And so, like, a lot of what we're trying to do is actually just square the way the technology actually works with what it is that people perceive. | ||
And so, like, fundamentally, right now, you know, Signal is based on phone numbers. | ||
If you register with your phone number, like, people are going to know that they can contact you on Signal. | ||
It's very difficult to make it so that they can't, you know, that, like, If we didn't do that, they could hit the compose button and see just that they could send you a message. | ||
They would just see you in the list of contacts that they can send messages to. | ||
And then if we didn't display that, they could just try and send you a message and see whether a message goes through. | ||
It's always possible to detect whether it is that you're on Signal the way that things are currently designed. | ||
It's interesting also how it works so much differently with Android than it does with iMessage. | ||
With Android, it'll also send an SMS. I noticed that I can use Signal as my main messaging app on Android. | ||
And it'll send SMS or it'll send a Signal message. | ||
It doesn't do that with iPhones. | ||
Yeah, Apple doesn't let you. | ||
Yeah, I found that pretty interesting. | ||
Because I tried to send people messages. | ||
I thought it would just send it as an SMS and it didn't. | ||
We would if we could, but Apple doesn't allow it. | ||
It doesn't allow it. | ||
Interesting. | ||
Because Apple's scared of you. | ||
Say it! | ||
Say it! | ||
They're fucking scared! | ||
No, I mean... | ||
They should be. | ||
Apple is... | ||
It's a better version of what they've got. | ||
How about that? | ||
I agree, but yeah, I mean, they have a much more complicated answer, but maybe you can distill it down to them. | ||
You guys need to just develop your own version of AirDrop, and then no one will need Apple ever again. | ||
That's what's holding people back, like a universal airdrop. | ||
Airdrop keeps a lot of fucking people on Apple. | ||
It's the best. | ||
You make a video, like a long video, a couple minutes long, and you can just airdrop it to me. | ||
Whereas if you text it to me, especially if I have an Android phone, oh, it becomes this disgusting version. | ||
I'll downsample it. | ||
It looks terrible. | ||
Yeah, no, that's true. | ||
That's true. | ||
Yeah, photographs are not too bad. | ||
I think it does a down-sample photographs as well, but not too bad. | ||
It's like, you could look at it, it looks like a good photograph. | ||
But video is just god-awful. | ||
It's embarrassing when someone sends you a video and you have it on an Android phone. | ||
You're like, what the fuck did you send me? | ||
This is terrible. | ||
What did you take this with? | ||
A flip phone from the 90s? | ||
It's so bad. | ||
But I mean, a lot of that is like, I think the reason why it is that way is kind of interesting to me, which is, you know, it's like these are protocol, you know, it's like when you're just using a normal SMS message on Android, you know, that was like this... | ||
agreement that phone carriers made with each other in like, you know... | ||
2002? | ||
No, before that. | ||
Really? | ||
96? | ||
Yeah, exactly. | ||
And then they've been unable to change the way that it works since then because you have to get everyone to agree. | ||
Right. | ||
And is Apple holding back some sort of a universal standard? | ||
Because if they did have a universal standard, then everyone would have this option to use. | ||
You could use a Samsung phone or a Google phone. | ||
You could use anything, and everybody would be able to message you clearly, without a problem. | ||
Like, one of the things that holds people back is if you switch from an iPhone to an Android phone, you lose all those iMessages. | ||
Sure, sure, sure. | ||
Yeah. | ||
Yeah, they're probably doing that intentionally because they... | ||
Fucking weasels. | ||
Don't they have enough money? | ||
Like, Jesus Christ. | ||
There's never enough. | ||
That's the problem. | ||
That is the problem, right? | ||
Yeah. | ||
And I think, I mean, it's like, I think the thing that everyone's worried about right now with Apple is like, you know, Apple, you know what I said before of like bad business models produce bad technology. | ||
You know, thus far, Apple's business model is much better than, you know, Google or Facebook or Amazon or, you know, like they're Their business is predicated on selling phones, selling hardware. | ||
And that means that they can think a little bit more thoughtfully about the way that their software works than other people. | ||
And I think what people are concerned about is that that business model is going to change. | ||
Approaching an asymptote of how many phones that they can sell. | ||
And so now they're looking at software. | ||
They're like, what if we had our own search engine? | ||
What if we had our own thing? | ||
And the moment that that starts to happen, then they're sort of moving in the direction of the rest of big tech. | ||
Which, you know, who knows how they do it, but that's what I think people are concerned about. | ||
They've done a better job at protecting your privacy, though, in terms of, like, particularly Apple Maps. | ||
Like, their Map app is far superior in terms of sharing your information than, say, like, the Google Maps. | ||
But the argument you could make is that Google Maps is a superior product because they share that information. | ||
Google Maps is also Waze now, right? | ||
They bought Waze, which is fantastic. | ||
It lets you know where the cops are, there's an accident up ahead, all kinds of shit, right? | ||
But Apple Maps is not that good. | ||
I use it because I like the ethic behind it. | ||
I like their idea behind it. | ||
They delete all the information after you make... | ||
If you go to a destination, it's not saving it, sending it to a server, and making sure it knows what was there and what wasn't there and how well you traveled and sharing information. | ||
They're not doing that. | ||
They're not sharing your information. | ||
unidentified
|
Right? | |
We don't know. | ||
I'm sure that they have a policy. | ||
I haven't read the policy, and maybe the policy says that. | ||
Supposedly. | ||
You're still in the world of trying to make computers secure. | ||
There's probably data, the data is probably accumulating somewhere, and maybe people can compromise those places. | ||
Yeah. | ||
We don't know. | ||
For sure, the intent behind the software that they have constructed, I think, has been much better than a lot of the other players in Big Tech. | ||
I think the concern is just that as that software becomes a larger part of their bottom line, that that might change. | ||
I wonder if they can figure out a way to have an I don't give a fuck phone or I care phone. | ||
Like, you want to have an I don't give a fuck phone? | ||
This phone is like, who knows what's making it? | ||
But look, it's really good. | ||
It's got a 100 megapixel camera and all this jazz. | ||
And a 5,000 milliamp battery. | ||
And then you've got an I care phone. | ||
unidentified
|
And the I care phone, it's like an iPhone X. But what's different about the iCareFone? | |
The iCareFone, you get a clear line of distinction. | ||
You get a real clear path. | ||
This is where we got our materials. | ||
These are the people that are making it. | ||
This is how much they're getting paid. | ||
Everyone is unionized. | ||
They're all getting healthcare. | ||
This is... | ||
They'll have 401k plans. | ||
It costs a little bit more. | ||
It's not as good. | ||
If you truly encapsulated all of the social costs with producing that phone, I think it would cost more than a little bit more. | ||
How much more do you think it would cost? | ||
I think some astronomical number. | ||
I'm sure Apple would prefer not to have child slaves mining cobalt for the batteries that are in their phone. | ||
Is that a thing you can say when a company is worth as much as most countries? | ||
They have so much cash. | ||
Can you really say that they would rather not use slaves? | ||
Can you imagine? | ||
I don't want to go broke. | ||
I only have $14 trillion. | ||
What am I going to do? | ||
What am I going to do? | ||
I need slaves. | ||
I need someone to dig the coal tan out of the Congo. | ||
What would I do if I was them? | ||
Well, first of all, it could never be them. | ||
It would never work. | ||
But if I was, I would say, hey, why don't we open up a factory in America? | ||
And why don't we... | ||
But you've got to mind the cobalt isn't in America. | ||
Right. | ||
Why don't we get all of our cobalt from recycled phones? | ||
Is that possible? | ||
Who's going to recycle them? | ||
That's a good question. | ||
I think that's what the Fairphone is trying to do, right? | ||
Aren't they using all recycled materials? | ||
unidentified
|
No. | |
Yeah, I mean, I don't... | ||
Any image I've seen of electronic recycling is equally apocalyptic. | ||
You know, there's just piles of shit like in some next to a lake in China where people are... | ||
You're bumming me out, man. | ||
How about we do... | ||
But I think if you were the CEO of Apple and you were like, this is a priority, we're going to spend, you know, however many trillions of dollars it takes to do this... | ||
Your shareholders go, hey, fuckface. | ||
You're fired. | ||
Out! | ||
You would have to be the grand poobah of Apple. | ||
You'd have to be the ultimate ruler. | ||
But it's not like, even then, if you were just like, you know, I'm willing to take the hit, you know, I'm going to do, no one can oust me or whatever. | ||
I'm the grand poobot, you know? | ||
Then it's like your share price plummets, which means that your employee retention plummets because those people are also working for the equity. | ||
Right. | ||
Stock options. | ||
And then they get poached away by these other companies. | ||
Dirty companies come and steal your clean employees. | ||
This is what Apple's website says now. | ||
It says they're committed to one day sourcing 100%. | ||
Look at this. | ||
Completely recycled, every bit as advanced. | ||
One day. | ||
We're committed to one day sourcing. | ||
One day. | ||
We're planning on the year 30,000. | ||
I mean, you know, it's like, I don't... | ||
They're not like sitting around twirling their mustaches. | ||
You know what I mean? | ||
It's just like, everyone likes good things and not bad things. | ||
Maybe they are. | ||
Let me read that again, Jamie. | ||
It says, 100% recyclable and renewable materials across all of our products and packaging because making doesn't have to mean taking from the planet. | ||
Oh, come on. | ||
You guys... | ||
It's like Nike. | ||
It's the same thing too, right? | ||
They're all committed to Black Lives Matter and all these social justice causes and they're using slave labor too. | ||
You know, aren't they? | ||
In China, they're using slave labor to make Nikes. | ||
Probably. | ||
So go back to that thing. | ||
What are they trying to do? | ||
I remember seeing a robot they have that can take the pieces out of it at a very fast rate than probably human hands can. | ||
Oh, okay. | ||
So that's why I was trying to dig through here, but I found that. | ||
Well, that would be good. | ||
I think that's the robot. | ||
That's the peacetaking robot? | ||
Daisy, it's good. | ||
This is Daisy. | ||
Don't name her. | ||
Name her, you've got a problem. | ||
There you go, 23rd. | ||
Right? | ||
Entirely clean energy, which isn't quite as... | ||
It's, you know... | ||
2030 means transitioning hundreds of our manufacturing suppliers to 100% renewable sources of electricity. | ||
Well, that's interesting. | ||
If they can actually do that, 100% resource, if they can figure out a way to do that, And to have recyclable materials and have all renewable electricity, whether it's wind or solar, if they could really figure out how to do that, I think that would be pretty amazing. | ||
But who's going to put it together? | ||
Are they going to still use slaves to put it together? | ||
I mean, I guess the people that are working at Foxconn are technically slaves, but would you want your child to work there? | ||
You know? | ||
Yeah, I mean, I think you can say that about a lot of the aspects of our economy, though. | ||
You know, who would willingly go into a coal mine? | ||
Yes. | ||
Right. | ||
unidentified
|
Yeah. | |
You know, there's some element of coercion to a lot of what keeps the world spinning. | ||
Right. | ||
And that's the, when you get into these insidious arguments about, or conversations about conspiracies, like conspiracies to keep people impoverished, they're like, well, why would you want to keep people impoverished? | ||
Well, who's going to work in the coal mines? | ||
You're not going to get wealthy, highly educated people to work in the coal mines. | ||
You need someone to work in the coal mines. | ||
So what do you do? | ||
What you do is you don't help anybody get out of these situations. | ||
So you'll always have the ability to draw from these impoverished communities, these poor people that live in Appalachia or wherever their coal miners are coming from. | ||
There's Not a whole lot of ways out. | ||
Like, I have a friend who came from Kentucky, and he's like, the way he described it to me, he goes, man, you've never seen poverty like that. | ||
Like, people don't want to concentrate on those people because it's not as glamorous as some other forms of poverty. | ||
He goes, but those communities are so poor. | ||
Yeah. | ||
40 million Americans, right? | ||
Yeah. | ||
40 million Americans are living in poverty. | ||
Yeah. | ||
I mean, I don't know if that conspiracy is accurate, but that's the one that people always want to draw from, right? | ||
They always want to... | ||
I mean, I don't think you need a conspiracy. | ||
You know, you just... | ||
You have... | ||
You have poor people. | ||
Structural forces, you know, that are like... | ||
Yeah. | ||
unidentified
|
That's... | |
That's why it's rare that a company comes along and has a business plan like Signal where they're like, we're going to be non-profit. | ||
We're going to create something that we think is of extreme value to human beings, just to civilization in general, the ability to communicate anonymously or at least privately. | ||
It's a very rare thing that you guys have done, that we decided to do this and to do it in a non-profit way. | ||
What was the decision that led up to that? | ||
How many people were involved? | ||
Now there's 20-something people. | ||
Which do you think that's a lot or a little? | ||
Um, I think that's a little okay I think it's always interesting talking to people. | ||
A lot of times I'll meet somebody and they're like, oh yeah, you're the person who did Signal or something. | ||
I'm like, oh yeah, yeah, yeah. | ||
They're like, okay, cool. | ||
What are you doing now? | ||
I'm like, oh, I'm still working on Signal. | ||
They're like, oh, is there another Signal that you're going to do? | ||
You're going to do Signal 2? | ||
I think it's hard for people to understand that software is never finished. | ||
There's this... | ||
Which is something that I really envy about, like, the kind of creative work that someone like you does. | ||
You know, that, like, I envy artists, musicians, writers, poets, painters, you know, people who can create something and be done. | ||
You know, that, like, you can record an album today, and 20 years later, you can listen to that album, and it'll be just good, you know? | ||
It's like, software's never finished. | ||
And if you stop, it'll just, like, float away like dandelions. | ||
What happens if you stop? | ||
Because software is not... | ||
It's very hard to explain this. | ||
It doesn't exist in isolation. | ||
It's a part of the ecosystem of all software. | ||
And that ecosystem is moving, and it's moving really fast. | ||
There's a lot of money behind it, a lot of energy in it. | ||
And if you aren't moving with it, it will just... | ||
Stop working. | ||
And also, it's like, you know, a project like this is not just the software that runs on your phone, but the service of, like, you know, moving the messages around on the internet, and that requires a little bit of care and attention, and if you're not doing that, then it will dissipate. | ||
And if you're doing something non-profit, the way you're doing it, how do you pay everybody? | ||
Like, how does it work? | ||
Yeah, well, okay, so, you know, the history of this was, um, I think before the internet really took over our lives in the way that it has, there were the kind of social spaces for people to experiment with different ideas outside of the context of their everyday lives, you know, like art projects, punk rendezvous, experimental gatherings. | ||
The embers of art movements. | ||
These spaces existed and were things that I found myself in and a part of. | ||
And they were important to me in my life. | ||
You look like a dude who'd go to Burning Man. | ||
Actually, I'm not a dude that goes to Burning Man. | ||
Maybe you're missing it. | ||
I've been once. | ||
I went in 2000, I think. | ||
Early adopter. | ||
Well, it's funny because at the time that I went, people were like, oh man, it's not like it used to be. | ||
And now people are like, have you been? | ||
I was like, I went once in 2000. Like, wow, wow, that's when it was like the real deal. | ||
I'm like, I don't think so. | ||
It's one of those things where it's like, you know, there's like day one and then on day two, they're like, ah, it's not like day one. | ||
Right, of course, of course. | ||
But yeah, I don't know. | ||
Those things, those spaces were important to me and like an important part of my life. | ||
And As more of our life started to be taken over by technology, Me and my friends felt like those spaces were missing online. | ||
We wanted to demonstrate that it was possible to create spaces like that. | ||
there had been a history of people thinking about cryptography in particular. | ||
And, and which is kind of funny in hindsight. | ||
So in the like eighties, so the history of cryptography is actually not long, like at least in outside of the military, you know? | ||
It really starts in the 70s. | ||
There were some really important things that happened then. | ||
In the 80s, there was this person who was this lone maniac who was writing a bunch of papers about cryptography during a time when it wasn't actually that relevant because there was no internet. | ||
The applications for these things were harder to imagine. | ||
And then in the late 80s there was this guy who wrote a Who was a retired engineer who discovered the papers that this maniac, David Chum, had been writing and was really... | ||
Was he doing this in isolation or was he a part of a project or anything? | ||
No, I think David Chum was... | ||
I think he's an academic. | ||
I'm embarrassed that I don't know. | ||
But he did a lot of the notable work on using the primitives that had already been developed. | ||
And he had a lot of interesting ideas and... | ||
There's this guy who was a retired engineer, his name was Tim May, who was kind of a weird character. | ||
And he found these papers by David Chum, was really enchanted by what they could represent for a future. | ||
And he wanted to write like a sci-fi novel that was sort of predicated on a world where cryptography existed and there was a future where the internet was developed. | ||
And so he wrote some notes about this novel, and he titled the notes The Crypto Anarchy Manifesto. | ||
And he published the notes online, and people got really into the notes. | ||
And then he started a mailing list in the early 90s called the Cypherpunks mailing list. | ||
And all these people started, you know, joined the mailing list and they started communicating about, you know, what the future was going to be like and how, you know, they needed to develop cryptography to live their, you know, crypto-anarchy future. | ||
And at the time, it's strange to think about now, but cryptography was somewhat illegal. | ||
It was regulated as a munition. | ||
unidentified
|
Really? | |
Yeah. | ||
So if you wrote a little bit of crypto code and you sent it to your friend in Canada, that was the same as, like, shipping Stinger missiles across the border to Canada. | ||
unidentified
|
Wow! | |
So did people actually go to jail for cryptography? | ||
There were some high-profile legal cases. | ||
I don't know of any situations where people were tracked down as munitions dealers or whatever, but it really hampered what people were capable of doing. | ||
So people got really creative. | ||
There were some people who wrote some crypto software called Pretty Good Privacy, PGP. And they printed it in a book, like an MIT Press book, in a machine-readable font. | ||
And then they're like, this is speech. | ||
This is a book. | ||
I have my First Amendment right to print this book and to distribute it. | ||
And then they shipped the books to Canada and other countries and stuff, and then people in those places scanned it back in. | ||
To computers. | ||
And they were able to make the case that they were legally allowed to do this because of their First Amendment rights. | ||
And other people moved to Anguilla and started writing code in Anguilla and shipping it around the world. | ||
There were a lot of people who were fervently interested. | ||
Why Anguilla? | ||
Because it's close to the United States and there were no laws there about producing cryptography. | ||
I think that was something people thought about. | ||
They have like three cases of COVID there ever. | ||
Oh, really? | ||
Yeah, it's a really interesting place. | ||
Yeah, I used to work down there. | ||
Really? | ||
Okay, International Traffic and Arms Regulation. | ||
It's a United States regulatory regime to restrict and control the export of defense and military-related technologies to safeguard U.S. national security and further U.S. foreign policy objectives. | ||
ITAR. Yeah, they were closed and Gila was closed until like November. | ||
They wouldn't let anybody in. | ||
And yeah, if you want to go there, they have like, I was reading all these crazy restrictions. | ||
You have to get COVID tested and you have to apply. | ||
And then when you get there, they test you when you get there. | ||
Because they have no deaths. | ||
Yeah, yeah, yeah. | ||
That's cool. | ||
Yeah, I like Gila. | ||
It's an interesting place. | ||
Yeah, this is what I was reading. | ||
They're inviting companies to come move here. | ||
Like, come work here. | ||
Oh, interesting. | ||
Come, we'll test the shit out of you. | ||
You can't go anywhere, but come here. | ||
It's beautiful. | ||
It is beautiful. | ||
I used to work on boats down there. | ||
Yeah? | ||
What'd you do on boats? | ||
I was like really... | ||
I don't know. | ||
I, for a while, was really into sailing and I had a commercial license and I was moving boats around and stuff. | ||
My parents lived in a sailboat for a while. | ||
Oh, really? | ||
Yeah. | ||
Yeah, they just decided to just check out. | ||
And this was like... | ||
I want to say early 2000s, somewhere around then. | ||
I lived on a sailboat for a few years until my mom got tired of it. | ||
They go around the world? | ||
They were in the Bahamas. | ||
They were all around that part of the world. | ||
They were in California for a little while on their boat. | ||
They just decided, let's just live on a boat for a while. | ||
Yeah, it's pretty crazy. | ||
I discovered sailing by accident where I was like... | ||
Working on a project with a friend in the early 2000s, and we were looking on Craigslist for something unrelated, and we saw a boat that was for sale for $4,000. | ||
And I thought a boat was like a million dollars or something. | ||
I was just like, what? | ||
The sailboats are $4,000? | ||
And this is just some listing. | ||
There's probably even cheaper boats, you know? | ||
And so we got really into it, and we discovered that you can go to any marina in North America and get a boat for free. | ||
You know, that like every marina has a lean sail dock on it where people have stopped paying their slip fees, and the boats are just derelict and abandoned, and they've You know, put it on these stocks. | ||
Really? | ||
Yeah. | ||
You get a boat for free? | ||
Yeah. | ||
They have an auction. | ||
There's usually like a minimum bid of, you know... | ||
50 bucks? | ||
50 bucks or whatever, you know. | ||
And most times it doesn't get bid on and they chop the boat up and throw it away. | ||
Really? | ||
And if you show up... | ||
So a functional boat? | ||
Oh, functional. | ||
Oh, that's the problem, right? | ||
You know... | ||
You gotta maintain the shit out of boats. | ||
Yeah, so, you know, if you put some work into it, though, you can get it going. | ||
And so we started doing that. | ||
We were, like, you know, getting boats, fixing them up, sailing them as far as we could. | ||
And then eventually I got a commercial license and started sailing other people's boats. | ||
unidentified
|
Wow! | |
All this on a whim of, how much does a boat cost? | ||
You can get a boat for four grand? | ||
Holy shit! | ||
Next thing you know, you're working on boats. | ||
Yeah, yeah. | ||
I mean, I was... | ||
It's a really... | ||
It's a whole world, you know? | ||
It's just like, you know, finding that link on Craigslist was like, you know, opening a door to another reality, right? | ||
Where it's like... | ||
Yeah. | ||
Because it's pretty amazing, you know, me and some friends used to sail around the Caribbean and... | ||
You know, the feeling of, like, you know, you pull up an anchor, and then you sail, like, you know, 500 miles to some other country or whatever, and you get there, and you drop the anchor, and you're just like, we... | ||
It was just the wind. | ||
The wind that took, you know, like, there was no engine, there was no fuel. | ||
It was just the wind, you know, and you catch fish, and, you know, it's just like... | ||
If you want to go real old school, you've got to use one of them... | ||
What are those fucking sex tents? | ||
Sex tents, of course. | ||
Do you use one of those? | ||
No, you didn't! | ||
Did you really? | ||
unidentified
|
Yeah. | |
I was like really into like, you know, no electronics, like it's just complicated, you know, they're expensive or whatever. | ||
So we had a TAF rail log. | ||
It's like a little propeller on a string that you connect to a gauge. | ||
And as it turns, the gauge keeps track of how far you've traveled. | ||
unidentified
|
What? | |
Yeah, so it's like... | ||
A propeller on a string? | ||
So it's just a thing that turns a string at a constant rate depending on how fast you're moving. | ||
So it can gauge how much distance you've traveled. | ||
So is the string marked? | ||
No, no, no. | ||
It's just a constant length. | ||
It's always spinning, and it's always turning the gauge. | ||
And then it reads a number? | ||
So it says how many miles? | ||
So it's just like a dial in the number of how many nautical miles you've traveled. | ||
unidentified
|
Wow. | |
So then you're just like, okay, well, we started here, and then we headed on this heading, and we did that, and we traveled 10 miles, so we must be here. | ||
And then once a day, you can take a sight with your sextant, and then you can do some dead reckoning with a compass. | ||
Wow! | ||
unidentified
|
Wow! | |
Dude, you went old school. | ||
Yeah, I once had a job, actually. | ||
Who did you do this with? | ||
Just friends, yeah. | ||
And you gotta have some fucking committed friends. | ||
Because, like, the friends had to be, you know, you had to be all on the same page. | ||
Because they could be like, hey man, let's get a fucking GPS. You guys are assholes. | ||
I don't want to die. | ||
I'm not going to get eaten by a shark. | ||
How much food do we have? | ||
People die out here, man. | ||
This is the ocean. | ||
unidentified
|
Yeah. | |
We didn't really have any money, so it wasn't much of a decision. | ||
To put things in perspective, we took a trip through the Caribbean once from Florida. | ||
The way that we got to Florida was riding freight trains. | ||
We hopped trains to get there. | ||
This was low-budget traveling. | ||
You guys were hobos. | ||
No. | ||
That's a hobo move. | ||
It was low-bagger, for sure. | ||
But, like, yeah, I was also, like, just weirdly ideological about it, where, like, I had a job once in the Caribbean that was, like, I was almost like a camp counselor, basically, where there was this camp that was like a sailing camp, but it was, like, 13 teenagers, mostly from North America. | ||
Showed up in St. Martin and then got on a boat with me and another woman my age. | ||
And we were like the adults. | ||
And it was just like we sailed from St. Martin to Trinidad over the course of six weeks with these like 13 kids on a 50-foot sailboat. | ||
Who left their kids with you? | ||
That's what I want to know, man. | ||
It was like... | ||
Is this you? | ||
Me and my friends made a video called Hold Fast that was trying to demystify sailing. | ||
Bro, you've been rocking this wacky hair for a long time. | ||
Dude, I know. | ||
You know, pandemic. | ||
unidentified
|
Wow. | |
Whoa, you had tornadoes out there? | ||
Yeah. | ||
And you caught fish? | ||
Yeah, yeah. | ||
So you lived off the fish that you caught, basically? | ||
Yeah. | ||
Yeah, fish, cock, seaweed. | ||
Wow, seaweed? | ||
Yeah. | ||
So when you prepare seaweed, what do you do? | ||
You boil it? | ||
You're going to sharpen your fucking knife, son. | ||
unidentified
|
I know. | |
That's ridiculous. | ||
What are you using, a pencil to try to kill that poor fish? | ||
This whole video is embarrassing. | ||
So thank you for that, James. | ||
Because you kind of didn't know what you were doing? | ||
And here's you with... | ||
What are you doing here? | ||
You're mapping out where you're at? | ||
This is Dead Reckoning, yeah. | ||
Dead Reckoning. | ||
That position was 50 miles off. | ||
50 miles off? | ||
So where you thought you were versus where you actually were was 50 miles difference? | ||
unidentified
|
Yeah. | |
And you're going how many miles an hour? | ||
Very slow. | ||
If you're doing really well, you know you're making five knots. | ||
Five nautical miles an hour. | ||
Five miles an hour. | ||
Jesus Christ. | ||
So you're walking. | ||
You're basically walking on the ocean. | ||
Yeah. | ||
Not walking. | ||
It's slow going. | ||
But you never stop. | ||
That's the thing. | ||
You can sail all night. | ||
You can just keep going. | ||
You're a light jog. | ||
You're jogging on the ocean. | ||
Anyway, I was a tyrant with these kids. | ||
We had a nice boat and I disabled all of the electronics. | ||
I disabled the electric anchor windlass. | ||
How long was this boat? | ||
How long was this boat? | ||
This was 50 feet. | ||
50 feet with 14 kids, you said? | ||
I think 13. 50 is a big boat. | ||
That's actually a big boat. | ||
Yeah, but it doesn't seem like a lot of room for all these kids. | ||
Yeah, people are like sleeping on the deck. | ||
Oh my god, that's insane. | ||
Did you feel weird? | ||
I mean, you're responsible for their food? | ||
You're responsible for making sure they don't fight with each other? | ||
Yeah, I mean, I actually enjoyed it. | ||
I think it was fun. | ||
Yeah? | ||
Well, it seemed like it. | ||
You have to make it work. | ||
There's no other solution. | ||
You're on this boat with these kids. | ||
unidentified
|
Yeah, that's true. | |
Do you still keep in touch with those kids? | ||
No, that was sort of like pre-social media. | ||
Right. | ||
They're going to reach out to you now. | ||
Man, I remember that. | ||
That was fucking crazy. | ||
I can't believe my parents left me with you. | ||
I can't believe they did either. | ||
So did you have to sign any paperwork or anything? | ||
How did you take care of these kids? | ||
I'm sure I had to sign something. | ||
I don't remember. | ||
You don't remember? | ||
Yeah. | ||
unidentified
|
Wow. | |
Was there any time where you were like halfway into this trip? | ||
You're like, what have I signed up for? | ||
Oh, sure. | ||
All the time. | ||
Yeah. | ||
But I was... | ||
You know... | ||
I had never really been in a situation like that either. | ||
And... | ||
Who has? | ||
I don't know. | ||
It's like I didn't even have siblings. | ||
You know? | ||
Like it's like... | ||
Oh, really? | ||
Yeah. | ||
unidentified
|
So... | |
But I... And I was pretty... | ||
You know, it was interesting. | ||
I feel like I learned a lot. | ||
And it was... | ||
But I was pretty tyrannical in a lot of ways. | ||
But in a way that I was trying to encourage. | ||
It was fun to see particularly teenagers who had a really North American affect about how to be. | ||
Just let all of that go over a few weeks on the ocean where it's just like, you know, it's just us. | ||
We're here. | ||
There's nobody else watching. | ||
You know, we're sleeping next to each other. | ||
You know, it's like the kids just getting comfortable with themselves, you know? | ||
And, you know, I would try and like, so I was like, I am really into rock, paper, scissors. | ||
How into it are you? | ||
I'm undefeated. | ||
How is that possible? | ||
So whenever they wanted anything, I would be like, all right, rock, paper, scissors. | ||
You know, they were like, can we like do this thing? | ||
I'd be like, all right, we'll do rock, paper, scissors. | ||
If you win, you can do this thing. | ||
If I win, and then I would like pick the thing that was like their sort of deepest fear, you know, it's like the really shy person had to like write a haiku about every day and then read it aloud at dinner. | ||
You know, like the, you know, the person who was like really into like having like a manicure, like wasn't allowed to shave her legs for the rest of the, you know, like that kind of thing. | ||
Wow. | ||
And so then by the end of it, it was just like, you know, everyone had lost, you know, so everyone was like reading the haiku at dinner and doing, you know. | ||
How are you so good at rock, paper, scissors? | ||
It's just, you know, skill, muscle, intuition. | ||
Intuition. | ||
Can we play right now? | ||
You want to play? | ||
Yes. | ||
But I only play for stakes. | ||
Okay. | ||
What do you want to play for? | ||
Okay. | ||
How about... | ||
If I win, I do the programming on your show for a week. | ||
No. | ||
That's worth a lot of money. | ||
You can fuck off. | ||
What kind of money? | ||
I'm not saying the ads or whatever. | ||
Programming. | ||
Who's going to be on? | ||
That's not possible. | ||
We're booked up months and months in advance. | ||
You were so confident until just now. | ||
That's ridiculous to flip a coin on that. | ||
There's no chance. | ||
I mean, what would be... | ||
Because then you'd make me have... | ||
Listen, the whole reason why this show works is because I talk to people that I want to talk to. | ||
That's why it works. | ||
The only way... | ||
You've got to do something to play this game. | ||
That's not a risk. | ||
That's just one week of your life. | ||
No, that's abandoning the show. | ||
That's one week of your life. | ||
No, you could bring some assholes on here that I don't want to talk to and then I'm like, what am I doing? | ||
No, no, no. | ||
Impossible. | ||
unidentified
|
Alright. | |
Well, do you think that there's something of equivalent value? | ||
No. | ||
Of that? | ||
Nothing that I can do. | ||
unidentified
|
No. | |
There's nothing that you could give me that would be worth a week of programming on the show? | ||
What are you going to give me? | ||
What about a day of programming? | ||
You'd have to give me a spectacular amount of money. | ||
I sent you a... | ||
We can't make this about money. | ||
But that's the only way I would... | ||
The only way, if you ever put a monetary equivalent to that, it would have to be a spectacular amount of money for me to let someone else program the show. | ||
I've never let anybody do that before. | ||
Not even for one day? | ||
unidentified
|
No! | |
That was one of the big things about doing this show on Spotify. | ||
They could have no impact at all on who gets on, no suggestions, no nothing. | ||
The only way it works... | ||
What was up with that dude in the suit outside with the clipboard that was telling me from Spotify? | ||
Oh, he's from the government. | ||
He's from the CIA. There's no one out there. | ||
He's joking. | ||
But the only way the show works, I think, the way it works, is I have to be interested in talking to the people. | ||
That's it. | ||
So it has to be, I get a, I have like all these suggestions for guests. | ||
I go, oh, that kind of seems cool. | ||
Oh, that might be interesting. | ||
Let me read up on this guy. | ||
What if it's like for a week, I give you the list of suggestions? | ||
No. | ||
No. | ||
No input. | ||
No? | ||
No. | ||
It's not. | ||
That's a ridiculous. | ||
Stand real. | ||
Stand real. | ||
Okay. | ||
Okay. | ||
unidentified
|
All right. | |
Impossible. | ||
In any case. | ||
How about five bucks? | ||
No. | ||
No? | ||
No, it's gotta be stakes. | ||
Come on, man. | ||
20 bucks? | ||
unidentified
|
20 bucks. | |
I got 20 bucks in my pocket. | ||
Money is off the table. | ||
We can't do money. | ||
Money's off the table? | ||
I forget that. | ||
All right. | ||
Sounds like someone's scared to lose at Rock, Paper, Scissors. | ||
It sounds like someone else is scared to lose at Rock, Paper, Scissors. | ||
No, you're asking me for something that's ridiculous. | ||
You don't have anything. | ||
You don't have anything that's worth a week of programming on this show. | ||
You don't have it. | ||
That's rough. | ||
It doesn't exist. | ||
That's rough. | ||
No, it literally doesn't exist. | ||
There's nothing that you can have that you could offer me that I couldn't buy myself. | ||
I'll make your... | ||
No, no, no. | ||
It'll be interesting. | ||
No, no, no. | ||
You can't. | ||
No. | ||
All right, fine. | ||
But that doesn't do anything for me. | ||
That does something for you. | ||
That does zero for me. | ||
Of course, you would have, if you win, you would name your steak. | ||
I don't have a steak. | ||
There's nothing I want from you. | ||
What you ask from me is a crazy thing. | ||
Yeah, we can't play Rock, Paper, Scissors now, huh? | ||
Interesting. | ||
Anyway, we were talking about something else before all of us. | ||
We're talking about the evolution of cryptography. | ||
Sailing with children. | ||
Sailing with children. | ||
Well, at first we were talking about Anguilla and the fact that people are moving to Anguilla. | ||
Yeah. | ||
So how did you learn how to do all this stuff? | ||
Was it trial by fire when you were learning how to use all this, I mean, I don't want to call it ancient equipment, but mechanical equipment to figure out how to... | ||
Yeah. | ||
Yeah. | ||
Secret is to begin. | ||
To start... | ||
Like a sextant. | ||
Where the fuck does one learn how to operate a sextant and then navigate in the ocean? | ||
Uh, yeah, just, I would, you know, I started, uh, you know, me and some friends got a boat and, um, we started fixing it up and making a lot of mistakes and then, you know, started taking some trips and then... | ||
Getting lost? | ||
Yeah, I got lost a bunch. | ||
I took a solo trip from San Francisco to Mexico and back, uh, on a little 27 foot boat with no engine and... | ||
Whoa! | ||
How long did that take? | ||
unidentified
|
Ah... | |
A few months. | ||
And the way you did it, did you stay close so I could see the shore? | ||
So if everything fucks up, I can kind of swim. | ||
Yeah, well, no, you can't swim. | ||
I learned that lesson, too. | ||
No? | ||
unidentified
|
Why? | |
I mean, the closest I ever came to death in my life was just in the bay. | ||
In the San Francisco Bay, I was on a boat that capsized, and I was probably 2,000 yards away from shore, and I almost drowned. | ||
I mean, I didn't make it to shore. | ||
Yeah, it's just the water's so cold, you know? | ||
You didn't make it to shore? | ||
Yeah, it's a long story. | ||
I was like... | ||
A friend of mine was living in San Francisco and he wanted to learn how to sail. | ||
And I was like, you know, what you should do is you should get like a little boat, like a little sailing thingy, you know, and then you can just anchor it like off the shore in this area that no one cares about. | ||
And, you know, you could just sort of experiment with this little boat. | ||
And so he started looking on Craigslist and he found this boat that was for sale for 500 bucks up in the North Bay. | ||
And every time we called the phone number, we got an answering machine that was like, hello, you've reached Dr. Ken Thompson, honorary. | ||
I'm unable to take your call, you know? | ||
And we were like, what is that? | ||
Like, honorary? | ||
It's a fake doctor. | ||
Is he like a judge? | ||
unidentified
|
Chiropractor. | |
You know, like, what is it? | ||
And so finally we got in touch with this guy. | ||
We go up there, and it's the kind of situation where, like, we pull up, and there's, like, the trailer that the boat's supposed to go on, and it's just full of scrap metal. | ||
Oh, boy. | ||
And, you know, this guy comes out. | ||
He's like, oh, yeah, this is the trailer. | ||
unidentified
|
We were going to do a metal run, but if you want the boat, you know, we'll take the metal off, you know? | |
And we're like, okay, you know, and he's like taking us around. | ||
He's like, okay, the mast is over here. | ||
And it's like under some leaves, you know, it's like, and then, you know, the hole is in the water here. | ||
And he has like a dock behind his house, and the tide is all the way out. | ||
So they're both just sitting in the mud, you know. | ||
And I'm like, well, how do we get this out of here? | ||
He's like, oh, you'd have to come back at a different time, you know, and then you take it over there. | ||
And we're like, you told us to come now, like at this time, you know. | ||
Anyway, so we go through all this thing, and my friend, who knows nothing about votes, is like, all right, Moxie, what do you think? | ||
Should I get this? | ||
And I was like, okay. | ||
Oh, and we were like, so what's a doctor of what? | ||
He's like, oh, self-declared. | ||
We're like, oh, okay. | ||
He's a self-declared doctor? | ||
Honorary. | ||
Honorary self-declared doctor. | ||
You can do that? | ||
I guess so. | ||
Why not? | ||
It's just an answer. | ||
Jamie? | ||
Yes. | ||
unidentified
|
Doctor? | |
Yes. | ||
I think we should become doctors. | ||
I just became one. | ||
I tried that for a while, actually. | ||
Yeah, did you really? | ||
Yeah, I don't know. | ||
I mean, I never went to college, so... | ||
Did Hunter S. Thompson ever get an honorary degree, or did he just call himself Dr. Hunter S. Thompson? | ||
Because he was calling himself Dr. Hunter S. Thompson for a while. | ||
I was quickly looking up how fast he could do this legally. | ||
Well, Bill Cosby became a doctor for a little bit. | ||
They took it back, though. | ||
That's when you know he fucked up. | ||
Yeah, yeah. | ||
They take back your fake doctor degree. | ||
Yeah, yeah. | ||
So this guy was like, you know, my friend's like, what do you think, Maxine? | ||
I'm like, all right, Dr. Ken. | ||
I would have to consider. | ||
I'm not sure that I would do it, but I would consider taking this boat for free. | ||
I'd have to think about it, but I would consider that, you know? | ||
And he's like... | ||
I might be amenable to that. | ||
So we've gone from $500 to free. | ||
And so we got this boat, and we had to deal with the metal and all that stuff. | ||
We got the boat, and we were just trying to anchor it. | ||
Did you bring life vests? | ||
Yeah, I was wearing a PFT, a Type 2 PFT, and we took it to this boat ramp, and it was the end of the day, and the wind was blowing kind of hard, and the conditions weren't that good, but I was like, oh, we're just doing this little thing, this little maneuver, and we were in two boats. | ||
I built this little wooden rowing boat, and my friend was going to go out in that with one anchor, and I was going to sail out this boat. | ||
You built it? | ||
Yeah, out of plywood. | ||
It's a stitching glue. | ||
But not the sturdiest vessel. | ||
So he's going to go out in this little rowboat, and I was going to sail out this little catamaran. | ||
And we had two anchors, and we're going to anchor it, and then we're going to get in the rowboat and row back. | ||
And it seemed a little windy, and I got in the boat first, and I got out around this pier and was hit by the full force of the wind and realized that it was blowing like 20 knots. | ||
It was way, way too much for what we were trying to do. | ||
But I had misrigged part of the boat, so it took me a while to get it turned around. | ||
And by the time I got it turned around, my friend had rowed out around the pier, and he got hit by the force of the wind and just got blown out into the bay. | ||
So he's rowing directly into the wind and moving backwards. | ||
Oh, shit! | ||
And I was like, fuck. | ||
And I'm on this little Hobie Cat, and it was moving so fast. | ||
It was way too windy to be sailing this thing. | ||
I've got just my clothes on. | ||
I don't have a wetsuit on or anything like that. | ||
I have a life jacket and just my clothes. | ||
And we don't have a radio. | ||
We're unprepared. | ||
It's starting to get dark. | ||
We don't have a light. | ||
And I'm sailing back and forth trying to help my friend. | ||
And it got to the point where I was like, all right, I'm just going to tack over. | ||
I'm going to sail up to this boat that was called the Sea Louse. | ||
Sail up to the Sea Louse. | ||
I'm going to get my friend off of it. | ||
We're just going to abandon it. | ||
And then we're going to sail this Hobie Cat back. | ||
If we can. | ||
And so I go to turn around, and right as I'm turning around, a gust of wind hit the boat and capsized it before I could even know that it was happening. | ||
It's one moment, you're on the boat, and the next moment you're in the water. | ||
And the water is like 50 degrees. | ||
It's a shock when it hits you. | ||
And the boat was a little messed up in a way where I couldn't write it. | ||
It had capsized, and then it capsized all the way and then sank. | ||
So it was floating like three feet underwater, basically. | ||
And so I'm in the water, but I'm still a little bit out of the water, but in the water. | ||
And I had a cell phone that just immediately was busted. | ||
And I look at my friend, and he's a ways away now. | ||
He didn't see me, and I was yelling as loud as I could, but the wind is blowing 20 knots, and you can't hear each other. | ||
It just takes your voice away. | ||
I was screaming, I was waving, he wasn't wearing his glasses, and he just very slowly rode away. | ||
Oh my god! | ||
And so then I was just like floating there. | ||
I was starting to get dark. | ||
He rode away? | ||
Did he notice that your boat had capsized? | ||
No, he didn't even see me. | ||
He thought that I just sailed somewhere else. | ||
Because in his mind, I was the person with the experience. | ||
Do you still talk to this dude? | ||
Yeah, all the time. | ||
I'd be like, you motherfucker. | ||
I don't blame him. | ||
In his mind, he was the person that was in trouble. | ||
Right. | ||
I understand. | ||
And he thought I just sailed somewhere else. | ||
That's crazy. | ||
Yeah. | ||
Sailed out of vision. | ||
Yeah, and then, you know, it basically got dark. | ||
I could see the shore. | ||
I wasn't far away. | ||
There's nobody on shore. | ||
There's nobody around. | ||
And the wind was blowing directly offshore. | ||
So you have to swim, you know, swim into the wind and into the wind wave and all that stuff. | ||
And eventually I tried swimming and I swam, you know, directly upwind. | ||
And I was because I was I was like, OK, like if I get separated from this boat and I don't make it to shore, then I'm definitely dead. | ||
You know, like there's just no saving me. | ||
So I was trying to go directly upwind so that if I felt like I couldn't make it, I would float back down when it hit the boat again. | ||
And so I tried, you know, I swam for probably like 20 minutes upwind and made no progress. | ||
It didn't feel like any progress. | ||
You know, in 50 degrees, you have 30 to 60 minutes before you black out. | ||
My arms were just, you know, it's like I consider myself a strong swimmer. | ||
Like I free dive, you know, all this stuff. | ||
And I just, you know, it's like you read these stories about... | ||
How people die. | ||
They succumb to hypothermia on a local hike or they drown in the bay. | ||
And the story's always like, well, Timmy was a strong swimmer. | ||
And you're like, really? | ||
Was Timmy really a strong swimmer? | ||
Because he drowned in the bay. | ||
And floating there, it just all came to me. | ||
I'm like, wow, this is how this happens. | ||
You just make a series of pretty dumb, small decisions until you find yourself floating in the dark in the bay. | ||
There's no one around. | ||
And it's a really slow process, too. | ||
You just come to terms with the idea that you're not going to make it. | ||
And it's not sudden. | ||
It's not like someone shot you or you got hit by a bus or something like that. | ||
It's like this hour-long thing that you're getting dragged through all alone. | ||
And you realize that no one will ever even know What this was? | ||
You know, how this happened? | ||
And you think about all the people like Joshua Slocum, Jim Gray, people who were lost at sea, and you realize they all had this thing that they went through, you know, this hour-long ordeal of just floating alone, and no one will even ever know what that was or what that was like, you know? | ||
And eventually, I realized I wasn't going to make it ashore. | ||
I looked back. | ||
The boat was, like, way far away from me. | ||
I started, you know, drifting back towards it. | ||
I was still trying to swim. | ||
I realized at some point that I wasn't going to hit it. | ||
I wasn't going to hit the boat on the way back downwind. | ||
And I had to just give it all that I had to try to connect with the boat, you know, to stop myself from getting blown past it. | ||
And in that moment, too, you realize that, like... | ||
Uncertainty is the most unendurable condition. | ||
You imagine yourself making it to shore and relaxing, just knowing that it's resolved. | ||
And in that moment of like, I might not make it back to this boat, you're tempted to give up because it's the same resolution. | ||
It's the feeling of just knowing that the uncertainties have been resolved. | ||
And you have to really remind yourself that it's not the same. | ||
You have to give it everything you have in order to survive. | ||
That feeling that you're longing for is not actually the feeling that you want. | ||
And I just barely got the end of a rope that was trailing off the back of the hull. | ||
Pulled myself back on it. | ||
Almost threw up. | ||
Then I had to... | ||
Then I was just floating there with the hole three feet underwater. | ||
I tied myself to it. | ||
I started to get tunnel vision. | ||
And really, at the last minute, a tugboat started coming through the area. | ||
And it was coming straight at me, actually. | ||
And I realized that it probably just wouldn't even see me. | ||
It would just run me over and not even know that... | ||
I had been there. | ||
It's totally possible. | ||
I was trying to wave. | ||
I could barely lift my arm. | ||
I was trying to scream. | ||
I could barely make any noise. | ||
Somehow they saw me. | ||
It took them 15 minutes to get a rope around me. | ||
They started pulling me up the side of the boat. | ||
Lining every tugboat is tires. | ||
Tires, usually. | ||
It has a fender. | ||
I got wedged in the tires as they were pulling me up. | ||
And I knew what was happening. | ||
And I was like, all I have to do is stick my leg out and push against the hull of the boat to go around the tires. | ||
And I couldn't do it. | ||
And I could barely see. | ||
And they swung me around and eventually pulled me up. | ||
They put me in next to the engines in the engine room. | ||
I couldn't even feel the heat. | ||
And they called the Coast Guard. | ||
And the Coast Guard came and got me. | ||
It was really embarrassing. | ||
And the Coast Guard guys like, He's got all these blankets over me and he's trying to talk to me to keep me alert. | ||
And he's like, so is this your first time sailing? | ||
And I have a commercial, like a 250-ton master's license. | ||
You need 600 days at sea to get this license. | ||
And I was like, no, I have a master's license. | ||
And he was like, what? | ||
He's like, you're a fucking idiot, man. | ||
Everything changed. | ||
The tone totally changed. | ||
Oh, my God, dude. | ||
That's insane. | ||
Yeah. | ||
Did that change your appreciation for comfort and safety and just life in general? | ||
Did it like... | ||
Yeah, totally. | ||
I mean, it changed. | ||
Well, you know, for sure, the next day, I was like, you know, it's just like any near-death experience. | ||
I feel like you're just like, what are we doing here? | ||
You know, like, what's the... | ||
Why are we wasting our time with this? | ||
You know, at the time, I was working on Twitter. | ||
And, you know, your co-workers are like, oh, we got this problem with the slave lag on the database. | ||
And you're just like, what are we doing, man? | ||
You know, shouldn't we be doing something else? | ||
You know, like... | ||
But you can't, I feel like, you can't live like that for long. | ||
The what are we doing, man? | ||
You know, it's like, it's impossible. | ||
The world will, like, suck you back into it. | ||
unidentified
|
Yeah. | |
Unless you go to Anguilla. | ||
unidentified
|
Yeah. | |
I mean, a lot of those early crypto people are actually still in Anguilla. | ||
Really? | ||
Yeah, it's funny. | ||
Yeah, that's why we were talking about selling Anguilla. | ||
So the people who moved to Anguilla were part of this moment of like... | ||
How much did that shift your direction in your life though? | ||
Did it change like the way like it seems almost I mean I haven't had a near-death experience but I've had a lot of psychedelic experiences and in some ways I think they're kind of similar and that life shifts to the point where whatever you thought of life before that experience is almost like oh come on that's nonsense Yeah, I mean, it changes your perspective, or it did for me. | ||
And, you know, because also in that moment, you know, it's like, you know, I think you go through this sort of embarrassing set of things where you're like, oh, I had these things I was going to do tomorrow. | ||
Like, I'm not going to be able to do them. | ||
And then you're like, wait, why is that the thing that I'm concerned about? | ||
Trivial things. | ||
Yeah, trivial. | ||
We're just like, oh, I was going to see that person tomorrow. | ||
I'm not going to see that. | ||
I feel like I remember I was supposed to meet somebody the next day. | ||
I remember being worried that they would think that I stood them up or something. | ||
You have the awesomest excuse ever. | ||
I mean, just tell them that story, the way you just told it to me, and they're going to be like, dude, we're good. | ||
Shit. | ||
Fuck. | ||
Glad you're alright. | ||
unidentified
|
Yeah. | |
My God. | ||
That kind of stuff. | ||
And then you get more into the... | ||
Yeah, it changes the way you think about things. | ||
And certainly, I was working at Twitter at the time, and I think it made me think about how I was spending my life. | ||
I remember the first day that I was at Twitter. | ||
At the time, the most popular person on Twitter was Justin Bieber. | ||
He had more followers than any other person. | ||
Was that when you guys were trying to rig it so that he wasn't trending number one always? | ||
Because they did do that, right? | ||
I don't remember that. | ||
Conveniently. | ||
Because Jamie and I were talking about that one day. | ||
Because they had to do something because if they didn't do something, Justin Bieber would be the number one topic every day, no matter what was happening in the world. | ||
I can believe that they wanted to change that because the problem was, at the time, Twitter was held together with bubblegum and dental floss. | ||
Every time Bieber would tweet, the lights would dim and the building would shake a little bit. | ||
Here it goes. | ||
So they blocked me from trending. | ||
This is 2010. I'm actually honored. | ||
Not even Matt. | ||
He's also 12. Then I get on and see, yet again, my fans are unstoppable. | ||
Love you. | ||
Okay, so there's, you know, people talk about, like, invisible labor. | ||
Like, the invisible labor behind that tweet is just kind of comical, because it's like, when he did that, you know, people, like, you know, it's like my first day there, you know, it's like he tweeted something, and, you know, the building's, like, kind of shaking, and, like, alarms are going off. | ||
People are, like, scrambling around, you know? | ||
And it was just this... | ||
You know, it's like this realization where you're just like, never in my life did I think that anything Justin Bieber did would like really affect me in any like deep way, you know? | ||
And then here I am just like scrambling around to like facilitate. | ||
What are your thoughts on curating what trends and what doesn't trend and whether or not social media should have any sort of obligation in terms of... | ||
How things, whether or not people see things, like shadow banning and things along those lines. | ||
I'm very torn on this stuff because I think that things should just be. | ||
And if you have a situation where Justin Bieber is the most popular thing on the internet, that's just what it is. | ||
It is what it is. | ||
But I also get it. | ||
I get how you would say, well, this is going to fuck up our whole program, like what we're trying to do with this thing. | ||
What do you mean, fuck up our whole program? | ||
Well, what you're trying to do with Twitter, I mean, I would assume what you're trying to do is give people a place where they could share important information and, you know, have people, you know... | ||
I mean, Twitter has been used successfully to overturn governments. | ||
I mean, Twitter has been used to... | ||
Break news on very important events and alert people to danger. | ||
There's so many positive things about Twitter. | ||
If it's overwhelmed by Justin Bieber and Justin Bieber fan accounts, if it's overwhelmed, then the top ten things that are trending are all nonsense. | ||
I could see how someone would think we're going to do a good thing by suppressing that. | ||
Yeah, I see what you're saying. | ||
Why do you think they did suppress that? | ||
What do you think? | ||
You worked there. | ||
Why do you think they kept him from trending? | ||
Well, I mean, I don't know about that specific situation. | ||
I mean, I think, you know, looking at the larger picture, right, like... | ||
In a way, you know, it's like, if you think about, like, 20 years ago, whenever anybody talked about, like, society, you know, everyone would always say, like, the problem is the media. | ||
It's like the media, man. | ||
You know, if only we could change the media. | ||
And a lot of people in who were interested in, like, a better and brighter future were really focused on self-publishing. | ||
Their whole conference is about an underground publishing conference, now the Allied Media Conference. | ||
People were writing zines. | ||
People were, you know, getting their own printing presses. | ||
We were convinced that if we made publishing more equitable, if everybody had the equal ability to produce and consume content, that the world would change. | ||
In some ways, what we have today is the fantasy of those dreams from 20 years ago. | ||
In a couple ways. | ||
One, it was the dream that if a cop killed some random person in the suburbs of St. Louis, that everyone would know about it. | ||
Everyone knows. | ||
And also, that anybody could share their weird ideas about the world. | ||
And I think, in some ways, we were wrong. | ||
You know, that we thought, like, you know, the word we got today is like, yeah, like, if a cop kills somebody in the suburbs of St. Louis, like, everybody knows about it. | ||
I think we overestimated how much that would matter. | ||
And I think we also believed that the things that everyone would be sharing were, like, our weird ideas about the world. | ||
And instead, we got, like, you know, Flat Earth and, like, you know, anti-vax and, like, you know, all this stuff, right? | ||
And so it's, like, in a sense, like, I'm glad that those things exist because they're, like, they're sort of what we wanted, you know? | ||
But I think what we did, what we underestimated is, like, how important the medium is. | ||
Like, the medium is the message kind of thing. | ||
And that, like, What we were doing at the time of writing zines and sharing information, I don't think we understood how much that was predicated on actually building community and relationships with each other. | ||
Like, what we didn't want was just, like, more channels on the television. | ||
And that's sort of what we got, you know? | ||
And so I think, you know, it's like everyone is, like, on YouTube trying to monetize their content, whatever, you know? | ||
And that, it's the same thing. | ||
Like, bad business models produce, like, bad technology and bad outcomes. | ||
And so I think there's concern about that. | ||
But I think... | ||
I think, like, you know, now that there's, like, you know, these two simultaneous truths that everyone seems to believe that are in contradiction with each other. | ||
You know, like, one is that, like, everything is relative. | ||
Everyone is entitled to their own opinion. | ||
All opinions are equally valid. | ||
And two, like... | ||
Our democracy is impossible without a shared understanding of what is true and what is false. | ||
The information that we share needs to be verified by our most trusted institutions. | ||
People seem to simultaneously believe both of these things, and I think they're in direct contradiction with each other. | ||
So in some ways, I think most of the questions about social media in our time are about trying to resolve those contradictions. | ||
But I think it's way more complicated than the way that the social media companies are trying to portray it. | ||
Yeah, I think there's simplistic methods that they're using to handle complex realities. | ||
Like, for instance, banning QAnon. | ||
This is a big one, right? | ||
QAnon's got these wacky theories and they're like, Jesus Christ, these are weaponizing all these nutbags. | ||
We're just going to ban QAnon. | ||
Because you think what they're saying is not true and not correct. | ||
How far do you go with that? | ||
You've sort of set a precedent. | ||
And where does that end? | ||
Because, you know, are we going to ban JFK theories? | ||
Because JFK murders are probably still relevant today. | ||
Some of those people are still alive. | ||
Do we ban... | ||
There's theories about the Challenger, the space shuttle Challenger. | ||
There's a lot of wacky conspiracy theories about space being fake. | ||
Have you ever seen hashtag space is fake? | ||
Yeah. | ||
Go on there if you want to really fucking just lose all faith in humanity. | ||
Look up space is fake. | ||
Oh my god, there's so many people. | ||
Yeah, and I think that people get something out of that. | ||
Yeah, they do. | ||
Well, people get something out of mysteries and maybe being on the inside and knowing things where the rest of the world is asleep. | ||
This is the reason why people love the idea of red-pilled. | ||
Somebody even suggested I call this room the red pill. | ||
My friend Radio Rahim said, call it the red pill. | ||
I'm like, ah... | ||
There's a lot riding on that term. | ||
Too bad because I'm a giant fan of the Matrix. | ||
But that term has been co-opted forever. | ||
But this idea that you're just going to ban people from discussing stupid ideas, where does that end? | ||
Does it end with Flat Earth? | ||
Are you going to ban that? | ||
They're going to go, oh, they're suppressing us. | ||
And then they're going to find these... | ||
That's the thing about all these weird alternative sources of social media, whether it's Parler or Gab, they become... | ||
Shitfests. | ||
If you go to those, especially Gab, it's just like, God damn, what have you guys done? | ||
It's not even what have they done. | ||
It's what have the people done that have all been kicked out of all these other places. | ||
And then if you have a place that says, we're not going to kick you out. | ||
And then all these fucking cretins come piling into these places. | ||
And I'm sure there's a lot of good people on Gab. | ||
Don't get me wrong. | ||
There's a lot of people that are... | ||
They just didn't want to be suppressed by social media. | ||
Parler doesn't seem to be nearly as bad. | ||
I've looked at that as well. | ||
It's more like just super right-wing information type stuff. | ||
And there's some reasonable people on Parler. | ||
unidentified
|
But... | |
But I think that there's a subtle thing there because I don't know how those things work. | ||
But I think part of what... | ||
If you set aside all of the takedown stuff, all the deplatforming stuff, if you say, okay, Facebook, Twitter, these companies, they don't do that anymore. | ||
They've never done that. | ||
They're still moderating content. | ||
They have an algorithm that decides what is seen and what isn't. | ||
Right. | ||
But how is that all algorithm programmed? | ||
For Facebook and for YouTube and a lot of these things, it's done to encourage viewership. | ||
It's done to encourage interaction, right? | ||
It's done to encourage time spent looking at the screen. | ||
Yeah, so that's how they monetize it. | ||
They want more clicks and more ad views and all that jazz. | ||
But when it becomes an ideological moderation, that's when things get a little weird, right? | ||
But it is by definition an ideological moderation. | ||
If you optimize for time spent looking at the screen, you're going to be encouraging certain kinds of content and not others. | ||
Okay, but that's not always true. | ||
I'll give you an example for us. | ||
We did a podcast with Kanye West. | ||
Kanye West was running for president, right? | ||
And if you were running for president and you were outside the... | ||
Like, for instance, Twitter banned Brett Weinstein's... | ||
Brett Weinstein had a... | ||
He had a Twitter account that was set up for... | ||
It was Unity 2020. And the idea was, like, instead of looking at this in terms of left versus right, Republican versus Democrat, let's get reasonable people from both sides, like a Tulsi Gabbard and a Dan Crenshaw. | ||
Bring them together and perhaps maybe put into people's minds the idea that, like, this idea, this concept of it has to be a Republican vice president or a Republican president. | ||
Maybe that's nonsense. | ||
And maybe it would be better if we had reasonable, intelligent people together. | ||
What is this? | ||
Third video. | ||
There's their video? | ||
Yeah. | ||
Well, it's a very rational perspective. | ||
It's not conspiracy theory driven. | ||
They got banned from Twitter. | ||
unidentified
|
For what? | |
For nothing. | ||
Just because they were promoting a third party. | ||
Because they were trying to come up with some alternative. | ||
The idea was this could siphon off votes from Biden. | ||
We want Biden to win because Trump is bad. | ||
This is the narrative, right? | ||
Yeah, I mean, I think that there's... | ||
Man, there's a lot here, but the... | ||
unidentified
|
Yeah. | |
But I was going to say, I got sidetracked, I'm sorry. | ||
Let me finish. | ||
Yeah. | ||
The Kanye West thing. | ||
So we had a podcast with Kanye West. | ||
It got, I don't know how many millions of views, but it was a lot. | ||
But it wasn't trending. | ||
And so, Jamie, you contacted the people at YouTube and asked them why it's trending. | ||
What was their answer? | ||
It's not trending. | ||
Um... | ||
Like it didn't meet the qualifications they decided for trending or something like that? | ||
No, like it didn't include everything you would assume, like you just said, all the interactivity comments. | ||
It had more comments than any video we had. | ||
That's what I mean. | ||
Massive amounts of comments, massive amounts of views, but yet nowhere to be seen on the trending. | ||
But I don't think there was a person involved. | ||
Like, there was an algorithm involved that was trying to optimize for certain things. | ||
Not in this case. | ||
This specific case, yeah. | ||
There's a team there. | ||
There's separate teams at YouTube, from my understanding. | ||
Yeah, and this separate team had made a distinction. | ||
And I don't even know if they told the person who told me that what it was. | ||
So that person may not know either. | ||
So they just decided this is not worthy of trending. | ||
So you have arbitrary decisions that are being made by people, most likely because they feel that ideologically Kanye West is not aligned with... | ||
I mean, he was wearing the MAGA hat for a while. | ||
So they just decided this is not trending. | ||
But it is trending. | ||
It's clearly trending. | ||
You've got millions and millions and millions of people who are watching it. | ||
Whether there, you know, it's like whether there are, but I think this is the point, you know, it's like whether there, whether it's people, whether it's algorithms, you know, there are forces that are making decisions about what people see and what people don't see, and they're based on certain objectives that I think are most often business objectives. | ||
But not in this case. | ||
In this case, the business objective was if they wanted to get more eyeballs on it, they would want it to be trending. | ||
And people say, oh shit, Kanye West is on the JRE? Do people that like Kanye click on ads or not? | ||
There's a lot in there that we don't know. | ||
Oh, that's horseshit. | ||
Come on, bro. | ||
I don't know. | ||
Maybe they're making an ideological decision. | ||
Millions and millions. | ||
When you have a video that's being viewed by that many people, there's going to be a lot of goddamn people clicking on ads. | ||
No matter what. | ||
The other thing that these platforms want is for the content to be ad safe. | ||
It's like maybe advertisers don't... | ||
I don't know. | ||
But I think actually focusing on the outlying cases of this person was deplatformed, this person was intentionally, ideologically not promoted or de-emphasized or whatever. | ||
Shadow banning. | ||
I think that that, like, obfuscates or, you know, draws attention away from the larger thing that's happening, which is that, like, those things are happening just implicitly all the time. | ||
And that, like, it almost, like, serves to the advantage of these platforms to... | ||
Highlight the times when they remove somebody because what they're trying to do is reframe this is like, okay, well, yeah, we've got these algorithms or whatever. | ||
Don't talk about that. | ||
The problem is there's just these bad people, you know, and we have to decide there's a bad content from bad people and we have to decide, you know, what to do about this bad content and these bad people. | ||
And I think that distracts people from the fact that like the platforms are at every moment making a decision about what you see and what you don't see. | ||
I see what you're saying. | ||
So, there's more than one problem. | ||
There's a problem of deplatforming, because in many ways, deplatforming decisions are being made based on ideology. | ||
It's a certain specific ideology that the people that are deplatforming the other folks have that doesn't align with the people that are Being de-platformed. | ||
These people that are being de-platformed, they have ideas that these people find offensive or they don't agree with. | ||
So they say, we're going to take you off. | ||
Yeah. | ||
Or sometimes they just find themselves in a trap, you know? | ||
A trap. | ||
Well, I think that there's a tendency for a lot of these platforms to try to define some policy about what it is that they want and they don't want. | ||
I feel like that's sort of a throwback to this modernist view of science and how science works and we can objectively and rigorously define these things. | ||
I just don't think that's actually how the world works. | ||
What do you mean? | ||
How so? | ||
I feel like we're just past that. | ||
That it's not like... | ||
First of all, I think science is not about truth. | ||
It's just not. | ||
It's about utility. | ||
unidentified
|
What do you mean? | |
Okay. | ||
It's like... | ||
I was taught Newtonian physics in high school. | ||
unidentified
|
Why? | |
It's not true. | ||
That's not how the universe works. | ||
But it's still useful, and that's why it's taught. | ||
Because you can use it to predict motion outcomes, that kind of thing. | ||
What's incorrect about Newtonian physics in the sense that they shouldn't be teaching it? | ||
I mean, today, you know, people believe that the truth is that, you know, there's like, you know, relativity, like gravity is not a force. | ||
There's like, you know, these planes and stuff, whatever, you know, that like there are other models to describe how the universe works. | ||
And Newtonian physics is considered outmoded. | ||
But it still has utility in the fact that you can use it to predict the... | ||
So you're talking about in terms of quantum physics and string theory and a lot of these more... | ||
Yeah, it's like relativity at the large scale, quantum physics at the small scale. | ||
And even those things are most likely not true in the sense that they aren't consistent with each other and people are trying to unify them and find something that does make sense at both of those scales. | ||
The history of science is a history of things that weren't actually true. | ||
You know, Bohr's model of the atom, Newtonian physics. | ||
People have these, you know, Copernicus's model of the solar system. | ||
People have these ideas of how things work. | ||
And the reason that people are drawn to them is because they actually have utility. | ||
That it's like, oh, we can use this to predict the motion of the planets. | ||
Oh, we can use this to send a rocket into space. | ||
Oh, we can use this to, you know, have better outcomes, you know, for some medical procedure or whatever. | ||
But it's not actually... | ||
I don't... | ||
I think it's not actually truth. | ||
Like, the point of it isn't truth. | ||
The point of it is that, like, we have some utility that we find in these things. | ||
And I think that... | ||
When you look at the emergence of science and people conceiving of it as a truth, it became this new authority that everyone was trying to appeal to. | ||
If you look at all of the 19th century political philosophy, I mean, okay, I think the question of truth is, like, you know, it's even a little squishy with the hard sciences, right? | ||
But once you get into, like, soft sciences, like social science, psychology, like, then it's even squishier, you know, that, like, these things are really not about truth. | ||
They're about, like, some kind of utility. | ||
And when you're talking about utility, the important question is, like, useful for what and to whom? | ||
You know? | ||
And I think that's just always the important question to be asking, right? | ||
Because, you know, when you look at, like, all the 19th century political writing, it's all trying to frame things in terms of science in this way that it just seems laughable now. | ||
But, you know, like, at the time, they were just like, we're going to prove that communism is, like, the most true, like, social economic system in the world, you know? | ||
Like, there are whole disciplines of that. | ||
People in... | ||
You know, people had like PhDs in that, you know, their whole research departments in the Soviet Union, people doing that. | ||
And we laugh about that now, but I don't think it's that different than like social science in the West, you know? | ||
And so I think, you know, it's like if you lose sight of that, then you can try, then you try to like frame Social questions in terms of truths. | ||
It's like, this is the kind of content that we want, and we can rigorously define that, and we can define why that's going to have the outcomes that we want it to. | ||
But once you get on that road, you're like, okay, well, terrorist stuff. | ||
We don't like terrorist stuff, so we're going to rigorously define that, and then we have a policy, no terrorist stuff. | ||
And then China shows up, and they're like, we've got this problem with terrorists, the Uyghurs. | ||
We see you have a policy. | ||
I think if people from the beginning acknowledged that all of objectivity is just a particular worldview and that we're not going to regularsly define these things in a way of what is true and what isn't, then I think we would have better outcomes. | ||
That's my weird take. | ||
I mean, I think, you know, from the perspective of Signal, you know, it's like, do you know what's trending on Signal right now? | ||
No. | ||
unidentified
|
Nothing. | |
Neither do I. No. | ||
Okay. | ||
But that's, it's on a social media platform. | ||
But isn't it, there's a weird thing when you decide that you have one particular ideology that's being supported in another particular ideology that That is being suppressed. | ||
And this is what conservative people feel when they're on social media platforms. | ||
Almost all of them, other than the ones we talked about before, Parler and Gab and the alternative ones, they're all very left-wing in terms of the ideology that they support. | ||
The things that can get you in trouble on Twitter. | ||
What did you say? | ||
But then the President of the United States just constantly violated every policy that they had. | ||
But he's ridiculous. | ||
That's a ridiculous example, right? | ||
Because he's one person, and they've actually discussed this, that he and his tweets are more important. | ||
It's more important that they allow these tweets to get out. | ||
First of all, you can understand how fucking crazy this guy is. | ||
And second of all, it's newsworthy. | ||
He's the leader of the... | ||
And also, it would be very costly from a business perspective. | ||
Yes. | ||
Very likely. | ||
And kind of amazing that he didn't do anything along the way while he was witnessing people get deplatformed, and particularly this This sort of bias towards people on the left and this discrimination against people on the right. | ||
There's people on the right that have been banned and shadow banned and blocked from posting things. | ||
You run into this situation where you wonder what exactly is a social media platform. | ||
It's just a small private company and maybe you have some sort of a video platform and there's only a few thousand people on it and you only want videos that align with your perspective. | ||
Okay, you're a private company. | ||
You can do whatever you want. | ||
But when you're the biggest video platform on earth like YouTube and you decide that you are going to take down anything that disagrees with your perspective on how COVID should be handled... | ||
Including doctors. | ||
This is one of the things that happened. | ||
Doctors that were stating, look, there's more danger in lockdowns. | ||
There's more danger in this than there is in the way we're handling it. | ||
There's more danger in the negative aspects of the decisions that are being made than it would be to let people go to work with masks on. | ||
Those videos just get deleted. | ||
Those videos get blocked. | ||
There's people that are opposed to current strategies with all sorts of different things, and those videos get blocked. | ||
So there's an ideological basis in censorship. | ||
And so you have to make a decision like, what are these platforms? | ||
Are these platforms simply just a private company, or is it a town hall? | ||
Is it the way that people get to express ideas? | ||
And isn't the best way to express ideas to allow people to decide, based on the better argument, what is correct and what's incorrect? | ||
Like, this is what freedom of speech is supposed to be about. | ||
It's supposed to be about, you have an idea, I have an idea, these two ideas come together, and then the observers get to go, hmm, okay, well, this guy's got a lot of facts behind him. | ||
This is objective reality. | ||
This is provable. | ||
And this other guy is just a crazy person who thinks the world's hollow. | ||
Okay? | ||
This is the correct one. | ||
There's going to be some people that go, no, there's a suppression of hollow earth and hollow earth is the truth and hollow earth facts and hollow earth theory. | ||
But you've got to kind of let that happen. | ||
You gotta kind of have people that are crazy. | ||
Remember the old dude that used to stand on the corners with the placards on, the world is ending tomorrow? | ||
They're still there. | ||
Yeah, but those are on Twitter now, right? | ||
But those people, no one said, you gotta get rid of that guy. | ||
You would drive by and go, look at this crazy fuck. | ||
Those crazy fucks making YouTube videos, those videos get deleted. | ||
I don't know if that's good. | ||
I kind of think that you should let those crazy fucks do that. | ||
Because it's not going to influence you. | ||
It's not going to influence me. | ||
It's going to influence people that are easily influenced. | ||
And the question is, who are we protecting and why are we protecting these people? | ||
Well, okay, but I think, in my mind, what's going on is, like, the problem is that it used to be that some person with very strange ideas about the world wearing a sign on the street corner shouting was just a person with very strange ideas about the world wearing a sign on the street corner shouting. | ||
Now, there's somebody, you know, with very strange ideas about the world, and those ideas are being amplified by a billion-dollar company, because there are algorithms that amplify that. | ||
And what I'm saying is that instead of actually talking about that, instead of addressing that problem, those companies are trying to distract us from that discussion by saying, Would the correct way to handle it... | ||
Would it be to make algorithms illegal in that respect? | ||
Like to not be able to amplify or detract? | ||
To not be able to ban, shadow ban, or just to have whatever trends trend. | ||
Whatever is popular, popular. | ||
Whatever people like, let them like it. | ||
And say, listen, this thing that you've done by creating an algorithm that encourages people to interact, encourages people to interact on Facebook, encourages people to spend more time on the computer, what you've done is you've kind of distorted what is valuable to people. | ||
You've changed it and guided it in a way that is ultimately, perhaps arguably, detrimental to society. | ||
So we are going to ban algorithms. | ||
You cannot use algorithms to dictate what people see or not see. | ||
You give them a fucking search bar, and if they want to look up UFOs, let them look up UFOs. | ||
But don't shove it down their throat because you know they're a UFO nut. | ||
Don't curate their content feed. | ||
Yeah, I mean, I think it's okay. | ||
It's complicated because, one, I have no faith in, like when you say ban or make it illegal or whatever, I have zero faith in the government being able to handle this. | ||
Yeah, nor do I. Every time I see a cookie warning on a website, I'm like, okay, these people are not the people that are good. | ||
This is what they've given us after all this time. | ||
These people are not going to solve this for us. | ||
And also, I think a lot of what it is that the satisfaction that people feel and the discomfort that people feel and the concern that people have is a concern about power. | ||
That right now, these tech companies have a lot of power. | ||
And I think that the concern that is coming from government is the concern for their power. | ||
The right has made such a big deal about deplatforming. | ||
And I think it's because they're trying to put these companies on notice. | ||
It's like, fuck with us. | ||
We will take power. | ||
But they've done nothing about it. | ||
Don't you think that they've actually made a big deal about deplatforming because the right has been disproportionately deplatformed? | ||
I think the right is, like, doing fine. | ||
How so? | ||
I don't know. | ||
I don't know what the numbers are, but I feel like it's, like... | ||
Did you say that, though, because you're on the left? | ||
Yeah, but that's Trump. | ||
That's Trump. | ||
He's an anomaly. | ||
You can't really, you know... | ||
Okay, I think, I guess maybe, let me just reframe this to say that, like, I think it's interesting that we are, we've hit an inflection point, right? | ||
Where, like, the era of utopianism with regards to technology is over. | ||
Yeah. | ||
That it's just like, you know, after 2016, it was just like, big tech has zero allies anymore. | ||
You know, on the left, everyone's just like, you just gave the election to Trump, you know? | ||
And on the right, they're just like, you just removed somebody from YouTube for calling gay people an abomination. | ||
Fuck you. | ||
You know, like, They have no allies. | ||
No one believes in the better and brighter. | ||
No one believes that Google is organizing the world's information. | ||
No one believes that Facebook is connecting the world. | ||
And I think that there's an opportunity there. | ||
That we're in a better situation than we were before. | ||
All the cards are on the table. | ||
People more and more understanding how it is that these systems function. | ||
I think we're increasingly see that people understand that this is really about power, it's about authority, and that we should be trying to build things that limit the power that people have. | ||
If you had your wish, if you could let these social media platforms, whether it's video platforms like YouTube or Facebook or Twitter, if you If you had the call, if they called you up and said, Moxie, we're going to let you make the call. | ||
What should we do? | ||
How should we curate this information? | ||
Should we have algorithms? | ||
Should we allow people? | ||
Should we just let it open to everything? | ||
Everything and anybody. | ||
What should we do? | ||
Well, I mean, this is what we're trying to do with Signal. | ||
But it's different, right? | ||
Because you're just a messaging app. | ||
We're just a messaging app. | ||
No, I don't say that. | ||
It's a very good messaging app that I use. | ||
No, I understand what you're saying. | ||
But I think the way that messaging apps are going, there's a trajectory where a project like Signal becomes more of a social experience. | ||
And that, like, the things that we're building extend beyond just, like, you know, sending messages. | ||
Particularly, I think, as more and more communication moves into group chats and things like that. | ||
And, you know, the foundation that we're building it on is a foundation where we know nothing. | ||
You know, it's like, if I looked up your Signal account record right now of, like, all the information that we had about you on Signal, There's only two pieces of information. | ||
The date that you created the account and the date that you last used Signal. | ||
That's it. | ||
That's all we know. | ||
If you looked on any other platform, your mind would be blown. | ||
No, it's admirable what you're doing, and it's one of the reasons why I wanted to talk to you. | ||
I think that foundation gives us... | ||
Now that we have that foundation, there's a lot that we can build on it. | ||
Would you do a social media app? | ||
Well, I think, you know, some of the stuff that we're working on now of just like moving away from phone numbers, you can have like, you know, a username so that you can like post that more publicly. | ||
And then, you know, we have groups, now you have group links. | ||
And then, you know, maybe we can do something with events. | ||
And we can, you know, that's like, we're sort of moving in the direction of like, an app that's good for communicating with connections you already have to an app that's also good for creating new connections. | ||
Would you think that social media would be better served with the algorithms that are in place and with the mechanisms for determining what's trending in place and for their trust and safety or whatever their content monitoring policy they have now or have it wide open? | ||
Wild West. | ||
I mean, I think it depends on when you say like better, you know, better for what, right? | ||
Better for humanity. | ||
Yeah, no. | ||
I think... | ||
Censorship is better? | ||
No, no, no. | ||
I think the problem... | ||
I think bad business models create bad technology, which has bad outcomes. | ||
You know, that's the problem we have today, right? | ||
So the problem is that there's a financial incentive for them to... | ||
That if we, you know, if you look at, like, the metrics, you know, that we talked about, like, you know, what Facebook cares about is just, like, time that you spent looking at the screen on Facebook, you know? | ||
Like, if we were to have metrics, if Signal were to have metrics, you know, our metrics would be, like, what we want is for you to use the app as little as possible, for you to actually have the app open as little as possible, but for the velocity of information to be as high as possible. | ||
So it's like you're getting maximum utility. | ||
You're spending as little time possible looking at this thing while getting as much out of it as you can. | ||
How could that be engineered, do you think? | ||
That's what we're trying to do. | ||
So you're trying to do that with a social media app as well? | ||
Well, I mean, you know, we're sort of moving in that direction, right? | ||
And it's like, and I think once you start from the principle of like, well, we don't have to have infinite growth. | ||
We don't actually have to have profit. | ||
We don't have to return. | ||
We're not accountable to investors. | ||
We don't have to, you know, satisfy public markets. | ||
We also don't have to build a pyramid scheme where we have like, you know, 2 billion users so that we can monetize them to like, you know, a few hundred thousand advertisers so that we can, you know, like we don't have to do any of that. | ||
And so We have the freedom to pick the metrics that we think are the ways that we think technology should work, that we think will better serve all of us. | ||
So what would better be served is a bunch of wild hippies like yourself that don't want to make any money at all, put together a social media app. | ||
If you work at Signal, you get paid. | ||
Oh yeah, I'm sure. | ||
But I don't mean the company itself as a corporation. | ||
You get paid, but that's it. | ||
And how do you generate the income? | ||
Well, you know, we do it by, like, tying ourselves to a community of users instead of advertisers, right? | ||
So, where's the money coming from, though? | ||
From people who use Signal. | ||
So, similar to, like... | ||
Do they pay for it? | ||
No, no, it's, like, donation-based. | ||
It's similar to, like, Wikimedia. | ||
Oh, okay. | ||
You know, it's, like, you know, Wikipedia exists. | ||
There's no company. | ||
There's no... | ||
Well, that would be great if they could figure out a way to develop some sort of social media platform that just operated on donations and could rival the ones that are operating on advertising revenue. | ||
Because I agree with you that that creates a giant problem. | ||
And that's what we're working on, slowly. | ||
So you just look at it in terms of bad business model equals bad outcome. | ||
That's how you look at all these. | ||
And it's also, by the way, why we have people mining cobalt in Congo. | ||
And you don't think that they can regulate their way out of this situation. | ||
With technology, I'm not super optimistic, yeah. | ||
Just based on, you know, and even the hearings, you know, it's just like amateur hour. | ||
So do you think that, yes, the hearings were amateur hour, yeah, when, yeah, there were some ridiculous questions. | ||
Yeah, I mean, it's just like, they're talking to the wrong people. | ||
They don't understand how stuff works. | ||
unidentified
|
They don't know that it's not Google, that's Apple. | |
You've been prepped for this. | ||
Don't you have a team of people who... | ||
Yeah. | ||
Come on. | ||
Yeah. | ||
It's fascinating to watch, right? | ||
It's like your dad who doesn't know how to... | ||
How do I get the email? | ||
It's like these people are not going to save us, man. | ||
You know, and it's like anything that they do will probably just make things worse. | ||
Do you think that it's a valid argument that conservatives have though? | ||
That they're being censored and that their voice is not being heard? | ||
I know what you said in terms of, you know, that if someone had something on YouTube that said that gay people are unhuman and they should be abolished and banned and delete that video. | ||
I get that perspective. | ||
But I think there's other perspectives, like the Unity 2020 perspective, which is not in any way negative. | ||
Yeah, I mean, I don't know what happened with that, but I feel like what I... I think it could be a part of this thing of just like, well, we create this policy and we have these... | ||
You know, we define things this way, and then a lot of stuff just gets caught up in it. | ||
You know, where it's just like, now you're like taking down content about the Uyghurs because you wanted to do something else. | ||
You know, that if people would just be more honest about, like, there is not really an objectivity... | ||
And, you know, we're looking for these specific outcomes and this is why that I think, you know, maybe we would have better results. | ||
Well, how does one fix this, though? | ||
How does one, like, you worked at Twitter, you kind of understand these better than most, these social media platforms. | ||
How would one fix this? | ||
If they hired you, if they said, hey, Moxie, we're kind of fucked. | ||
We don't know how to fix this. | ||
Well, uh... | ||
Is there a way? | ||
Because it seems like they make so much money. | ||
Yeah. | ||
If you came along and said, yeah, well, you gotta stop making money. | ||
They'd be like, get rid of that fucking nut. | ||
Exactly, exactly. | ||
Look at him, goddamn sailor. | ||
Yeah. | ||
What's he talking about? | ||
What is he talking about? | ||
Fuck out of here. | ||
Stop making money. | ||
Yeah. | ||
What, you wanna play rock, paper, scissors? | ||
unidentified
|
Yeah. | |
You're crazy, man. | ||
How do you fix this? | ||
I mean, one thing I'm actually a little encouraged by is the organizing unionization stuff that's been happening in the tech industry. | ||
So there's been a couple of walkouts and there's some increased communication among tech workers. | ||
Normally you think about... | ||
I'm not totally aware of this. | ||
What have they been organizing and unionization about? | ||
Well, normally you think about unionization as a process for improving material conditions for workers. | ||
And there's some aspect of this in the organizing that's been happening. | ||
Where have they been doing this? | ||
Google is the big, where a lot of the activity has happened, but it's happening across the industry. | ||
What are their objectives at Google? | ||
At Google, there were some walkouts. | ||
The objectives... | ||
You should talk to Meredith Whitaker about this, actually. | ||
She's really smart and has a lot to say. | ||
Shout out to Meredith. | ||
She and other people working there and... | ||
They were organizing for, one, trying to apply the protections that full-time workers and benefits of full-time workers there had to a lot of the temporary workers, like the people who work in security, the people who are working in the cafeteria, the people who are driving buses and stuff like that, who are living a lot more precariously. | ||
But also for creative control over how the technology that they're producing is used. | ||
So Google was involved in some military contracts that were pretty sketch. | ||
Like applying machine learning AI stuff to military technology. | ||
And then finally, there had been a lot of high profile sexual harassment incidents at Google where the perpetrators of sexual harassment were Usually paid large severances in order to leave. | ||
And so they had a list of demands. | ||
And they, like a lot of people walked out. | ||
I don't know what the numbers were, but a lot of people, they managed to organize internally and walked out. | ||
And I think stuff like that is encouraging because, you know, it's like we look at the hearings and it's like the people in Congress don't even know who's the right person to talk to. | ||
You know, it's like, you know, old people talking about But isn't that another issue where you're going to have people who have an ideological perspective? | ||
And that may be opposed to people that have a different ideological perspective, but they're sort of disproportionately represented on the left in these social media corporations. | ||
When you get kids that come out of school, they have degrees in tech, or they're interested in tech, they tend to almost universally lean left. | ||
Maybe, but I think most... | ||
Like, when it comes to the technology, I don't think people are... | ||
I think what almost everyone can agree is the amount of money and resources that we're putting into surveillance, into ad tech, into these algorithms that are just about increasing engagement, that they're just not good for the world. | ||
And if you put a different CEO in charge... | ||
That person's just going to get fired. | ||
But if the entire company organizes together and says, no, this is what we want. | ||
This is how we want to allocate resources. | ||
This is how we want to create the world, then you can't fire all those people. | ||
I understand what you're saying. | ||
So they'd have to get together and unionize and have a very distinct mandate, very clear that we want to go back to do no evil or whatever the fuck it used to be. | ||
Right. | ||
Yeah, where they don't really have that as a big sign anymore. | ||
Do you think that would really have an impact, though? | ||
I mean, it seems like the amount of money, when you find out the amount of money that's being generated by Google and Facebook and YouTube, the numbers are so staggering that to shut that valve off, to like... | ||
To shut that spout, good luck. | ||
It's almost like it had to have been engineered from the beginning, like what you're doing at Signal. | ||
Like someone had to look at it from the beginning and go, you know what, if we rely on advertiser revenue, we're going to have a real problem. | ||
Yeah. | ||
And I think, but I think it's, yeah, exactly. | ||
I mean, you know, I think you're right. | ||
And there's, you know, part of the problem with just relying on tech workers to organize themselves is that they are shareholders of these companies. | ||
You know, they have a financial stake in their outcome. | ||
And so that influences the way that they think about things. | ||
But... | ||
I think another aspect to all of this is that I think people underestimate just how expensive it is to make software. | ||
And another thing that I think would really improve things is making software cheaper. | ||
Right now, it's moving in the opposite direction. | ||
It's getting harder, more expensive to produce software. | ||
How so? | ||
It used to be that if you wrote a piece of software, you just wrote it once for the computer. | ||
And then that was your software. | ||
Now if you want to write a piece of software, you have to write it at least three times. | ||
You have to write it for iPhone. | ||
You have to write it for Android. | ||
You have to write it for the web. | ||
Maybe you need a desktop client. | ||
So it's like you need three or four times the energy that you used to have. | ||
And the way that software works... | ||
Not worth going into. | ||
But it's getting more expensive. | ||
What do you personally use? | ||
Are you one of those minimalist dudes? | ||
I notice you have a little tiny notebook here. | ||
Oh, yeah, yeah. | ||
And then you have two phones. | ||
Yeah, I'm like... | ||
I have to have... | ||
I try to be like... | ||
I just want to... | ||
You're also one of those renegades with no phone case. | ||
Oh, yeah, man. | ||
I feel like that's like... | ||
You and Jamie should get together and talk about it. | ||
He's radical. | ||
I mean, it's like people... | ||
You know, it's like industrial designers put all of that effort into creating that thing. | ||
Yeah, it's made on a fucking glass and it costs a thousand bucks. | ||
If you drop it with this thing on it, it doesn't get hurt. | ||
And see this? | ||
This little thing right here? | ||
See, I stick my finger in there and then I can use it. | ||
I can text better. | ||
Really good. | ||
Yeah. | ||
And then also, if I want to watch a video, that'll prop it up. | ||
unidentified
|
You know? | |
You know how that works? | ||
Isn't that better than no case? | ||
I mean, some things I actually want to make more difficult for myself. | ||
But I have two phones just because I'm trying to... | ||
I always just want to keep tabs on how everything works everywhere. | ||
So you have an Android and an iPhone? | ||
Do you keep things stripped down? | ||
No, I'm pretty... | ||
I mean, I don't actually use... | ||
TikTok? | ||
Well, okay, my problem is that, like, I spend all day. | ||
I think, you know, sometimes I go through this thing where, like, cryptography will be, like, in the news or something. | ||
There'll be some, like, geopolitical thing that's happening. | ||
And someone like, you know, Vice or something will get in touch with me. | ||
And they'll be like, hey, we want to do a thing, like a video where, like, we follow you around for a day, like a day in the life. | ||
You know, it's because it's so exciting. | ||
Sounds good for them. | ||
Annoying for you. | ||
Well, the thing I'll usually write back is, like, okay, here's the video. | ||
Me sitting in front of a computer for eight hours. | ||
And they're like, oh, we can't make that video. | ||
No one would want to watch that. | ||
Yeah, what we need to do is take you to a yoga class and you go to an organic food store and you talk to people about their rights and then... | ||
Yeah, exactly. | ||
Unfortunately, I don't even want to watch the movie in my own life. | ||
So that is my life. | ||
I spend so much time looking at a computer for work that I... It's hard for me to continue looking at screens and stuff. | ||
Yeah, I can only imagine. | ||
But I try to be, like, a normal... | ||
Like, there's this, like, in the history of people who are, like, doing... | ||
Like, building cryptography, stuff like that, there was this period of time where the thesis was basically, like, all right, what we're going to do is develop really powerful tools for ourselves, and then we're going to teach everyone to be like us, you know? | ||
And that didn't work because, you know, we didn't really anticipate the way that computers were going. | ||
So I try to be, like, as normal as possible. | ||
You know, I just, like, have, like, a normal setup. | ||
I'm not, like, you know, I haven't... | ||
I used to have a cell phone where I soldered the microphone differently so there was a hard switch that you could turn it off. | ||
Really? | ||
You did that? | ||
Yeah, because you start thinking about how all this stuff works. | ||
Do you ever fuck around with Linux phones or anything like that? | ||
No. | ||
I try to be normal. | ||
I still do run Linux on a desktop just because I've been doing that forever. | ||
And you keep a moleskin for what? | ||
Just notes. | ||
You don't put them on your phone? | ||
Sometimes I do, but I like writing more, I guess. | ||
Okay, so you just do it just because you enjoy it. | ||
Yeah, but I guess, you know, you're right. | ||
Maybe, you know, like, I feel the forces of darkness are not going to compromise. | ||
Yeah. | ||
Do you feel like you have extra scrutiny on you because of the fact that you're involved in this messaging application that Glenn Greenwald and Edward Snowden and a bunch of other people that are seriously concerned with... | ||
Security and privacy that maybe people are upset at you? | ||
That you've created something that allows people to share encrypted messages? | ||
I mean, maybe. | ||
I mean, I think... | ||
Because you've kind of cut out the middleman, right? | ||
You've cut out the third-party door. | ||
Yeah. | ||
And I think... | ||
But in some ways, that means that there's less pressure on me because, you know, it's like if you're the creator of Facebook Messenger and your computer gets hacked, like, that's everyone's Facebook messages are, you know, gone. | ||
Yeah. | ||
And, you know, for me, if, like, my computer gets hacked, I can't access anyone's signal messages whether I get hacked or not, you know? | ||
Right. | ||
And so I have sort of less liability in that sense. | ||
There was, like, a weird period of time where it was very difficult for me to fly commercially, like, on an airplane. | ||
And I don't know why. | ||
I think it had something to do with a talk that someone gave about WikiLeaks once, and they mentioned my name. | ||
And after that... | ||
You were getting flagged? | ||
Yeah, it was very annoying. | ||
I would go to the airport and I wouldn't be able to print a ticket at the kiosk. | ||
I had to go talk to a person. | ||
They had to call some phone number that would appear on their screen and then wait on hold for like 45 minutes to an hour to get approval. | ||
And then they would get approval to print the ticket. | ||
So you had to anticipate this when you travel? | ||
unidentified
|
Yeah. | |
So you had to go their way in advance? | ||
Way in advance. | ||
And then anytime I traveled internationally, on the way back through customs, they would seize all of the electronics that I had. | ||
Jeez. | ||
The US government would do this? | ||
Yeah. | ||
Customs and Border Protection. | ||
They would seize your shit and would you get it back? | ||
They would eventually send it back, but you just had to throw it out because it's not... | ||
Who knows what they did to it, you know? | ||
I would want to give it to someone and go, hey, tell me what they did. | ||
Could you do that? | ||
Is it possible to back-engineer whatever? | ||
I never spent time on it. | ||
How much time did they have your shit for? | ||
It would be like weeks. | ||
unidentified
|
Weeks?! | |
Weeks! | ||
Did you have to give them passwords and everything? | ||
Well, that's the thing. | ||
They would stop you, and they would be like, hey, we just need you to type in your password here so that we can get through the full disk encryption. | ||
And I would be like, no. | ||
And they would be like, well, if you don't do that, we're going to take this, and we're going to send it to our lab, and they're going to get it anyway. | ||
And I would be like, no, they're not. | ||
And they would be like, all right, we're going to take it. | ||
You're not going to have your stuff for a while. | ||
You sure you don't want to type in your password? | ||
I would be like, nope. | ||
And then it would disappear, and it would come back weeks later, and then it's like, How bizarre. | ||
Yeah. | ||
And with... | ||
There was no... | ||
They didn't have like a motive. | ||
There was no... | ||
That's the thing. | ||
You never know why. | ||
No, but I'm saying they didn't say, hey, you were... | ||
You're thought to have done this or there's some... | ||
No, they would always just be like, oh no, this is just random or whatever. | ||
But there would be two people at the exit of the plane with photographs of me, you know, waiting for me to step off the plane and they would escort. | ||
They wouldn't even wait for me to get to the... | ||
So did you have to have like a burner laptop? | ||
I just wouldn't travel with electronics, you know, because it was just... | ||
Even your phone? | ||
unidentified
|
Yeah. | |
Yeah, I gave them my phone. | ||
Oh, fuck. | ||
Wow. | ||
That was only internationally, though, because they can't do that domestically. | ||
So domestically, you just had long waits, and then they'd eventually give you a ticket? | ||
Yeah, they would eventually give you a ticket and then you'd get the selective screening where they would take all the stuff out of your bag and like, you know, filter out your car. | ||
They'd touch your dick too, right? | ||
And then at every connection, the TSA would come to the gate of the connecting thing, even though you're already behind security, and do it again at the connection. | ||
unidentified
|
Really? | |
Yeah, I don't know. | ||
It was weird. | ||
It was just like a... | ||
Connections too? | ||
Yeah, yeah. | ||
So they're trying to fuck with you? | ||
I think so, yeah. | ||
I don't know what that was. | ||
And how long did that last for? | ||
That was a couple years. | ||
Yeah. | ||
And when did it go away? | ||
The day it went away, were you like, ooh? | ||
Yep. | ||
Yeah, one day it just stopped. | ||
It really did change the game. | ||
What year did it go away? | ||
When Trump got into office? | ||
No, it was way before that. | ||
Yeah, I forget. | ||
Yeah, I forget. | ||
Yeah, I was thinking, actually, I was thinking on the way here, it's funny how, like, I remember after the last election, everyone was talking about, like, California leaving the United States. | ||
Like, California seceding. | ||
You remember that? | ||
Yeah, hilarious. | ||
And now everyone's talking about leaving California, like, after this election. | ||
Yeah, imagine that. | ||
President Newsom. | ||
Yeah. | ||
Locked down in a communist state. | ||
But do you remember... | ||
People discovered that the CalExit, the whole CalExit movement was started by a guy that lived in Russia. | ||
Oh, it was one of those IRA things? | ||
Internet research agency scams? | ||
But it wasn't. | ||
I actually tracked the guy down in Moscow one time. | ||
You tracked him down? | ||
He was just some guy. | ||
Did he do it for goof? | ||
No, he, like, really believes. | ||
That California should leave? | ||
Yeah, he, like, he lived in California and had been, for years, like, trying to foment this CalExit thing. | ||
And he has all the stats on, you know, why it would be better for California and all this stuff, you know. | ||
And then he sort of thought, well, this isn't working. | ||
And he really liked Russia for some reason. | ||
So he moved to Russia just before the election, not knowing what was going to happen. | ||
And then when Trump won, people were like, wait a second, fuck this. | ||
Maybe California should get out of here. | ||
And they just found this... | ||
Like campaign that already existed and everyone sort of got behind it and he was just like oh shit and he lives in Russia now you know and and but he like didn't really understand um optics I think where he like he like the re the way that people everyone found out that uh he lived in Russia was that he opened a California embassy in Moscow so they like announced like you know CalAXIT has opened the first California embassy like in a foreign country but it was in Moscow and this was right as all the like Russian like stuff was happening you know So | ||
if you're conspiratorially minded, you'd have drawn some incorrect conclusions. | ||
Yeah, he was just... | ||
I met with him. | ||
I like hanging out with him for a day. | ||
I think he really genuinely just... | ||
So what was your motivation to hang out with this guy for a whole day? | ||
I mean, I was just fascinated, you know, because here's this guy who's, like, doing this kind of ambitious thing, and it just, the optics seem so bad, you know? | ||
Yeah. | ||
I think he reminded me of, like, the Hannah Arendt quote that's like, you know, if the essence of power is deceit, does that mean that the essence of impotence is truth? | ||
You know, that, like... | ||
He sort of believed that just, like, the facts were enough. | ||
You know, it's just, like, the stats of just, like, yeah, we spend this much money on, like, defense spending. | ||
If we, you know, if we stopped, you know, it's like we would have, like... | ||
So much money if California was a country. | ||
And we would still have, like, the fourth largest military in the world. | ||
And, you know, we would have, like... | ||
You know, it's just, like, the numbers actually are compelling, you know? | ||
And it was just sort of, like, that's... | ||
You know, people will just see the truth, you know? | ||
And I was like, dude, I think maybe you should, like, not live in Russia anymore, you know? | ||
It was... | ||
Dude. | ||
Why did he go to Russia? | ||
I don't know. | ||
He had been teaching English, and I think he just sort of ended up liking Russia. | ||
And so, yeah, he just decided to move there. | ||
I was on the way with a friend to Abkhazia. | ||
Have you ever heard of that place? | ||
No. | ||
It's an autonomous region of the country of Georgia. | ||
And it's kind of interesting. | ||
There's all these autonomous regions in the world that are essentially their own countries, you know, but they're not recognized by the UN or other countries, you know. | ||
Like Texas. | ||
You're in one right now. | ||
I mean, these places are like, you know, militarized border, like they have their own, you know. | ||
But they're not recognized by the UN. Yeah. | ||
And so they all recognize each other. | ||
And it's like, if you want to be a country, it's kind of interesting. | ||
You need a lot of stuff. | ||
You need a flag. | ||
You need a national bird. | ||
You need an anthem or whatever. | ||
And you need a soccer team. | ||
You definitely have to have a soccer team. | ||
Interesting. | ||
So these countries all have their own soccer teams, but they can't play in FIFA because they're not recognized by the UN. So FIFA can't recognize them. | ||
So they have their own league. | ||
It's like the league of unrecognized states and stateless peoples. | ||
And they have their own World Cup. | ||
And they have the World Cup in Abkhazia. | ||
How many different countries are there that are like this? | ||
There are a lot. | ||
How many? | ||
I mean, I don't know. | ||
I don't know how many teams are in this league called Kanifa. | ||
I mean, it's 20 plus. | ||
So there's 20-plus unrecognized countries or autonomous regions. | ||
And also stateless people, so like the Kurds. | ||
People from Chagos Islands were basically evicted for a U.S. military base in their diaspora. | ||
You know, places like Somaliland, Transnistria, South Ossetia, Lapalandia, you know, like, it's kind of interesting. | ||
So, I went with a friend to Ocasio for the World Cup of all the unrecognized states. | ||
How was that? | ||
It was awesome. | ||
unidentified
|
Yeah? | |
It was like, yeah, it was really interesting. | ||
I mean... | ||
The smile on your face. | ||
This is the biggest smile you've had the entire show. | ||
It sounds like it was a great time. | ||
I mean, it just is so fascinating to me. | ||
And... | ||
I think it's, like, an interesting, you know, it's, like, in a way that I feel like, you know, society moves by, like, pushing at the edges, you know, that, like, it's the fringes that end up moving the center. | ||
I feel like, you know, looking at the margins of the way politics works is an interesting view of, like, how everything else works, you know, that, like, going to Abkhazia, it was so crazy getting there, you know, it's like, You know, we travel all through Russia. | ||
We get to this, like, militarized border. | ||
You go through these three checkpoints that aren't supposed to exist, but obviously exist. | ||
You know, you get to the other side, and it's just the same as where you just were. | ||
You know, you guys fought a brutal civil war, you know, with, like, genocide, like, full-on, you know, like, crazy shit. | ||
unidentified
|
And... | |
It's just kind of the same. | ||
Was it worth it? | ||
What's the deal? | ||
I feel like it's this thing you see again and again of the institutions that we're familiar with in the world that exists are the institutions of kings. | ||
It's like police, military, illegal apparatus, tax collectors. | ||
Every moment in history since then has been about trying to change ownership of those institutions. | ||
Hmm. | ||
And it's always sort of dissatisfying, you know? | ||
And, like, you know, just seeing that happen again and again. | ||
And just, like, you know, realizing that it's like maybe what we should be doing is actually trying to get rid of these institutions or change these institutions in some way, you know? | ||
Don't you think there's a very slow rate of progress, but ultimately progress? | ||
If you follow Pinker's work, it looks at all the various metrics like murder, rape, racism, crime, all these different things. | ||
Over time, we're clearly moving in a better direction. | ||
Do you think it's just like... | ||
You know, I was listening to this podcast today. | ||
We were talking about religion, and it was discussing the Bible, and they were talking about all the different stories that are in the Bible, many of them that are hundreds of years apart, that were collected and put into that. | ||
Just stop and think about a book that was written literally before... | ||
The Constitution was drafted, and that book is being introduced today as gospel. | ||
And that there's a new book that's going to be written 200 years from now, and that will be attached to the new version of the Bible as well. | ||
And then one day someone will come across this, and it will all be interpreted as the will and the words of God that all came about. | ||
Yeah. | ||
Yeah, yeah, yeah. | ||
But today, the spans of time are far slower, like going from Alan Turing in 1950, being chemically castrated for being gay, to in my lifetime, seeing gay marriage as being something that seeing gay marriage as being something that was very fringe when I was a boy living in San Francisco, to universal across the United States today. | ||
at least mostly accepted by the populace, right? | ||
That this is a very short amount of time where a big change has happened. | ||
And that these changes are coming quicker and quicker and quicker. | ||
I would hope that this is a trend that is moving in the correct direction. | ||
Yeah, certainly there are some things that are getting better, yeah. | ||
And I feel like, to me, it's important to, you know, for a lot of those things, like the things you mentioned, like gay marriage, I think it's important to realize that, like, a lot of those, a lot of that progress would not have happened without the ability to break the law, honestly. | ||
Right, right. | ||
How would anyone have known that we wanted to allow same-sex marriage if no one had been able to have a same-sex relationship because Saudi laws had been perfectly enforced? | ||
How would we know that we want to legalize marijuana if no one had ever been able to consume marijuana? | ||
So I think a lot of the fear around increased surveillance data is that space dissipates. | ||
Yes. | ||
Yeah. | ||
But, you know, on the other hand, you know, it's like we're living in the apocalypse, you know, that it's like if you took someone from 200 years ago who used to be able to just walk up to the Klamath River and dump a bucket in the water and pull out, you know, 12 salmon and that was, you know, their food. | ||
And you were like, oh, yeah, the way it works today is you go to Whole Foods and it's $20 a pound and it's, you know, pretty good. | ||
You know, they'd be like, what have you done? | ||
Oh, my God. | ||
You used to be able to walk across the backs of the salmon, you know, across the whole river. | ||
Well, we're trying to avoid slipping even further into that apocalypse. | ||
I don't know if you've followed what's going on in the Bristol Bay of Alaska with the Pebble Mine. | ||
No. | ||
It's crazy. | ||
They're trying to... | ||
According to what Joe Biden said when he was running for office, when he's in office, that will not happen. | ||
But they're trying to... | ||
Do essentially the biggest mine in the world that would destroy the salmon population. | ||
It would destroy it. | ||
It would literally wipe out a gigantic, not just a gigantic industry, but a gigantic chunk of the salmon. | ||
I think it's... | ||
I forget which kind of salmon it is. | ||
I don't want to say it's Chinook. | ||
I forget what kind of salmon it is, but it's the biggest population. | ||
Sockeye, thank you. | ||
The biggest population of them, certainly in America, but I think in the world. | ||
I think it's responsible for an enormous number of jobs. | ||
Apparently there's fucking billions of dollars worth of gold and copper down there. | ||
Yeah. | ||
Earthworks. | ||
What's at stake? | ||
An average of 40 to 50 million wild salmon make the epic migration from the ocean to the headwaters of the Bristol Bay every year. | ||
Like, no place on Earth. | ||
The Bristol Bay watershed. | ||
They've been working to try to make this mine a reality for, I think, a couple of decades now. | ||
And people have been fighting tirelessly to educate people on what a devastating impact this is going to have on the ecology of that area and the fact that the environment will be permanently devastated. | ||
There's no way of bringing this back and there's no way of doing this without destroying the environment. | ||
Because the specific style of mining that they have to employ in order to pull that copper and gold out of the ground It involves going deep, deep into the earth to find these reservoirs of gold and copper and sulfur they have to go through and then they have to remove the waste. | ||
And mining companies have invested hundreds of millions of dollars in this and then abandoned it. | ||
So they were like, we can't. | ||
We can't fucking do this. | ||
And then people are like, we can do it. | ||
And then they've got... | ||
And it's other companies that are... | ||
I don't believe the company that's currently involved in this is even an American company. | ||
I think it's a... | ||
It's a foreign company that's trying to... | ||
I think they're from Canada that are trying to do this spectacular... | ||
I don't know which company it is, but it's... | ||
My friend Steve Rinella from the Meat Eater podcast. | ||
I want to recommend this podcast because he's got a particular episode on that where he talks about it. | ||
Let me find it real quick. | ||
Because it's... | ||
It's pretty epic where he talks to this one guy who's dedicated the last 20 years of his life trying to fight this. | ||
Let me just find it real quick because it's pretty intense. | ||
And it's terrifying when you see how close it's come to actually being implemented and how if it happens, there's no way you pull that back. | ||
Once they do it... | ||
It's like all that Standing Rock shit. | ||
They were like, no, it's the Pipeline's got to be fine. | ||
No way that it leaks into the water or whatever. | ||
It's like, sure enough. | ||
Exactly. | ||
unidentified
|
Unfortunately, I've already listened to it, so I'm having a hard time finding it in this app. | |
It's motherfuckers. | ||
Did you find it? | ||
Yeah, previously played... | ||
Yeah, Half-Life of Never. | ||
It's the October 5th episode. | ||
That's a good title. | ||
Yeah, and the gentleman's name is Tim Bristol, which is kind of crazy. | ||
That is his birth name. | ||
His name is Tim Bristol, and he's dealing with his Bristol Bay situation. | ||
I mean, it's just a random coincidence. | ||
Yeah, and you read all that shit about the... | ||
Episode 241. Like when they were building all the dams in California, and it's just like the salmon just bashed themselves to death. | ||
They had to set them on fire. | ||
Seattle, yeah. | ||
Same thing that happened up in Seattle. | ||
These knuckleheads, they didn't understand the migration. | ||
These salmon won't go anywhere else. | ||
They have one specific river where they were born, and that's where they will die and spawn. | ||
Ugh, it's crazy. | ||
But these assholes that just want copper and gold are willing to do this. | ||
And there was this one politician in particular that has a gigantic windfall, if he can pull this off, or a lobbyist or whatever the fuck he is. | ||
But he stands to make, I think they said $14 million if he can actually get the shovels into the ground. | ||
That's how much he earns. | ||
So what are we going to do about it? | ||
Kill that guy! | ||
Assassination politics. | ||
Yes. | ||
Kill them all. | ||
No. | ||
I'm kidding. | ||
Don't get me in trouble. | ||
You can get banned off of YouTube for saying something like that. | ||
I'm joking. | ||
What should we do? | ||
We should make people aware of it and make people aware that this is that there are real consequences to allowing politicians to make decisions that will literally affect human beings for the rest of eternity Because you will never have that population of salmon coming to that particular location that have been going there for millions and millions of years And the reason why you won't have them there is because someone is greedy. | ||
It's really that simple and I mean, we are getting along fine without that copper and without that gold, and we are using the resource of the salmon, and people are employed that are enjoying that resource, and they're also able to go there and see the bears eating the salmon and seeing this incredible wild place. | ||
Alaska is one of the few really, truly wild spots in this country. | ||
Yeah. | ||
And someone might fuck that up. | ||
And if you get enough greedy assholes together, and they can figure out a way to make this a reality, and with the wrong people in positions of power, that's 100% possible. | ||
Yeah, you might even say we've organized the entire world economy to fuck that up. | ||
Yeah, yeah. | ||
But I think that it's like the question of agency of how do we affect these processes is tough. | ||
Well, I was joking, obviously, about killing that person, but there was a recent one of the Iranian scientists was assassinated, and this brought up this gigantic ethical debate. | ||
And we don't know who did it, whether it was an Israeli army Mossad held a press conference to say, we didn't do it, while wearing t-shirts that said, we definitely did it. | ||
Assassinated Iranian nuclear scientists shot with remote-controlled machine gun. | ||
Holy fuck! | ||
unidentified
|
Holy fuck! | |
It was in middaylight, which is what I was hearing about. | ||
Oh my god! | ||
Dude, we're killing people with robots now, right? | ||
That was the other Iranian guy that got killed, Soleimani, who was also killed with a drone. | ||
I mean, essentially... | ||
It says out of another car, but whatever. | ||
Oh, so a car was driving by and there was a remote-controlled machine gun? | ||
unidentified
|
Mm-hmm. | |
Fuck. | ||
It says he was in a bulletproof car, too. | ||
Wow, I don't know. | ||
He was in a bulletproof... | ||
Like, they knew they were going to kill this guy. | ||
Yeah, they did, man. | ||
Damn. | ||
So, this is the question. | ||
Oh, he got out of the car. | ||
unidentified
|
Oh. | |
Well, there you go. | ||
You fucked up. | ||
Stay in that bulletproof car. | ||
If you know that a man is going to... | ||
Like, what if someone did that to Oppenheimer? | ||
You know, what if someone said, hey, we see where this is going... | ||
And we need to find that Oppenheimer gentleman, and we need to prevent Big Boy from dropping down and killing how many people? | ||
Half a million people. | ||
What? | ||
He got shot by that remote. | ||
It was 164 yards away. | ||
Shot him and his bodyguard, and then the car they were in exploded. | ||
It lasted for three minutes. | ||
The whole thing was three minutes long. | ||
Wow. | ||
So there's this ethical dilemma. | ||
Like, if someone is actively trying to acquire nuclear weapons, and we think that those people are going to use those nuclear weapons, is it ethical to kill that person? | ||
And if that person's a scientist, they're not a... | ||
Yeah, I mean, I think the causality stuff is really hard to figure out, you know. | ||
But I think most of the time it's not about the one person, you know, that it's not, you know, maybe sometimes it is, but I think most, it's just like, I feel like assassination politics in the tech arena does not work, you know, that it's like you can get rid of all the people at the top of these companies and that's not what's going to do it, you know, that there are like these structural reasons why these things keep happening over and over again. | ||
Yeah, I think they're trying to slow it down, though, right? | ||
Like, this is the reason why... | ||
Do you remember when they employed Suxnet? | ||
Suxnet, yeah, yeah. | ||
You know, I mean, that was for the same reason, right? | ||
They were trying to disarm the Iranian nuclear capabilities. | ||
Yeah, yeah, yeah. | ||
That was the same thing, where they... | ||
But that was kind of crazy. | ||
They were like, we didn't do it while wearing t-shirts. | ||
They were like, we definitely did this. | ||
unidentified
|
Yeah. | |
But they did that with computer virus, right? | ||
Which is pretty fascinating. | ||
Yeah. | ||
And people didn't have a problem with that. | ||
Well, I think some people had a problem with that, obviously. | ||
Well, Iranians. | ||
Yeah, but also just like, okay... | ||
You know, you go down that road and, you know, where things can happen too. | ||
A great example is, so one of the things that came out in a lot of the documents that Snowden released was that the NSA had worked with a standards body called NIST in order to produce a random number generator that was backdoored. | ||
So random numbers are very important in cryptography, and if you can predict what the random numbers are going to be, then you win. | ||
And so the NSA had produced this random number generator that allowed them to predict what the random numbers would be because they knew of this one constant that was in there. | ||
They knew a reciprocal value that you can't derive just by looking at it, but they know because they created it. | ||
And they had what they called a nobody but us backdoor. | ||
No bus. | ||
Nobody but us backdoor. | ||
And they got NIST to standardize this thing, and then they got a company called Jupyter, who makes routers and VPNs and stuff like that. | ||
Juniper, sorry. | ||
To include this in their products. | ||
And so the idea was that, like, the NSA would have these capabilities, they had developed, you know, these vulnerabilities that they could exploit in situations like this, you know, that they could, like, take advantage of foreign powers and stuff like that in ways that wouldn't boomerang back at them. | ||
But what happened was, in, I think, you know, 20 early teens, Juniper got hacked, and somebody secretly changed that one parameter. | ||
That was, like, basically the back door to a different one that they knew the reciprocal value to. | ||
And it's most likely China or Russia that did this. | ||
And then what's kind of interesting is there was a big incident where the OPM, the Office of Personnel Management, I think, was compromised. | ||
And they have records on, you know, foreign intelligence assets and stuff like that. | ||
Their systems were compromised, it seems like, maybe by China. | ||
And what's sort of interesting is that they were running the Juniper networking gear that had been, you know, hacked in this one specific way. | ||
And so it's kind of possible that, like, you know, the NSA developed this backdoor that they were going to use for situations like this, you know, against foreign adversaries or whatever, and that the whole thing just boomeranged back at them, and the OPM was compromised as a result. | ||
Wow. | ||
But this is like, I don't know, I think it's, You know, it's easy to look at things like Stuxnet and stuff like that and just be like, yeah, this is harm reduction or whatever, you know, but like in the end, it can have real-world consequences. | ||
And this is also why people are so hesitant about, you know, like the government is always like, well, why don't you develop a form of cryptography where it like works except for us, you know, weaken the content, you know. | ||
And it's like, well, this is why. | ||
Because if you can access it, if anybody can access it, somehow that's going to boomerang back at you. | ||
Well, I remember when there was a terrorist attack in Bakersfield, California. | ||
Is that where it was? | ||
I think it was Bakersfield. | ||
Yeah. | ||
San Bernardino. | ||
San Bernardino, thank you, yeah. | ||
And there was an iPhone involved, and Apple wouldn't open it for them. | ||
It wouldn't allow the FBI to have access to it, and people were furious. | ||
And they were like, if this starts here, this does not end well. | ||
And I kind of saw their point, but I kind of saw the FBI's point too. | ||
Like, did you just open this one? | ||
This guy's clearly a murderer, has killed a ton of people, and created this terrorist incident. | ||
Yeah, but I mean, it was a little disingenuous too, right? | ||
Where it's like, the FBI had their entire iCloud backup for this device. | ||
The only thing they didn't have was the previous two hours or something like that. | ||
And the reason they didn't have it is because they fucked up and approached it in the wrong way and got themselves locked out of it. | ||
Oh, really? | ||
It was their own mistake that led to the situation where they didn't have the iCloud backup. | ||
So then it's like, what are you really going to get off this phone? | ||
The actual possibility of what was there was extremely marginal. | ||
So do you think what they really want is the tools to be able to get into other people's phones? | ||
They've just been waiting for the moment of like, okay, here we go. | ||
We got terrorists. | ||
That makes sense. | ||
What did you think like when the State Department or whoever it was banned Huawei phones? | ||
Yeah. | ||
Did you think there was... | ||
I mean, yeah, it's mostly political, right? | ||
Like it's... | ||
It's complicated, right? | ||
Because there's, like, you know, companies like Huawei and, you know, the people who make TikTok. | ||
Like, they're, yeah, they're doing, like, all the sketchy shit. | ||
But it's the same sketchy shit that, like, all of Silicon Valley is doing, you know? | ||
Like, it's not... | ||
Is it really? | ||
Is that a valid comparison to what they're doing in Silicon Valley? | ||
Like, Huawei did have routers that had third-party access, apparently, and they were shown that information was going to a third party that was not supposed to be, right? | ||
Wasn't that part of the issue? | ||
Am I reading this wrong? | ||
Well, okay, I think there's a couple... | ||
There have been incidents where it's like, yeah, there's data collection that's happening. | ||
Well, there's data collection happening in all Western products, too. | ||
And actually, the way the Western products are designed are really scary. | ||
In the telecommunications space, there's a legal requirement called CALEA, Communications and Law Enforcement Act, or something like that, that requires telecommunications equipment to have... | ||
To have eavesdropping, like surveillance stuff built into it, like when you produce the hardware, in order to sell it in the United States, you have to have... | ||
Like which hardware? | ||
Like phone switches and stuff. | ||
It's like when you make a normal phone call, it has to have... | ||
I forget what they call it. | ||
The ability to tap. | ||
Yeah, they call it something else. | ||
But it has to have this ability to record conversations, intercept... | ||
Lawful intercept, that's what it's called. | ||
How does a signal call work? | ||
So signal calls work not using the traditional telecommunications infrastructure. | ||
It is routing data over the internet. | ||
And that data is end-to-end encrypted, so nobody can eavesdrop on those calls, including us. | ||
But so communication equipment that is produced in the United States has to have this so-called lawful intercept capability. | ||
But what's crazy about that is that's the same, you know, it's like these are U.S. companies and they're selling that all around the world. | ||
So that's the shit that gets shipped to UAE. Yeah. | ||
You know, so it's like it's the secondary effect thing of like the United States government was like, well, we're going to be responsible with this or whatever. | ||
We're going to have warrants or whatever. | ||
And even that's not true. | ||
And then that same equipment gets shipped to tyrants and repressive regimes all over the place. | ||
And they just got a ready-made thing to just avail everyone's phone calls. | ||
So it's like, I don't know, it's hard to indict Huawei for acting substantially different than the way, than, you know, whatever, the US industry acts. | ||
It's just, certainly they have a different political environment and, you know, they are much more willing to use that information to do really brutal stuff. | ||
Well, it wasn't just that they banned Huawei devices, but they also banned them from using Google. | ||
That's when I thought, like, wow, this is really... | ||
Like, what do they know? | ||
Or what has been... | ||
Oh, Google... | ||
Yeah. | ||
Well, Google has... | ||
No. | ||
So, you know, Android... | ||
You're talking about, like, so-called Android devices. | ||
They can't use the Android operating system anymore. | ||
They have to now... | ||
They've developed their own operating system, and now they have their own ecosystem, they have their own app store, the whole deal. | ||
Yeah. | ||
But that's also... | ||
That's a business thing, you know, where it's, like, Google's control over... | ||
Google is producing this software, Android, and it's just free. | ||
They're releasing it. | ||
But they want to maintain some control over the ecosystem because it's their thing that they're producing. | ||
And so they have a lot of requirements. | ||
It's like, okay, you can run Android. | ||
Oh, you want all this other stuff that we make that's not part of just the stock-free thing, like Play Services and all the Google stuff. | ||
Increasingly more and more of Android is just getting shoved into this proprietary bit. | ||
And they're like, okay, you want access to this? | ||
Then it's going to cost you in these ways. | ||
And I think it probably got to the point where Huawei was just like... | ||
We're not willing to pay, you know, even either monetarily or through whatever compromise they would have to make, and they were just like, we're gonna do our own thing. | ||
I thought it was because of the State Department's boycott. | ||
Oh, it could have also been that there was a legal requirement that they stopped doing it, yeah. | ||
Yeah, I think I might be... | ||
Jamie will find out. | ||
I think I might be right, but I'm not sure though. | ||
But it just made me think, like, I understand that there's a sort of connection that can't be broken between business and government in China, and that business and government are united. | ||
It's not like, you know, like Apple and the FBI, right? | ||
Yeah. | ||
In China, they would just give them the phone. | ||
Oh yeah, of course. | ||
They developed the phone. | ||
They wouldn't have the tools to already get into it. | ||
They wouldn't have to have this conversation. | ||
Yeah, exactly. | ||
They just send it directly to the people. | ||
What we're terrified of is that these relationships that business and government have in this country, they're getting tighter and tighter intertwined. | ||
And we look at a country like China that does have this sort of inexorable connection between business and government, and we're terrified that we're going to be like that someday. | ||
Yeah. | ||
Yeah. | ||
Is it just it? | ||
It is what it is? | ||
Yeah. | ||
I mean, and that's, I think, you know, a lot of what Snowden was revealing. | ||
Yes. | ||
It was like, you know, that there are already these relationships, you know. | ||
You know, the NSA called it PRISM. And, you know, tech companies just called it, like, the consoles or whatever they had built for these, you know, for these requests. | ||
But it's... | ||
That's... | ||
Yeah, it's happening. | ||
And I don't... | ||
Also, you know, it's sort of like... | ||
I think a lot of people, a lot of nations look at China and are envious, right? | ||
Where it's like, they've done this thing where they just, you know, they built like the Great Firewall of China, and that has served them in a lot of ways. | ||
You know, one, surveillance, obviously, like they have total control of everything that appears on the internet. | ||
So not just surveillance, but also content moderation, propaganda, but then also, It allows them to have their own internet economy. | ||
China is large enough that they can have their own ecosystem. | ||
People don't use Google there. | ||
They have their own chat apps. | ||
They have their own social networks. | ||
They have their own everything. | ||
And I think a lot of nations look at China and they're just like, huh, that was kind of smart. | ||
It's like you have your own ecosystem, your own infrastructure that you control, and you have the ability to do content moderation, and you have the ability to do surveillance. | ||
And so I think the fear is that there's going to be a balkanization of the internet where Russia will be next and then every country that has an economy large enough will go down the same road. | ||
Was it, Jamie? | ||
There's a couple things that happened that are what you're saying, but directly seems to be related to this. | ||
Sweeping crackdown on facial recognition tech. | ||
House and Senate Democrats on Tuesday rolled out a legislation to halt federal use of facial recognition software and require state and local authorities to pause any use of the technology to receive federal funding. | ||
The Facial Recognition and Biometric Technology Moratorium Act introduced Thursday. | ||
Marks one of the most ambitious crackdowns on Facebook. | ||
This has to do with that? | ||
It said it was part of this boycott that had to do with Google's, like, antitrust suit. | ||
That also had to do with Facebook, and they were looking into it. | ||
This was from, like, a month ago. | ||
I mean, I think this is connected to what you're saying, just in the sense that, like... | ||
You know, the people who are producing that facial recognition technology, it's not the government. | ||
It's, you know, Valenti or whoever sells services to the government. | ||
And then, you know, the government is then deploying this technology that they're getting from industry and in kind of crazy ways. | ||
Like, there's the story of the Black Lives Matter protester who, like, the police, like, NYPD, you know, not like the FBI, you know, NYPD, like, tracked him to his house using facial recognition technology. | ||
And so, yeah. | ||
How did they do that? | ||
There's a story about it. | ||
I've been finding stories. | ||
No one knows what these things are. | ||
There's things supposedly all over New York City and Manhattan that are tracking everybody's face as soon as they go in there. | ||
I've watched news videos from local New York, local media, asking people, have you seen these? | ||
What are they? | ||
They get no answers. | ||
Well, here's what's hilarious. | ||
Crime has never been higher. | ||
New York City crime right now is insane. | ||
That shit's not doing anything. | ||
Yeah. | ||
Well, everyone's wearing a mask, too. | ||
That's also part of the problem. | ||
But I think the fear is that there's this circle of industry-producing technology that is going into government. | ||
Stuff like facial recognition technology just makes existing power structures much more difficult to contest. | ||
Do you use facial recognition on your phone? | ||
No. | ||
I don't have any apps or anything that use it. | ||
You don't know with your iPhone? | ||
unidentified
|
Oh, no. | |
I just have a pen. | ||
Oh, you don't use it. | ||
What's going on, Jeremy? | ||
New York City Police Department uses facial recognition software to track down a Black Lives Matter activist accused of assault after allegedly shouting into a police officer's ear with a bullhorn. | ||
That's it? | ||
What about that guy who punched Rick Moranis, you fucks? | ||
They found him. | ||
They did? | ||
Yeah. | ||
Like last week. | ||
Right in jail. | ||
But they did find him. | ||
How'd they find him? | ||
They have facial recognition, Joe. | ||
But he wear a mask. | ||
I don't know. | ||
Anyway. | ||
Listen, I think what you're doing is very important. | ||
And I love the fact that you approach things the way you do. | ||
And that you really are this idealistic person that's not trying to make money off of this stuff. | ||
And you're doing it because you think it's the right thing to do. | ||
If there is a resistance, people like you are very important. | ||
What you've done by creating signal, it's very important. | ||
There's not a lot of other options, and there's no other options that I think are as secure or as viable. | ||
Thank you. | ||
Thanks. | ||
I appreciate you saying that. | ||
And I support it, and I try to tell other people to use it as well. | ||
Last word. | ||
Do you have anything to say to everybody before we wrap this up? | ||
That's a lot of pressure. | ||
Sorry. | ||
Can I put out a public plea for a project I'm trying to work on? | ||
Sure. | ||
Okay, I'm vaguely obsessed with this thing that happened in the 60s. | ||
Are you familiar with the Soviet space dogs? | ||
So the first animal in space was a dog named Laika. | ||
Laika died in space, sadly. | ||
The second animal in space was a dog called Strelka. | ||
Strelka went to space, made it back to Earth, and had puppies. | ||
Whoa, those puppies can read minds. | ||
When Khrushchev came to visit JFK in 1965, he brought with him the ultimate insult gift, which was one of the puppies. | ||
That's an insult? | ||
Oh, dude. | ||
It's like, oh, do you have anything that's been to space? | ||
We have extra puppies. | ||
You know, do you want one? | ||
You know? | ||
That's an insult? | ||
Dude, it's the ultimate insult gift. | ||
Like, the United States had no space program, had never been, the Soviet Union was, like, way ahead of them. | ||
They're like, oh, we've just got extra animals that have been to space. | ||
Like, here, have one, you know? | ||
It's a puppy. | ||
Stop being so personal. | ||
That's what I would tell Kennedy. | ||
Just take the puppy, bro. | ||
Well, Kennedy took the puppy. | ||
Kennedy took the puppy. | ||
The puppy had a Cold War romance with one of Kennedy's dogs, and they had puppies. | ||
Oh, snap! | ||
That the Kennedys called the Pupnicks. | ||
And the Pupnicks captivated the imagination of children across America. | ||
Because Jackie Kennedy said something, she was like, I don't know what we're going to do with the dogs, you know? | ||
And that ignited a spontaneous letter-writing campaign from children across America who all requested one of the puppies. | ||
Jackie Kennedy selected two children in America whose names were Mark Bruce and Karen House. | ||
And she delivered two puppies to each of these people. | ||
One of them lived in Missouri, the other lived in Illinois. | ||
And I have sort of been obsessed with the idea that those puppies had puppies, and that those puppies had puppies, and that somewhere in the American Midwest today are the descendants of the original animals in space. | ||
The first animal to go to space and survive. | ||
They've probably been watered down so heavily. | ||
Maybe, but like... | ||
Chihuahuas and German Shepherds and shit. | ||
Well, they were all... | ||
There they are, right there. | ||
They were mutts. | ||
They were random dogs that they found from around the, like, spaceport. | ||
Because they thought that they would be, like, tougher. | ||
unidentified
|
Yeah. | |
Oh, wow. | ||
But they were small. | ||
And so, yeah, I've been obsessed with the idea that these dogs could still be out there, and I've been trying to find the dogs. | ||
So I've been trying to track down these two people, notably Karen House, because she got the female dog. | ||
And I think she's still alive, and I think she lives in the Chicago area, but I can't get in touch with her because I'm not, I don't know, I'm not an investigative journalist. | ||
I, like, don't know how to do this or whatever. | ||
So, if anybody knows anything about the whereabouts of Karen House or the descendants of the Soviet space dogs, I'm very interested. | ||
My goal is just to meet one, you know? | ||
How should someone get in touch with you? | ||
I'm on the internet. | ||
unidentified
|
Okay. | |
Moxie. | ||
Just like that. | ||
I'm on the internet. | ||
My name is Moxie. | ||
I love it. | ||
Thanks, man. | ||
I really appreciate it. | ||
I really enjoyed our conversation. | ||
unidentified
|
Thank you. |