Moxie Marlinspike, Signal’s creator, argues end-to-end encryption is the only way to fight mass surveillance after decades of failed system security, citing Bezos’ iMessage hack and Snowden’s revelations. He traces crypto-anarchy roots to 1980s-90s pioneers like David Chaum and Tim May, who circumvented ITAR restrictions by printing code in books or relocating to Anguilla. Marlinspike critiques Silicon Valley’s profit-driven tech—from Facebook’s global influence to Huawei’s Great Firewall—warning that bad business models enable surveillance, even as U.S. laws like CALEA and the 2021 Facial Recognition Moratorium Act expand state power. His quest for Soviet space dogs’ descendants highlights how Cold War-era tensions persist in modern curiosity, while Signal remains a rare ethical alternative amid growing digital control fragmentation. [Automatically generated summary]
Okay, well, you know, I think ultimately what we're trying to do with Signal is stop mass surveillance to bring some normality to the internet and to explore a different way of developing technology that might ultimately serve all of us better.
Yeah, it's a messaging app, but it's somewhat different from the way the rest of technology works because it is encrypted.
So...
Typically, if you want to send somebody a message, I think most people's expectation is that when they write a message and they press send, that the people who can see that message are the person who wrote the message and the intended recipient.
But that's not actually the case.
There's tons of people who are in between, who are monitoring these things, who are collecting data information.
And Signal's different because we've designed it so that we don't have access to that information.
So when you send an SMS, that is the least secure of all messages.
So if you have an Android phone and you use a standard messaging app and you send a message to one of your friends, that is the least of all when it comes to security, right?
So iPhones use iMessage, which is slightly more secure, but it gets uploaded to the cloud, and it's a part of their iCloud service, so it goes to some servers and then goes to the other person.
It's encrypted along the way, but it's still, it can be intercepted.
Fundamentally, there's two ways to think about security.
One is computer security, this idea that we'll somehow make computers secure.
We'll put information on the computers, and then we'll prevent other people from accessing those computers.
And that is a losing strategy that people have been losing for 30 years.
Information ends up on a computer somewhere, and it ends up compromised in the end.
The other way to think about security is information security, where you secure the information itself, that you don't have to worry about the security of the computers.
You could have some computers in the cloud somewhere, information's flowing through them, and people can compromise those things and it doesn't really matter because the information itself is encrypted.
And so, you know, things like SMS, you know, the iMessage cloud backups, most other messengers, Facebook Messenger, all that stuff, you know, they're relying on this computer security model And that ends up disappointing people in the end.
Well, because the way the internet works today is insane.
Fundamentally, I feel like private communication is important because I think that change happens in private.
Everything that is fundamentally decent today started out as something that was a socially unacceptable idea at the time.
You look at things like abolition of slavery, legalization of marijuana, legalization of same-sex marriage, even constructing the Declaration of Independence.
Those are all things that required a space for people to process ideas outside the context of everyday life.
Those spaces don't exist on the internet today.
I think it's kind of crazy the way the internet works today.
If you imagined You know, every moment that you were talking to somebody in real life, there was somebody there just with a clipboard, a stranger, taking notes about what you said.
That would change the character of your conversations.
And I think that in some ways, like, we're living through a shortage of brave or bold or courageous ideas, in part because people don't have the space to process what's happening in their lives outside of the context of everyday interactions, you know?
That's a really good way to put it, because you've got to give people a chance to think things through.
But if you do that publicly, they're not going to.
They're going to sort of like basically what you see on Twitter.
If you stray from what is considered to be the acceptable norm or the current ideology or whatever opinions you're supposed to have on a certain subject, You get attacked, ruthlessly so.
So you see a lot of self-censorship, and you also see a lot of virtue signaling, where people sort of pretend that they espouse a certain series of ideas because that'll get them some social cred.
I think that communication in those environments is performative.
You're either performing for an angry mob, you're performing for advertisers, you're performing for the governments that are watching.
And I think also the ideas that make it through are kind of tainted as a result.
Did you watch any of the online hearing stuff that was happening over COVID? You know, where city councils and stuff were having their hearings online?
It was kind of interesting to me because it's like, you know, they can't meet in person, so they're doing it online.
And that means that the public comment period was also online, you know?
And so it used to be that, like, you know, if you go to a city council meeting, they have a period of public comment where, you know, people can just stand up and say what they think, you know?
And, like, ordinarily, it's like, oh, you got to go to city hall, you got to, like, wait in line, you got to sit there, you know?
But then when it's on Zoom, it's just sort of like anyone can just show up on the Zoom thing.
You know, they just dial in and they're just like, here's what I think, you know?
And...
You know, it was kind of interesting because particularly when a lot of the police brutality still was happening in Los Angeles, I was watching the city council hearings and people were just like, you know, they were just calling, you know, like, fuck you!
I yield the rest of my time, fuck you!
You know, it was just like really brutal and not undeservedly so.
You know, what was interesting to me was just watching the politicians, basically, you know, who just had to sit there, and just, they were just like...
And it was just like, you know, you get three minutes, and then there's someone else to get, you know, and they're just like, okay, and now we'll hear from, you know, like...
And, you know, watching that, you sort of realize that it's like, to be a politician, you have to just sort of fundamentally not really care what people think of you, you know?
You have to fundamentally just be comfortable sitting, you know, and having people yell at you, you know, in three minute increments for an hour or whatever, you know.
And so it seems like what we've sort of done is like bred these people who are willing to do that, you know.
And in some ways that's like a useful characteristic, but in other ways that's the characteristic of a psychopath, you know.
No, but I think, you know, Trump is perfectly capable of just not caring.
You know, just like people, like, you know, Grayson is just like, yeah, whatever, you know, I'm the best, they don't, you know.
And, like, that's, you know, that's politics.
But I think, you know, the danger is when that, you know, to do anything ambitious, you know, outside of politics or whatever, you know, requires that you're capable of just not caring, you know, what people think or whatever, because everything is happening in public.
I think you made a really good point in that change comes from people discussing things privately because you have to be able to take a chance.
You have to be daring and you have to be able to confide in people and you have to be able to say, hey, this is not right and we're going to do something about it.
If you do that publicly, the powers that be that do not want change in any way, shape, or form, they'll come down on you.
This is essentially what Edward Snowden was warning everyone about when he decided to go public with all this NSA information.
We're saying, look, this is not what we signed up for.
Someone's constantly monitoring your emails, constantly listening to phone calls.
This is not this mass surveillance thing.
It's very bad for just the culture of free expression, just our ability to have ideas and to be able to share them back and forth and vet them out.
I think when you look at the history of that kind of surveillance, there are a few interesting inflection points.
At the beginning of the internet as we know it, in the early to mid-90s, there were these DOD efforts to do mass surveillance.
They were sort of open about what they were doing.
One of them was this program called Total Information Awareness.
And they were trying to start this office, I think called the Total Awareness Office or something within the DoD.
And the idea was they're just going to collect information on all Americans and everyone's communication and just stockpile it into these databases and then they would use that to mine those things for information.
It was sort of like their effort to get in on this at the beginning of the information age.
And, you know, it was ridiculous.
You know, it's like they called it Total Information Awareness.
They had a logo that was like, you know, the pyramid with the eye on top of it.
And then instead, what ended up happening was data naturally accumulated in different places.
Back then, what they were trying to do is be like, our proposal is that everyone carry a government-mandated tracking device at all times.
What do you guys think?
It'll make us safer.
And people were like, no, I don't think so.
But instead, everyone ended up just carrying cell phones at all times, which are tracking your location and reporting them into centralized repositories that government has access to.
And so, you know, this sort of like oblique surveillance infrastructure ended up emerging.
And that was what, you know, people sort of knew about, but, you know, didn't really know.
And that's what Snowden revealed.
It was like, you know, we don't have this.
Instead, it's like all of those things are happening naturally, you know.
You know, gate detection, fingerprint, you know, like all this stuff's happening naturally.
It's ending up in these places.
And then...
You know, governments are just going to those places and getting the information.
And then I think, you know, the next inflection point was really Cambridge Analytica.
You know, that was a moment where I think people were like...
Cambridge Analytica was a firm that was using big data in order to forecast and manipulate people's opinions.
In particular, they were involved in the 2016 election.
It was sort of, you know, so it's like, you know, what Stone have revealed was PRISM, which was the cooperation between the government and these places where data was naturally accumulating, like Facebook, Google, etc., you know, and the phone company.
And Cambridge Analytica, I think, was the moment that people were like, oh, there's like also sort of like a private version of PRISM, you know, that's like not just governments, but like the data is out there.
And other people who are motivated are using that against us, you know?
And so I think, you know, in the beginning it was sort of like, oh, this could be scary.
And then it was like, oh, but, you know, we're just using these services.
And then people were like, oh, wait, the government is, you know, using the data that we're, you know, sending to these services.
And then people were like, oh, wait, like anybody can use the data against us.
And they were like, oh, you know, it's like, I think things went from like, I don't really have anything to hide to like, wait a second, these people can...
Predict and influence how I'm going to vote based on what kind of jeans I buy?
And then sort of where we are today, where I think people are also beginning to realize that the companies themselves that are doing this kind of data collection are also not necessarily acting in our best interests.
Okay, I mean, I think you can say, like, no one anticipated that these things would be this significant.
But I also think that there's, you know, I think ultimately, like, what we end up seeing again and again is that, like, bad business models produce bad technology, you know?
That, like...
Mark Zuckerberg did not create Facebook because of his deep love of social interactions.
He did not have some deep sense of wanting to connect people and connect the world.
That's not his passion.
Jeff Bezos did not start Amazon because of his deep love of books.
These companies are oriented around profit.
They're trying to make money.
And they're subject to external demands as a result.
They have to grow infinitely, which is insane, but that's the expectation.
And so what we end up seeing is that the technology is not necessarily in our best interest because that's not what it was designed for to begin with.
Yeah, and that's why, I mean, I think the Silicon Valley obsession with China is a big part of that, where people, they're just like, wow, that's a lot of people there.
Someone had to stick a – like, literally, they're getting it out of the ground, digging into the dirt to get it out of the ground.
We were talking about it on the podcast.
They were like, is there a way that this could – is there a future that you could foresee where you could buy a phone that is guilt-free?
If I buy a pair of shoes, like I bought a pair of boots from my friend Jocko's company.
He's got a company called Origin.
They make handmade boots.
And it's made in a factory in Maine.
You can see a tour of the factory.
These guys are stitching these things together, and it's a real quality boot.
And I'm like, I like that I could buy this.
I know where it came from.
I could see a video of the guys making it.
This is a thing that I could feel like...
I am giving them money.
They're giving me a product.
There's a nice exchange.
It feels good.
I don't feel like that with a phone.
With a phone, I have this bizarre disconnect.
I try to pretend that I'm not buying something that's made in a factory where there's a fucking net around it because so many people jump to their deaths that instead of trying to make things better, they say, we're going to put nets up, catch these fuckers, put them back to work.
Is it possible...
That we would all get together and say, hey, enough of this shit.
Will you make us a goddamn phone that doesn't make me feel like I'm supporting slavery?
Yeah, I mean, but okay, so, you know, I feel like it's difficult to have this conversation without having a conversation about capitalism, right?
Because, like, ultimately, you know, what we're talking about is, like, externalities, that the prices of things don't incorporate their true cost, you know, that, like, you know, we're destroying the planet for plastic trinkets and reality television, you know, like...
From the origin of the materials, like how they're coming...
How they're getting out of the ground, how they're getting into your phone, how they're getting constructed, how they're getting manufactured and assembled by these poor people...
When most people hear about it, they don't like it.
It makes them very uncomfortable.
But they just sort of go, la la la.
They just plug their ears and keep going and buy the latest iPhone 12 because it's cool.
So, like, if you have a car that you know is being made by slaves, or a car that's being made in Detroit by union workers, wouldn't you choose the car, as long as they're both of equal quality?
I think a lot of people would feel good about their choice.
If they could buy something that, well, no, these people are given a very good wage.
They have health insurance and they're taken care of.
They have a pension plan.
There's all these good things that we would like to have ourselves that these workers get.
So you should probably buy that car.
Why isn't there an option like that for a phone?
We looked at this thing called a fare phone.
We're going over it.
Can't even fucking buy it in America.
Like, no, America has no options for fare.
They only have them in, like, Holland and a couple other European countries.
I mean, you know, even agriculture, you know, it's just like, you know, the sugar you put in your car, you know, it's like, I've been to the sugar beet harvest, you know, it's apocalyptic, you know, it's like, you know, so I think there's just like an aspect of civilization that we don't usually see or think about.
Not non-conscious, but I mean conscious capitalism would be the idea that you want to make a profit, but you only want to make a profit if everything works.
Like the idea of me buying my shoes from origin.
Like knowing, okay, these are the guys that make it.
This is how they make it.
This makes me feel good.
I like this.
If there was that with everything...
If you buy a home from a guy who you know built the home, this is the man.
But I think if you put humans together and you give them this diffusion of responsibility that comes from a corporation and then you give them a mandate, you have to make as much money as possible every single year.
And then you have shareholders and you have all these different factors that will allow them to say, well, I just work for the company.
You know, it's not my call.
You know, I just, you know, you got the guy carving up a stake saying, listen, I'm so sorry that we have to use slaves, but look, Apple's worth $5 trillion.
Yeah, I fundamentally agree, and I think that that's, you know, that's...
Anytime you end up in a situation where, like, most people do not have the agency that they would need in order to direct their life the way that they would want, you know, direct their life so that we're living in a sane and sustainable way, that, yeah, I think is a problem.
And I think that's the situation we're in now, you know.
And honestly, I feel like, you know, the stuff that we were talking about before of, you know, people...
You know, sort of being mean online is a reflection of that.
You know, that That's the only power that people have.
The only thing you can do is call someone a name, you're going to call them a name.
And I think that it's unfortunate, but I think it is also unfortunate that most people have so little agency and control over the way that the world works that that's all they have to do.
And I guess you would say also that the people that do have power, that are running these corporations, don't take into account what it would be like to be the person at the bottom of the line.
They've probably done what they think is something.
Even the CEO of a company is someone who's just doing their job at the end of the day.
They don't have ultimate control and agency over how it is that a company performs because they are accountable to their shareholders, they're accountable to the board.
I think there is a tendency for people to look at what's happening, particularly with technology today, And think that it's the fault of the people, the leaders of these companies.
I think it goes both ways.
Slavoj Žižek always talks about when you look at the old political speeches, if you look at the fascist leaders, they would give a speech and when there was a moment of applause, they would just sort of stand there and accept the applause because in their ideology, they were responsible for the thing that people were applauding.
And if you watch the old communist leaders, like when Stalin would give a speech and he would say something and there would be a moment of applause, he would also applaud.
Because in their ideology of historical materialism, they were just agents of history.
They were just the tools of the inevitable.
It wasn't them.
You know, they had just sort of been chosen as the agents of this thing that was an inevitable process.
And so they were applauding history, you know.
Sometimes when I see the CEOs of tech companies give speeches and people applaud, I feel like they should also be applauding.
That it's not them.
Technology has its own agency, its own force that they're the tools of, in a way.
And at this point, if we look at where we are in 2020, it seems inevitable.
It seems like there's just this unstoppable amount of momentum behind innovation and behind just the process of Creating newer, better technology and constantly putting it out and then dealing with the demand for that newer, better technology and then competing with all the other people that are also putting out newer, better technology.
Look what we're doing.
We are helping the demise of human beings.
Because I feel, and I've said this multiple times and I'm going to say it again, I think that we are the electronic caterpillar that will give way to the butterfly.
We don't know what we're doing.
We are putting together something that's going to take over.
We're putting together some ultimate being, some symbiotic connection between humans and technology, or literally an artificial version of life, not even artificial, a version of life constructed with silicone and wires and things that we're making.
If we keep going the way we're going, we're going to come up with a technology that I think we're a ways away.
The Turing test is if someone sat down with, like in Ex Machina, Remember, it was one of my all-time favorite movies, where the coder is brought in to talk to the woman, and he falls in love with the robot lady, and she passes the Turing test, because he's in love with her.
I mean, he really can't differentiate, in his mind, that is a woman, that's not a robot.
I mean, just think that this man back then was thinking there's going to be a time where we will have some kind of a creation where we imitate life, the current life that we're aware of, where we're going to make a version of it that's going to be indistinguishable from the versions that are biological.
That very guy, by whatever twisted ideas of what human beings should or shouldn't do, whatever expectations of culture at the time, is forced to be chemically castrated and winds up committing suicide.
I mean, you know, I don't think about this stuff that often, but it is, you know, it's an empirical test, right?
So it's like, it's a way to avoid having to define what consciousness is, right?
Which is kind of strange.
We're conscious beings and we don't actually really even know what that means.
Right.
And so instead we have this empirical test where it's just sort of like, well, if you can't tell the difference without being able to see it, then we'll just call that.
Yeah, I mean, I think that there's a lot of, most of what I see in like the artificial intelligence world right now is not really intelligence, you know, it's, it's just matching, you know, it's like you show a model, an image of 10 million cats, and then you can show it an image, and it will be like, I predict that this is a cat.
And then you can show it an image of a truck, and it'll be like, I predict that this is not a cat.
I think there's one way of looking at it that's like, well, you just do that with enough things enough times, and that's what intelligence is.
But I kind of hope not.
The way that it's being approached right now, I think, is also dangerous in a lot of ways, because what we're doing is just feeding information about the world into these models, and that just encodes the existing biases and problems with the world into the things that we're creating.
That, I think, has negative results.
But it's true.
This ecosystem is moving and it's advancing.
The thing that I think is unfortunate is that right now, that ecosystem, this really capital-driven investment startup ecosystem, has a monopoly on groups of young people trying to do something ambitious together in the world.
In the same way that I think it's unfortunate that grad school has a monopoly on groups of people learning things together.
Part of what we're trying to do different with Signal is it's a non-profit because we want to be for something other than profit.
We're trying to explore a different way of groups of people doing something mildly ambitious.
It's kind of amazing, though, that you guys have figured out a way to create, like, basically a better version of iMessage that you could use on Android.
Because one of the big complaints about Android is the lack of any encrypted messaging services.
So Google, for Android, makes an app called Messages, which is just the standard SMS texting app.
And they put that on the phones that they make, like the Pixel and stuff like that, you know.
And then there's the rest of the ecosystem.
You know, there's like, you know, Samsung devices, Huawei devices, you know, all this stuff.
And it's sort of...
It depends, you know, what's on those things.
And...
So, they've been trying to move from this very old standard called SMS that you mentioned before to this newer thing called RCS, which actually I don't know what that stands for.
I think in my mind I always think of it as standing for too little too late.
But they're trying to move to that.
So they're doing that on the part of the ecosystem that they control, which is the devices that they make and sell.
And they're trying to get other people on board as well.
Originally, RCS didn't have any facility for end-to-end encryption.
And they're actually using our stuff, the Signal Protocol, in the new version of RCS that they're shipping.
So I think they've announced that, but I don't know if it's on or not.
With the discovery question of you don't want people to know that you're on Signal, it's kind of So, we're working on it, but it's a more difficult problem than you might imagine because you want some people to know that you're on...
So a lot of people like knowing who they can communicate with.
And the other thing is we try to square the actual technology with the way that it appears to work to people.
So right now, with most technology, it seems like you send a message and the person who can see it is the person who received the message.
You sent the message to, you know, the intent recipient, you know?
And that's not how it actually works.
And so, like, a lot of what we're trying to do is actually just square the way the technology actually works with what it is that people perceive.
And so, like, fundamentally, right now, you know, Signal is based on phone numbers.
If you register with your phone number, like, people are going to know that they can contact you on Signal.
It's very difficult to make it so that they can't, you know, that, like, If we didn't do that, they could hit the compose button and see just that they could send you a message.
They would just see you in the list of contacts that they can send messages to.
And then if we didn't display that, they could just try and send you a message and see whether a message goes through.
It's always possible to detect whether it is that you're on Signal the way that things are currently designed.
But I mean, a lot of that is like, I think the reason why it is that way is kind of interesting to me, which is, you know, it's like these are protocol, you know, it's like when you're just using a normal SMS message on Android, you know, that was like this...
agreement that phone carriers made with each other in like, you know...
And I think, I mean, it's like, I think the thing that everyone's worried about right now with Apple is like, you know, Apple, you know what I said before of like bad business models produce bad technology.
You know, thus far, Apple's business model is much better than, you know, Google or Facebook or Amazon or, you know, like they're Their business is predicated on selling phones, selling hardware.
And that means that they can think a little bit more thoughtfully about the way that their software works than other people.
And I think what people are concerned about is that that business model is going to change.
Approaching an asymptote of how many phones that they can sell.
And so now they're looking at software.
They're like, what if we had our own search engine?
What if we had our own thing?
And the moment that that starts to happen, then they're sort of moving in the direction of the rest of big tech.
Which, you know, who knows how they do it, but that's what I think people are concerned about.
They've done a better job at protecting your privacy, though, in terms of, like, particularly Apple Maps.
Like, their Map app is far superior in terms of sharing your information than, say, like, the Google Maps.
But the argument you could make is that Google Maps is a superior product because they share that information.
Google Maps is also Waze now, right?
They bought Waze, which is fantastic.
It lets you know where the cops are, there's an accident up ahead, all kinds of shit, right?
But Apple Maps is not that good.
I use it because I like the ethic behind it.
I like their idea behind it.
They delete all the information after you make...
If you go to a destination, it's not saving it, sending it to a server, and making sure it knows what was there and what wasn't there and how well you traveled and sharing information.
But I think if you were the CEO of Apple and you were like, this is a priority, we're going to spend, you know, however many trillions of dollars it takes to do this...
2030 means transitioning hundreds of our manufacturing suppliers to 100% renewable sources of electricity.
Well, that's interesting.
If they can actually do that, 100% resource, if they can figure out a way to do that, And to have recyclable materials and have all renewable electricity, whether it's wind or solar, if they could really figure out how to do that, I think that would be pretty amazing.
But who's going to put it together?
Are they going to still use slaves to put it together?
I mean, I guess the people that are working at Foxconn are technically slaves, but would you want your child to work there?
And that's the, when you get into these insidious arguments about, or conversations about conspiracies, like conspiracies to keep people impoverished, they're like, well, why would you want to keep people impoverished?
Well, who's going to work in the coal mines?
You're not going to get wealthy, highly educated people to work in the coal mines.
You need someone to work in the coal mines.
So what do you do?
What you do is you don't help anybody get out of these situations.
So you'll always have the ability to draw from these impoverished communities, these poor people that live in Appalachia or wherever their coal miners are coming from.
There's Not a whole lot of ways out.
Like, I have a friend who came from Kentucky, and he's like, the way he described it to me, he goes, man, you've never seen poverty like that.
Like, people don't want to concentrate on those people because it's not as glamorous as some other forms of poverty.
That's why it's rare that a company comes along and has a business plan like Signal where they're like, we're going to be non-profit.
We're going to create something that we think is of extreme value to human beings, just to civilization in general, the ability to communicate anonymously or at least privately.
It's a very rare thing that you guys have done, that we decided to do this and to do it in a non-profit way.
And that ecosystem is moving, and it's moving really fast.
There's a lot of money behind it, a lot of energy in it.
And if you aren't moving with it, it will just...
Stop working.
And also, it's like, you know, a project like this is not just the software that runs on your phone, but the service of, like, you know, moving the messages around on the internet, and that requires a little bit of care and attention, and if you're not doing that, then it will dissipate.
Yeah, well, okay, so, you know, the history of this was, um, I think before the internet really took over our lives in the way that it has, there were the kind of social spaces for people to experiment with different ideas outside of the context of their everyday lives, you know, like art projects, punk rendezvous, experimental gatherings.
The embers of art movements.
These spaces existed and were things that I found myself in and a part of.
Well, it's funny because at the time that I went, people were like, oh man, it's not like it used to be.
And now people are like, have you been?
I was like, I went once in 2000. Like, wow, wow, that's when it was like the real deal.
I'm like, I don't think so.
It's one of those things where it's like, you know, there's like day one and then on day two, they're like, ah, it's not like day one.
Right, of course, of course.
But yeah, I don't know.
Those things, those spaces were important to me and like an important part of my life.
And As more of our life started to be taken over by technology, Me and my friends felt like those spaces were missing online.
We wanted to demonstrate that it was possible to create spaces like that.
there had been a history of people thinking about cryptography in particular.
And, and which is kind of funny in hindsight.
So in the like eighties, so the history of cryptography is actually not long, like at least in outside of the military, you know?
It really starts in the 70s.
There were some really important things that happened then.
In the 80s, there was this person who was this lone maniac who was writing a bunch of papers about cryptography during a time when it wasn't actually that relevant because there was no internet.
The applications for these things were harder to imagine.
And then in the late 80s there was this guy who wrote a Who was a retired engineer who discovered the papers that this maniac, David Chum, had been writing and was really...
But he did a lot of the notable work on using the primitives that had already been developed.
And he had a lot of interesting ideas and...
There's this guy who was a retired engineer, his name was Tim May, who was kind of a weird character.
And he found these papers by David Chum, was really enchanted by what they could represent for a future.
And he wanted to write like a sci-fi novel that was sort of predicated on a world where cryptography existed and there was a future where the internet was developed.
And so he wrote some notes about this novel, and he titled the notes The Crypto Anarchy Manifesto.
And he published the notes online, and people got really into the notes.
And then he started a mailing list in the early 90s called the Cypherpunks mailing list.
And all these people started, you know, joined the mailing list and they started communicating about, you know, what the future was going to be like and how, you know, they needed to develop cryptography to live their, you know, crypto-anarchy future.
And at the time, it's strange to think about now, but cryptography was somewhat illegal.
So if you wrote a little bit of crypto code and you sent it to your friend in Canada, that was the same as, like, shipping Stinger missiles across the border to Canada.
I don't know of any situations where people were tracked down as munitions dealers or whatever, but it really hampered what people were capable of doing.
So people got really creative.
There were some people who wrote some crypto software called Pretty Good Privacy, PGP. And they printed it in a book, like an MIT Press book, in a machine-readable font.
And then they're like, this is speech.
This is a book.
I have my First Amendment right to print this book and to distribute it.
And then they shipped the books to Canada and other countries and stuff, and then people in those places scanned it back in.
To computers.
And they were able to make the case that they were legally allowed to do this because of their First Amendment rights.
And other people moved to Anguilla and started writing code in Anguilla and shipping it around the world.
There were a lot of people who were fervently interested.
It's a United States regulatory regime to restrict and control the export of defense and military-related technologies to safeguard U.S. national security and further U.S. foreign policy objectives.
ITAR. Yeah, they were closed and Gila was closed until like November.
They wouldn't let anybody in.
And yeah, if you want to go there, they have like, I was reading all these crazy restrictions.
You have to get COVID tested and you have to apply.
And then when you get there, they test you when you get there.
I discovered sailing by accident where I was like...
Working on a project with a friend in the early 2000s, and we were looking on Craigslist for something unrelated, and we saw a boat that was for sale for $4,000.
And I thought a boat was like a million dollars or something.
I was just like, what?
The sailboats are $4,000?
And this is just some listing.
There's probably even cheaper boats, you know?
And so we got really into it, and we discovered that you can go to any marina in North America and get a boat for free.
You know, that like every marina has a lean sail dock on it where people have stopped paying their slip fees, and the boats are just derelict and abandoned, and they've You know, put it on these stocks.
Because it's pretty amazing, you know, me and some friends used to sail around the Caribbean and...
You know, the feeling of, like, you know, you pull up an anchor, and then you sail, like, you know, 500 miles to some other country or whatever, and you get there, and you drop the anchor, and you're just like, we...
It was just the wind.
The wind that took, you know, like, there was no engine, there was no fuel.
It was just the wind, you know, and you catch fish, and, you know, it's just like...
So then you're just like, okay, well, we started here, and then we headed on this heading, and we did that, and we traveled 10 miles, so we must be here.
And then once a day, you can take a sight with your sextant, and then you can do some dead reckoning with a compass.
But, like, yeah, I was also, like, just weirdly ideological about it, where, like, I had a job once in the Caribbean that was, like, I was almost like a camp counselor, basically, where there was this camp that was like a sailing camp, but it was, like, 13 teenagers, mostly from North America.
Showed up in St. Martin and then got on a boat with me and another woman my age.
And we were like the adults.
And it was just like we sailed from St. Martin to Trinidad over the course of six weeks with these like 13 kids on a 50-foot sailboat.
So whenever they wanted anything, I would be like, all right, rock, paper, scissors.
You know, they were like, can we like do this thing?
I'd be like, all right, we'll do rock, paper, scissors.
If you win, you can do this thing.
If I win, and then I would like pick the thing that was like their sort of deepest fear, you know, it's like the really shy person had to like write a haiku about every day and then read it aloud at dinner.
You know, like the, you know, the person who was like really into like having like a manicure, like wasn't allowed to shave her legs for the rest of the, you know, like that kind of thing.
And so then by the end of it, it was just like, you know, everyone had lost, you know, so everyone was like reading the haiku at dinner and doing, you know.
The only way, if you ever put a monetary equivalent to that, it would have to be a spectacular amount of money for me to let someone else program the show.
Was it trial by fire when you were learning how to use all this, I mean, I don't want to call it ancient equipment, but mechanical equipment to figure out how to...
Uh, yeah, just, I would, you know, I started, uh, you know, me and some friends got a boat and, um, we started fixing it up and making a lot of mistakes and then, you know, started taking some trips and then...
A friend of mine was living in San Francisco and he wanted to learn how to sail.
And I was like, you know, what you should do is you should get like a little boat, like a little sailing thingy, you know, and then you can just anchor it like off the shore in this area that no one cares about.
And, you know, you could just sort of experiment with this little boat.
And so he started looking on Craigslist and he found this boat that was for sale for 500 bucks up in the North Bay.
And every time we called the phone number, we got an answering machine that was like, hello, you've reached Dr. Ken Thompson, honorary.
We go up there, and it's the kind of situation where, like, we pull up, and there's, like, the trailer that the boat's supposed to go on, and it's just full of scrap metal.
Oh, boy.
And, you know, this guy comes out.
He's like, oh, yeah, this is the trailer.
unidentified
We were going to do a metal run, but if you want the boat, you know, we'll take the metal off, you know?
Yeah, I was wearing a PFT, a Type 2 PFT, and we took it to this boat ramp, and it was the end of the day, and the wind was blowing kind of hard, and the conditions weren't that good, but I was like, oh, we're just doing this little thing, this little maneuver, and we were in two boats.
I built this little wooden rowing boat, and my friend was going to go out in that with one anchor, and I was going to sail out this boat.
So he's going to go out in this little rowboat, and I was going to sail out this little catamaran.
And we had two anchors, and we're going to anchor it, and then we're going to get in the rowboat and row back.
And it seemed a little windy, and I got in the boat first, and I got out around this pier and was hit by the full force of the wind and realized that it was blowing like 20 knots.
It was way, way too much for what we were trying to do.
But I had misrigged part of the boat, so it took me a while to get it turned around.
And by the time I got it turned around, my friend had rowed out around the pier, and he got hit by the force of the wind and just got blown out into the bay.
So he's rowing directly into the wind and moving backwards.
And I'm on this little Hobie Cat, and it was moving so fast.
It was way too windy to be sailing this thing.
I've got just my clothes on.
I don't have a wetsuit on or anything like that.
I have a life jacket and just my clothes.
And we don't have a radio.
We're unprepared.
It's starting to get dark.
We don't have a light.
And I'm sailing back and forth trying to help my friend.
And it got to the point where I was like, all right, I'm just going to tack over.
I'm going to sail up to this boat that was called the Sea Louse.
Sail up to the Sea Louse.
I'm going to get my friend off of it.
We're just going to abandon it.
And then we're going to sail this Hobie Cat back.
If we can.
And so I go to turn around, and right as I'm turning around, a gust of wind hit the boat and capsized it before I could even know that it was happening.
It's one moment, you're on the boat, and the next moment you're in the water.
And the water is like 50 degrees.
It's a shock when it hits you.
And the boat was a little messed up in a way where I couldn't write it.
It had capsized, and then it capsized all the way and then sank.
So it was floating like three feet underwater, basically.
And so I'm in the water, but I'm still a little bit out of the water, but in the water.
And I had a cell phone that just immediately was busted.
And I look at my friend, and he's a ways away now.
He didn't see me, and I was yelling as loud as I could, but the wind is blowing 20 knots, and you can't hear each other.
It just takes your voice away.
I was screaming, I was waving, he wasn't wearing his glasses, and he just very slowly rode away.
So you have to swim, you know, swim into the wind and into the wind wave and all that stuff.
And eventually I tried swimming and I swam, you know, directly upwind.
And I was because I was I was like, OK, like if I get separated from this boat and I don't make it to shore, then I'm definitely dead.
You know, like there's just no saving me.
So I was trying to go directly upwind so that if I felt like I couldn't make it, I would float back down when it hit the boat again.
And so I tried, you know, I swam for probably like 20 minutes upwind and made no progress.
It didn't feel like any progress.
You know, in 50 degrees, you have 30 to 60 minutes before you black out.
My arms were just, you know, it's like I consider myself a strong swimmer.
Like I free dive, you know, all this stuff.
And I just, you know, it's like you read these stories about...
How people die.
They succumb to hypothermia on a local hike or they drown in the bay.
And the story's always like, well, Timmy was a strong swimmer.
And you're like, really?
Was Timmy really a strong swimmer?
Because he drowned in the bay.
And floating there, it just all came to me.
I'm like, wow, this is how this happens.
You just make a series of pretty dumb, small decisions until you find yourself floating in the dark in the bay.
There's no one around.
And it's a really slow process, too.
You just come to terms with the idea that you're not going to make it.
And it's not sudden.
It's not like someone shot you or you got hit by a bus or something like that.
It's like this hour-long thing that you're getting dragged through all alone.
And you realize that no one will ever even know What this was?
You know, how this happened?
And you think about all the people like Joshua Slocum, Jim Gray, people who were lost at sea, and you realize they all had this thing that they went through, you know, this hour-long ordeal of just floating alone, and no one will even ever know what that was or what that was like, you know?
And eventually, I realized I wasn't going to make it ashore.
I looked back.
The boat was, like, way far away from me.
I started, you know, drifting back towards it.
I was still trying to swim.
I realized at some point that I wasn't going to hit it.
I wasn't going to hit the boat on the way back downwind.
And I had to just give it all that I had to try to connect with the boat, you know, to stop myself from getting blown past it.
And in that moment, too, you realize that, like...
Uncertainty is the most unendurable condition.
You imagine yourself making it to shore and relaxing, just knowing that it's resolved.
And in that moment of like, I might not make it back to this boat, you're tempted to give up because it's the same resolution.
It's the feeling of just knowing that the uncertainties have been resolved.
And you have to really remind yourself that it's not the same.
You have to give it everything you have in order to survive.
That feeling that you're longing for is not actually the feeling that you want.
And I just barely got the end of a rope that was trailing off the back of the hull.
Pulled myself back on it.
Almost threw up.
Then I had to...
Then I was just floating there with the hole three feet underwater.
I tied myself to it.
I started to get tunnel vision.
And really, at the last minute, a tugboat started coming through the area.
And it was coming straight at me, actually.
And I realized that it probably just wouldn't even see me.
It would just run me over and not even know that...
I had been there.
It's totally possible.
I was trying to wave.
I could barely lift my arm.
I was trying to scream.
I could barely make any noise.
Somehow they saw me.
It took them 15 minutes to get a rope around me.
They started pulling me up the side of the boat.
Lining every tugboat is tires.
Tires, usually.
It has a fender.
I got wedged in the tires as they were pulling me up.
And I knew what was happening.
And I was like, all I have to do is stick my leg out and push against the hull of the boat to go around the tires.
And I couldn't do it.
And I could barely see.
And they swung me around and eventually pulled me up.
They put me in next to the engines in the engine room.
I couldn't even feel the heat.
And they called the Coast Guard.
And the Coast Guard came and got me.
It was really embarrassing.
And the Coast Guard guys like, He's got all these blankets over me and he's trying to talk to me to keep me alert.
And he's like, so is this your first time sailing?
And I have a commercial, like a 250-ton master's license.
How much did that shift your direction in your life though?
Did it change like the way like it seems almost I mean I haven't had a near-death experience but I've had a lot of psychedelic experiences and in some ways I think they're kind of similar and that life shifts to the point where whatever you thought of life before that experience is almost like oh come on that's nonsense Yeah, I mean, it changes your perspective, or it did for me.
And, you know, because also in that moment, you know, it's like, you know, I think you go through this sort of embarrassing set of things where you're like, oh, I had these things I was going to do tomorrow.
Like, I'm not going to be able to do them.
And then you're like, wait, why is that the thing that I'm concerned about?
Because Jamie and I were talking about that one day.
Because they had to do something because if they didn't do something, Justin Bieber would be the number one topic every day, no matter what was happening in the world.
Okay, so there's, you know, people talk about, like, invisible labor.
Like, the invisible labor behind that tweet is just kind of comical, because it's like, when he did that, you know, people, like, you know, it's like my first day there, you know, it's like he tweeted something, and, you know, the building's, like, kind of shaking, and, like, alarms are going off.
People are, like, scrambling around, you know?
And it was just this...
You know, it's like this realization where you're just like, never in my life did I think that anything Justin Bieber did would like really affect me in any like deep way, you know?
And then here I am just like scrambling around to like facilitate.
What are your thoughts on curating what trends and what doesn't trend and whether or not social media should have any sort of obligation in terms of...
How things, whether or not people see things, like shadow banning and things along those lines.
I'm very torn on this stuff because I think that things should just be.
And if you have a situation where Justin Bieber is the most popular thing on the internet, that's just what it is.
It is what it is.
But I also get it.
I get how you would say, well, this is going to fuck up our whole program, like what we're trying to do with this thing.
Well, what you're trying to do with Twitter, I mean, I would assume what you're trying to do is give people a place where they could share important information and, you know, have people, you know...
I mean, Twitter has been used successfully to overturn governments.
I mean, Twitter has been used to...
Break news on very important events and alert people to danger.
There's so many positive things about Twitter.
If it's overwhelmed by Justin Bieber and Justin Bieber fan accounts, if it's overwhelmed, then the top ten things that are trending are all nonsense.
I could see how someone would think we're going to do a good thing by suppressing that.
Well, I mean, I don't know about that specific situation.
I mean, I think, you know, looking at the larger picture, right, like...
In a way, you know, it's like, if you think about, like, 20 years ago, whenever anybody talked about, like, society, you know, everyone would always say, like, the problem is the media.
It's like the media, man.
You know, if only we could change the media.
And a lot of people in who were interested in, like, a better and brighter future were really focused on self-publishing.
Their whole conference is about an underground publishing conference, now the Allied Media Conference.
People were writing zines.
People were, you know, getting their own printing presses.
We were convinced that if we made publishing more equitable, if everybody had the equal ability to produce and consume content, that the world would change.
In some ways, what we have today is the fantasy of those dreams from 20 years ago.
In a couple ways.
One, it was the dream that if a cop killed some random person in the suburbs of St. Louis, that everyone would know about it.
Everyone knows.
And also, that anybody could share their weird ideas about the world.
And I think, in some ways, we were wrong.
You know, that we thought, like, you know, the word we got today is like, yeah, like, if a cop kills somebody in the suburbs of St. Louis, like, everybody knows about it.
I think we overestimated how much that would matter.
And I think we also believed that the things that everyone would be sharing were, like, our weird ideas about the world.
And instead, we got, like, you know, Flat Earth and, like, you know, anti-vax and, like, you know, all this stuff, right?
And so it's, like, in a sense, like, I'm glad that those things exist because they're, like, they're sort of what we wanted, you know?
But I think what we did, what we underestimated is, like, how important the medium is.
Like, the medium is the message kind of thing.
And that, like, What we were doing at the time of writing zines and sharing information, I don't think we understood how much that was predicated on actually building community and relationships with each other.
Like, what we didn't want was just, like, more channels on the television.
And that's sort of what we got, you know?
And so I think, you know, it's like everyone is, like, on YouTube trying to monetize their content, whatever, you know?
And that, it's the same thing.
Like, bad business models produce, like, bad technology and bad outcomes.
And so I think there's concern about that.
But I think...
I think, like, you know, now that there's, like, you know, these two simultaneous truths that everyone seems to believe that are in contradiction with each other.
You know, like, one is that, like, everything is relative.
Everyone is entitled to their own opinion.
All opinions are equally valid.
And two, like...
Our democracy is impossible without a shared understanding of what is true and what is false.
The information that we share needs to be verified by our most trusted institutions.
People seem to simultaneously believe both of these things, and I think they're in direct contradiction with each other.
So in some ways, I think most of the questions about social media in our time are about trying to resolve those contradictions.
But I think it's way more complicated than the way that the social media companies are trying to portray it.
But I think that there's a subtle thing there because I don't know how those things work.
But I think part of what...
If you set aside all of the takedown stuff, all the deplatforming stuff, if you say, okay, Facebook, Twitter, these companies, they don't do that anymore.
They've never done that.
They're still moderating content.
They have an algorithm that decides what is seen and what isn't.
And if you were running for president and you were outside the...
Like, for instance, Twitter banned Brett Weinstein's...
Brett Weinstein had a...
He had a Twitter account that was set up for...
It was Unity 2020. And the idea was, like, instead of looking at this in terms of left versus right, Republican versus Democrat, let's get reasonable people from both sides, like a Tulsi Gabbard and a Dan Crenshaw.
Bring them together and perhaps maybe put into people's minds the idea that, like, this idea, this concept of it has to be a Republican vice president or a Republican president.
Maybe that's nonsense.
And maybe it would be better if we had reasonable, intelligent people together.
Whether there, you know, it's like whether there are, but I think this is the point, you know, it's like whether there, whether it's people, whether it's algorithms, you know, there are forces that are making decisions about what people see and what people don't see, and they're based on certain objectives that I think are most often business objectives.
The other thing that these platforms want is for the content to be ad safe.
It's like maybe advertisers don't...
I don't know.
But I think actually focusing on the outlying cases of this person was deplatformed, this person was intentionally, ideologically not promoted or de-emphasized or whatever.
I think that that, like, obfuscates or, you know, draws attention away from the larger thing that's happening, which is that, like, those things are happening just implicitly all the time.
And that, like, it almost, like, serves to the advantage of these platforms to...
Highlight the times when they remove somebody because what they're trying to do is reframe this is like, okay, well, yeah, we've got these algorithms or whatever.
Don't talk about that.
The problem is there's just these bad people, you know, and we have to decide there's a bad content from bad people and we have to decide, you know, what to do about this bad content and these bad people.
And I think that distracts people from the fact that like the platforms are at every moment making a decision about what you see and what you don't see.
There's a problem of deplatforming, because in many ways, deplatforming decisions are being made based on ideology.
It's a certain specific ideology that the people that are deplatforming the other folks have that doesn't align with the people that are Being de-platformed.
These people that are being de-platformed, they have ideas that these people find offensive or they don't agree with.
Or sometimes they just find themselves in a trap, you know?
A trap.
Well, I think that there's a tendency for a lot of these platforms to try to define some policy about what it is that they want and they don't want.
I feel like that's sort of a throwback to this modernist view of science and how science works and we can objectively and rigorously define these things.
I just don't think that's actually how the world works.
Yeah, it's like relativity at the large scale, quantum physics at the small scale.
And even those things are most likely not true in the sense that they aren't consistent with each other and people are trying to unify them and find something that does make sense at both of those scales.
The history of science is a history of things that weren't actually true.
You know, Bohr's model of the atom, Newtonian physics.
People have these, you know, Copernicus's model of the solar system.
People have these ideas of how things work.
And the reason that people are drawn to them is because they actually have utility.
That it's like, oh, we can use this to predict the motion of the planets.
Oh, we can use this to send a rocket into space.
Oh, we can use this to, you know, have better outcomes, you know, for some medical procedure or whatever.
But it's not actually...
I don't...
I think it's not actually truth.
Like, the point of it isn't truth.
The point of it is that, like, we have some utility that we find in these things.
And I think that...
When you look at the emergence of science and people conceiving of it as a truth, it became this new authority that everyone was trying to appeal to.
If you look at all of the 19th century political philosophy, I mean, okay, I think the question of truth is, like, you know, it's even a little squishy with the hard sciences, right?
But once you get into, like, soft sciences, like social science, psychology, like, then it's even squishier, you know, that, like, these things are really not about truth.
They're about, like, some kind of utility.
And when you're talking about utility, the important question is, like, useful for what and to whom?
You know?
And I think that's just always the important question to be asking, right?
Because, you know, when you look at, like, all the 19th century political writing, it's all trying to frame things in terms of science in this way that it just seems laughable now.
But, you know, like, at the time, they were just like, we're going to prove that communism is, like, the most true, like, social economic system in the world, you know?
Like, there are whole disciplines of that.
People in...
You know, people had like PhDs in that, you know, their whole research departments in the Soviet Union, people doing that.
And we laugh about that now, but I don't think it's that different than like social science in the West, you know?
And so I think, you know, it's like if you lose sight of that, then you can try, then you try to like frame Social questions in terms of truths.
It's like, this is the kind of content that we want, and we can rigorously define that, and we can define why that's going to have the outcomes that we want it to.
But once you get on that road, you're like, okay, well, terrorist stuff.
We don't like terrorist stuff, so we're going to rigorously define that, and then we have a policy, no terrorist stuff.
And then China shows up, and they're like, we've got this problem with terrorists, the Uyghurs.
We see you have a policy.
I think if people from the beginning acknowledged that all of objectivity is just a particular worldview and that we're not going to regularsly define these things in a way of what is true and what isn't, then I think we would have better outcomes.
That's my weird take.
I mean, I think, you know, from the perspective of Signal, you know, it's like, do you know what's trending on Signal right now?
But isn't it, there's a weird thing when you decide that you have one particular ideology that's being supported in another particular ideology that That is being suppressed.
And this is what conservative people feel when they're on social media platforms.
Almost all of them, other than the ones we talked about before, Parler and Gab and the alternative ones, they're all very left-wing in terms of the ideology that they support.
The things that can get you in trouble on Twitter.
And kind of amazing that he didn't do anything along the way while he was witnessing people get deplatformed, and particularly this This sort of bias towards people on the left and this discrimination against people on the right.
There's people on the right that have been banned and shadow banned and blocked from posting things.
You run into this situation where you wonder what exactly is a social media platform.
It's just a small private company and maybe you have some sort of a video platform and there's only a few thousand people on it and you only want videos that align with your perspective.
Okay, you're a private company.
You can do whatever you want.
But when you're the biggest video platform on earth like YouTube and you decide that you are going to take down anything that disagrees with your perspective on how COVID should be handled...
Including doctors.
This is one of the things that happened.
Doctors that were stating, look, there's more danger in lockdowns.
There's more danger in this than there is in the way we're handling it.
There's more danger in the negative aspects of the decisions that are being made than it would be to let people go to work with masks on.
Those videos just get deleted.
Those videos get blocked.
There's people that are opposed to current strategies with all sorts of different things, and those videos get blocked.
So there's an ideological basis in censorship.
And so you have to make a decision like, what are these platforms?
Are these platforms simply just a private company, or is it a town hall?
Is it the way that people get to express ideas?
And isn't the best way to express ideas to allow people to decide, based on the better argument, what is correct and what's incorrect?
Like, this is what freedom of speech is supposed to be about.
It's supposed to be about, you have an idea, I have an idea, these two ideas come together, and then the observers get to go, hmm, okay, well, this guy's got a lot of facts behind him.
This is objective reality.
This is provable.
And this other guy is just a crazy person who thinks the world's hollow.
Okay?
This is the correct one.
There's going to be some people that go, no, there's a suppression of hollow earth and hollow earth is the truth and hollow earth facts and hollow earth theory.
But you've got to kind of let that happen.
You gotta kind of have people that are crazy.
Remember the old dude that used to stand on the corners with the placards on, the world is ending tomorrow?
Well, okay, but I think, in my mind, what's going on is, like, the problem is that it used to be that some person with very strange ideas about the world wearing a sign on the street corner shouting was just a person with very strange ideas about the world wearing a sign on the street corner shouting.
Now, there's somebody, you know, with very strange ideas about the world, and those ideas are being amplified by a billion-dollar company, because there are algorithms that amplify that.
And what I'm saying is that instead of actually talking about that, instead of addressing that problem, those companies are trying to distract us from that discussion by saying, Would the correct way to handle it...
Would it be to make algorithms illegal in that respect?
Like to not be able to amplify or detract?
To not be able to ban, shadow ban, or just to have whatever trends trend.
Whatever is popular, popular.
Whatever people like, let them like it.
And say, listen, this thing that you've done by creating an algorithm that encourages people to interact, encourages people to interact on Facebook, encourages people to spend more time on the computer, what you've done is you've kind of distorted what is valuable to people.
You've changed it and guided it in a way that is ultimately, perhaps arguably, detrimental to society.
So we are going to ban algorithms.
You cannot use algorithms to dictate what people see or not see.
You give them a fucking search bar, and if they want to look up UFOs, let them look up UFOs.
But don't shove it down their throat because you know they're a UFO nut.
It's complicated because, one, I have no faith in, like when you say ban or make it illegal or whatever, I have zero faith in the government being able to handle this.
Yeah, nor do I. Every time I see a cookie warning on a website, I'm like, okay, these people are not the people that are good.
This is what they've given us after all this time.
These people are not going to solve this for us.
And also, I think a lot of what it is that the satisfaction that people feel and the discomfort that people feel and the concern that people have is a concern about power.
That right now, these tech companies have a lot of power.
And I think that the concern that is coming from government is the concern for their power.
The right has made such a big deal about deplatforming.
And I think it's because they're trying to put these companies on notice.
That it's just like, you know, after 2016, it was just like, big tech has zero allies anymore.
You know, on the left, everyone's just like, you just gave the election to Trump, you know?
And on the right, they're just like, you just removed somebody from YouTube for calling gay people an abomination.
Fuck you.
You know, like, They have no allies.
No one believes in the better and brighter.
No one believes that Google is organizing the world's information.
No one believes that Facebook is connecting the world.
And I think that there's an opportunity there.
That we're in a better situation than we were before.
All the cards are on the table.
People more and more understanding how it is that these systems function.
I think we're increasingly see that people understand that this is really about power, it's about authority, and that we should be trying to build things that limit the power that people have.
If you had your wish, if you could let these social media platforms, whether it's video platforms like YouTube or Facebook or Twitter, if you If you had the call, if they called you up and said, Moxie, we're going to let you make the call.
But I think the way that messaging apps are going, there's a trajectory where a project like Signal becomes more of a social experience.
And that, like, the things that we're building extend beyond just, like, you know, sending messages.
Particularly, I think, as more and more communication moves into group chats and things like that.
And, you know, the foundation that we're building it on is a foundation where we know nothing.
You know, it's like, if I looked up your Signal account record right now of, like, all the information that we had about you on Signal, There's only two pieces of information.
The date that you created the account and the date that you last used Signal.
That's it.
That's all we know.
If you looked on any other platform, your mind would be blown.
Well, I think, you know, some of the stuff that we're working on now of just like moving away from phone numbers, you can have like, you know, a username so that you can like post that more publicly.
And then, you know, we have groups, now you have group links.
And then, you know, maybe we can do something with events.
And we can, you know, that's like, we're sort of moving in the direction of like, an app that's good for communicating with connections you already have to an app that's also good for creating new connections.
Would you think that social media would be better served with the algorithms that are in place and with the mechanisms for determining what's trending in place and for their trust and safety or whatever their content monitoring policy they have now or have it wide open?
That if we, you know, if you look at, like, the metrics, you know, that we talked about, like, you know, what Facebook cares about is just, like, time that you spent looking at the screen on Facebook, you know?
Like, if we were to have metrics, if Signal were to have metrics, you know, our metrics would be, like, what we want is for you to use the app as little as possible, for you to actually have the app open as little as possible, but for the velocity of information to be as high as possible.
So it's like you're getting maximum utility.
You're spending as little time possible looking at this thing while getting as much out of it as you can.
Well, I mean, you know, we're sort of moving in that direction, right?
And it's like, and I think once you start from the principle of like, well, we don't have to have infinite growth.
We don't actually have to have profit.
We don't have to return.
We're not accountable to investors.
We don't have to, you know, satisfy public markets.
We also don't have to build a pyramid scheme where we have like, you know, 2 billion users so that we can monetize them to like, you know, a few hundred thousand advertisers so that we can, you know, like we don't have to do any of that.
And so We have the freedom to pick the metrics that we think are the ways that we think technology should work, that we think will better serve all of us.
Well, that would be great if they could figure out a way to develop some sort of social media platform that just operated on donations and could rival the ones that are operating on advertising revenue.
Because I agree with you that that creates a giant problem.
Do you think that it's a valid argument that conservatives have though?
That they're being censored and that their voice is not being heard?
I know what you said in terms of, you know, that if someone had something on YouTube that said that gay people are unhuman and they should be abolished and banned and delete that video.
I get that perspective.
But I think there's other perspectives, like the Unity 2020 perspective, which is not in any way negative.
Yeah, I mean, I don't know what happened with that, but I feel like what I... I think it could be a part of this thing of just like, well, we create this policy and we have these...
You know, we define things this way, and then a lot of stuff just gets caught up in it.
You know, where it's just like, now you're like taking down content about the Uyghurs because you wanted to do something else.
You know, that if people would just be more honest about, like, there is not really an objectivity...
And, you know, we're looking for these specific outcomes and this is why that I think, you know, maybe we would have better results.
They were organizing for, one, trying to apply the protections that full-time workers and benefits of full-time workers there had to a lot of the temporary workers, like the people who work in security, the people who are working in the cafeteria, the people who are driving buses and stuff like that, who are living a lot more precariously.
But also for creative control over how the technology that they're producing is used.
So Google was involved in some military contracts that were pretty sketch.
Like applying machine learning AI stuff to military technology.
And then finally, there had been a lot of high profile sexual harassment incidents at Google where the perpetrators of sexual harassment were Usually paid large severances in order to leave.
And so they had a list of demands.
And they, like a lot of people walked out.
I don't know what the numbers were, but a lot of people, they managed to organize internally and walked out.
And I think stuff like that is encouraging because, you know, it's like we look at the hearings and it's like the people in Congress don't even know who's the right person to talk to.
You know, it's like, you know, old people talking about But isn't that another issue where you're going to have people who have an ideological perspective?
And that may be opposed to people that have a different ideological perspective, but they're sort of disproportionately represented on the left in these social media corporations.
When you get kids that come out of school, they have degrees in tech, or they're interested in tech, they tend to almost universally lean left.
Like, when it comes to the technology, I don't think people are...
I think what almost everyone can agree is the amount of money and resources that we're putting into surveillance, into ad tech, into these algorithms that are just about increasing engagement, that they're just not good for the world.
And if you put a different CEO in charge...
That person's just going to get fired.
But if the entire company organizes together and says, no, this is what we want.
This is how we want to allocate resources.
This is how we want to create the world, then you can't fire all those people.
So they'd have to get together and unionize and have a very distinct mandate, very clear that we want to go back to do no evil or whatever the fuck it used to be.
Yeah, where they don't really have that as a big sign anymore.
Do you think that would really have an impact, though?
I mean, it seems like the amount of money, when you find out the amount of money that's being generated by Google and Facebook and YouTube, the numbers are so staggering that to shut that valve off, to like...
To shut that spout, good luck.
It's almost like it had to have been engineered from the beginning, like what you're doing at Signal.
Like someone had to look at it from the beginning and go, you know what, if we rely on advertiser revenue, we're going to have a real problem.
Like, there's this, like, in the history of people who are, like, doing...
Like, building cryptography, stuff like that, there was this period of time where the thesis was basically, like, all right, what we're going to do is develop really powerful tools for ourselves, and then we're going to teach everyone to be like us, you know?
And that didn't work because, you know, we didn't really anticipate the way that computers were going.
So I try to be, like, as normal as possible.
You know, I just, like, have, like, a normal setup.
I'm not, like, you know, I haven't...
I used to have a cell phone where I soldered the microphone differently so there was a hard switch that you could turn it off.
Do you feel like you have extra scrutiny on you because of the fact that you're involved in this messaging application that Glenn Greenwald and Edward Snowden and a bunch of other people that are seriously concerned with...
Security and privacy that maybe people are upset at you?
That you've created something that allows people to share encrypted messages?
But in some ways, that means that there's less pressure on me because, you know, it's like if you're the creator of Facebook Messenger and your computer gets hacked, like, that's everyone's Facebook messages are, you know, gone.
Yeah.
And, you know, for me, if, like, my computer gets hacked, I can't access anyone's signal messages whether I get hacked or not, you know?
They would stop you, and they would be like, hey, we just need you to type in your password here so that we can get through the full disk encryption.
And I would be like, no.
And they would be like, well, if you don't do that, we're going to take this, and we're going to send it to our lab, and they're going to get it anyway.
And I would be like, no, they're not.
And they would be like, all right, we're going to take it.
You're not going to have your stuff for a while.
You sure you don't want to type in your password?
I would be like, nope.
And then it would disappear, and it would come back weeks later, and then it's like, How bizarre.
Yeah, they would eventually give you a ticket and then you'd get the selective screening where they would take all the stuff out of your bag and like, you know, filter out your car.
And then at every connection, the TSA would come to the gate of the connecting thing, even though you're already behind security, and do it again at the connection.
Yeah, I was thinking, actually, I was thinking on the way here, it's funny how, like, I remember after the last election, everyone was talking about, like, California leaving the United States.
Yeah, he, like, he lived in California and had been, for years, like, trying to foment this CalExit thing.
And he has all the stats on, you know, why it would be better for California and all this stuff, you know.
And then he sort of thought, well, this isn't working.
And he really liked Russia for some reason.
So he moved to Russia just before the election, not knowing what was going to happen.
And then when Trump won, people were like, wait a second, fuck this.
Maybe California should get out of here.
And they just found this...
Like campaign that already existed and everyone sort of got behind it and he was just like oh shit and he lives in Russia now you know and and but he like didn't really understand um optics I think where he like he like the re the way that people everyone found out that uh he lived in Russia was that he opened a California embassy in Moscow so they like announced like you know CalAXIT has opened the first California embassy like in a foreign country but it was in Moscow and this was right as all the like Russian like stuff was happening you know So
I mean, I was just fascinated, you know, because here's this guy who's, like, doing this kind of ambitious thing, and it just, the optics seem so bad, you know?
I think he reminded me of, like, the Hannah Arendt quote that's like, you know, if the essence of power is deceit, does that mean that the essence of impotence is truth?
You know, that, like...
He sort of believed that just, like, the facts were enough.
You know, it's just, like, the stats of just, like, yeah, we spend this much money on, like, defense spending.
If we, you know, if we stopped, you know, it's like we would have, like...
It's an autonomous region of the country of Georgia.
And it's kind of interesting.
There's all these autonomous regions in the world that are essentially their own countries, you know, but they're not recognized by the UN or other countries, you know.
And it's like, if you want to be a country, it's kind of interesting.
You need a lot of stuff.
You need a flag.
You need a national bird.
You need an anthem or whatever.
And you need a soccer team.
You definitely have to have a soccer team.
Interesting.
So these countries all have their own soccer teams, but they can't play in FIFA because they're not recognized by the UN. So FIFA can't recognize them.
So they have their own league.
It's like the league of unrecognized states and stateless peoples.
I think it's, like, an interesting, you know, it's, like, in a way that I feel like, you know, society moves by, like, pushing at the edges, you know, that, like, it's the fringes that end up moving the center.
I feel like, you know, looking at the margins of the way politics works is an interesting view of, like, how everything else works, you know, that, like, going to Abkhazia, it was so crazy getting there, you know, it's like, You know, we travel all through Russia.
We get to this, like, militarized border.
You go through these three checkpoints that aren't supposed to exist, but obviously exist.
You know, you get to the other side, and it's just the same as where you just were.
You know, you guys fought a brutal civil war, you know, with, like, genocide, like, full-on, you know, like, crazy shit.
I feel like it's this thing you see again and again of the institutions that we're familiar with in the world that exists are the institutions of kings.
It's like police, military, illegal apparatus, tax collectors.
Every moment in history since then has been about trying to change ownership of those institutions.
And, like, you know, just seeing that happen again and again.
And just, like, you know, realizing that it's like maybe what we should be doing is actually trying to get rid of these institutions or change these institutions in some way, you know?
We were talking about religion, and it was discussing the Bible, and they were talking about all the different stories that are in the Bible, many of them that are hundreds of years apart, that were collected and put into that.
Just stop and think about a book that was written literally before...
The Constitution was drafted, and that book is being introduced today as gospel.
And that there's a new book that's going to be written 200 years from now, and that will be attached to the new version of the Bible as well.
And then one day someone will come across this, and it will all be interpreted as the will and the words of God that all came about.
Yeah.
Yeah, yeah, yeah.
But today, the spans of time are far slower, like going from Alan Turing in 1950, being chemically castrated for being gay, to in my lifetime, seeing gay marriage as being something that seeing gay marriage as being something that was very fringe when I was a boy living in San Francisco, to universal across the United States today.
at least mostly accepted by the populace, right?
That this is a very short amount of time where a big change has happened.
And that these changes are coming quicker and quicker and quicker.
I would hope that this is a trend that is moving in the correct direction.
Yeah, certainly there are some things that are getting better, yeah.
And I feel like, to me, it's important to, you know, for a lot of those things, like the things you mentioned, like gay marriage, I think it's important to realize that, like, a lot of those, a lot of that progress would not have happened without the ability to break the law, honestly.
Right, right.
How would anyone have known that we wanted to allow same-sex marriage if no one had been able to have a same-sex relationship because Saudi laws had been perfectly enforced?
How would we know that we want to legalize marijuana if no one had ever been able to consume marijuana?
So I think a lot of the fear around increased surveillance data is that space dissipates.
But, you know, on the other hand, you know, it's like we're living in the apocalypse, you know, that it's like if you took someone from 200 years ago who used to be able to just walk up to the Klamath River and dump a bucket in the water and pull out, you know, 12 salmon and that was, you know, their food.
And you were like, oh, yeah, the way it works today is you go to Whole Foods and it's $20 a pound and it's, you know, pretty good.
You know, they'd be like, what have you done?
Oh, my God.
You used to be able to walk across the backs of the salmon, you know, across the whole river.
An average of 40 to 50 million wild salmon make the epic migration from the ocean to the headwaters of the Bristol Bay every year.
Like, no place on Earth.
The Bristol Bay watershed.
They've been working to try to make this mine a reality for, I think, a couple of decades now.
And people have been fighting tirelessly to educate people on what a devastating impact this is going to have on the ecology of that area and the fact that the environment will be permanently devastated.
There's no way of bringing this back and there's no way of doing this without destroying the environment.
Because the specific style of mining that they have to employ in order to pull that copper and gold out of the ground It involves going deep, deep into the earth to find these reservoirs of gold and copper and sulfur they have to go through and then they have to remove the waste.
And mining companies have invested hundreds of millions of dollars in this and then abandoned it.
So they were like, we can't.
We can't fucking do this.
And then people are like, we can do it.
And then they've got...
And it's other companies that are...
I don't believe the company that's currently involved in this is even an American company.
I think it's a...
It's a foreign company that's trying to...
I think they're from Canada that are trying to do this spectacular...
I don't know which company it is, but it's...
My friend Steve Rinella from the Meat Eater podcast.
I want to recommend this podcast because he's got a particular episode on that where he talks about it.
Let me find it real quick.
Because it's...
It's pretty epic where he talks to this one guy who's dedicated the last 20 years of his life trying to fight this.
Let me just find it real quick because it's pretty intense.
And it's terrifying when you see how close it's come to actually being implemented and how if it happens, there's no way you pull that back.
You can get banned off of YouTube for saying something like that.
I'm joking.
What should we do?
We should make people aware of it and make people aware that this is that there are real consequences to allowing politicians to make decisions that will literally affect human beings for the rest of eternity Because you will never have that population of salmon coming to that particular location that have been going there for millions and millions of years And the reason why you won't have them there is because someone is greedy.
It's really that simple and I mean, we are getting along fine without that copper and without that gold, and we are using the resource of the salmon, and people are employed that are enjoying that resource, and they're also able to go there and see the bears eating the salmon and seeing this incredible wild place.
Alaska is one of the few really, truly wild spots in this country.
And if you get enough greedy assholes together, and they can figure out a way to make this a reality, and with the wrong people in positions of power, that's 100% possible.
Well, I was joking, obviously, about killing that person, but there was a recent one of the Iranian scientists was assassinated, and this brought up this gigantic ethical debate.
And we don't know who did it, whether it was an Israeli army Mossad held a press conference to say, we didn't do it, while wearing t-shirts that said, we definitely did it.
Like, if someone is actively trying to acquire nuclear weapons, and we think that those people are going to use those nuclear weapons, is it ethical to kill that person?
And if that person's a scientist, they're not a...
Yeah, I mean, I think the causality stuff is really hard to figure out, you know.
But I think most of the time it's not about the one person, you know, that it's not, you know, maybe sometimes it is, but I think most, it's just like, I feel like assassination politics in the tech arena does not work, you know, that it's like you can get rid of all the people at the top of these companies and that's not what's going to do it, you know, that there are like these structural reasons why these things keep happening over and over again.
You know, you go down that road and, you know, where things can happen too.
A great example is, so one of the things that came out in a lot of the documents that Snowden released was that the NSA had worked with a standards body called NIST in order to produce a random number generator that was backdoored.
So random numbers are very important in cryptography, and if you can predict what the random numbers are going to be, then you win.
And so the NSA had produced this random number generator that allowed them to predict what the random numbers would be because they knew of this one constant that was in there.
They knew a reciprocal value that you can't derive just by looking at it, but they know because they created it.
And they had what they called a nobody but us backdoor.
No bus.
Nobody but us backdoor.
And they got NIST to standardize this thing, and then they got a company called Jupyter, who makes routers and VPNs and stuff like that.
Juniper, sorry.
To include this in their products.
And so the idea was that, like, the NSA would have these capabilities, they had developed, you know, these vulnerabilities that they could exploit in situations like this, you know, that they could, like, take advantage of foreign powers and stuff like that in ways that wouldn't boomerang back at them.
But what happened was, in, I think, you know, 20 early teens, Juniper got hacked, and somebody secretly changed that one parameter.
That was, like, basically the back door to a different one that they knew the reciprocal value to.
And it's most likely China or Russia that did this.
And then what's kind of interesting is there was a big incident where the OPM, the Office of Personnel Management, I think, was compromised.
And they have records on, you know, foreign intelligence assets and stuff like that.
Their systems were compromised, it seems like, maybe by China.
And what's sort of interesting is that they were running the Juniper networking gear that had been, you know, hacked in this one specific way.
And so it's kind of possible that, like, you know, the NSA developed this backdoor that they were going to use for situations like this, you know, against foreign adversaries or whatever, and that the whole thing just boomeranged back at them, and the OPM was compromised as a result.
But this is like, I don't know, I think it's, You know, it's easy to look at things like Stuxnet and stuff like that and just be like, yeah, this is harm reduction or whatever, you know, but like in the end, it can have real-world consequences.
And this is also why people are so hesitant about, you know, like the government is always like, well, why don't you develop a form of cryptography where it like works except for us, you know, weaken the content, you know.
And it's like, well, this is why.
Because if you can access it, if anybody can access it, somehow that's going to boomerang back at you.
Is that a valid comparison to what they're doing in Silicon Valley?
Like, Huawei did have routers that had third-party access, apparently, and they were shown that information was going to a third party that was not supposed to be, right?
There have been incidents where it's like, yeah, there's data collection that's happening.
Well, there's data collection happening in all Western products, too.
And actually, the way the Western products are designed are really scary.
In the telecommunications space, there's a legal requirement called CALEA, Communications and Law Enforcement Act, or something like that, that requires telecommunications equipment to have...
To have eavesdropping, like surveillance stuff built into it, like when you produce the hardware, in order to sell it in the United States, you have to have...
So signal calls work not using the traditional telecommunications infrastructure.
It is routing data over the internet.
And that data is end-to-end encrypted, so nobody can eavesdrop on those calls, including us.
But so communication equipment that is produced in the United States has to have this so-called lawful intercept capability.
But what's crazy about that is that's the same, you know, it's like these are U.S. companies and they're selling that all around the world.
So that's the shit that gets shipped to UAE. Yeah.
You know, so it's like it's the secondary effect thing of like the United States government was like, well, we're going to be responsible with this or whatever.
We're going to have warrants or whatever.
And even that's not true.
And then that same equipment gets shipped to tyrants and repressive regimes all over the place.
And they just got a ready-made thing to just avail everyone's phone calls.
So it's like, I don't know, it's hard to indict Huawei for acting substantially different than the way, than, you know, whatever, the US industry acts.
It's just, certainly they have a different political environment and, you know, they are much more willing to use that information to do really brutal stuff.
That's a business thing, you know, where it's, like, Google's control over...
Google is producing this software, Android, and it's just free.
They're releasing it.
But they want to maintain some control over the ecosystem because it's their thing that they're producing.
And so they have a lot of requirements.
It's like, okay, you can run Android.
Oh, you want all this other stuff that we make that's not part of just the stock-free thing, like Play Services and all the Google stuff.
Increasingly more and more of Android is just getting shoved into this proprietary bit.
And they're like, okay, you want access to this?
Then it's going to cost you in these ways.
And I think it probably got to the point where Huawei was just like...
We're not willing to pay, you know, even either monetarily or through whatever compromise they would have to make, and they were just like, we're gonna do our own thing.
I think I might be right, but I'm not sure though.
But it just made me think, like, I understand that there's a sort of connection that can't be broken between business and government in China, and that business and government are united.
It's not like, you know, like Apple and the FBI, right?
What we're terrified of is that these relationships that business and government have in this country, they're getting tighter and tighter intertwined.
And we look at a country like China that does have this sort of inexorable connection between business and government, and we're terrified that we're going to be like that someday.
It was like, you know, that there are already these relationships, you know.
You know, the NSA called it PRISM. And, you know, tech companies just called it, like, the consoles or whatever they had built for these, you know, for these requests.
But it's...
That's...
Yeah, it's happening.
And I don't...
Also, you know, it's sort of like...
I think a lot of people, a lot of nations look at China and are envious, right?
Where it's like, they've done this thing where they just, you know, they built like the Great Firewall of China, and that has served them in a lot of ways.
You know, one, surveillance, obviously, like they have total control of everything that appears on the internet.
So not just surveillance, but also content moderation, propaganda, but then also, It allows them to have their own internet economy.
China is large enough that they can have their own ecosystem.
People don't use Google there.
They have their own chat apps.
They have their own social networks.
They have their own everything.
And I think a lot of nations look at China and they're just like, huh, that was kind of smart.
It's like you have your own ecosystem, your own infrastructure that you control, and you have the ability to do content moderation, and you have the ability to do surveillance.
And so I think the fear is that there's going to be a balkanization of the internet where Russia will be next and then every country that has an economy large enough will go down the same road.
House and Senate Democrats on Tuesday rolled out a legislation to halt federal use of facial recognition software and require state and local authorities to pause any use of the technology to receive federal funding.
The Facial Recognition and Biometric Technology Moratorium Act introduced Thursday.
Marks one of the most ambitious crackdowns on Facebook.
I mean, I think this is connected to what you're saying, just in the sense that, like...
You know, the people who are producing that facial recognition technology, it's not the government.
It's, you know, Valenti or whoever sells services to the government.
And then, you know, the government is then deploying this technology that they're getting from industry and in kind of crazy ways.
Like, there's the story of the Black Lives Matter protester who, like, the police, like, NYPD, you know, not like the FBI, you know, NYPD, like, tracked him to his house using facial recognition technology.
New York City Police Department uses facial recognition software to track down a Black Lives Matter activist accused of assault after allegedly shouting into a police officer's ear with a bullhorn.
That's it?
What about that guy who punched Rick Moranis, you fucks?
And the Pupnicks captivated the imagination of children across America.
Because Jackie Kennedy said something, she was like, I don't know what we're going to do with the dogs, you know?
And that ignited a spontaneous letter-writing campaign from children across America who all requested one of the puppies.
Jackie Kennedy selected two children in America whose names were Mark Bruce and Karen House.
And she delivered two puppies to each of these people.
One of them lived in Missouri, the other lived in Illinois.
And I have sort of been obsessed with the idea that those puppies had puppies, and that those puppies had puppies, and that somewhere in the American Midwest today are the descendants of the original animals in space.
And so, yeah, I've been obsessed with the idea that these dogs could still be out there, and I've been trying to find the dogs.
So I've been trying to track down these two people, notably Karen House, because she got the female dog.
And I think she's still alive, and I think she lives in the Chicago area, but I can't get in touch with her because I'm not, I don't know, I'm not an investigative journalist.
I, like, don't know how to do this or whatever.
So, if anybody knows anything about the whereabouts of Karen House or the descendants of the Soviet space dogs, I'm very interested.