All Episodes Plain Text
Dec. 8, 2022 - True Anon Truth Feed
01:16:01
Episode 256: Programmed to Kill

Brace Belden and Liz Franczak dissect the rise of armed police robots, from San Francisco’s 2022 vetoed lethal-force bots to Oakland’s deceptive "pan disruptor" gun-robot, tracing their lineage to WWI’s Kettering Bug and Nazi Goliath. They expose how militarized tech—like Turkey’s 2020 autonomous Kargu II drone in Libya—blurs ethical lines, with Silicon Valley complicit in weaponizing AI despite its lack of moral agency. Legal vacuums and civilian casualties (from Vietnam’s "smart bombs" to U.S. drone strikes) underscore a slippery slope where machines replace human judgment, leaving accountability—and humanity—in the dust. [Automatically generated summary]

|

Time Text
Armed Robots Used By Police 00:04:50
Hey everyone, welcome to another episode of True Anon.
I'm Brace Belden and I'm 5-4.
And I'm Liz Franczak.
And let's not forget our producer, Young Chomsky.
That's right.
We've got Young Chomsky in the control room, making sure we sound our best.
Ready to roll.
That's right.
And today we're talking about a really timely and controversial topic.
Armed robots used by the police.
Yeah, this is definitely one of the more pressing issues that our society is facing today.
We've seen reports of robots being used in different contexts, but this is the first time they've been used in this way.
Definitely.
And it's a really interesting topic because it raises a lot of questions about the implications of technology on our society.
Right.
And it's not just about the technology itself, but about the people in the systems that are using it.
What does it mean for citizens when police are using robots?
Yeah.
And it also raises questions about the ethics of using robots in this way.
We're seeing a lot of mixed reactions, from people who think it's a great idea to those who are freaking flipping out about it.
Absolutely.
So let's jump in and take a closer look at the issue.
We'll be talking to experts, activists, and more to get different perspectives on the issue.
And we'll also be talking to people who are directly affected by the use of armed robots.
So stay tuned and get ready for an interesting and thought-provoking episode.
Okay, so I want to actually take a little bit of issue with there, Liz.
Yes.
With you, excuse me.
I specifically, when I was working with my partner in the AI program to write this intro for the episode, I specified that I wanted you to use the phrase freaking flipping.
Did you really?
Yeah.
Unfortunately, you decided to sort of editorialize and kind of add your own twist on that by using freaking flipping out.
And I put a little spin on it, a little funk.
I'm just like, I want to be clear with you.
Like, I get what you're doing.
Like, I admire your craft.
And I think that you know what you're doing podcasting, but like, you know, I wrote this script for like a pretty specific reason.
Actually, you didn't write the script the robot did.
But this is, to our audience, a perfect example of the difference between robot and human.
Hello, everyone.
I'm Liz.
And to illustrate the difference even more, in contrast to that, I'm Brace.
And of course, we are joined by producer Hyung Chomsky.
And the podcast is called True Anon.
There we go.
We're back.
We're back.
Hello.
Hello.
Yeah.
I got to tell you.
So, you know, I fed the, we've been using this chat GPT.
Well, to be clear, I have not been using it because I think it's godless and I shall not be interacting with any kind of robot.
No, Liz's funky ass got locked out of the MacBook.
However, the boys have been going pretty freaky deeky on it in the old GC in the group chat, TA group chat, spending some quality time with the old chat bot.
Let me tell you, this thing can write a hell of a rap song.
We should save that to the end.
It's too good.
You want me to, because one idea that I did have on our tour was that I want to start.
I can't really do this now because I kind of like outed myself, but like I wanted to come to you guys with an idea that I wanted to do a rap song, like a really seriously, like, hey, like, guys, I've really been thinking about this.
Like, I think this would be good for the show.
I think, you know, I could kind of get into some music stuff.
Sure.
And then put out like one of the, you know, under the, of course, the trade name Arudide, put out probably one of the worst eight minute long, probably the funkiest a white boy has ever gotten.
Here's the thing.
You're going to, you're going to bring us that and be like, I'm going to play little tricks on them.
They're going to be like, this is a terrible idea.
Me?
I'm flipping the script.
I'm saying, let's do it.
In a heartbeat, I would.
In a heartbeat.
You know what?
It's going to be Arudite by Brace Belden.
And I'm going to have it.
That record will be next to your name for the rest of your life.
Okay, so it's bad to have something that's been cold and circular.
Like this.
Oh, that's fine.
What's the first sentence?
He was a really nice person.
Brace Belden was killed today while being bored to death by Liz Franzak, who is also mean.
That's me.
We are talking about something that I think longtime listeners of the show will know that I hate.
Robots and Military-Style Gear 00:10:19
Robots.
The vile robot.
I feel like I used to talk about this a lot more.
Robots?
Robots.
I hate robots.
What's your favorite robot?
That's the crazy.
That's like being like, hey, so like, you know, I know you just got out of Dachau, but like, who's your favorite?
Like, of the officers there, like, who do you think was?
I know, but like, we saw the pop culture robot in the pop culture robot pantheon.
If you had to pick one, what do you got?
You know what?
Hard to get away from the Noble Terminator.
Terminator.
Yeah.
You go Terminator over RoboCop?
RoboCop's not a robot.
He's an android.
That's the RoboCop is sort of a colloquial thing.
I know.
He's more, he's cop, too.
He's not just Robocop.
Well, we'll get into this more when we discuss what actually is a robot, which is, of course, the backbone of almost all sci-fi.
Who's your favorite robot?
I gotta go.
What's her name from the Jetsons?
Rosie.
Rosie.
She's classico.
I like her.
So, you know, nice and thick with it.
Sometimes I feel like for a guy who grew up like I did, robots were my role models in media.
Not a lot of robots in movies these days.
I don't know.
I mean, what?
Yeah, they got to make more robot movies.
It has robots in it?
Like, it's like, what's going to have robots?
Avatar?
Too?
Westworld robots?
That was all about robots.
You know what's going on?
Takesie backseas, Maya Culpa.
Yeah, I don't know.
I don't really know what's going on with movies these days, but definitely I assume there's probably, since T2 Rise of the Machines, I don't think there's been a lot of like that.
They don't make robot movies anymore because it's too close to reality.
That's true.
I know it wasn't.
And look at that.
That was actually, that was a classic, that was a classic Blade Runner Voigt comp style test to see if you are a bit of a robot yourself.
Wow.
Certainly exhibiting some robot-like behavior here.
Anyways, they just tried to make a suicide bomb robot in San Francisco.
Yeah, so that's why we're talking about robots, not because we're going to talk about robot movies.
All right, so let me see if I can do this.
I did not.
You know what?
A little behind the scenes here.
I did put the timestamps of the video that I wanted played right here.
Unfortunately, did not put the link to the video, so I have to refind it.
It just says audio around 34 seconds.
Let's see if I can just do this freestyle.
Check this out.
How did San Francisco and its mayor London Breed go from this in July of 2020?
Breed also announced a major $120 million budget cut to the San Francisco police and sheriff's departments.
That money instead to be spent on addressing disparities in the city's black community.
To this, just one week ago here in 2022.
Now it's seven police robots with the potential for deadly force, the SFPD policy just considered by San Francisco supervisors, allowing robots to kill suspects in certain critical situations.
You like that?
That sounded good.
Yeah, we're trying to get on our iHeartRadio style.
Real podcasting.
Yeah, so last week, I believe it was last week, yes, the San Francisco Board of Supervisors voted to approve an SFPD policy that would use, quote, ground-based robots, aka killer robots.
They had a second vote on it, which came down yesterday, which a little behind the scenes, that's Tuesday.
And the Board of Supervisors actually voted it down, which was a twist, and we were not expecting when we were writing this episode.
However, the episode shall go on.
Now, the thing to understand about this is that California now requires police departments to take stock of all current military-style inventory, and that's like a technical term, military-style, and scope out what it has and what it will, could, is authorized to use all that stuff for.
And then all of that then has to be approved by local authorities, right?
So our police departments, all throughout the country, they get insane military-grade weapons, not just drones, stun grenades, all that kind of shit, but now apparently also now killer robots.
Yeah.
And, you know, it's funny because like, you know, you mentioned here like their assault rifles, which is, you know, AR-15, that's what that stands for.
Are not actually included in that?
Those are not included in the military-style rifles.
No.
The military-style weaponry.
Well, it is technically a military-style weapon.
It justifies, it basically, SFPD has like a shit ton, hundreds, I think, of assault rifles in its inventory, but it doesn't include those in the inventory list because it says that assault rifles are standard issue service weapons, which is completely and totally insane.
I thought that the famous shotgun in the back of the police car, or like in the front of the police car, but in the back of that area, was like the big artillery they were supposed to use.
Yeah, I mean, you don't see them walking, you don't see cops in SF yet walking around with AR-15s.
Like, it's just like not, that's not a cop in San Francisco walking around at all.
But also, more to the point, like, actually, that definition is a legal definition, right?
That's not something the SFPD gets to like footsie around with.
So, but putting that aside, I do want to say that the reason why our police departments are getting all these military-style weapons is because the DOD has a program.
It's called the 1033 program, where they just, it enables DOD to unload like as much excess equipment as they want to state and local authorities.
And that's how they all get it, right?
And 1033, for those listening, angel number.
Is that?
It is.
Oh.
I don't want any angel numbers around.
Yeah, I'm not a fan of that.
I think that's dark matter.
It leads to these robots.
Okay, so this initially came to public attention when Aaron Peskin, who is the chair of the Board of Supervisors, I think the Rules Committee, he is also deliriously short.
I mean, my God, the man makes me look like, I don't know who's a tall guy.
Think of whatever 5'8, 5'9 celebrity you have in your head and picture him.
I mean, the guy is a teeny little fella, almost as big as the bottles of liquor that he drinks nightly.
Little just, you know, little scuttlebutt around town.
The man's a lush.
But anyways, he fucking, he takes out for the SFPD's proposal here, the following sentence.
Robots shall not be used as a use of force against any person.
That's like, it seems like a normal thing to add in to just say robots shouldn't be used as use of use of force.
No, they should be used as the classic things of girlfriends and slaves.
That's basically the use of a robot.
Here's the thing, though.
SFPD got that suggestion back and then they're like, no, and they crossed it out.
Yeah, and this actually, like, this was, it was sort of funny to see actually the reaction to this because this did get voted on last week.
And I believe it was three against, eight for.
There's 11 supervisors in San Francisco.
The people who are against it were Dean Preston, classic D5 supervisor.
And has no relation or interest or any connection to this podcast whatsoever.
Also not related to me, to be clear about that.
Is that a thing?
A lot of people seem to think that he is of some relation to me.
What?
I got to say, lately he's been looking his haircut maybe.
Oh, this is like an ancestry.com.
We don't actually look the same.
People just think any two guys with glasses people think look the same.
It's one of the biggest discriminatory movies.
He's got kind of a big nose, too.
He's got a big nose, but I just have a different facial structure.
He's much taller.
Yeah, he's like 5'6, 5'7.
So Dean Preston votes against it.
Hillary Ronin votes against it.
And Shimon Walton votes against it.
Everybody else votes for it.
And you actually saw a lot of defenses, including from Raphael Mandelman, who's like, listen, a lot of people are saying that like, you know, there's going to be killer robots, but like, you know, they're not actually killer robots.
I'm sorry.
You got a robot that's specifically designed to kill somebody?
I call a spade a spade.
That's a killer robot, my man.
Yeah.
When SFPD struck out that suggestion, they replaced it with this sentence.
Robots will only be used as deadly force option when risk of loss of life to members of the public or officers are imminent and outweigh any other force option available to SFPD, which is a pretty crazy statement.
And we can get into some of the legal implications of that maybe a little bit later in the show.
But there was, it's funny because even though this got voted down, you know, there was a similar controversy in Oakland.
Their policy on, it was just like a couple months ago, I think.
It was very recent.
Their policy on police robots, which is just, I'm sorry.
I know that, look, we get everyone that's like, no, everyone's like, oh, you guys are boomers, blah, blah, blah.
I'm sorry, saying police robots as a thing that we have to discuss and do an episode on is fucking crazy.
My thing is, you want to make this sound like a, like, you want to make me not sound like a boomer?
Don't call it a robot, man.
Yeah.
That's, that's an old-fashioned thing.
They probably won't in the future because of some legal stuff we're talking about.
With Oakland PD, they were really looking at this one device, which was called a pan disruptor.
You know what?
I've been called a bit of a pan disruptor myself.
But that actually stands for percussion-actuated non-electric disruptor.
The disruptor stands for disruptor.
Yeah.
This is a, quote, gun-shaped device that gets attached to a remote-controlled robot, and then it uses blank shotgun shells to disable a bomb by shooting it with pressurized water.
And basically, the city council was looking at this, and they were like, huh.
And they were talking to the PD about it.
Justification For Killer Robots 00:10:34
And they were like, all right, but can't, like, could this ever be used in any other situation?
And Oakland PD, like, basically begrudgingly, after like a series of like huffing and puffing in like very cartoon police way, like imagine cartoon pig in cartoon police outfit.
So charming, yeah.
So charming.
He's huffing and puffing.
He's like, ugh.
Yes, you can replace those blanks with live rounds.
Now, see, I read about this and I was wondering, like, if I was this guy, because he actually, he not only says that, like, he actually starts like over-explaining, which, by the way, true and on tip.
Never overexploit.
Yes or no suffices.
Yeah.
Especially in a question and answer environment.
You don't want to be doing, you're in front of like a committee of some sort.
Yeah.
Look, you're a cartoon pig.
You already have a hard time talking.
Or anticipate what the objections might be and then seek to undercut those.
But he's basically like, well, they ask him, like, could this, you know, could this be used against a person?
He's like, no.
Well, yes.
And then he proceeds to give like a variety of situations that he's clearly thought about where that could absolutely be used against a human being.
Yes.
So, I mean, one of the things he says, I mean, is it possible?
We have an active shooter in a place we can't get to, and he's fortified inside a house, or we're trying to get to a person?
Interesting that he says that because there's obviously an incident at the top of his mind.
Cut to Dallas, 2016.
Bet you didn't think we were going to say that, did you?
No, no, no, folks.
A robot did not kill JFK.
Ooh, that's a good one.
You know what?
Actually, yeah, it could like a, you know, something people haven't considered is what if Lee Harvey Oswald actually came back from a future time period where JFK had turned into some sort of Hitler type figure.
Interesting.
And he could have killed him before killed baby JFK.
But he was in a, he did have.
Lee Harvey Oswald's like, I can't kill a baby.
It's an exterminator film.
Yeah, yeah, exactly.
So back in Dallas, 2016, right?
We're having a hot summer that year.
BLM protests are in full force, in many ways, sort of mirroring the much hotter but much sadder summer of 2020.
Somewhat masked.
Very masked.
Well, somewhat masked summer of 2020.
You can't actually catch coronavirus at BLM protests.
So there was actually no reason for the mask.
But anyways, so Dallas 2016 had been a pretty crazy week that week.
There was, I believe that was the same week that Alton Sterling had been executed in Baton Rouge?
Baton Rouge?
I like Baton Rouge.
Baton Rouge.
And of course, the video of Philando Castile, actually a live stream of him being killed.
It's a really tough video to watch.
Those had both recently happened and those had come out, right?
And so there was going to be this protest in downtown Dallas, actually pretty much near where the JFK robot assassination had happened.
Mika Johnson, Army vet.
One article from NBC has his former squad mates calling him klutzy and goofy, which I got to say, insane adjectives to use.
Yes.
When they served together in the, no shit, 420th Engineering Battalion.
He was a mason and a carpenter in the army.
Anyways, deployed to Afghanistan, you know, basically like a veteran, but not like some like crazy PTSD or the guy who shot American sniper style veteran, but also American sniper himself.
So the guy says, I mean, maybe I have too many details here, but you know, he's, this is pretty much a forgotten shooting, even though it was a pretty major one.
I think one of the biggest attacks specifically on police officers in our great nation's history.
But sorry, Liz gave me a look after I said that, but you know what?
I'm going to keep rocking through it.
So big protest happens, right?
It's sort of towards the end.
We're getting towards the evening, the night, and a car pulls up.
This is right near a college downtown campus in Dallas.
And a guy gets out, talks to a few different police officers, and then shoots them all.
And over the next several hours, gets in a running gun battle with the police through the college campus until basically he is cornered and is sort of stuck in, well, he's stuck, but he's also got the, I guess, the Dallas police officer stuck in down like a long hallway where he's firing from.
So there's also a famous move during this fight that I should mention here because those of you who studied urban combat will no doubt have seen this, where Mika Johnson is sort of like engaged in a pretty close quarters shootout with a police officer around these pillars outside of a building in downtown Dallas.
Does he go behind the pillar?
Well, they're both behind pillars sort of shooting at each other.
He goes behind, shoots at the ground to distract the officer in one direction, and then goes behind the cop in the other direction, bam, blows him away.
Are you telling me this is a video game move?
I'm telling you, you know what?
I don't even know if video games possess the technology where you could do a move this advanced.
It's a pretty crazy move.
Anyways, he ends up shooting, I think it's like maybe nine cops, killing nine cops, I believe.
But the, you know, which is, police do not like when one of them get killed, let alone nine of them.
Yeah.
Pretty deadly day for the force.
Wounds, of course, a bunch of other ones.
At the end of this little chase of the college, he sort of holed up at the end of this hallway, blasting back at officers.
They're shooting at him.
One of the big things is they say, like, oh, we expended 200 rounds shooting at this guy.
What?
I don't know.
That's not that many rounds.
Okay.
I mean, you know, because half the time people are shooting these things.
They're kind of just popping off.
Yeah, yeah.
They're just going all the way.
These aren't like named shots.
Well, it's Dallas, too.
Crazy Cowboy cops.
Exactly.
Well, they do only have six shooters, which is why he very much outmatched them with his assault rifle.
But police, you know, report him as saying black power, you know, all this stuff.
This is, there's a lot to get into for maybe a different episode about sort of the story around this.
But, you know, they were sort of implying that he was a member of a black nationalist group, although it never actually came out that he was a member of any sort of organized group or anything like that.
Anyways, two hours pass.
The police are impatient.
They're being taunted by Johnson, who is saying classic quips like, how many of you guys did I kill?
Or I think I might kill some more of you guys.
So they say.
But there's absolutely no way that they're going to take this guy alive.
I mean, you shoot a cop, your chances of getting out of whatever situation you're in alive, it goes down by quite a bit.
You shoot nine cops, you're at 0%.
Yeah, totally.
So the police eventually send in a robot behind a wall near where Johnson is located.
Now, this is a remote, Remotech F5A bomb disposal robot sold by Northrop Grumman with a pound of C4 strapped to it.
They detonate the bomb, and Johnson dies in the blast.
And the robot.
You know what?
Actually, the robot does not die, which is something that should be mentioned.
This EOD robots are famously very resistant to bombs.
The robot was damaged, but I believe, after, of course, recuperating from his wounds with his wife and newborn child, was probably put back into active duty.
But no, the robot actually survived the blast.
Fascinating.
But this was the first case of an unmanned vehicle killing a U.S. citizen in the United States.
In fact, not even just an unmanned vehicle.
I mean, listen, who among us hasn't put an apple down on the fucking gas pedal and had that fucking guy go down a hill and streets of San Francisco?
How many of us haven't died, you know, have died in Tesla crashes?
True.
Well, it's manned.
You're just not really in control.
But this is actually the first case of a robot being used to assassinate a human being that we know of in the United States.
The justification from the police was that the police could possibly face injury or death, which I got to say, yeah, I mean, that's kind of a big part of the job.
Well, that's how the police forces justify the use of these things is they say, OK, this is going to keep the safety of officers.
Like, you know, this will keep them more safe and this will make the officers more comfortable in doing their job or whatever, whatever, whatever.
And we'll see that as we kind of go through this, even more so when they kind of describe future uses for them.
So like you could see a robot now stopping you, you know, coming up to inspect your vehicle rather than the cop is staying in the car, for example.
But it's always, always, always justified by saying that, you know, it's taking care of the officers.
Yeah, yeah.
And even an even more broader justification for this, because, you know, I think even the police, as pig-headed as they can be sometimes, realize that that's not always the best justification for the general public to accept things like this, is that it saves lives.
That's a huge reason that EOD robots are so popular and that their use was so that like, you know, when they were first deployed in Northern Ireland, which we'll get to in a second, was justified to the general public is that like, well, these are saving lives.
Well, they're saving the officers' lives, maybe, or they're saving like, maybe, maybe they are.
But the general thing is that they're protecting civilians.
That's how you're supposed to think of these things.
And really, we see that justification the whole way down from like EOD robots to drones.
But specifically, I do want to get into how did we get to the era, basically, where the SFPD, presumably using the example of the 2016 attack in Dallas, is basically trying to justify the use of what are essentially explosive killer robots.
Yes.
So since the popular imagination, as people like Liz tend to call it, but as I would call it, since man first began to dream about robots or automatons, there's been a lot of space taken up, but thinking about how are these things going to fucking eventually turn on us and kill us, right?
Essentially Explosive Killer Robots 00:17:58
I call this, I really got to think of a pithy name.
Terminated mind.
That's my annoying Verso book about it.
But there's basically been attempts at suicide robots ever since we got robots to function a little bit like you would think they would in like the modern day, right?
So one of the first ones was called the Kettering Bug.
Now, if we called robots bugs, way cuter.
What about Roomba?
Kettering bug sounds very cute.
Sounds like it's going to do a dance.
You know what?
I'll be honest.
Before we get into any of this history stuff, what's your take on the Roomba?
I don't like Roomba.
I don't like Roombas either.
First of all, too loud.
Hugely noisy.
Very loud.
Yeah.
And I don't like that it just does its thing.
I don't like the, first of all, it's mapping my house, which I got to be honest with you.
People always say that.
I'm like, what is your house?
I mean, I could map anyone's house if I walked in there in two seconds.
What you have a one-bedroom and a bathroom.
Yeah, but maybe a little kitchen.
You're not selling the map of your house to like 85 billion advertisers.
No, I sell it to our highest patrons.
Yeah, but you know, but I don't like Roombas.
I feel like they steal jobs from boyfriends who like maybe don't know how to do a ton of like around the house stuff, except they do know how to vacuum, stuff like that.
I feel like that should be maybe illegal on that, just for that alone.
I think it's too loud.
And I don't like, you know, here's the other thing.
Me, I'm not, I don't like the element of surprise.
I love the element of surprise.
I don't like surprise on me.
I don't like surprise on me.
Fair enough.
Yeah.
So when I see something moving in the corner of my eye, I don't like that.
In my own home?
No way.
I don't like it either.
I can't get used to that.
Well, the company that makes the Roomba iRobot also makes.
You can't call your robot company iRobot.
Well, you clearly can't, although I do say that was a very famous.
Will Smith?
Well, I actually haven't seen the movie.
A much more famous book than the movie.
I believe the movie is not very well known.
But yeah, but iRobot itself was the.
Wait, they just named their company after famous books.
By Isaac Isamat, Isaac Asimov.
Yeah, you can't do that.
That's like where they develop the rules of like whether robots should be able to kill you and all that kind of stuff.
Famously, that they can't kill you.
But that's disproved in the book.
I mean, that's the thing is they come up with the rules and then both reality and the very book that those rules surface in really just go and show us the opposite, right?
So anyways, the Kettering Bug, which was basically like, you know, this is happening during World War I. They're inventing it.
So it's like a biplane with a bunch of bombs in it.
It's supposed to be, I think it was radio controlled.
Back then, people were flipping all sorts of switches and making them do things.
A lot of knobs and switches have.
A little closer to a guided missile, really, than a plane, or excuse me, than like a robot or anything.
But we're sort of seeing where things would end up.
The French, you know, tinkerers that they are, a gnomish race, also had an unmanned explosive vehicle they employed around the same time that tanks that sort of looked like a tank a little bit.
That was also an unmanned explosive vehicle.
But the actual tank filled with those brave boys from Alsace, those were actually, those could have overshadowed it.
I don't think they actually ever deployed their little fake robot into battle.
So the French, in the interwar period, when the rest of the country was busy, the French were doing a lot of weird shit back then.
They were trying to be like, the French attempts at fascism, some of the most pathetic things you could ever point to anywhere in history.
I think they do a pretty good job of it nowadays.
But exactly, but it looks nothing like it did.
It's quite modern.
Yeah, exactly.
So while everyone else was doing all those wacky shit in politics, some French tinkerers were working on a small remote-controlled demolition vehicle.
So this is, I believe, based on the attempt that they made during World War I to make this little fucking, you know, driving bomb.
Anyways, famously, the Germans, get this, guys, invade France.
What?
Take that son of a bitch over.
Wow, I totally missed that book.
Yeah, they're like, you know what, we need to put in charge here?
One of the oldest men in history.
Philip Petain.
I mean, let's say, you guys, they actually put the Germans, so respectful of French culture, actually put the oldest man in charge of France at that time that they could find.
Anyways, they find this little prototype, and Germans that they are, thieves and rats, all of them.
Not you if you're listening to this.
They take that thing back, they tinker with it themselves and create a device called the Goliath, which is a little bit ironic because it's a tiny little guy, you know, maybe the size of a hug.
And it is like a little tracked vehicle that's filled with bombs and controlled by wires.
And you drive it up to your little, like, I don't know, I guess they're the Nazis, so probably some people who really didn't deserve it.
And then you blow them up with it.
And so they created bigger and bigger versions of this.
And they actually employed it quite a lot, you know, of this.
Basically, a remote-controlled car filled with explosives.
This is, I believe, the genesis of the bomb that killed Mika Johnson and the ones that the SFPD would like to use on people in San Francisco today.
Yeah, they just finally went wireless.
It's the Nazi Goliath is what the SFPD wants to unleash.
And I'm not even kidding.
It's essentially the same thing.
So, okay, we've got your little, you know, we've got our little suicide vehicles here.
But I think we also kind of have to, if we're getting to where we actually get today, I think we have to take some other things into consideration here, right?
So World War II ends.
Oh, no, it's so scary.
Everyone has robot fever now, right?
We're all crazy about the golden age of science fiction happening.
So America is going crazy for this stuff.
You know, DARPA's out there, MIT is out there.
Basically, every nerd in America who, by the way, if all these people had been locked up or killed by either the Soviet Union or some sort of devastating brain virus throughout this period, we'd been in a much better place today.
Anyways, they invent this thing called Shaky the Robot.
Shaky?
Yes.
This was invented at Stanford, at least.
I think in the late 50s.
And it's got like a little camera on it.
It's basically the first modern day thing that we would recognize as like a robot.
So the early 1970s also saw the introduction of the first explosives ordinance disposal, which EOD robots onto the battlefield.
This happened in Northern Ireland, where the Irish were battling the British over who could be less Jewish.
And the first thing that they come up with is called the wheelbarrow.
And I think they actually still use versions of the wheelbarrow today.
You know, the IRA was getting pretty good at these bombs, these fucking car bombs, these regular bombs, these street bombs, all kinds of bombs.
And so the British were constantly using, in fact, employing these EOD robots that they had just invented throughout the rest of the time of the Troubles.
And it's actually, you can see some pretty, I don't want to say funny, but they're definitely funny pictures of like this little WALL-I-looking robot, like, you know, going through the streets of Northern Ireland while these British people are like hiding behind the walls with their guns.
So EOD robots famously constantly used the whole world over, right?
I mean, this is kind of a guy comes up to me now at the cop bars that I hang out with to meet boyfriends at, and he tells me, oh, I work at EOD.
I'm like, so what?
You control a little robot?
Like, suck my dick, dude.
You don't do anything.
EOD robots used in like every fucking police department, used on every battlefield.
I mean, they are a very, they're probably the most prominent use of what we would call like the robot in American police departments, except for, of course, New Orleans Police Department, which does have a C-3P.
I'm going to regret this.
So, okay.
Elsewhere in the 1970s, actually the late 1960s, in Vietnam, the U.S. Army, one of the most benevolent organizations in human history, was deploying laser-guided munitions and so-called, like the early, basically the earliest instances of so-called smart munitions.
So, this is also really an introduction where we hear a lot about how precision attacks save lives, right?
And of course, like the classic, I mean, and a classic American tactic of like terror bombing and like just fully like just unload loading like dummy bombs constantly, like in B-52s over, you know, cities and battlefields.
Like, you know, they do kill a lot of people.
But I think the contrast here between the so-called smart munitions saving lives and sort of the dummy bombs like losing, like killing people or whatever, they both kill people.
Right.
And I think it really, it's an obfuscatory, whatever you pronounce it, tactic basically used by propagandists for, you know, the war machine.
Anyways, we also see the introduction of UAVs, which is unmanned aerial vehicles, which have technically been in use in some form for like a very long time.
But the ones that we know essentially as drones really started with the, you know, similar thing to how like the guided missiles started with like these planes that they were kind of launching out of airfields during the First World War.
World War II, they kind of create this even crazier one, which is like an old B-17.
It's like they actually have a pilot up there for a second.
He takes off, arms the bombs, he jumps out.
The rest of the way is controlled by remote control.
Fact, John F. Kennedy, later to be killed by robots, his older brother, he's up there.
He's one of these pilots.
He's like, I'm taking off.
This thing's heading to Germany.
I'm going to fucking set this.
I'm going to take off.
I'm going to get level.
I'm going to turn the bomb on.
And then I'm going to parachute out like I'm supposed to do.
Not so fast, young man.
Bomb goes off early.
What could have been?
Oh, my God.
J.K. Jr.
J.K. Jr., I know.
Joseph.
I wonder if it's a P. Kennedy Jr.
I actually don't know his middle name, but Joe Kennedy Sr., real piece of shit, by the way.
Anyways, technology advance, drones get improved upon.
I mean, drones, as they are, get invented.
And in addition to their use as reconnaissance vehicles, which is what they've traditionally been used for, they actually begin carrying deadly payloads as well.
Israel is sort of the first pioneers of this.
I believe somewhere I read that Iran had one that was like armed with an RPG in the 80s, which is kind of cool to think about.
But drone as drone didn't really come into its own until, of course, the classic Afghanistan, right?
That would be America.
That would be Afghanistan.
Not the Taliban.
Afghanistan invented these things in the 1960s during their push for industrialization.
No, so.
No, the Predator drone.
Everyone knows the Predator drone.
That's like the symbol of, I would say, like the Bush years would be the Predator drone.
To me, the real Predator drone is, hey, how are you?
My name is Brace.
I play in an indie rock band.
Oh, you look so old.
So yeah, the Predator drone saw its first combat use in Afghanistan, like, I think just like a few, maybe a week after the invasion started.
I mean, we've been flying unmanned, like unarmed rather, drones over Afghanistan for a while before that.
But we really started rocking people with the hellfires right after 9-11, and we kind of haven't stopped.
Yeah.
I think Predator drone, very clunky.
It's got kind of a bulbous Jacobian forehead.
It's like not, it's, you know, it's definitely like a big boy.
It's a fatty.
It's not.
It's not sleek.
It's not sleek.
I mean, those things are big.
And have you seen the little cages they keep these little freaks who drive them in?
Yeah.
Good God, man.
Do you imagine what those things smell like?
If there is a drone operator on this earth who doesn't stink like shit, then I will eat my own shoe.
My God.
So anyways, now it is very normal, actually.
And sort of people don't even really think about it.
But there are literally an unknown amount of American robots that fly over various countries we are not at war with that are ready to rain down destruction, including, in some cases, this giant flying knife at people, which is a new kind of bomb they came up with, at a moment's notice.
So we have essentially these flying terminators all around the world that are ready to basically off anyone we don't like, no matter what.
You know, it's fitting too that the last U.S. drone strike in Afghanistan when we were still in Afghanistan was actually, of course, auto-civilian.
I mean, this is, I mean, a huge topic that has been covered extensively, but obviously the death toll from these, you know, these precision-guided munitions, these drones that are supposed to be, you know, preventing combat deaths or preventing excess deaths in general have often been used on totally civilian targets.
I mean, it is astounding how many just regular people they kill, a lot of the times on purpose, you know, to seed terror in these countries.
I have also had a grenade dropped on me from a drone.
Not on me, on a building I was in, but sort of smaller drones, even like civilian drones essentially, are often repurposed in combat now by, let's say, non-state actors to serve as sort of crude bomb-dropping devices.
Recycled.
Exactly.
Again, though, almost all of the things that we've described here are controlled by a human operator, right?
That's a human operator controlling the robot, but it's a human operator involved to begin with.
So in 2020, during a failed offensive by the Libyan warlord Khalid Haftar, Turkey, which was backing his rivals and the government of national accord, of course, had them on the pod as well, kind of tried to do like a Rogan-style, like, you know, guys, smoking a little weed here.
That did not go well in that episode.
It will never be released.
Anyways, Turkey used autonomous drones to harry and harass Haftar's forces while they were retreating.
So according to a UN report on the subject, logistics convoys and retreating HAF were subsequently hunted down and remotely engaged by the unmanned combat aerial vehicles or the lethal autonomous weapon systems, such as the STM Kargoo II and other loyalty munitions.
The lethal autonomous weapon systems were programmed to attack targets without requiring data connectivity between the operator and the munition.
In effect, a true fire, forget, and find capability.
So what that's saying there is killer robots.
Yeah, that's a robot.
That is.
I was just going to say, so it's like up until, like we said, up until this point, I think what we were talking about, even with drones, people don't really think of those as robots, right?
Some of that, I think, is because we've become so used to this technology that because of that, our understanding of kind of what this stuff is feels like it's like more and more normalized.
And so we don't think of it as something like so future sounding as like robot.
For sure.
Yeah.
On the other hand, like you mentioned, all of this stuff previous to when we got to Olympia were remote controlled by operators.
And so they weren't ever making decisions themselves.
They weren't, you know, they were just sort of like, I, you know, they're a little thing that can deploy something, but a person has to say yes, no, whatever, based on whatever reconnaissance or not the thing gives it, right?
Whatever data it feeds.
Okay.
What we're saying here is that in 2020, that changed.
And we saw an actual autonomous, I don't know, robot.
It's a robot.
Scary, it's a killer robot.
Kill somebody.
Yeah, yeah.
I mean, here's the thing.
Like, we're not robot experts.
We're not drone experts.
I would say we're laymen on the subject, right?
Gladly so.
I will never take the side of the robot.
But this is what it looks like to me.
I'm not a, I'm not, you know, I'm not a dumb guy.
I'm not a smart guy.
I'm a regular guy.
And I'm seeing this stuff here.
And I'm seeing a robot killing a human being in the desert with no human input.
And that to me raises my hackles.
As it should.
I mean, I think, okay, you know, with all robot and artificial intelligence content, and we can talk about the difference between those two if there is or what or whatever, if we want.
But it brings up a ton of interesting kind of like legal, ethical, you know, debates, questions, whatever.
Like in the intro, what the chatbot said.
You know what I'm saying?
You know, it's like, like we kind of detailed, in the Dallas shooting, the robot lacked any ability to make any kind of independent decision.
Yeah.
Right.
It was an unmanned vehicle, and the decision was made by the police, right, to end the standoff with the detonation.
But it wasn't the robot that decided that.
Yeah.
Which is an important distinction.
And technically, that robot was, now I'm just, let's, you know, taking a step back.
Technically, the robot wasn't designed to harm people.
No.
And it didn't have any ability to make any independent decision.
Right.
So it's sort of like being ramshackled, you know, kind of like make use.
Yeah, they strapped a pound of C4 to a robot whose actual purpose was to be available.
Big human intervention there.
Yes.
Totally.
So I think in general, when we think of robot, at least when I think of robot, because I want to talk about this, we don't think of just like a vehicle without a person in it.
No.
Right?
I mean, that's just like, I don't know.
It just, it doesn't feel like a robot yet.
No, no, no, no.
Legal Classification of Robots 00:15:20
We think of AI or what's also known as artificial intelligence.
We're thinking of an R2D2 or a C3BM.
Yeah, basically, look, there's no agreed upon definition of what a robot is, which is good for the sci-fi genre, but bad for the legal system.
I think there's a lot of debate over what actually is artificial intelligence.
We don't have to get into the weeds about that, although I'm curious what you think a robot is.
You know, it's funny.
I actually.
It's one of those things with a lot with me is I know it when I see it.
But I'm saying a little metal guy, whether it's an arm or a whole guy or even something that doesn't look like a guy, but I'm going to say even these little crawlers and an R2D2 type robot as opposed to a C3PO type robot.
And I think, and I'm not even like saying this for effect on the show or whatever.
I think they should all be extremely illegal.
No, but like, what is, but like, what do you think constitutes robot?
Like, when does it go from unarmed, sorry, unmanned vehicle to robot?
Oh, like, when does it go?
When does it go from being like an RC car to a robot?
Yeah.
I'm saying RC cars are little robots, but they have, they're more androids because they have a man connected to them.
Pause.
But I would say, I don't know, actually.
I don't exactly know.
So I think, I mean, it's an interesting debate.
Like, we could get stoned and go to a freshman dorm or whatever and get freaky with it.
We actually couldn't because that would affect my sobriety, but that's okay if you think that.
I hate to quote this guy.
Sebastian Thrun, who's the director of the AI lab at Stanford, I think has like a pretty decent like definition, which is that, you know, for artificial intelligence, which I do think a robot requires.
Interesting.
I don't, but go on.
I think that when we think of robot, yeah, I think decision-making is important.
But he says that it's the ability of a machine to perceive something complex and make appropriate decisions.
Although I might quibble with his use of appropriate there.
But I do think that that kind of gets us closer to maybe something.
So I think where we maybe differ on our definitions is I don't think a robot needs AI.
I think a robot can have either a simple or complex algorithm, which is you often see AI.
Right, okay, just a machine thing that does something.
Totally, right?
So I'm not sure.
So on our way back from our wonderful shows in San Francisco, right?
Our way back here to, oh, ever so cold, New York, shitty.
I was an SFO, of course, denied entry to the Admiral's Lounge due to the incident.
Wonderful airport.
Great airport.
It's the best airport in the country.
It's fantastic airport.
Calm.
Calm.
Well, here's me calmly trying to get my damn matcha latte.
And what do I see but a robot arm behind a kiosk, right?
And this thing, I've seen videos of it before.
I don't like it.
But I'm like, you know what?
I'm going to try for one of the nastiest drinks I can get.
A matcha latte.
And so I put my card.
I actually don't have a tap card.
I do have a debit card with about a third of the stripe that fell off after I dropped on the ground.
What's going on in your wallet?
You always have like broken cards and like things getting demagnetized.
Yeah, yeah, that's true.
That's because I carry a lot of dangerous shit in this.
Anyways, I pop that in.
It accepts.
And it makes me one of the worst drinks I've ever had in my life.
But there's a crowd of sort of like gaping onlookers.
Look and lose.
And you know, this thing doesn't, it's not AI, right?
Like it's pre-programmed to do a certain number of tasks.
But there is no job that I've ever had in my life, including, as we saw from that fucking chat GPT bullshit, podcaster that could not be replaced by AI or by, excuse me, by even a complex algorithm for the most part.
And so my thing with this is all of it.
I don't like any of it.
Even a simple, complex algorithm, whatever.
If a man can do it, a man should do it.
You know, it's listen, these things are job killers.
Yes, they definitely are.
I mean, they're going to kill just the simple predator drone.
I mean, because AI is definitely coming to drone warfare.
100%.
It's funny.
There's like a lot has been made in like pop, there's been all these like movies in the past like decade or so about like the perils of being a drone operator, which I was kind of like thinking back on when we were kind of going through the notes for this.
The tech exists to push this even further and put those hardworking drone operators out of work.
You know, the tech actually exists right now to put those hardworking drone operators into a six-foot hole.
In 2020, the Army equipped an MC-1C gray eagle drone, which I just, just as a side note, I fucking hate our names for weapons.
Yeah.
Predator, I got to be honest with you.
Predators are pretty sick names.
Okay, I'll give you predators.
Or fire, also a sick name.
Gray Eagle sucks.
Gray Eagle is awful.
Patriot?
Horrible.
God.
I hate it all.
Anyway, whatever.
It's just corny.
Okay.
So DOD, or excuse me, the Army, they equipped this drone with the Maven Smart System and an algorithmic inference platform.
It uses sensor data and turns it into target info.
And then, based on that info, selects the best weapon for response to a given threat.
All of that, fully autonomous, right?
Now, they put on a big show with this thing.
And the big show was that, you know, they're like, okay, this is AI.
This is machine learning.
You know, you've got the timeline.
You've got data collection.
Then the weapon system is engaged.
All of that was shortened from what used to take 20 minutes to 20 seconds.
Yeah.
So the tech exists.
There was a, I found this like horrifying quote from Brigadier General Ross Kaufman, naturally friend of the show.
He said, so obviously the technology exists to remove the human, right?
The technology exists.
But the United States Army, and I swear to God, this is a direct quote, but the United States Army, an ethical-based organization, that's not going to remove a human from the loop to make decisions of life or death on the battlefield, right?
We understand that.
I'm not sure we understand that.
I don't think we understand that.
Here's the thing.
Here's the thing about all of this, Liz, is this is a slippery fucking slope, right?
Yes.
There actually, there have been various attempts at making international laws to basically ban what would be AI-controlled robot warfare, but I guarantee you this, that will not happen.
Well, the federal government is already spending a shit ton of money on robotics research.
Everyone is researching robotics.
You know, inexpensive and sophisticated tech is going to be extremely attractive to police forces, like just as it has been to the military, you know what I mean?
Regardless of the intent and design.
And I think that's like an important thing, right?
It's like, you know, I'm not a tech person.
Okay.
I'm not as, I swear I'm not as stupid as sometimes I play on the podcast about this stuff.
But I do think that there is a bit of naivete or let's say shirking of responsibility in the tech sector on the development of some of this stuff and then its possible use cases, right?
Absolutely.
And part of that is, you know, complicated because, you know, especially with a lot of the computational technology, a lot of it's like understanding how it will be used comes in then feedback from the user, right?
I understand that.
But that's very tricky and sticky when the user is police forces or the U.S. military.
You know what I mean?
And this is like a direct quote from the U.S. government on this.
They said, it really started with this idea that commercial industry, Silicon Valley, had come up with techniques going back to 2008, 2009, when the sort of deep learning kind of revolution happened.
The idea was, let's bring in industry, let's sort of translate what you're doing in Silicon Valley in commercial.
We want to give you military scenarios to see what you can do with that.
And that's where all of this is converging, right?
And it's really, you know, you start working through a lot of this stuff, especially when we're talking about specifically the police force, right?
Like how is, you know, you mentioned, you know, kind of the legal area, right?
When we think about, you know, to get a little heady here for a second, it's like when you think about what is, you know, theoretically in intent, you know, reaching for a fully automated system, right?
Yeah.
What does then due process mean in a fully automated system?
How is that even possible?
Yeah.
You know, theoretically.
And obviously, you know, it's like there's a lot of sci-fi that has covered all of it.
I'm not trying to blow anyone's mind, but I want to bring this back specifically to police forces.
You know what I'm saying?
Like a lot of this will come down to how we classify robots, which sounds insane, like under the law.
Like this is totally uncharted territory.
You know what I mean?
Like is machine learning technology?
Is it something else?
Is it property?
Is it human?
Is it another legal category?
Is it animal?
Is it slave?
Is it child?
I know this sounds insane.
Yeah.
But like putting this stuff in the framework of policing is super interesting, right?
Because, you know, human police, which sounds weird to say, but I'm not sure.
Well, there's dog police.
Yes, but I'm talking about human police here.
You know, they're legally allowed to, I mean, you know, give me, be generous with me when I'm talking about this, okay?
They're legally allowed to resort to force, right?
Including deadly force, which we all know.
And claims of excessive force against police, those get judged under the legal standard of, quote, objective reasonableness, which is, you know, under the Fourth Amendment, right?
Okay.
So the Supreme Court has really shied away from defining out in any kind of like list what factors could be used in determining what is what constitutes reasonableness in any given scenario in these like excessive force cases.
And so instead what you what's kind of like been given as direction is a really like mucky, confusing legal doctrine that's really notoriously difficult to kind of articulate.
And it's all done on a case-by-case basis.
Okay.
We all know this, right?
We all know that the police is given like extreme generosity from the courts and determining what is like reasonable.
But for the most part, you know, courts have maintained that police are forced to make split-second decisions about the amount of force required in any given scenario.
As an aside, I think there's some pretty interesting, there's some interesting like people who have problematized just that idea.
The idea that excessive force relies on split-second decisions by the police, I think, is very easily contested.
And that would completely undermine a lot of these Fourth Amendment cases.
Anyway, but basically, that's all to say.
The courts have given the cops a lot of deference when they testify and they believe that their lives were in danger, right?
Very much so, yeah.
Even if and when those fears turn out to be mistaken, which is an important thing to note.
Okay, so what does that have to do with robots?
Now, there's a couple things to think through here, right?
Like I said, what is a robot?
Like legally, what category does the robot fall under, right?
You know, the thing is, the categories that, you know, when this gets decided inevitably, right?
It's going to get carved out in the coming decades as these cases appear before the courts.
And the courts are going to use analogies in order to understand robots and justify those rules, right?
So that will deter like how the law compares them to children, slaves, computers, property, animals, humans, right, is going to determine whether or not a robot could be liable for using excessive force, right?
And you can like, you know, if robots, let's say, let's push it out, right?
If robots are property, right, let's just say that, okay?
If robots are property, then there's no legal basis for them to defend themselves against a human attack, even if they're operating a policing role.
So you can see how, from a legal basis, from the legal standard, if it's a piece of property, it cannot defend itself, right, against a human attack.
So for instance, this is like, I mean, it would be like the same thing as me shooting a police car rather than shooting a police officer.
Exactly.
And so a police car isn't allowed to have a machine gun on top of it that comes out through sort of like separating plates and then scans the area and then shoots me.
And so why should a robot?
I mean, you can, you know, let's use a different analogy, right?
Let's say it's an animal.
Well, is then, what if is cruelty to robots going to be criminalized, right?
These are interesting questions.
I mean, there was a case where a robot, there was like, do you guys remember there was like that robot?
I'm just forgetting.
I don't have it here in front of me, but it was like a robot that was found dismembered in like a mountain.
Like, it was like a hiking robot.
And like all these people like dismembered it in some kind of like, I don't know, maybe it was like a Donner Party LARPing situation.
But, you know, a lot of people in response to that pushed back because they were like, oh, this is showing our, you know, I mean, a lot of reason why we have animal cruelty laws and other sorts of, you know, they're kind of like to map out what we as a society say or like, you know, it's not a utilitarian thing, right?
It's because this is, you know, this goes against a fundamental, like, I don't know, moral principle that we have as a society.
And so many people were pointing to that as, you know, with the robot, the hiking robot or whatever, dismemberment, saying, look, this shows that we need to, you know, protect the robots because it shows something so ugly about our own nature, right?
Now, if they're similar to animals, then how does that shift for police robots then become?
Are they then police dogs?
That you could see that kind of scenario.
What about damage?
What about hacks when these things inevitably malfunction, right?
Who bears responsibility if for the mistakes of the robot?
You know, if it's if it causes a threat to be misjudged because of a hack or because of a malfunction, which will inevitably happen, right?
And then it uses disproportionate force, who's liable?
Learning in the Autonomous Realm 00:06:43
I mean, these are all like really I mean, and this isn't even like scratching the surface.
We haven't even talked about what about when there's criminal robots.
So now you've got police robots fighting criminal robots.
What about the use of excessive force there?
If a police robot isn't a human, then a criminal robot isn't a human.
And so there cannot be any excessive force.
Well, it's like, but then you get also onto like, what about when there's, you know, robot wars that are fought.
Yeah, I know.
It sounds so insane.
But it's all, all of this is, you know, this is very long and kind of weedy, I guess.
But like, all of it's to say that these like legal, this legal regime is completely and totally uncharted.
There are no laws governing any of this stuff.
And so what happens, you know, in these cases, in the case of Dallas, or if we see more and more forces, like police forces, deploying these robots, and I mean, this stuff is going to happen in the next 10 years.
We're going to start seeing cases.
I mean, that's the thing is like this stuff can seem like, I don't know, all, you know, so futuristic and so like, oh, I'm just smoking weed and having this theoretical thing in the college dorm.
But like the fact of the matter is, right?
There have been kills, confirmed kills by autonomous weapons, okay?
And, you know, there was autonomous weapon.
I talked about the drone too, but like Germany, I know, invented like an autonomous gun that can shoot down projectiles from the sky.
Like a big thing that is used by, let's say, non-state actors, non-state forces in warfare is these small swarms of suicide drones, also used by state actors as well.
But which can be very effective, right?
And they were using them against like air bases in Afghanistan.
Germany had a gun that could just like automatically track and shoot down these things.
Obviously, those are shooting down projectiles.
They're not killing people.
But like this, this sort of technology is advancing at a pretty rapid rate, right?
And we're seeing the actual era where this becomes a possibility.
Like the use of all these things, again, I should remind you, like it is not like you are not taking out of time out of your day to like trip out every day at the fact that there are literal fucking flying robots all around the world.
I sound like fucking Louis C.K. here.
Like it's so incredible.
Like there are actual flying robots like around the world that like can just blow a guy up at a fucking moment, a push of a button from a fucking gamer fucking setup in Nevada.
Right?
I mean, this shit is, this is not, we are leaving the realm of the theoretical and entering the realm of the practical.
And it's best to like actually figure out this stuff before it actually happens.
Because a big, a big, you know, thing that, you know, from the Silicon Valley types is move fast and break things and like progress.
And that any technological advancement is necessarily synonymous with progress.
Right.
And like, you know, call me fucking crazy, but like you see all this progress that we've made with like the, you know, the internet and technology in the past 20 years.
It has made the world an actively significantly worse place, right?
And it's like, imagine that, you know, that sort of same sort of technological leaps and bounds, but to creating artificial intelligence or fucking, you know, or robots or better robots or robots that have artificial intelligence.
I mean, I'll tell you one thing is, is the reason like, you know, we don't want fucking people killing animals, right?
We don't want like, you know, people abusing animals, killing animals, right?
Is because like, you know, first of all, that says something about our moral and ethical duty to like other living beings, right?
And like, you know, the thing is with a cop, you know, I dealt with a lot of cops.
I actually have had the pleasure of both being, you know, droned, not, but not in a, probably the smallest sense of it, but, you know, having had a drone drop explosives.
And also I've had a gun pointed at me by the SFPD in the alley across from the Mexican gay bar on fucking 16th.
And I have been subject to both of those things.
A man pulls a gun at me, even if that man is a piece of shit, that is still a man, right?
On some level, still a man.
And a man that I can reason with, I can talk to.
Generally not if they're a cop, but you know, there is that possibility there.
They were born a human being with a soul.
And like an AI or a robot or whatever, you know, specifically, I guess this would be AI or an algorithmically driven robot, whatever.
You know, these things, you know, maybe I'm sure that they'll eventually create some AI that has like less accidental, not accidental, but like unnecessary use of force incidents than like a flesh and blood police officer, right?
And that they're going to be like able to do like, eventually, depending on what administration's in power, which way the public wind is blowing, is going to be like, look, we've made an anti-racist police officer a robot cop, right?
Or they're going to use something like that.
I'm sure that these tests will be done and they'll do these kind of things.
It's like at the end of the day, though, like if someone's going to pull the trigger on me, it's like, I want a beating heart somewhere behind that gun, you know?
And like, or if someone's going to pull out a gun at me at all.
And it's like a robot can be programmed or fed data that gives it a simulacrum of an emotion or a heart or a moral compass.
It can have its moral, it can have an implanted moral compass, but it can never actually have a real one.
And it's like, and it can learn these things.
And I'm sure these fucking nerds will be like, well, it learns them in the same way that we all learn them.
First of all, I fucking despise you.
But it's not, it does not have like the same like flesh and blood like connection to the earth and to the to the people upon it.
And so, like, it is, I'm, I gotta say, they're our fucking enemy.
That's why we say no robots here.
No robots.
No robots.
No more robots.
That's the true true non.
No more robots.
You know what, Liz?
I can guarantee you this, my sweet little moscher.
That some fucking nerd who listens to this show will be like, well, this is a robot.
Listen, we know what the robots are.
Yeah, you know when you see it.
We know when you see it, you know?
I don't need to get into this.
We're not Lex Friedman over here, right?
We're human beings.
I hate this little guy.
I'm going to tell you what.
Lex Friedman Friedman Theory 00:03:40
Let me talk about this Lex Friedman Friedman guy for a second.
Now, that is an AI if I've ever seen one.
So much has happened, Liz.
I know.
With us and with the world, but also just in the, you know, we've been on tour, but it's so nice to be back.
It's so nice to be back.
Hear about this Kanye West guy?
No, who's that?
Well, it's an interesting little fella.
He's like a sort of like an How tall is Kanye?
He is huge.
Like 5'10, 5'11?
Wait, how tall is he?
Actually, how tall is he?
5'8.
He's really 5'8?
Yeah, he's got to be shorter than 50.
He's 5'8?
No, so that's internet 5'8?
Means he's real life 5'7.
I'm towering over this motherfucker at a hard 5'8.25 inches.
Yeah, he is.
You see Alex Jones and Steve Crowder.
Steve Crowder.
Steve Crowder turned.
Yeah, that's what I call him.
Steven Crowder.
They turned against Nick Fuentes.
I gotta tell you, this past two weeks has been a disaster for us kind, gentle, loving people who are trying to rehabilitate Adolf Hitler.
Oh, my God.
I mean, my God, 50 years of work I've been engaging in.
Down the drain, this charlatan.
That's the real psyop.
Yeah, yeah, yeah.
That should be the new, that should be the new like crazy right-wing theory that all of those guys are psyop.
Is that their own?
That literally is the theory on this.
Yes, it's like they're like, they're trying to shitcoat Trump or whatever.
Here's the thing.
I gotta tell you.
I forgot about Trump.
Is he still running?
You know, it's funny because he declared, right?
Yeah.
And then just, we didn't really hear much after that.
No.
Yeah.
Whimper.
Did he declare or was it just rumored he was going to declare?
No, I think he did.
He did declare, right?
It's just been such a nothing after that.
You know what?
I think they should have to go, I do declare.
And then we would know.
Adieu declared that.
I'm running for a president of, well, the Southern States of America.
Yeah, he is, you know, well, he had, you know, he had dinner with like Kanye and Fuentes and Trump did.
Yeah, and that's a good fucking cocksucker.
You got to get back in the game.
I mean, I don't think he's very much.
I don't think he's in the game.
I think Ron DeSanctimonius is.
It's weird to be running your re-election campaign from the state where the other guy is governor.
Yeah, he's, I don't like him.
I mean, I've except obviously.
But I mean, not just for the obvious things.
He looks so clammy.
Ron DeSantis?
He looks like he's very.
Just two hours ago, Vincent Martini says the same thing to me.
No way.
He's just like, Ron DeSantis.
He's a very wet man.
Yeah.
Doesn't he seem very wet?
He does seem like a moist fellow.
I think he's going to Marco Rubio himself.
Gay?
No.
Like, he's going to do a classic.
Everyone's like, this is my guy.
This is the guy.
And it's like, it's crazy people were really like, Marco Rubio is our guy.
They're still trying to make that one happen.
My God.
Every, you know, every time.
What did Trump call him?
Little Marco.
Little Marco.
That's a.
See, that's so much better.
Little Marco is crazy.
Yeah.
Compare that to fucking Ron DeSanctimoni.
That doesn't work.
You know why?
Because I don't think Trump wrote that.
Oh, yeah.
You know, this is your theory that I get behind.
Trump's not.
No, no.
Patriots are not in control right now.
No.
There's so many ghostwriters.
Yeah.
Yeah.
Well, I think they just made the one.
He's just, just call him Clammy Ron.
Clammy Ron is good.
Yeah.
No one's going to vote for Clammy Ron.
You know, it's crazy.
It's crazy to see the battle lines getting drawn here, but you know what?
I mean, fucking egg on our faces.
We were Kamala first, people, and now it's like, I don't even know if they're going to run her as an own ticket.
I'm still Kamala first.
You know what?
Let's go full lady.
Dual-Wielding Middle Fingers, Liz 00:06:32
Because look, we here at True Non, we're straight shooters.
We are hardcore still remain Jill Stein voters.
We're Stein all the way down.
We're Stein all the way down.
So what we're saying is, Kamala, let's reprise this role as VP.
You've done such a great job.
But get behind Stein.
Yeah.
Yeah.
You got to get behind Stein.
Hey, get behind Stein.
She's Jewish?
I think so.
She doesn't look like it.
I don't know anything about her.
I got to say, Kamala, very funny.
And she's having a renaissance right now.
Yeah, everything's coming up, Kamala.
know if I would go that far at all.
You're laughing at her probably in greater numbers than ever before.
This is like what I call, okay, so every NBA season, the beginning of every NBA season, you can kind of catch the vibe of like where everyone's going in terms of like low-key, let's call them the hipster teams.
Oh.
Oh.
You know what I'm saying?
Yeah.
Of like, I'm rooting for these guys.
These are the fun watch league pass.
These are like league passes white.
What I would call League Plass Sleepers, which is, you know, no one thinks that they want to watch this, but this is, you know, classic.
Yeah, yeah.
In two to three years, everyone's going to be like, man, this squad is fucking fantastic, right?
I feel that afoot with Kamala.
And I have for about the past six months in that, it's really been eight months, where everyone's just like, God, she's so funny.
We misjudged her.
It's ironic, of course.
But she's, you know, she's got some ironic power that I think she could tap into, except the team is going to ruin it forever.
That little girl was me.
Yeah.
That little girl was me.
If the team gets wind of the ironic power of Kamala, her internal team, the whole thing's going to be dead in the water.
You think Hair and Doug Amoff are fucking?
You guys, it's great to be back.
So good.
It's so good to have to do podcasting.
And we are so excited to announce that this was the last episode that Brace and I will be doing because now chatbots will be for like another minute.
We'll be no.
All right, fine.
You know what I'm going to say when you do that?
Then I just like freeze up and I can't.
Let me ask you a question.
Why don't we just tell him to pause recording and we'll start?
No, We're podcasting right now.
No, the whole thing of podcasts can get edited.
It's not live radio.
Okay, pause here.
All right.
Look at the end of the document, the very end.
Oh, no.
No, you didn't put it in yet.
Oh, fuck.
Wait, I got to switch up the verse chorus structure.
Oh, my God.
Hold on, hold on.
I'm getting there.
This is how the fucking, we never know.
We never do the live editing while we're going here.
We can switch off verses if you'd like.
Oh, my God.
But I think it's time to start.
Okay, let's switch off?
Okay.
Are you going to start?
No, you are.
Oh.
No.
Okay, I'll start.
No, it's fine.
Oh, God.
Liz Franzak is a real cool host, inviting her guests.
She's never boast.
But when Brace Belden came to town, she had a plan that would turn the tables around.
She offered him, well, this is not how this happened, but okay.
She offered him a co-host of the Truan podcast, but only if he'd admit to shaving his legs.
I don't do that.
Brace knew that it was a test of his will, but he refused, ready to remain standing still.
Which is ironic because actually the reason that my legs don't have any hair on them right now is due to aerodynamics, but I didn't shave them.
It just fell off.
Oh, this is the chorus.
Oh, Liz Franzak, the hostess with the mostest, offering to co-host a rat in exchange to Brace Belden's admission.
It was a deal that seemed too good to be true, but it didn't stop Liz from trying to make it through.
The rat was a special one, Young Chomsky by name, living in a hole in Central Park, he'd claim.
He was excited at the offer made by the host and said he'd be delighted if Brace just said toast.
But Brace held his ground, his pride in the way.
He refused to admit that he shaves his legs that day.
So Young Chomsky stayed in his hole and Liz stayed host and Truanon is still going, so that's not a total loss.
Oh, Liz.
This doesn't really make any sense.
Oh, Liz Francak, the hostess with the Moses, offering a co-host, a rat in exchange for Brace Belden's admission.
It was a deal that seemed too good to be true, but it didn't stop Liz from trying to make it through.
It's a funny thing, this deal Liz made.
She said she'd give the co-host to a rat, unafraid.
But Brace Belden kept his pride and said no.
So Young Chomsky stayed in his hole in Central Park, not ready to go.
But Liz still hosts the show, her voice ringing out strong, inviting guests and conversations that go on and on.
True and still going, and it's worth a listen.
And Liz Franczak is still the one to thank for that decision.
Okay, I think it ends well.
I think so too.
And you know what?
I got to tell you, I've created some other raps.
I didn't think that the other raps would be okay for us to read on here.
Yeah, they aren't.
They aren't okay.
But I feel like this one was appropriate.
Yeah.
One second.
I'm actually going to come up with a quick way for us to say goodbye.
Okay.
Can you just read this?
Can you read this?
No, I can't.
You can't read this.
Young Chomsky, could you give her a hint?
Could you maybe see what you're reading here?
He's dual-wielding.
Dual-wielding middle fingers, Liz.
Dual-wielding.
And that is why I'll never replace Brace with a robot.
But I would gladly replace you with an even shorter woman that I could tower over and act like a tyrant in front of.
Now, I'm hungry.
I don't want to do the episode.
I'm going to be four hours late today.
You know what?
The rap studio next door to us literally stopped playing beats really loud right as writing the show.
I know.
Which is.
But you know what?
We pushed through it.
We pushed through it.
You know what?
We did.
And we're back.
We're podcasting.
We're back in the seat.
We've got some great episodes coming up for you guys.
We're doing an episode.
I'm not here to do this.
I'm not going to lie.
Wait, tell me.
I'm not going to lie.
No, no, no, no.
You'll see when she shows up.
Oh, my God.
I'm Liz.
My name is Brace, the robot Belden.
And of course, we are joined by producer Young Chomsky and our faithful friend, C3PO.
Don't get technical with me.
And this has been Truanon.
We'll see you next time.
Export Selection