All Episodes
Feb. 6, 2026 - The David Knight Show
20:40
Flashback Interview: Autonomous Killer Robots & The Death Of Accountability

AI and robotics professor Noel Sharkey reveals how a deep dive into U.S. military roadmaps shattered his faith in “benevolent” robotics, exposing a global rush toward fully autonomous killer machines with zero accountability. He details DARPA’s push for armed humanoid robots, autonomous drone jets, and ground-based weapons platforms—sold to the public as rescue tools while quietly designed for assassination and domestic control. Money should have intrinsic value AND transactional privacy: Go to https://davidknight.gold/ for great deals on physical gold/silver For 10% off Gerald Celente's prescient Trends Journal, go to https://trendsjournal.com/ and enter the code KNIGHT Find out more about the show and where you can watch it at TheDavidKnightShow.com If you would like to support the show and our family please consider subscribing monthly here: SubscribeStar https://www.subscribestar.com/the-david-knight-show Or you can send a donation through Mail: David Knight POB 994 Kodak, TN 37764 Zelle: @DavidKnightShow@protonmail.com Cash App at: $davidknightshow BTC to: bc1qkuec29hkuye4xse9unh7nptvu3y9qmv24vanh7

|

Time Text
Drones And The Robot Gap 00:15:09
Now, we're joined right now with Dr. Noel Sharkey.
We were concerned that we might not be able to get him.
He is traveling the globe trying to wake people up to the dangers and the abuses of robotics technology.
He's actually a professor of artificial intelligence and robotics.
And he has now, after he had kind of an epiphany, we're going to let him tell you about it.
I won't try to describe it.
His core research interest is now in the ethical applications of robotics and artificial intelligence.
Dr. Sharkey, can you hear me?
I can hear you very clearly, David.
Hello.
Great, great.
So you found Wi-Fi in that monastery?
Yes, I did.
That's great.
That's great.
Tell us a little bit about your epiphany.
When you went from, you were working in robotics, you were very much interested in the technology of it.
But like so many of us who work in engineering or technology, you didn't realize exactly some of the dark ways it was being used.
No, I didn't.
I had my head in a bag kind of thing, just getting on with my research.
And I was at a press conference in London about a government report on talking about robots worrying about housing benefits for robots of the future.
It was kind of a nutty report, and it was the first report our government had put out about robotics.
And at the press conference, some journalists said to me, Well, can you tell us a little bit about military robots?
And of course, I knew nothing about it, apart from a little bit about bomb disposal, because I'm from Northern Ireland and we know about bomb disposal.
So I thought I'd go off for the evening and just have a quick look at the internet, you know, an hour or so, sort of answer questions.
Seven months later, I finished that little look at the internet, having read, you know, through all the U.S. plans from 2001 right up to the present at the time, which was five years, six years of plans for all the U.S. forces, the roadmaps.
And they were all talking about the application of autonomous robots for killing people.
And I just thought, this is ridiculous the way they were talking about it.
It was like science fiction.
They didn't seem to have an idea about the limitations.
And so I wrote an article for the Guardian newspaper in the UK then, 2007, and it started there, and it's been a whirlwind since.
I find it very alarming because I look at every week we see some kind of new robotic technology.
They're spinning it in the media as saying that these are the Pentagon's rescue robots.
I don't think that DARPA is really funding rescue operations.
I don't think the Pentagon's really funding that.
If anybody thinks that after looking at the way they're using drones to assassinate people all over the world, they need to have their head examined.
That's some of the most obvious propaganda I've ever seen.
The media should be ashamed of itself for putting out these killer robots, these potential, these robotic projects that are being put out by the Defense Advanced Research Projects, DARPA.
DARPA has a research budget that is higher than they've got for the entire budget of North Korea.
We're supposed to be concerned about North Korea, yet North Korea's entire economy is about the same size as the DARPA research budget.
And DARPA is constantly, this seems to be their primary focus.
Well, DARPA aren't accountable.
So their thing, and it's a good research agenda, their agenda is you just keep funding everything that's even slightly crazy.
And as long as one of them comes up and comes up Trump's, then you're okay.
But I was laughing there because one of the reports I read from DARPA about these humanoid robots that are going to, they're going to carry wrenches, and they talk about them going ahead of the forces.
Well, why would you have a pile of humanoid robots going ahead of the forces with wrenches?
People's water pipes.
Fix the broken tanks.
Yeah, here's the headline.
This six-foot, 330-pound robot may one day save your life.
That's talking about the Atlas robot.
And the troubling thing is, is that when the Atlas robot came out, I looked at it.
I thought it was very frightening the way that this thing is able to move around.
And the fact that it has these two long arms that essentially look like cylinders, where I can imagine them being guns of some sort.
And Ray Kurzweil comes out and says, that's great.
Now all he needs is a brain so he can act autonomously.
Oh, God.
Yeah, that's bad.
I think you're right, though.
I can see the cylinders are going to become machine guns.
I would say, you know, it's no doubt about that at all.
Maybe not for a while, but that would be the general plan, I would think.
Yes.
Why would DARPA be spending money on rescuing people?
When they be in the business of rescue ever.
And then we have these articles that are coming out.
Now, this was about a year ago.
This is from Wired's Danger Room.
It says, the Pentagon doesn't trust its own robots.
They say that there's a cloud of distrust and misunderstanding hovering over robots that the Pentagon already has.
So they're looking at this and they're evaluating this and they don't really like it.
So we've got to conduct a media campaign to change the public's mind to change the military's mind.
You're pushing back in the opposite direction.
We understand that the military industrial complex wants this very badly.
They see this as a brand new profit center.
I'm very, very concerned about the lack of accountability that is going to happen with these killer robots.
Can you address that?
Yes.
I mean, you're right about the profit.
I mean, just in terms of the drones, which we know about already, I mean, Israel made 4.8 billion in the last five years profit on those.
And this competition is so stiff.
So the idea is that you've got all the drones.
How do you compete in that big market?
And the way you compete is let's add autonomy to it.
Let's make them work on their own.
And then we can make a lot more money.
And what we're trying to do is we're trying to preemptively ban them.
That means ban them before they get out and before there's too much investment.
Because once billions of dollars have been spent on it, it's going to be very difficult.
Very difficult.
And we're thinking, we're making very good headway at the UN, actually.
Well, that's good.
I'm glad that you're going around and you're addressing the press.
You're addressing governments.
You're pointing out to them where this could head.
I see this.
You know, when we're talking about nuclear war, if some government presses the button and launches a bunch of nuclear weapons on someone, there's going to be literally hell to pay for that.
There's going to be accountability for that.
But if they take a bunch of killer robots, as we've already seen with drones, we've seen how they can go in with drones.
They can destroy a village and there's no accountability.
Nobody is held responsible for that.
And it'll even be easier for them once these robots have self-autonomy.
They can basically say that they've got plausible deniability.
They can say, well, the software went wrong on it, or it was a hardware bug.
It wasn't me.
It wasn't the president.
It wasn't the general.
It wasn't the lieutenant.
It was just this crazy technology.
So we'll fix it.
Meanwhile, they've killed a lot of people.
And I can see them using this domestically as well as abroad.
So if you think that this is something historically, governments have killed more of their own people than they've killed in other countries.
It's called democide.
So if you think this is something that is simply going to be used against these people on the other side of the world that you don't like, you need to wake up about that as well.
I think really do.
And the thing is that what the United States and the high technicians like the UK, they have a problem understanding that this stuff will proliferate and everyone will have it.
And then where are we going to be?
And that's my biggest worry is this idea that we'll keep ahead of the technology.
We'll be the ones who'll use it without thinking at all that everybody else is going to have it and they will all interact together and then it's going to be a real mess.
We've already seen countries like Iran setting up their own drones.
They're not that technologically advanced.
It didn't take them that long to get drones.
It's not going to take them long to get robots.
This is something.
And what we're concerned about here, I have to say, Dr. Sharkey, we have a little bit different situation in the U.S. is unique in the sense that now we have this massive program of bringing home weapons that have been used in Iraq and putting them in police departments here in America.
And we're already concerned about the fact that humans themselves are not being held accountable for their actions when they fire on a child with a toy gun.
What's going to happen when a robot does that?
Do you think there's going to be any accountability for that?
Do you really think the United States police will start?
I mean, there's certainly talk about arming robots with tasers.
That would be the start for it.
But tasers are kind of deadly.
I mean, 150 people died in the U.S. within a period of two years from taser abuse.
So if you start arming robots with tasers, then I think it's only a short step to arming them with weapons, really.
Yes.
They're very horrible, non-quote-unquote, non-lethal weapons, very horrible ones.
It's not just tasers.
They have all kinds of things that they can beam at you to make you sick, to blind you, that sort of thing.
We just had, you talked about how they kill people.
We've got a teenager right here just south of Austin that was tasered by a cop in a school who was standing there, standing around a fight and didn't get out of the way in time.
The cop tasered him, fell down and cracked his head.
He's now in a coma.
They believe he may die.
That happens all the time with tasers.
So when you talk about non-lethal force, you talk about shooting people with rubber bullets.
You talk about using these gases on people.
That's not something that we're looking forward to seeing happen in terms of crowd control.
That can be very, very oppressive.
Yes, but you don't have to worry too much because they're going to stop it.
Good, good.
So tell us about your campaign.
Tell us, you're going around and you're talking to different governments.
I guess you're having public speeches where you're trying to inform the public.
How is your press coverage?
Well, press coverage has been very good, but we had a breakthrough recently at the United Nations.
There's a committee of the United Nations called the CCW.
And that's the place where poisonous gas has got banned, biological weapons got banned, chemical weapons.
So that's the place for prohibiting new weapons.
And we spoke to the, the French have just taken up the presidency, and we spoke to the French ambassador.
Then we spoke to the U.S. delegation, and they've agreed with us that they would put this forward as a mandate for discussion in the UN.
And last Friday, Friday, two weeks ago, they had 117 nations from the CCW met, and one of those nations could v to it.
Russia were there, China were there.
There was a massive discussion, and they accepted the mandate.
So next year, the CCW committee are setting up an expert workshop to take this on board and discuss it.
Well, that's great.
And your campaign is stopkillerrobots.org.
They can learn about that on the internet.
www.stopkillerrobots.org.
It's a campaign to stop killer robots.
Now, you've taken this to the UN.
You've taken it to other countries.
I'm a little bit more skeptical of the UN and these other countries.
Hopefully they will ban this and hopefully these other countries won't secretly develop this on their own, worried that they're going to have some kind of a robot gap or something like that.
Like they've had weapons defensively do it because they're afraid the other guy might do it.
My big concern, and I think your campaign will be very effective at this as well, and that is to address the engineers and the scientists to try to get them to understand, because that, I believe, is where we really need to go.
As long as there's an engineer or a scientist who will develop this kind of stuff for pay, who doesn't look at the ethics of their work, you're going to find some kind of a politician or a dictator somewhere who will do it, who will use those people, who will pay them large amounts of money, and who will break any treaty that comes up, won't you?
Well, the problem is it's mainly the U.S. because they're a bit more cautious in Europe.
In the U.S., it's very difficult to not get funded by the military.
Most of the robotics labs are run by the military, are funded by the military.
And people aren't necessarily making weapons and things.
But when you're funded by the military, it stops you speaking up.
Yes.
It stops you speaking up against it.
I mean, there's hope.
I mean, there's a professional magazine in the U.S. called The Engineer.
And they ran a poll of the readership asking how many people would go for a complete ban or how many people would just go for trying to make the weapons more perfect over time.
Only 3% of the readership said they would make the weapons go more perfect over time.
And 73% said there should be a total ban.
So we're beginning to see a consciousness of it.
That's very hopeful because so many times we'll talk to directly, Alex will talk to soldiers and policemen over the radio and say, understand that the kind of society that you allow to happen, if you allow and are part of this kind of abuse that we see happening in the streets, that's going to be the society that you live.
That could be your family that is brought under that.
And historically, we've seen that always does happen.
It can't be contained and only limited to people who don't work for the government.
But we need to have that kind of awareness with engineers.
And as you pointed out, it's very difficult in America to get a job if you're an engineer that doesn't involve the military-industrial complex.
I know when I got out, it took me a while to find a job where I wasn't working for the military.
That was my degree initially.
So I understand that.
And it's good that people privately say that.
And I think I really applaud you for what you're doing because getting the information out there, people don't even believe that this could happen in so many cases.
They don't believe that the technology is there.
They think it's just the science fiction fantasy of the Terminator.
And we've been criticized for that here, that we're talking about some wild thing.
Talk about the absolute, talk about how the technology is approaching rapidly.
Well, we've had 44 nations of the UN now speak up with concern.
So people better believe me now.
It's not science fiction anymore when you have major countries like the U.S., Germany, Pakistan even, you know, all speaking up against killer robots or about them.
So it's definitely on course to happen.
There's no question of that unless we stop it.
But you have to be very worried in the U.S. as well about your privacy.
I mean, you've got the NSA there really doing a lot of nasty things, surveying people, but you're starting to get a lot of drones.
And once the drones become autonomous, not even armed, they can be everywhere.
They're getting very small.
And what worries me is not the current government you have, but these are legacy systems.
So what does the next government do with them?
What does the next government do with them?
Because they just inherit them directly.
And if you ever want to create an authoritarian regime, you're going to have the right tools to do it.
And that is a great concern to me as well.
Unmanned Aircraft Warfare 00:03:36
You hit the nail on the head.
What we are building in the United States right now is a perfect infrastructure for tyranny.
And whether our current leadership uses it or it's the next or second administration down, that's what we're concerned about.
Dr. Sharkey, what I would like for you to talk about in this segment is the state of the art in robotics.
People don't understand how imminent this problem is, how rapidly it's coming upon us.
Could you address that?
Yes, certainly.
Well, we've got lots of armed robots on the ground and in the air, as you know, that are remote controlled.
But what's happening now is a very rapid development of platforms that will carry the weapons.
The more we talk in the campaign about banning these weapon systems, the more we're driving them underground, it seems.
There's less talk about it now.
But for instance, in the United States, you've got three devices I'd like to mention to you.
One is called the X-47B, and that can land on an air.
It's a fighter jet, a combat unmanned combat aircraft, fast subsonic, just beneath the speed of sound.
It can land on an aircraft carrier.
It can take off from an air carrier.
And it's going to be used in the Pacific.
It's got 10 times the range of one of your F-35 fighter jets.
So that's really productive for them.
And it's just been tested two weeks ago in very windy conditions.
And that's working very well.
That will be weaponized and used.
So it's like a super drone, but you don't need people involved at all in controlling it.
You've also got a prototype system called the Crusher.
And that's developed by Carnegie Mellon and DARPA again, of course.
Of course they are.
The Crusher is a seven and a half ton truck.
And you can see it on, if anybody wants to go on YouTube and just do Crusher CMU, they will find it and they'll see it crushing Cadillacs and it's got a big gun up there on top of it.
Now, the other device that you're making in the United States is on the HTV-2 program, again, DARPA.
And they've got an aircraft called the Falcon.
And the idea is to be able to get an unmanned combat aircraft anywhere on the planet within a one-hour window.
So this thing has been tested at 13,000 miles an hour.
So that's just the United States.
In the UK, we have the Tyrannus, which is actually supersonic, so that's even faster.
That's an intercontinental unmanned combat aircraft.
Fully autonomous, means that there's no human controlling it.
And that's been tested in Australia just in the last couple of months.
So these are progressing very quickly.
The Chinese have the invisible sword as the engine, and that's being designed and built for air-to-air combat.
Again, no human controlling it.
The Russians have the SCAT.
The Israelis are using one called the Guardian.
And at the beginning, when they developed it, they talked a lot about it being fully autonomous to do routes and to fire on people at the borders between Palestine and Israel.
But now I spoke to an Israeli colleague the other day who was very excited because I had an early picture of it with guns on it.
And now they can't find any pictures of them with guns.
So I'm afraid we're driving these people underground a bit, but they're still doing it.
Well, you know, we can understand very quickly the implications of something like the Crusher that's going to crush vehicles domestically.
But even when it comes to these supersonic jet transports that are taking off from aircraft carriers that are unmanned aerial vehicles that are remotely controlled, that has a lot of danger in the sense that that's going to make our already aggressive government starting wars everywhere for very little justification.
They Want to Know Everything 00:01:53
And without congressional authorization, that's going to make them even more likely to get involved in these wars.
And there's this thing called blowback.
You know, once you start a war somewhere else, it will come home to you one way or the other eventually.
It may be asymmetric warfare.
It may be terrorism with people coming to your country and blowing people up in shopping malls, which will then be used to send out the crusher robots.
But thank you so much, Professor Sharkey, for joining us.
We're out of time.
Good luck on your campaign.
Keep trying to educate people about these dangers.
It's a very important thing you're doing.
Thank you for having me on, spreading the word for us.
Thank you very much.
Thank you very much.
That's StopKillerRobots.org.
The Common Man.
They created common core to dumb down our children.
They created common past to track and control us.
Their commons project to make sure the commoners own nothing.
And the communist future.
They see the common man as simple, unsophisticated, ordinary.
But each of us has worth and dignity created in the image of God.
That is what we have in common.
That is what they want to take away.
Their most powerful weapons are isolation, deception, intimidation.
They desire to know everything about us while they hide everything from us.
It's time to turn that around and expose what they want to hide.
Please share the information and links you'll find at the DavidNightShow.com.
Thank you for listening.
Thank you for sharing.
If you can't support us financially, please keep us in your prayers.
Export Selection