© 2024 KOSU
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Human Rights Watch Raises Concerns Over Autonomous Weapons

SCOTT SIMON, HOST:

Elon Musk, the CEO of Tesla, signed a letter earlier this week that got our attention. It was addressed to the United Nations Convention on Certain Conventional Weapons, and it outlined concerns that Musk and another 100 robotics and artificial intelligence experts have over the development of fully autonomous weapons - think self-driving cars but as lethal machines. Human Rights Watch is asking for a pre-emptive ban on the development of these weapons. Bonnie Docherty is the senior arms researcher at Human Rights Watch and joins us in our studios. Ms. Docherty, thanks so much for being with us.

BONNIE DOCHERTY: Thank you for having me.

SIMON: What are your concerns?

DOCHERTY: Well, fully autonomous weapons raise a host of concerns - they're moral, legal, security, technological. So, for example, the moral perspective, many people find it objectionable, even outrageous, that a weapon could make a life-and-death decision on the battlefield or in law enforcement, for that example. From a legal perspective, there's a lot of concerns whether they could follow international law and adequately protect civilians in war. And there's also issues of accountability. If a fully autonomous weapon killed civilians, who would you hold responsible? There's problems with holding commanders, manufacturers, producers responsible, so they would escape liability.

SIMON: A lot of people will notice that human beings seem to be pretty gifted at figuring out ways to kill each other without the benefit of machines already, don't they?

DOCHERTY: Well, they do. And so what we want to do is avoid exacerbating the situation by having new tools for humans and in some ways delegating that responsibility to others.

SIMON: People who undertake drone warfare for the United States and other societies say that contrary to what we might think, drones actually can be more selective and they can observe the international rules of conduct and engagement in a way people often don't. You don't think robots could do that?

DOCHERTY: Well, these weapons that we're concerned about in this particular campaign are the step-beyond drones. Currently, drones have the ability to fly autonomously and to even select targets, but a human is finally the one that pushes the button that chooses to kill. And what we're concerned about is when we go beyond that, when the robot itself is making the determination to kill. And that's when you cross a moral red line and run a lot of risks in terms of accountability and international law violations.

SIMON: Haven't we learned from North Korea in recent experience - and for that matter, the rearmament of Germany in the 1930s - that people will sign any piece of paper, but when the crunch comes, they will develop the weapons that they have agreed not to because they think it's in their self-interest?

DOCHERTY: There are limits of international law. There's elements of any national laws as well. Creating a stigma through international law, through a treaty or set of norms will help limit that. Obviously, no law is perfect. And I often use the analogy of laws against murder do not prevent all murder, but that doesn't mean we should not ban murder. And I think this is similar, that it still has a deterrence effect. And it can affect states that are even outside because it becomes - they'll be considered a pariah of the international community if they behave in such a way.

SIMON: Do you know of a single nation that's expressed support so far?

DOCHERTY: There are actually 19 states that have spoken out in favor of a ban. And at the Convention on Conventional Weapons meetings, there have been a number of others who have spoken in favor of requiring meaningful human control over the decision to kill on the battlefield or in law enforcement.

SIMON: Well, let me put it this way. Have the United States, Russia or China or any country in the European Union expressed support?

DOCHERTY: Some European countries have spoken in favor of a requirement for human control. The U.S., Russia, China and others have not been in favor of a ban, but some of them have been willing to continue discussions. They recognize these weapons would dramatically revolutionize warfare and that they want something to be done. It's a question of how far they're willing to go. And one thing we encourage countries to do is even if they're not at this point willing to go for a international ban, to develop national policies that would restrict the use at a national level. And the U.S. has done that.

SIMON: Bonnie Docherty is senior arms researcher at Human Rights Watch. Thanks so much for being with us.

DOCHERTY: Thank you very much for having me. Transcript provided by NPR, Copyright NPR.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

KOSU is nonprofit and independent. We rely on readers like you to support the local, national, and international coverage on this website. Your support makes this news available to everyone.

Give today. A monthly donation of $5 makes a real difference.