|
Ban ‘Killer Robots’ Before It’s Too Late
un article par Human Rights Watch (abridged)
Governments should pre-emptively ban fully autonomous weapons because of the danger they pose to civilians in armed conflict, Human Rights Watch said in a report released today [November 19]. These future weapons, sometimes called “killer robots,” would be able to choose and fire on targets without human intervention.
© 2012 Russell Christian for Human Rights Watch
click on photo to enlarge
The 50-page report, “Losing Humanity: The Case Against Killer Robots,” outlines concerns about these fully autonomous weapons, which would inherently lack human qualities that provide legal and non-legal checks on the killing of civilians. . .
“Losing Humanity” is the first major publication about fully autonomous weapons by a nongovernmental organization and is based on extensive research into the law, technology, and ethics of these proposed weapons. It is jointly published by Human Rights Watch and the Harvard Law School International Human Rights Clinic.
Human Rights Watch and the International Human Rights Clinic called for an international treaty that would absolutely prohibit the development, production, and use of fully autonomous weapons. They also called on individual nations to pass laws and adopt policies as important measures to prevent development, production, and use of such weapons at the domestic level.
Fully autonomous weapons do not yet exist, and major powers, including the United States, have not made a decision to deploy them. But high-tech militaries are developing or have already deployed precursors that illustrate the push toward greater autonomy for machines on the battlefield. The United States is a leader in this technological development. Several other countries – including China, Germany, Israel, South Korea, Russia, and the United Kingdom – have also been involved. Many experts predict that full autonomy for weapons could be achieved in 20 to 30 years, and some think even sooner.
“It is essential to stop the development of killer robots before they show up in national arsenals,” said Steve Goose, Arms Division director at Human Rights Watch. “As countries become more invested in this technology, it will become harder to persuade them to give it up.”
Fully autonomous weapons could not meet the requirements of international humanitarian law, Human Rights Watch and the Harvard clinic said. They would be unable to distinguish adequately between soldiers and civilians on the battlefield or apply the human judgment necessary to evaluate the proportionality of an attack – whether civilian harm outweighs military advantage.
These robots would also undermine non-legal checks on the killing of civilians. Fully autonomous weapons could not show human compassion for their victims, and autocrats could abuse them by directing them against their own people. While replacing human troops with machines could save military lives, it could also make going to war easier, which would shift the burden of armed conflict onto civilians.
Finally, the use of fully autonomous weapons would create an accountability gap. Trying to hold the commander, programmer, or manufacturer legally responsible for a robot’s actions presents significant challenges. The lack of accountability would undercut the ability to deter violations of international law and to provide victims meaningful retributive justice. . .
“Action is needed now, before killer robots cross the line from science fiction to feasibility,” Goose said.
|
|
DISCUSSION
Question(s) liée(s) à cet article:
Drones (unmanned bombers), Should they be outlawed?
* * * * *
Commentaire le plus récent:
Finally a Drone Report Done Right
By David Swanson
The U.N. and Human Rights Watch and Amnesty International recently released a flurry of deeply flawed reports on drone murders. According to the U.N.'s special rapporteur, whose day job is as law partner of Tony Blair's wife, and according to two major human rights groups deeply embedded in U.S. exceptionalism, murdering people with drones is sometimes legal and sometimes not legal, but almost always it's too hard to tell which is which, unless the White House rewrites the law in enough detail and makes its new legal regime public.
When I read these reports I was ignorant of the existence of a human rights organization called Alkarama, and of the fact that it had just released a report titled License to Kill: Why the American Drone War on Yemen Violates International Law. While Human Rights Watch looked at six drone murders in Yemen and found two of them illegal and four of them indeterminate, Alkarama looked in more detail and with better context at the whole campaign of drone war on Yemen, detailing 10 cases. As you may have guessed from the report's title, this group finds the entire practice of murdering people with flying robots to be illegal.
Alkarama makes this finding, not out of ignorance of the endless intricacies deployed by the likes of Human Rights Watch and Amnesty International. Rather, Alkarama adopts the same dialect and considers the same scenarios: Is it legal if it's a war, if it's not a war? Is it discriminate, necessary, proportionate? Et cetera. But the conclusion is that the practice is illegal no matter which way you slice it.
This agrees with Pakistan's courts, Yemen's National Dialogue, Yemen's Human Rights Ministry, statements by large numbers of well-known figures in Yemen, and the popular movement in Yemen protesting the slaughter. . ... continuation.
|
|