"[Arms Control Today] has become indispensable! I think it is the combination of the critical period we are in and the quality of the product. I found myself reading the May issue from cover to cover."
More Scrutiny for Autonomous Weapons
December 2014
Autonomous weapons systems will receive more scrutiny at a UN conference next year after stark divisions emerged among the states-parties in Geneva attending the annual meeting of the Convention on Certain Conventional Weapons (CCW) Nov. 11-14.
At the end of the meeting, conference president Remigiusz Henczel of Poland announced “an informal meeting of experts” next April 13-17 to discuss questions related to the emerging technologies that policymakers call “lethal autonomous weapons systems” and headline writers have dubbed “killer robots.”
The issue raised by these new weapons technologies was first taken up by the United Nations after the April 2013 report by Christof Heyns, the organization’s special rapporteur on extrajudicial, summary, or arbitrary executions, called for a moratorium on the development of weapons that can target and kill without human intervention.
The first multinational conference dedicated exclusively to robotic warfare took place in Geneva last May. (See ACT, June 2014.)
Several nongovernmental organizations and a handful of countries renewed their calls for a pre-emptive ban on lethal autonomous weapons systems. But, as happened last May, the only consensus that the 80-plus governments in attendance could muster was that autonomous weapons need more study.
“Important questions still remain,” the European Union said in a Nov. 13 statement. The Campaign to Ban Killer Robots, a coalition of nongovernmental groups, lamented the lack of a “sense of urgency” in the scheduling of more study. The campaign called on delegates to adopt the standard that all weapons systems must have “meaningful human control” or else be prohibited.
Michael Biontino, a German representative at the conference, concurred. “We regard the retention of human control over the decision about life and death as indispensable,” he said.
Autonomous Systems Versus Drones
One key difference among the speakers was whether lethal autonomous systems are already in use.
Delegates from Cuba and the Palestinian territories suggested that U.S. and Israeli drones deployed in armed conflicts in Pakistan and Gaza qualify as autonomous or semiautonomous weapons.
“Drones with lethal autonomous munitions have been extensively used to target Palestinians,” Palestinian representative Adel Atieh said at the meeting. He alleged that observation towers around Gaza “are equipped with automatic gun machines with lethal autonomous capabilities.” Atieh said several Palestinian farmers had been killed by the machines.
The Israel Defense Forces did not respond to requests for comment.
Cuba is concerned about the use of semiautonomous military technologies such as unmanned aerial vehicles, said Anayansi Rodriguez Camejo, the country’s representative to the UN in Geneva.
Vatican representative Silvano Tomasi said further studying of lethal autonomous weapons systems “does not dispense the CCW from discussing in an appropriate manner the complex question of use of armed drones.”
Other countries made a distinction between drones and lethal autonomous systems, saying the latter are not yet a reality. “We are not speaking here about existing weapons systems,” said a UK government statement.
Russian representative Vladimir Yermakov said there are no working models of lethal autonomous systems. Discussion of “international legal regulation of virtual technology that presently has no functioning models seems to be doubtful,” he said in a statement.
What Is to Be Studied?
One of the issues likely to be debated at the meeting next April, say independent experts, is whether the use of autonomous weapons should be governed by international humanitarian law or international human rights law.
Humanitarian law applies “in situations of armed conflict whereas human rights laws, or at least some of them, protect the individual at all times, in war and peace alike,” according to the website of the International Committee for the Red Cross.
At last May’s meeting on robotic warfare, the International Committee for Robot Arms Control, a group of academic experts seeking a pre-emptive ban on autonomous systems, said, “It is not enough to consider only armed conflict or international humanitarian laws when discussing autonomous weapons.” States should consider the human rights implications “in any situation,” the committee said.
At the November meeting, Venkatesh Varma of India urged the attending nations to go beyond legal questions and assess the impact of the weapons on international security “if there is dissemination of such weapon systems.”
The autonomous features of existing weapons system need further study, said Laura Boillot, a project manager at Article 36, a London-based human rights group that seeks an international treaty to pre-emptively ban weapons that are fully autonomous.
“Understanding the controls that are considered acceptable in existing systems should help to work out what is acceptable in future systems,” Boillot wrote in a Nov. 18 e-mail to Arms Control Today.
“We have a number of experts’ individual views,” Jean-Hugues Simon-Michel of France, chair of the May experts meeting, said in a Nov. 21 e-mail to Arms Control Today. “We need now to move towards official national positions. It is a necessary step that we cannot avoid.”