"[Arms Control Today] has become indispensable! I think it is the combination of the critical period we are in and the quality of the product. I found myself reading the May issue from cover to cover."
Autonomous Weapons Stir Geneva Debate
The first multinational conference dedicated exclusively to robotic warfare took place May 13-16 at the UN Office at Geneva as governments around the world confront the emerging technologies that policymakers call “lethal autonomous weapons systems” and headline writers have dubbed “killer robots.”
The three-day meeting featured diplomats, scholars, and activists debating the implications of new weapons that could automatically target and kill people without human control. Although few such weapons exist now, revolutionary developments in sensors and robotics have stoked fears in some quarters that these weapons systems could make warfare less risky for the attacker and therefore more indiscriminate, but raised hopes in others that they might reduce civilian casualties.
“Delegations underlined the fact that this meeting had contributed to forming common understandings, but that questions still remained,” Jean-Hugues Simon-Michel of France, who chaired the meeting, said in his report. “Many delegations expressed the view that the process should be continued.”
Representatives from 87 countries and a dozen civil society groups attended the conclave amid high media interest. Multiple news outlets reported on the meeting of the parties to the Convention on Certain Conventional Weapons (CCW), which bans weapons such as blinding lasers.
The result of the Geneva meeting was a mixture of urgency and uncertainty among the 100-plus attendees.
“All too often, international law only responds to atrocities and suffering” after they have happened, Michael Møller, acting director-general of the UN Office, said in welcoming the conference participants, “You have the opportunity to take pre-emptive action and ensure that the decision to end life remains firmly under human control.”
Five states—Cuba, Ecuador, Egypt, Pakistan, and the Vatican—submitted statements calling for a ban on the lethal autonomous systems. “Experiences have shown that it is best to ban a weapon system that is deemed excessively injurious or has an indiscriminate effect before it is being deployed [rather] than afterwards,” said Walid Abdelnasser, head of the Egyptian delegation, in Egypt’s statement.
That call was echoed by most of the civil society organizations in attendance, including the Nobel Women’s Initiative, which submitted an open letter signed by 20 Nobel Peace Prize winners endorsing a ban.
U.S. officials said talk of a ban or any other specific policy measure was “premature” and often based on an inaccurate conception of the weapons involved.
“Too often the phrase [‘lethal autonomous weapons systems’] appears still to evoke the idea of a humanoid machine independently selecting targets for engagement and operating in a dynamic and complex urban environment,” Stephen Townley, the State Department official who headed the eight-person U.S. delegation, said in his opening statement. “But that is a far cry from what we should be focusing on, which is the likely trajectory of technological development, not images from popular culture.”
Mary Cummings, director of the Humans and Autonomy Lab at Duke University, also called for “a debate based on facts, not fear.” In a May 21 interview, she emphasized the need for a “much better understanding of the technology.”
‘Meaningful Human Control’
“We are beginning to develop a shared understanding internationally regarding the issues surrounding this new class of weapons,” said Ron Arkin, a roboticist from the Georgia Institute of Technology, in a May 19 e-mail. Nevertheless, he said, “[t]here remains a long way to go even in terms of shared definitions and terminology regarding autonomy and ‘meaningful human control,’” a concept endorsed by many delegations as a prerequisite for any lethal weapon.
In two sessions on international law, experts debated whether existing legal instruments could control new kinds of weapons.
“Many states recognized that existing international law, including international humanitarian law, already provides a robust framework for dealing with new weapon technologies, even if autonomous weapons—like many new technologies—pose some challenges,” Matthew Waxman, an official in the George W. Bush administration, said in a May 19 e-mail. Waxman, a law professor at Columbia University, spoke at the Geneva meeting.
Christof Heyns, a South African jurist who serves as UN special rapporteur on extrajudicial, summary, or arbitrary executions, argued that new legal concepts will be necessary. Heyns’ April 2013 report to the United Nations on lethal autonomous weapons called for a moratorium on such weapons, after which civil society groups urged the parties to the CCW to convene the informal experts meeting in Geneva.
“I think there was a lot of interest at the meeting in the notion that it will be necessary to ensure that humans retain meaningful human control over each attack, and this concept—in addition to other concepts—has to be developed further,” Heyns said in a May 21 e-mail.
Stephen Goose, director of Human Rights Watch’s arms division, welcomed the emphasis on meaningful human control, saying in a May 20 interview that it represented a step toward expanding the CCW to cover lethal autonomous systems. But in a May 21 e-mail, a U.S. official countered, “It is premature to consider where the discussions may be leading.”
The debate about robotic weapons will continue in November when the parties to the CCW will decide at their annual meeting in Geneva whether to renew the mandate to study the issue in 2015.
“The interest shown in Geneva shows that killer robots need to go to top of the arms control and disarmament agenda,” Goose said.