"The Arms Control Association’s work is an important resource to legislators and policymakers when contemplating a new policy direction or decision."
U.S., Russia Impede Steps to Ban ‘Killer Robots’
October 2018
By Michael Klare
The latest effort toward imposing binding international restrictions on so-called killer robots was thwarted by the United States and Russia, pushing off the divisive issue to a November meeting of states-parties to the Convention on Certain Conventional Weapons (CCW).
For five years, officials representing member-states of the CCW, a 1990 treaty that seeks to outlaw the use of especially injurious or pernicious weapons, have been investigating whether to adopt a ban on lethal autonomous weapons systems. In late August, a group of governmental experts established by the CCW met to assess the issue, but it failed to reach consensus at a Geneva meeting and called instead for further discussions.
The impasse reflects the tensions over an advancing set of technologies, including artificial intelligence and robotics, that will make possible systems capable of identifying targets and attacking them without human intervention.
Opponents insist that such weapons can never be made intelligent enough to comply with the laws of war and international humanitarian law. Advocates say autonomous weapons, as they develop, can play a useful role in warfare without violating those laws.
Concern over the potential battlefield use of fully autonomous weapons systems has been growing rapidly in recent years as the pace of their development has accelerated and the legal and humanitarian consequences of using them in combat have become more apparent. Such systems typically combine advanced sensors and kill mechanisms with unmanned ships, planes, or ground vehicles.
Theoretically, fully autonomous weapons of this sort can be programmed to search within a predesignated area for certain types of threats—tanks, radars, ships, aircraft, and individual combatants—and engage them with onboard guns, bombs, and missiles on their own if communications are lost with their human operators. This prospect has raised the question whether these weapons, if used in a fully autonomous manner, will be able to distinguish between legitimate targets, such as armed combatants, and noncombatant civilians trapped in the zone of battle. Likewise, will they be able to distinguish between enemy combatants still posing a threat and those no longer capable of fighting because of injury or illness?
Humans possess the innate capacity to make such distinctions on a split-second basis, but many analysts doubt that machines can ever be programmed to make such fine distinctions and so should be banned from use.
Under the terms of the CCW, the 120 signatory states, which include China, Russia, and the United States, can negotiate additional protocols prohibiting certain specific classes of weapons. So far, five such protocols have been signed, including measures banning landmines, incendiary weapons, and blinding lasers.
Starting in 2014, some member states have sought to initiate negotiations leading to a similar protocol that would ban the development and use of fully autonomous lethal weapons. Others were resistant to moving directly toward negotiations, but agreed to a high-level investigation of the issue. For that purpose, CCW member states established the experts group, comprised largely of officials from those states, to assess the implications of fielding autonomous weapons and whether starting negotiations on a protocol was justified.
In the discussions that followed, several distinctive positions emerged. About two dozen countries, including Argentina, Austria, Brazil, Chile, China, Egypt, and Mexico, advocated for a legally binding prohibition on use of such weapons. A number of civil society organizations, loosely allied through the Campaign to Stop Killer Robots, also urged such a measure.
Another group of states led by France and Germany, while opposing a legally binding measure, support a political declaration stating the necessity of maintaining human control over the use of deadly force.
Wherever they stand on the issue of a binding measure, nearly every country represented in the experts group at the August meeting expressed opposition to the deployment of fully autonomous weapons. Nevertheless, a small group of countries, including Israel, Russia, South Korea, and the United States, rejected a legal prohibition and a political declaration, saying more research and discussion is necessary.
For the United States, the resistance to a declaration or binding measure on autonomous weapons can be read as instinctive hostility toward any international measure that might constrain U.S. freedom of maneuver, a stance visible in the Trump administration’s animosity towardother multilateral agreements, such as the Iran nuclear deal.
Further, U.S. opposition stems from another impulse: many senior U.S. officials believe that leadership in advanced technology, especially artificial intelligence, cyberoperations, hypersonics, and robotics, is essential for ensuring U.S. success in a geopolitical contest with China and Russia. “Long-term strategic competition, not terrorism, is now the primary focus of U.S. national security,” Defense Secretary Jim Mattis told the Senate Armed Services Committee on April 26.
“Our military remains capable, but our competitive edge has eroded in every domain of warfare,” he said. To reclaim that edge, the United States must restore its advantage in all areas of military competency, including through “research into advanced autonomous systems, artificial intelligence, and hypersonics.”
U.S. policy requires that a human operator be “in the loop” when making decisions before a weapons system, such as a missile-carrying drone, fires at a target.
Still, the determination to ensure U.S. dominance in artificial intelligence and robotics virtually guaranteed U.S. opposition to any outcome of the experts group that may hinder progress in developing military applications of machine autonomy. “We believe it is premature to enter into negotiations on a legally binding instrument, a political declaration, a code of conduct, or other similar instrument, and we cannot support a mandate to enter into such negotiations,” Joshua Dorosin, deputy legal adviser at the State Department, said at the experts group meeting Aug. 29.
Because decisions of the group are made by consensus, U.S. opposition, mirrored by Russia and a few other countries, prevented it from reaching any conclusion at its meeting other than a recommendation to keep talking.
Follow-up steps will be determined by CCW states-parties. They are due to meet in Geneva on Nov. 21–23, although it is unlikely they will reach consensus on anything beyond continuing discussions.
Member organizations of the Campaign to Stop Killer Robots are lobbying participating delegations to act more vigorously and to consider a variety of other pathways to banning the development of fully autonomous weapons systems, perhaps outside the CCW framework.