"[Arms Control Today] has become indispensable! I think it is the combination of the critical period we are in and the quality of the product. I found myself reading the May issue from cover to cover."
Dueling Views on AI, Autonomous Weapons
April 2023
By Michael Klare
The international debate over controlling artificial intelligence (AI) and autonomous weapons systems, often called “killer robots” by critics, is heating up, with the contending approaches generally falling into two camps.
One approach, favored by the United States, the United Kingdom, and many of their allies, calls for the adoption of voluntary codes of conduct to govern the military use of AI and autonomous systems in warfare. Another approach, advocated by Austria, most Latin American countries, and members of the Non-Aligned Movement, supports the adoption of a legally binding, international ban on autonomous weapons or a set of restrictions on their deployment and use.
These contending views were brought into sharp focus at three recent international meetings that considered proposals for regulating the military use of AI and autonomous weapons systems.
The first meeting, held in Belén, Costa Rica, on Feb. 23–24, was attended by government officials from nearly every country in Latin America and the Caribbean, as well as officials from the United States and 12 other countries. Also present was strong representation from civil society organizations, including the Campaign to Stop Killer Robots.
After hearing from government officials and civil society representatives about the risks posed by the deployment of autonomous weapons systems, the Latin American officials adopted a statement, entitled the Belén Communiqué, calling for further international efforts “to promote the urgent negotiation of an international legally-binding instrument with prohibitions and regulations with regard to autonomy in weapons systems.”
The meeting had scarcely concluded when the next gathering, the Responsible AI in the Military Domain summit, was convened in The Hague. Held on Feb. 15–16 and sponsored by South Korea and the Netherlands, the event favored an alternative approach. Rather than advocate for a ban on the military use of AI and autonomous weapons systems, summit participants, including the United States, called for voluntary measures allowing for their use in a safe, responsible manner.
This approach was outlined in a keynote address by Bonnie Jenkins, U.S. undersecretary of state for arms control and disarmament. She argued that the utilization of AI by the military could have positive outcomes in aiding combat operations and enhancing compliance with international humanitarian law. But she acknowledged that because its use also poses a risk of malfunction and unintended consequences, it must be subject to rigorous controls and oversight.
“AI capabilities will increase accuracy and precision in the use of force, which will also help strengthen implementation of international humanitarian law’s protections for civilians,” Jenkins said. “But we must do this safely and responsibly.”
Underscoring this view, Jenkins released a U.S.-crafted “political declaration” on the responsible military use of AI and autonomous weapons systems. Drawing on guidelines issued by the U.S. Department of Defense in revised directive 3000.09, “Autonomy in Weapons Systems” (see ACT, March 2023), the declaration calls on states to employ AI and autonomous systems in a safe and principled manner.
“[The] military use of AI can and should be ethical, responsible, and enhance international security,” the declaration affirms. To advance this outcome, states are urged to adopt best practices in the design, development, testing, and fielding of AI-enabled systems, including measures to ensure compliance with international humanitarian law and to “minimize unintended bias in military AI capabilities.” Such endeavors are entirely voluntary and subject to domestic laws alone, where they exist, the declaration states.
The competing approaches were given an extensive airing at a meeting of the group of governmental experts (GGE) convened under the auspices of the Convention on Certain Conventional Weapons (CCW) in Geneva on March 6–10.
For several years, the GGE has been considering proposals for an additional protocol to the CCW that would prohibit or strictly regulate the deployment of autonomous weapons systems. As the GGE operates by consensus and some states-parties to the CCW, including Russia and the United States, oppose such a measure, the group has been unable to forward a draft protocol to the full CCW membership. Nevertheless, GGE meetings have provided an important forum for proponents of contending approaches to articulate and defend their positions, and the March meeting was no exception.
The United States, joined by Australia, Canada, Japan, and the UK, submitted a draft proposal that draws heavily on the political declaration released by Jenkins. It asserts that the use of autonomous weapons systems should be deemed lawful as long as the states using them have taken effective measures to ensure that their use will not result in violations of international humanitarian law. If employed in such a manner, the joint proposal states, “these technologies could be used to improve the protection of civilians.”
An entirely different approach was put forward in papers submitted by Austria, Pakistan, member states of the Non-Aligned Movement, and representatives of civil society. These participants disputed the notion that autonomous weapons systems can be employed in accordance with humanitarian law and prove useful in protecting civilians. They said such systems pose an inescapable risk of causing battlefield calamities and harming civilians unnecessarily.
For states that adhere to this view, nothing is acceptable short of a complete ban on autonomous weapons systems or a set of binding regulations that would severely circumscribe their use. As noted in the Austrian paper, autonomous weapons systems that are not under effective human control at all times and that “select and engage persons as targets in a manner that violates the dignity and worth of the human person” must be considered unacceptable and must be prohibited.
Given the wide gap between these two contending approaches, it is unlikely that a common strategy will be devised at the next GGE meeting, scheduled for Geneva in May, and so no draft proposal for a legally binding ban or set of regulatory controls on autonomous weapons systems is likely to be submitted to the CCW state-parties when they next meet.
A stalemate on the issue gives autonomous weapons developers time to hone new technologies and commercialize them. It also could lead to a dual approach to controlling such devices, with some states adopting voluntary rules and others pursuing the adoption of a legally binding measure outside the CCW process. One possible venue for the latter option is the UN General Assembly, where a majority vote, rather than a consensus decision, would be required for passage.