Geopolitics and the Regulation of Autonomous Weapons Systems

January/February 2025
By Alexander Kmentt

Advancements in artificial intelligence (AI) promise to benefit humanity, making tasks faster, easier, and more accessible. Yet, like all innovations, AI poses serious risks, especially in life-and-death decisions, such as those involving autonomous weapons systems. Many experts and developers urge caution, calling for reflection and regulatory measures. Integrating AI into weaponry will alter fundamentally how conflicts are fought and call into question the role of humans within them. Early examples in Ukraine and Gaza reveal this shift, but the future holds even greater change. Militaries want these systems for their speed, efficiency, and ability to minimize soldier casualties, fueling global investment. Their widespread adoption appears imminent, with costs expected to drop and their use likely to proliferate across conflicts worldwide.

Alexander Kmentt, director of the Disarmament, Arms Control and Non-Proliferation Department of the Austrian Ministry for Foreign Affairs, calls the present time a “critical juncture [that] demands urgent political leadership” to regulate lethal autonomous weapons. (Photo by Alex Halada/AFP via Getty Images)

This trajectory raises urgent ethical questions and presents challenges to compliance with international humanitarian law and human rights law. Experts warn of an arms race in autonomous weapons systems, especially amid rising geopolitical tensions, with significant risks of proliferation, unwanted escalation, and difficult-to-predict shifts in global power dynamics. Although AI offers extraordinary potential, the arrival of these systems demands immediate action to ensure legal norms, ethical boundaries, and global security.

There is no agreed definition of autonomous weapons systems, but in essence these are weapons systems that once activated by a human, select and engage targets without further human intervention. These weapons systems make “decisions” over life and death based on preprogrammed algorithms. These systems might be built on AI elements that cannot be understood fully by their human operators, and so really their actions cannot be predicted by those who should be responsible. This makes the arrival of autonomous weapons systems a watershed moment for our entire species.

This critical juncture demands urgent political leadership to create regulations and safeguards. Yet, the contrast between the rapid development of AI-powered weapons and the slow pace of regulatory discussions is stark. Mistrust among states and a potentially flawed confidence in technological solutions hinder progress. If the global community fails to act swiftly, the opportunity to establish legal guardrails will be lost before autonomous weapons systems become widely deployed, potentially leading to devastating consequences.

A Decade of Discussions

The issue of autonomous weapons systems first formally appeared on the agenda of the international community in a report by Christof Heyns, the UN special rapporteur on extrajudicial, summary or arbitrary executions, to the Human Rights Council in 2013.1 In 2016 the Convention on Certain Conventional Weapons (CCW) established a group of governmental experts to explore regulatory options. The group has met regularly since 2017, fostering greater understanding of the challenges posed by these systems, largely thanks to the contributions from civil society and academia.

In 2019, the CCW agreed on 11 guiding principles, including the full applicability of international humanitarian law to these systems and the need to retain human responsibility for decisions on the use of these systems and human accountability across their entire life cycle.2 Discussions within the expert group have focused largely on ensuring compliance with international humanitarian law, a body of law that was designed around humans and human decision-making. Autonomous weapons systems challenge the legal requirements to distinguish between civilians and combatants, assess proportionality, and take necessary precautions in attacks. Systems that are not understood sufficiently, whose behaviors cannot be predicted reliably, or that cannot be limited contextually pose challenges for ensuring the level of control required for compliance with international humanitarian law as well as for retaining human legal accountability in the decision-making processes of increasingly autonomous systems.

One proposed response to these problems is the so-called two-tier approach, which has garnered growing support albeit no consensus yet. This framework combines prohibitions on systems that cannot be used in accordance with international humanitarian law with strict regulations for other systems to ensure adherence to legal standards. Central to this debate is defining the nature and level of “meaningful human control” and “appropriate human judgement.” Defining adequate human involvement, however, is contentious. Disagreements persist over what constitutes adequate predictability, understanding, and control, so there is disagreement also as to where any lines of prohibition should be drawn. A small group of states would prefer not to draw any lines at all.

In 2023, the experts group mandate was extended until 2026, with a slightly expanded focus to develop by consensus “a set of elements of an instrument, without prejudging its nature.” The group chair, Robert In den Bosch of the Netherlands, compiled a “rolling text” summarizing possible regulatory elements and offering a foundation for potential agreement. The group is engaged in these discussions, and if the rolling text3 from November 2024 is the measure, then progress appears to be made on the substance.

Yet, beneath the surface, the picture is less encouraging. The scope of the group discussions is limited, hampered by substantive disagreements and ineffective working methods imposed by CCW rules. Most importantly, the geopolitical backdrop has stymied decisive action. Rising tensions, mistrust, and diverging priorities among states have resulted in alarmingly slow progress.

Despite a decade of discussions, the global community has little to show in terms of concrete outcomes. The mismatch between the rapid development of autonomous weapons systems technologies and the sluggish pace of international regulation is troubling. Strong political leadership, guided by ethical principles and a commitment to international humanitarian law, is essential to meet this unprecedented challenge.

The Challenge of Comprehensive Regulation

The Martens clause of international humanitarian law calls for the public conscience to guide consideration of what is acceptable and unacceptable in warfare. The law needs to be responsive to wider societal needs. The regulation of autonomous weapons systems is demanded by factors and considerations that extend beyond the law and armed conflict. Although international humanitarian law remains central, broader concerns related to human rights, ethics, and global security also need attention.

These systems raise critical questions under international human rights law, particularly regarding the “right to life” and legal accountability. Their development and use likely will extend beyond warfare, where these rules tend to be applicable, and encompass law enforcement and border control. New rules that might be sufficient from an international humanitarian law perspective are not aligned necessarily with human rights law requirements. The potential for systems to make lethal decisions without human oversight amplifies concerns about who bears responsibility for errors or unlawful killings. Ethically, the automation of force challenges core principles of human dignity.

One of the most pressing areas of ethical concern is the direct targeting of individuals. This risks depriving humans of their dignity, dehumanizing the use of force, and violating the right of nondiscrimination. Key protections, such as the need to distinguish combatants from civilians or to protect soldiers wounded in combat, may be lost if they are inconvenient to the drive for automation. Systems programmed to distinguish between demographic groups risk having societal biases embedded in their datasets, disproportionately affecting marginalized communities. Errors stemming from such biases could lead to severe violations of rights and exacerbate inequalities.

In addition to presenting moral hazards at the level of human targeting, the systems may pose risks to global peace and stability. Where they reduce the risks to a state’s own soldiers, they may reduce the political threshold for deploying or using force. They promise operational speed, but this often comes at the expense of meaningful human control. These factors can heighten the risk of conflict escalation through rapid, machine-driven interactions. The systems are likely to feature prominently in asymmetric conflicts, potentially becoming weapons of choice for terrorists and nonstate actors. Controlling their proliferation will require coordinated legal, political, and technical measures.

All this is to say that there are important aspects of the issue that go beyond the international humanitarian law dimension that has been the focus of discussions thus far. Even if the experts group rapidly fulfills its international humanitarian law-focused mandate, only one set of challenges from autonomous weapons systems is likely to be addressed. The wider ethical challenges of targeting people and the proper limits on systems in border security or policing, for example, look unlikely to be fully explored or addressed. Many states have voiced the need to expand discussions, but opposition from certain actors has prevented meaningful action.

The 2016 decision to address autonomous weapons systems within the CCW framework, emphasizing international humanitarian law, reflected the urgency of tackling these legal challenges swiftly. It also was promoted by a few states that know that the CCW working method can be exploited to block progress. After nearly a decade of discussions and rapid technological development, it is fair to question the wisdom of this decision. The experts group has yet to transition from discussions to negotiations, and its narrow mandate has curtailed the comprehensive approach required to address challenges posed by autonomous weapons systems. The current mandate presents an important test.

A key issue is consensus-based CCW decision-making, which allows individual states to block progress. Russia, for example, has employed procedural tactics to delay discussions and reject progressive proposals or expressions of urgency in the group reports. Other states actively developing systems also have shown limited interest in advancing negotiations of a legally binding agreement, leaving the experts group in a cycle of substantive annual discussions without substantive outcomes.

This failure reflects a troubling procrastination. As technology evolves, the gap between regulation and reality widens further. Comprehensively regulating autonomous weapons systems requires a holistic approach that integrates international humanitarian law, human rights, ethics, and security considerations. Yet, the group is blocked from effectively addressing only the international humanitarian law considerations in the CCW, and there is also not yet a sufficient partnership for taking up discussions elsewhere.

The political landscape of the group risks perpetuating a Groundhog Day dynamic of stalled discussions. States must now prioritize building shared recognition of what a legal instrument could look like in the near term and expanding the scope of dialogue to promote a comprehensive, long-term response. It is a global priority to ensure that the international legal framework develops in step with technological advancements; the procedural barriers are, in the end, excuses for political choices.

Geopolitical Rule

The unsatisfactory state of affairs is frustrating for those who recognize the urgency of this issue. Yet, states developing such systems, along with their allies, vigorously defend the experts group process as the only suitable venue for discussions. Proposals to broaden the focus beyond international humanitarian law or involve other forums are met with skepticism. A common argument is that all militarily significant states must participate to ensure effective regulation. Although there is general agreement on the need for rules governing these systems, geopolitical tensions and eroded trust make many states reluctant to adopt regulations that their competitors might ignore.

A “Miloš” tracked combat robot, an unmanned ground vehicle that can be armed with machine guns, M11 grenade launchers and other armaments, being towed in 2019 by a Serbian Land Rover. Serbia, among many countries developing or acquiring lethal autonomous weapons, presented its first unmanned ground vehicle in 2009. (Photo by Srđan Popović via Wikipedia, available under public license: creativecommons.org/licenses/by-sa/4.0/)

This hesitancy is particularly evident among certain Western states, which share frustrations about the lack of progress but remain wary of initiatives that they fear could exclude major powers and lead to legal instruments that do not align with their perceived military interests. This caution, however understandable on the surface, is probably shortsighted.

First, the experts group’s proposed two-tier approach aims to prohibit autonomous weapons systems that do not align with the requirements of international humanitarian law and to regulate the use of others to enable compliance with that law, emphasizing understandability, predictability, and spatial-temporal limits on use in order to exert human control. The military advantage of systems beyond human judgment and control remains unclear at best. Many likely would argue that there would be no advantage; after all, the rules being discussed in the CCW have had more than 10 years of military interests being vocally represented. At a practical level, clarity of international legal rules also allows for clarity and direction for operational military commanders and private sector stakeholders.

From a legal perspective, the corollary of not accepting rules unless all states participate means accepting for themselves or their adversaries systems that they believe would violate international humanitarian law. Can states committed to international humanitarian law and universal values agree to delegate life-and-death decisions to systems lacking capacity for compliance, control, and accountability? Arguably, these principles are vital and must be upheld, regardless of universal participation, to maintain international humanitarian law and shared values. At a time when international rules and norms are in a critical condition, giving power to only the most militarized states is a political failure.

Relying on universal consensus before taking regulatory steps is ultimately a disingenuous stance that invites inertia. Few international issues today enjoy universal agreement, and insisting on unanimity in regulating autonomous weapons systems effectively guarantees standstill. This approach allows states to appear committed to multilateralism while shifting blame for inaction to others.

Such procrastination ignores the narrowing window of opportunity to establish legal and normative safeguards before these systems become widespread. The longer regulatory efforts are delayed, the more difficult it will be to reverse course once these systems proliferate globally. Democratic states, in particular, must recognize that this delay is against their own long-term interests and the broader interests of humanity.

The Road Ahead

The immediate priority should be to bring the experts group’s work to a successful conclusion. The group must recommend the initiation of negotiations for a legally binding instrument based on the two-tier approach, encompassing prohibitions and regulations for autonomous weapons systems. Although it does not tackle concerns regarding antipersonnel systems effectively, the group chair’s rolling text of November 2024 shows that many states are close to agreeing on the key rules needed to preserve human control.

The Sea Hunter, a prototype submarine-hunting drone ship that can cross the open seas without a human crew for months at a time, is among the autonomous weapons systems being tested by the U.S. Navy. (Photo by U.S. Navy)

The group mandate extends to 2026, with the CCW review conference set as the deadline for a final report, but progress should be accelerated to conclude by the end of 2025. Substantive recommendations on prohibitions and regulations are on the table. Continuing discussions ad infinitum without committing to action would represent a failure of political responsibility, especially for states that claim to uphold international humanitarian law and universal values.

In parallel, efforts to generate political momentum for regulating these systems must intensify. Participation in the experts group remains limited. Many countries of the Global South are notably absent, despite the inevitable future global impact of these systems once they become cheap and accessible. Moreover, despite active civil society participation and input, the group’s work receives very limited public attention, and there is little critical focus on those blocking progress.

Encouragingly, there are some signs of change. UN Secretary-General António Guterres and the president of the International Committee of the Red Cross, Mirjana Spoljaric-Egger, have called urgently for international rules to regulate autonomous weapons systems to be negotiated by 2026.4 Regional conferences in Costa Rica,5 the Philippines, Trinidad and Tobago,6 and Sierra Leone7 have consolidated positions in support of regulation. Austria hosted a large international conference on these systems in April 2024.8 The chair’s summary9 of that conference has garnered endorsements from 40 states, signaling growing support for action.

In 2023, Austria, on behalf of 27 co-sponsoring states, introduced the first UN General Assembly resolution on lethal autonomous weapons systems,10 backed by 164 states. The resolution mandated the secretary-general to compile a comprehensive report on the issue, incorporating input from states, international organizations, civil society, and the private sector. Published in 2024 after a record number of states provided submissions, the report underscored the pressing humanitarian, legal, security, technological, and ethical challenges posed by autonomous weapons systems.11 It also highlighted widespread support for a legally binding instrument and called for the experts group to fulfill its mandate. A follow-up resolution in December 2024 again received strong support, with 166 states in favor.12 In 2025, the General Assembly will hold informal discussions to further address the issue.

The quest for regulating autonomous weapons systems continues to be hindered by geopolitical tensions, procrastination, and insufficient political momentum. Although there is growing recognition of the need for action, the necessary leadership to navigate these challenges remains elusive. The situation has been referenced as an “Oppenheimer moment.” There are indeed parallels to the post-1945 era when Manhattan Project scientists advocated for nuclear weapons regulation. Their warnings were overshadowed and ultimately ignored because of Cold War politics, resulting in a protracted nuclear arms race and an existential threat enduring to this day.

Despite loud warnings from experts, geopolitical tensions again are thwarting vital international action to regulate these systems, thus risking a new arms race and posing an existential challenge for humanity. Negotiation of international rules and limits on autonomous weapons systems is more urgent than ever. States cannot allow geopolitical rivalries to continue obstructing these efforts. They need to continue building a cross-regional partnership of states, international organizations, and civil society with the confidence to break out of the procedural deadlock because a greater good demands it.

ENDNOTES

1. See UN General Assembly, “Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Christof Heyns,” A/HRC/23/47, April 9, 2013.

2. Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, “Report of the 2019 Session of the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems,” CCW/GGE.1/2019/3, September 25, 2019, p. 13.

3. See “GGE on LAWS, Rolling Text, Status Date: 8 November 2024,” n.d.

4. International Committee of the Red Cross, “Joint Call by the United Nations Secretary-General and the President of the International Committee of the Red Cross for States to Establish New Prohibitions and Restrictions on Autonomous Weapon Systems,” News release, October 5, 2023.

5. “Communiqué of the Latin American and Caribbean Conference of Social and Humanitarian Impact of Autonomous Weapons,” Ministerio de Relaciones Exteriores y Culto, Costa Rica, n.d.

6. “CARICOM Declaration on Autonomous Weapons Systems,” Government of the Republic of Trinidad and Tobago, n.d.

7. “Submission of the Government of Sierra Leone Towards the UN Secretary-General’s Call Outlined in Resolution 78/241 on ‘Lethal Autonomous Weapons Systems,’ Adopted by the General Assembly in December 2023, Drawing From a Conference of Member States of Economic Community of West African States on the Peace and Security Aspect of Autonomous Weapons Systems, 17-18 April 2024,” May 23, 2024.

8. Austrian Ministry for Foreign Affairs, “2024 Vienna Conference on Autonomous Weapons Systems,” n.d., (accessed December 18, 2024).

9. See “Humanity at the Crossroads: Autonomous Weapons Systems and the Challenge of Regulation; Chair’s Summary,” April 30, 2024.

10. UN General Assembly, “Lethal Autonomous Weapons Systems,” A/RES/78/241, December 28, 2023.

11. UN General Assembly, “Lethal Autonomous Weapons Systems: Report of the Secretary-General,” A/79/88, July 1, 2024.

12. UN General Assembly, “Austria, Belgium, Brazil, Costa Rica, Guatemala, Ireland, Kiribati, Liechtenstein, Malta, Mexico, New Zealand, Philippines, Sierra Leone, Sri Lanka, Switzerland and Trinidad and Tobago: Draft Resolution; Lethal Autonomous Weapons Systems,” A/C.1/79/L.77, October 18, 2024.


Alexander Kmentt is the director of the Disarmament, Arms Control and Nonproliferation Department at the Austrian Ministry for Foreign Affairs. The views in this article reflect those of the author and do not necessarily represent the position of the Austrian government.