US: New Policy On Autonomous Weapons Flawed, Says HRW

By

A new United States Department of Defense directive concerning development of autonomous weapons systems is an inadequate response to the threats posed by removing human control from the use of force, Human Rights Watch said in a report. Instead of creating adequate controls on the development of these weapons, the directive could facilitate it. 

“The US pursuit of autonomous weapons systems without binding legal rules to explicitly address the dangers is a recipe for disaster,” said Mary Wareham, arms advocacy director at Human Rights Watch. “National policy and legislation are urgently needed to address the risks and challenges raised by removing human control from the use of force.”

The United States, as well as Australia, China, India, Iran, Israel, South Korea, Russia, Turkey, and the United Kingdom, is investing heavily in the military applications of artificial intelligence (AI) and related technologies to develop air, land, and sea-based autonomous weapons systems. At diplomatic talks held since 2014, these countries have consistently resisted growing calls to negotiate a new legally binding instrument on autonomous weapons systems.

Human Rights Watch and the Harvard Law School International Human Rights Clinic reviewed the content of Directive 3000-09 on Autonomy in Weapons Systems, issued on January 25, 2023. The directive, which defines an autonomous weapon as “a weapon system that, once activated, can select and engage targets without further intervention by an operator [person],” closely follows a previous Defense Department directive issued in November 2012.

As with the original directive, the 2023 directive applies only to the Department of Defense. There is still no US government-wide policy on autonomous weapons systems or governing their use in law enforcement, border control, or other circumstances outside of armed conflict. The 2023 directive also does not apply to the CIA, which has played an active role in the use of armed drones abroad.

The 2023 directive contains some of the same significant loopholes and provisions as the original directive. Like the earlier directive, it requires that “autonomous and semi-autonomous weapon systems will be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force.” That language recognizes the value of human judgment, which is essential for ensuring compliance with international humanitarian law. However, neither the previous nor current directive are clear on what constitutes an “appropriate level” of human judgment.

The new directive establishes similar requirements, such as testing, that must be met before a waiver can be granted for the development or use of autonomous weapons systems that deliver lethal force. That review requirement can be waived by high-level officials “in cases of urgent military need.” There is little transparency, but it appears that the review requirement has never been used.

Like its predecessor, the 2023 directive does nothing to curb the proliferation of autonomous weapons systems as it allows for “international sales and transfers” as long as they are “approved in accordance with existing technology security and foreign disclosure requirements and processes.”

One problematic revision is the deletion of several references to concern for “loss of control.” The US has repeatedly objected to the use of the word “control” in multilateral meetings. Human “control” is an appropriate word to use because it encompasses both the mental judgment and physical act needed to prevent autonomous weapons systems from posing moral, ethical, legal, and other threats, Human Rights Watch and the Harvard clinic said.

The 2023 policy is also out of step with widely supported international proposals for a new treaty to prohibit and regulate autonomous weapons systems.

The US has proposed that countries agree to voluntary commitments such as a “code of conduct” to guide development and use of autonomous weapons systems. It recommends adoption of a political declaration aimed at ensuring responsible use of weapons systems that incorporate AI capabilities.

By contrast, more than 70 countries as well as nongovernmental organizations and the International Committee of the Red Cross regard a new treaty with prohibitions and restrictions as necessary, urgent, and achievable. United Nations Secretary-General António Guterres has called for “internationally agreed limits” on weapons systems that could, by themselves, target and attack human beings, describing such weapons as “morally repugnant and politically unacceptable.”

Most treaty proponents call for prohibitions on autonomous systems that by their nature operate without meaningful human control or that target people, and regulations that ensure all other autonomous weapons systems cannot be used without meaningful human control. The US directive fails to reach this bar.

Human Rights Watch is a cofounder of Stop Killer Robots, the coalition of more than 190 nongovernmental organizations in 67 countries that is working for new international law on autonomy in weapons systems.

“Now is not the time for the US to tinker with incremental measures that pave the way for a future of automated killing,” Wareham said. “To protect humanity, the US should support the negotiation of new international law to prohibit and restrict autonomous weapons systems.”

Leave a Reply

Your email address will not be published. Required fields are marked *