ISSN 2330-717X

First Use Of ‘Killer Robot’ In US History Raises Ethical And Legal Questions – Analysis

By and

By Scott N. Romaniuk and Emeka T. Njoku

The use of remotely operated vehicles (ROVs) for purposes related to national security and order is not new. Armies and navies have been using ROVs for these purposes, including the handling and removal of dangerous objects in hazardous situations, for decades. Lately, they have been sensationalised as “killer robots.” An armed robot was used by the police forces in Dallas, TX, after the deadliest attack on law enforcement since al-Qaeda’s attack on 9/11. Following a targeted attack that left 5 Dallas officers dead during a police-shootings protect, the response by the Dallas police is also under attack.

Use of an armed robot to kill the suspected shooter in Dallas is an allegedly serious and pioneering move leading to a potential increase in the use of armed robots as well as to “police militarization.” The changes raise concerns about human rights and the constitution in much the same way the Obama administration’s killing of Osama bin Laden back in May 2011 had. It is the first time in US history that law enforcement employed a robot in this manner.

Public concerns stem from the negative perception cultivated from media coverage and images of robots used in war zones. It causes concern that practices areas of armed conflict have crossed into realm of domestic security. Concerns about the crossover have been misinformed and relate only mildly to the employment of “drones” battles overseas.

The small remotely operated machine used to kill Micah Xavier in Dallas carried an explosive device. The mode of delivery was much different than what the US Army and the CIA has done in countries like Pakistan, Afghanistan, and Iraq. Although this represents the first time the police in the US have used a robot in this way, it is neither the first time a bomb has been used to eliminate someone who posed a threat to public safety, nor is it the first time the police have used “drones” to track criminal and help law enforcement officers complete their work.

Although police used a bomb against black liberationists in Philadelphia nearly 30 years ago, the use the armed robot in Dallas cannot be labeled a turning point. Peter Asaro at The New School argued that the event could be a forerunner to future actions, leading to standard practice and eventually a new police behavior. During such a course, the risk will also be that future events will be characterized as deep grey zones of human rights neglect and constitutional ambiguity. “I hope they don’t start designing a whole series of police-armed robots,” opined Asaro, who stated further that, “[o]nce it becomes standard practice, it’ll be used in other instances that aren’t as cut and dry as this one.”

9/11 did not mark a turning point for the use of robotic weapons and sophisticated technology by the US military exclusively. Many different types of ROVs have been employed by law enforcement for decades. They have acquired newer and more sophisticated machines from the government after they were used in Operation Enduring Freedom (OEF) in Afghanistan since 2002 and Operation Iraqi Freedom (OIF) in Iraq since 2003.

Companies like Endeavor Robotics, Remotec, and RoboteX, have been supplying police forces with thousands of machines designed to enhance the performance of law enforcement personnel. The use of sophisticated technology by the police is part of a larger occurrence of technology diffusion or spread popularized by Everett Rogers over five decades ago. As new technologies continue to emerge and show promise to serving the needs of militaries and police forces around the world, turning to their application in the interest of protecting military and law enforcement officers lives is only natural.

ROVs successfully diffused bombs planted around a police station in Dallas in 2014 and were also used to search the suspect’s vehicle for further devices. The lives of police officers would probably have been lost had the robot not been used. The 2014 incident resulted in the bomb exploding but fortunately police were attempting to diffuse it via an ROV.

The basic idea of technological innovation and application is to make life easier, safer, and better overall. The 2016 use of innovative technology served that purpose while fitting into a steady but so far positive trend in application. A sniper’s bullet presents us not even with a modestly different scenario. The use of a projectile like a bullet – once it leave the chamber essentially becoming an “unmanned” weapon – is no different. In both cases a decision to use lethal force in the interest of public safety and security was made. The outcome of that decision does not change from one scenario to another. Technology itself remains a mere condition of the action that takes place. The result would have been the same had the police thrown a grenade at the shooter or hailed him with bullets.

Still, it is important to keep in mind that the robot did not do the killing. In any of the scenarios described, the police made the conscious decision to carry out the targeted killing and to use deadly force. The robot does provide more oversight and can carry a camera to provide the police and legal officers who might want to observe the footage during and after the event with sufficient coverage of the event as it takes place. American Civil Liberties Union Jay Stanley (2015) stated clearly that, “there is a broad consensus that armed domestic drones are beyond the pale.”

Cyber-security researcher Jonathan Zdziarski who claims that wireless connection can be compromised too easily, resulting in “a robot loose with a bomb”, made further detraction to the use of robots. He asks, “What controls are in place to make sure that nobody can hijack that connection?” Ethical issues appear to rest in hypothetical scenarios that we currently face with any type of technological application, including hand guns that police officers already carry, the issue of safety mechanisms, and decision making skills such as those called into question regarding the killings that sparked the protects in the first place. Concerns about connection security conjure up memories of a US military feed being hacked into by Iranian-backed Shiite militants. Militants also gained control of US-operated RCAVs in Afghanistan through unprotected communications links between operators and their machines.

However, the essence of technological innovations is to make life better and not brutish. Specifically, caution in deploying this technology in non-conflict areas need to re-emphasized. Questions such as what conditions warrants the deployment of this killer robot, who uses it and mechanisms to check for abuse need to be highlighted, as this could only worsen the already battered relations between the police and the people. This view has been aptly expressed by the President of RoboteX, Eric Ivers, who conveyed his shock over the fact that the Dallas police used a robot in such a way, stating, “[t]hat is the absolute opposite of what these robots are used for. They’re used to save lives on both sides.” The robot succeeded in achieving precisely what Ivers criticized it for. Moreover, Professor Emerita Marjorie Cohn at the Thomas Jefferson School of Law stated that law enforcement are moving in “the wrong direction.”

Cohn remarked further that, “the fact that the police have a weapon like this, and other weapons like drones and tanks, is an example of the militarization of the police and law enforcement—and goes in the wrong direction […] We should see the police using humane techniques, interacting on a more humane level with the community, and although certainly the police officers did not deserve to die, this is an indication of something much deeper in the society, and that’s the racism that permeates the police departments across the country. It is a real tragedy.” Therefore, the police needs to purge itself from the frequent abuses of power expressed on the streets, particularly in its interaction with the black community. If they hope deploy these technology to guarantee security within US homeland the onus is on them to prove that they could be deployed objectively and this objectivity must be shown in their relations with the society they swore to protect. This can go a long way to deconstruct the rising moral debate on the use of this new technology in law enforcement.

The types of weapons Cohn and others speak out against have been around for decades and the process that leads to the killing of a targeted, a process called a loop and one that does not involve a human element, are not at all novel. The first and so-called watershed moment in law enforcement activity, in which most likely lives were saved from Xavier and his armed aggression, remains cast against countless instances where actions in which a person was involved have led to immoral and even illegal death.

If, as detractors of armed ROVs claim, it is in the best interest of public safety and security to maintain a human element in police action at all times, then we are somehow ascribing a moral acuity to the weapons themselves while simultaneously arguing the exact opposite. There will always be human involvement in autonomous weapon systems and ROVs are no exception, but if we turn to the war zones to see that constraints over the use of those weapons that many people are either critical or afraid of, we do not have to look as far as we might think. All violence remains rational.

This article was published by Geopolitical Monitor.com


Enjoy the article? Then please consider donating today to ensure that Eurasia Review can continue to be able to provide similar content.


Scott N. Romaniuk

Dr. Scott N. Romaniuk completed his PhD at the School of International Studies, University of Trento. He holds an MRes in Political Research, an MA in Terrorism, Crime and Global Security, and an MA in Military Studies (Joint Warfare). His teaching and research specializations include International Relations, Military and Strategic Studies, Security Studies, Terrorism and Political Violence, and Research Methods. He is a Senior Research Affiliate with the Canadian Network for Research on Terrorism Security and Society (TSAS) and a member of the Conflict, Terrorism and Development (CTD) Collaboratory at Michigan State University.

One thought on “First Use Of ‘Killer Robot’ In US History Raises Ethical And Legal Questions – Analysis

  • July 23, 2016 at 1:03 pm
    Permalink

    There is a problem with the militarization of the police up front. The police should police for safety, not fight literal street war as in Ferguson. The more military gear the police has at its disposal, the more of it will be used and it will be used on the weakest members of society, which at this point are the blacks. Blacks are under privileged and their poverty and deprivation drives them into violence and crime. Drug peddling is often their only way to survive – when a felony record cuts them off of any reputable job opportunity.

    The shooting of police officers after so many unarmed black people were shot brutally for essentially no cause is an act of retaliation by the black community. It should have been seen as that and instead of punishing it, the necessary steps to rectify the brutality of police against blacks should be corrected with fair and genuine trials of the offending police officers. Otherwise the unjust killings by police will result in endless cycles of retaliation, which will then require more and more sophisticated ways of remote killing to protect the police. A good portion of the violence in the US is by people who were deprived of justice by corrupt or indifferent courts and a corrupt private prison system that needs so many loner term prisoners to remain profitable.

    The entire problem has to be solved, not just new weapons for the police rationalized by refusing to face the problem in its entirely and in a genuine way.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

CLOSE
CLOSE