Israel Has Allowed AI To Become Judge, Jury And Executioner – OpEd

By

By Chris Doyle

Hollywood has for a long time fantasized about a world where artificial intelligence rules, where wars are fought by computers and robots, where machines dominate man. Yet, as the world wakes up to both the potential positives and negatives of AI in our lives, the Israeli onslaught on Palestinians in Gaza is a chilling harbinger of the future of war.

Much of this is not new but it is more advanced. The use of drones has already changed the conduct of warfare, as we have seen in Ukraine. Israel has been using drones since the 1982 Lebanon War, while it developed its first attack drone in 1989. It remains at the cutting edge of drone technology.

Israeli forces use every type, from small drones that can search tunnels and buildings or place explosives to the larger types that can drop massive ordnance. Xtender, for example, is a small drone designed for indoor and underground operations, making it well suited to buildings and tunnels in places such as Gaza city. Nonstate actors can also use drones, including Hamas, as in the case of Gaza.

But new evidence of the use of advanced Israeli tech is far more chilling. The brilliant Israeli magazine +972 has unearthed, in a series of probing investigations, two major AI targeting programs, named “The Gospel” and “Lavender.”

Back in November, +972 unearthed The Gospel program, also known by its Hebrew name “Habsora.” This system has allowed Israeli forces to expand their potential target lists — i.e., the buildings that could be determined as legitimate for bombing. These include “public buildings, infrastructure, and high-rise blocks, which sources say the army defines as ‘power targets.’” Commanders know in advance from intelligence how many civilians may be killed in a bombing. Yet, as one Israeli source stated: “The numbers increased from dozens of civilian deaths (permitted) as collateral damage as part of an attack on a senior official in previous operations, to hundreds of civilian deaths as collateral damage.”

Another source was equally chilling: “These are not random rockets. Everything is intentional. We know exactly how much collateral damage there is in every home.” This tallies with the evidence so far procured on the ground, including human rights reports.

As for the Lavender system, it is more about identifying individuals as targets rather than buildings. +972 last week reported that it “is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad, including low-ranking ones, as potential bombing targets.” Essentially, it is a system to tag Palestinians. Lavender reportedly has information on 90 percent of the Palestinian population in Gaza (one wonders if this is the same for the West Bank). Each person is given a rating from one to 100 based on the likelihood of them being a militant and evidence indicated children were also marked.

The magazine’s investigation determined that, within the first few weeks after Oct. 7, Lavender had identified 37,000 Palestinian “targets” deemed to be suspected “militants.” This equates to what Israel has publicly stated to be its estimate of the number of Hamas and Palestinian Islamic Jihad operatives in Gaza. One Hamas commander was given such a high rating that it was assessed that up to 300 Palestinian civilian fatalities would be acceptable as collateral damage.

What makes it worse is that Lavender tends to locate individuals in their homes. Many of the subsequent attacks have taken place at night, so the bombing would also kill members of the target’s family. “We were not interested in killing (Hamas) operatives only when they were in a military building or engaged in a military activity,” said one Israeli intelligence officer.

Local commanders were even encouraged to use Lavender’s “kill lists.” One Israeli military source noted that the humans were just there to rubber stamp the decisions — a process that took about “20 seconds.”

Advocates can argue that machines and AI may be more effective than humans. Yet the known margin of error is about 10 percent, so hardly foolproof.

The reality is that Israel has killed at least 33,000 Palestinians in six months, including 14,000 children, because it has such loose open-fire regulations, which are frequently facilitated by these dangerous AI systems. Israel has not even got close to adhering to the international law principle of proportionality.

The international reaction has been minimal, at least in public. One of the few to speak out has been UN Secretary-General Antonio Guterres. On Israel’s use of AI, he said on Friday: “No part of life and death decisions which impact entire families should be delegated to the cold calculation of algorithms.”

Even in the media, few mainstream outlets have run with the story. One has to ask why.

Will there be any chance of halting this march, of returning to a world where such life and death decisions are made by humans rather than machines? Will it not be so much easier for a machine to do the killing, devoid of any emotional or moral inhibitions that hopefully humans still have? Will it not be so much easier in the future to blame a computer rather than a person? You cannot take a computer to court. A computer does not pay compensation or have to apologize. There is no transparency in this process. Will any Palestinian parent know if their child was bombed because of The Gospel or Lavender, or because an actual human took a decision?

This matters way beyond the carnage of the Eastern Mediterranean. It shows how massive amounts of data on a population can be abused for the deadliest of purposes. Israel has such information due to its 57 years of intense military occupation. But where else will these systems be adopted? What will be the international legal repercussions?

Moreover, where is the high-level debate? Should such systems be banned? If not, what restrictions should be imposed? Are the companies that make this practice possible complicit in the killings?

The world needs to wake up. Machines have become judge, jury and executioner. The human-machine link in war is shifting toward being dominated by machine. Computers cannot determine if a person is a terrorist. The weakening or even bypassing of any moral agency is terrifying. This is what is being done to Palestinians in Gaza right now. Unless action is taken, this will be the future of war. Hollywood will need some new scripts.

  • Chris Doyle is director of the Council for Arab-British Understanding in London. X: @Doylech

Arab News

Arab News is Saudi Arabia's first English-language newspaper. It was founded in 1975 by Hisham and Mohammed Ali Hafiz. Today, it is one of 29 publications produced by Saudi Research & Publishing Company (SRPC), a subsidiary of Saudi Research & Marketing Group (SRMG).

Leave a Reply

Your email address will not be published. Required fields are marked *