ISRAEL’S AI SYSTEM ‘LAVENDER’ DECIDES WHO LIVES AND DIES IN GAZA

In case you missed it Middle East Tech World

  File – IDF Chief of Staff Lt. Gen. Herzi Halevi meets with female surveillance soldiers at an army base on the Gaza border, March 12, 2024. (Israel Defense Forces)

Thu 04 April 2024:

The Israeli military has designated tens of thousands of people in Gaza as assassination targets, relying on an AI targeting system — Lavender — with minimal human oversight and a lenient approach to casualties, according to two Israel-based news sites.

Lavender has been instrumental in the unprecedented bombing of Palestinians in Gaza, particularly in the initial phases of the war, news sites +972 Magazine and Local Call reported, adding the impact on military operations was so significant that the outputs of Lavender were essentially treated “as if it were a human decision.”

The AI system conducted extensive surveillance in Gaza, resulting in a list of 37,000 bombing targets, including numerous low-level alleged Hamas operatives who wouldn’t typically be targeted in bombing operations, according to the report.

The revelation stems from the interviews with six Israeli intelligence officers who served during Tel Aviv’s ongoing war on Gaza and were involved in using AI to investigate and destroy targets.

One intelligence officer remarked on the stark contrast between human emotion and the cold efficiency of the machine: “Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”

Another officer questioned the significance of human involvement in the target selection process, noting that their role often amounted to little more than a rubber stamp.

“I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added-value as a human, apart from being a stamp of approval. It saved a lot of time,” the officer admitted.

A relative of a baby who was killed by Israel in attacks and brought to the morgue of the Nasser Hospital, mourns as Israeli air strikes continue on the 19th day in Khan Yunis, Gaza on October 25, 2023. (AA)

 Dumb bombs and floor bombs

Developed by Israel military’s elite intelligence division, Unit 8200, Lavender transformed target identification, rapidly sifting through data to identify potential threats. At one stage, the database listed tens of thousands of Palestinian men allegedly linked to Palestinian resistance groups, streamlining the targeting process.

The accounts revealed how the Israeli military, for certain categories of targets, established pre-authorised allowances for civilian casualties, permitting air strikes to proceed even if they resulted in significant collateral damage.

This leniency, particularly evident in the early weeks of the conflict, allowed for the targeting of low-ranking fighters using unguided munitions, resulting in the destruction of entire homes and the loss of civilian lives.

The staggering death toll of some 33,000 Palestinians during the six-month conflict, with hundreds of families suffering multiple losses, underscores the grim reality of the war’s impact on civilian populations.

The testimonies paint a picture of a military under immense pressure to produce results, with commanders demanding a continuous stream of targets to intensify the war against besieged Palestinians.

The accounts also reveal fluctuations in the threshold for acceptable civilian casualties, with the Israeli military reportedly authorising strikes that could result in the deaths of over 100 civilians in targeting top-ranking Hamas officials.

This permissive approach to collateral damage, particularly concerning for lower-ranking militants, raises significant legal and ethical concerns.

Three intelligence sources told +972 and Local Call that junior Hamas operatives marked by Lavender were assassinated only with dumb bombs, in the interest of saving more expensive armaments.

The implication, one source explained, was that the army would not strike a junior target if they lived in a high-rise building, because the army did not want to spend a more precise and expensive “floor bomb” [with more limited collateral effect] to kill him.

But if a junior target lived in a building with only a few floors, the army was authorised to kill him and everyone in the building with a dumb bomb, according to the investigation.

“It’s much easier to bomb a family’s home. The system is built to look for them in these situations.” (AA)

War crimes of ‘disproportionate attacks’ with AI system

“We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” one of the anonymous officers told the publications.

“On the contrary, the IDF [Israeli military] bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

“An independent examination by an [intelligence] analyst is required, which verifies that the identified targets are legitimate targets for attack, in accordance with the conditions set forth in IDF directives and international law,” the Israeli army told the outlets in response to the investigation.

UN rapporteur Ben Saul warned of potential war crimes over Israeli military decisions.

The UN special rapporteur for human rights and counterterrorism said that reports the Israeli army decided it was permissible to kill 15 or 20 Palestinians for every low-level Hamas associate could constitute “war crimes.”

“If true, many Israeli strikes in Gaza would constitute the war crimes of launching disproportionate attacks,” said Saul in a post on social media.

Leave a Reply

Your email address will not be published. Required fields are marked *