Article Image

IPFS News Link • Israel - Palestine

Israel Lets AI Decide Who Dies in Gaza

•, by Will Porter

The new system has generated sweeping kill lists condemning tens of thousands of Palestinians, part of the IDF's growing dependence on AI to plan lethal strikes.

Citing six Israeli intelligence officers, the Tel Aviv-based magazine said the previously undisclosed AI system, dubbed 'Lavender,' has played a "central role in the unprecedented bombing" of Gaza since last October, with the military effectively treating its output "as if it were a human decision."

"Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets," the outlet reported, adding that "during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants—and their homes—for possible air strikes."

However, while thousands have been killed in the resulting air raids, the majority were "women and children or people who were not involved in the fighting," the officers told the magazine, noting that Israeli field commanders often rely on the AI system without consulting more substantial intelligence.

"Human personnel often served only as a 'rubber stamp' for the machine's decisions," one source said, adding that many commanders spend a mere "20 seconds" reviewing targets before approving strikes—"just to make sure the Lavender-marked target is male."

Human input has been relegated to such a minor role in the decision-making process that Lavender's conclusions are often treated as "an order" by Israeli troops, "with no requirement to independently check why the machine made that choice."