Lavender, "The Gospel", and the Scale of AI War
By Alex
Earlier this month, Israeli journalist Yuval Abraham and +972mag published a harrowing report about a system called Lavender, a technology which the Israeli Defense Forces are using to assassinate Hamas and Palestinian Islamic Jihad targets in the Gaza Strip. The whole piece is worth a read, but it is stomach churning, with each sentence, as one Twitter user described it, harder to read than the last. The weight of the piece also signals how the technology is being used as both an excuse for wonton death and a means to scale violence.
Abraham and the publication reported in November 2023 on another system used by IDF called Habsora or "The Gospel", the purpose of which was to expand the kind of targets that the Israeli military was allowed to bomb in their offensive in Gaza. In the vengeful fit post-October 7, the IDF expanded their bombing to so-called "power targets"--which included universities, banks, and government offices--to exert "civil pressure" on Hamas. (Nevermind that it is appalling that these were ever on the table, and that there are no more universities in Gaza after the current offensive), and to "operative homes", which targeted the families of suspected Hamas members, another appalling admission of the targeting of civilians.
The reporting on Lavender indicates that the system has expanded the range of targeting for Israel, including "junior operatives", which Abraham's source, a senior officer only called "B.", says made the program's "target generation [go] crazy".
Many others on social media have noted that what we should take away from the article isn't necessarily that the Lavender tool is being used at all; nor the fact of its inherent inaccuracy (that 10 percent of the targets were not Hamas members, or that the only human verification of such a system was whether the target was male or not). The details of the system are sparse: whether it is truly "AI", machine learning, or another system of mass pattern matching is besides the point.
The larger conceit is the blatant plausible deniability, the ability of military forces to say that they are justified in eliminating whole families from the Earth, and that they can point to such a tool to ground their decimation. The shifting of a hyperparameter or allowing the error bars a little more slack does not absolve the artifice of genocide and the dehumanization which makes this all possible. B. remarks with this outright: "Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”
While the details outlined by +972mag are gut-wrenching, one part that is worth outlining is the description of the ideological foundations of the project, contained within a book written by Yossi Sariel, the current commander of Unit 8200, which is the Israeli equivalent of the U.S. National Security Administration.
“The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World” was released in English under the pen name “Brigadier General Y.S.” In it, the author — a man who we confirmed to be the current commander of the elite Israeli intelligence unit 8200 — makes the case for designing a special machine that could rapidly process massive amounts of data to generate thousands of potential “targets” for military strikes in the heat of a war. Such technology, he writes, would resolve what he described as a “human bottleneck for both locating the new targets and decision-making to approve the targets.” [emphasis added]
Journalist Abraham continues later with a description of the book:
Describing human personnel as a “bottleneck” that limits the army’s capacity during a military operation, The commander laments: “We [humans] cannot process so much information. It doesn’t matter how many people you have tasked to produce targets during the war — you still cannot produce enough targets per day.”
As if the deaths of over 40,000 Palestinians had not been enough fodder, for a never-ending list of targets. B., again, echoes the themes of the book: "We were constantly being pressured: ‘Bring us more targets.’ They really shouted at us. We finished [killing] our targets very quickly."
The technology is not the issue. If anything, it is synedoche, an entrée into the longstanding violence of colonialism, imperialism, and militarized borders. The technology could be logistic regression, object recognition, or a simple network map. But analytically, it is also points to the different junctures of institutions (and possible points of resistance) imbricated in the process of war-making; as TIME recently highlighted and as No Tech for Apartheid organizers already knew, Google and Amazon are part of the web of military contracts and war profiteers from IDF's 6+ month incursion into Gaza. Cloud services will be the locus of the data stores for which Lavender is hosted, the storerooms for virtual Hollerith machines.
The +972mag article should reveal a possibly banal truth of the AI-ness of the current era: whatever seems to be the grist, there is a need for scale, whether that's synthetic text to be exchanged by email agents, extruded images splashed on junk webpages, or, most grimly, targets for a regime engaged in genocide.