By Daniel Harvey
Yuval Abraham, an Israeli journalist with the online independent non-profit +972 Magazine, has unveiled how extensively AI systems are being used by Israel in its genocidal war in Gaza. The systems–Lavender, Where’s Daddy?, and Habsora (The Gospel in English)–algorithmically create kill lists, track targets to their homes, and identify cultural institutions marked for destruction respectively. Working in tandem with each other these systems, amorally lenient policies designed for lethality-at-scale, and dumb bombs, are responsible for the unprecedented death and devastation caused by Israel since Operation Swords of Iron began on October 7th, 2023.
Before we discuss the technology, we need to understand the policies shaping their use.
In international law there’s a notion called the principle of proportionality. The International Committee of The Red Cross defines it as:
The principle of proportionality prohibits attacks against military objectives which are “expected to cause incidental loss of civilian life, injury to civilians, damage to civilian objects, or a combination thereof, which would be excessive in relation to the concrete and direct military advantage anticipated”. In other words, the principle of proportionality seeks to limit damage caused by military operations by requiring that the effects of the means and methods of warfare used must not be disproportionate to the military advantage sought.
That means nations that proclaim to adhere to “rules-based international order” should only resort to significant collateral damage only when it results in a significant military advantage. Historically that meant in practice Israel would allegedly only target high ranking Hamas or Palestinian Islamic Jihad (PIJ) officers and only if they could guarantee a minimal loss of civilian life.
Since at least 2014, during Operation Protective Edge, Israel started walking away from such guarantees (if they ever truly maintained them in the first place). With Swords of Iron it has turned into a full-on sprint. First by casting a wider net about who they would target and then by multiplicatively skyrocketing the so-called collateral damage coefficient:
“In the past, we did not regularly mark the homes of junior Hamas members for bombing,” said a security official who participated in attacking targets during previous operations. “In my time, if the house I was working on was marked Collateral Damage 5, it would not always be approved [for attack].” Such approval, he said, would only be received if a senior Hamas commander was known to be living in the home.
“To my understanding, today they can mark all the houses of [any Hamas military operative regardless of rank],” the source continued. “That is a lot of houses. Hamas members who don’t really matter for anything live in homes across Gaza. So they mark the home and bomb the house and kill everyone there.”
So what caused the policy shift? Abraham’s sources within the IDF describe the atmosphere in the early days of the war as "painful and vindictive" and that “there was hysteria in the professional ranks." One source, called simply A., expanded on this by saying:
According to A., who was an officer in a target operation room in the current war, the army’s international law department has never before given such “sweeping approval” for such a high collateral damage degree. “It’s not just that you can kill any person who is a Hamas soldier, which is clearly permitted and legitimate in terms of international law,” A. said. “But they directly tell you: ‘You are allowed to kill them along with many civilians.’
A. also used the word “revenge” to describe the atmosphere inside the army after October 7. “No one thought about what to do afterward, when the war is over, or how it will be possible to live in Gaza and what they will do with it,” A. said. “We were told: now we have to fuck up Hamas, no matter what the cost. Whatever you can, you bomb.”
Emphasis mine.
This change in policy represented two fundamental challenges to target operations teams in the IDF and the technologists in Unit 8200, Israel’s equivalent of the NSA:
Israel historically did a poor job of tracking lower-ranking members of Hamas
Issuing assassination orders (a process known “incrimination”) at scale would be too time consuming to have a human-in-loop in any meaningful way
The new policy also posed a technical problem for Israeli intelligence. In previous wars, in order to authorize the assassination of a single human target, an officer had to go through a complex and lengthy “incrimination” process: cross-check evidence that the person was indeed a senior member of Hamas’ military wing, find out where he lived, his contact information, and finally know when he was home in real time. When the list of targets numbered only a few dozen senior operatives, intelligence personnel could individually handle the work involved in incriminating and locating them.
However, once the list was expanded to include tens of thousands of lower-ranking operatives, the Israeli army figured it had to rely on automated software and artificial intelligence. The result, the sources testify, was that the role of human personnel in incriminating Palestinians as military operatives was pushed aside, and AI did most of the work instead. According to four of the sources who spoke to +972 and Local Call, Lavender — which was developed to create human targets in the current war — has marked some 37,000 Palestinians as suspected “Hamas militants,” most of them junior, for assassination (the IDF Spokesperson denied the existence of such a kill list in a statement to +972 and Local Call).
Unit 8200 and its mysterious director “Colonel Yoav” had been working on this exact problem for years. Under the pen name “Brigadier General Y.S.” “Yoav” authored a book in 2021 on the subject called “The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World.” He also delivered a lecture to Unit 8200’s Data Science & AI Center around the same time that laid out the foundations of how the program worked during its inaugural run (also in 2021).
In brief, it works a lot like the graph databases that power the friend recommendation algorithms of most social media companies:
Its trained on data gleaned from mass surveillance of Hamas operatives
It analyses common features and traits of members of the original training data set
It then interpolates that data against all 2.3 million Palestenians living in Gaza to rate how well their behavioural patterns match those of Hamas militants
Ranked scores from 1 to 100 were then issued
If your score is at or above the threshold then you’ve become a human target
According to Abraham’s sources in the early days of the war some 37,000 Palestenians were marked for death in this way.
And just like the collateral damage coefficient, the threshold for inclusion on the kill list was just as flexible. On any given day a kill list could be generated with a lower or higher threshold based on how many targets the IDF wanted to process.
What sorts of features or behaviours got people onto this kill list? In the book “Yoav” cited examples like “being in a Whatsapp group with a known militant, changing cell phone every few months, and changing addresses frequently.”
In the Israeli war machine, Lavender has been deemed to be 90% accurate. Despite knowing full well that their traits often scooped up innocents like “police and civil defense workers, militants’ relatives, residents who happened to have a name and nickname identical to that of an operative, and Gazans who used a device that once belonged to a Hamas operative.”
In short order the only process “humans in the loop” took to verify moving ahead with inclusion on a kill list was vetting whether or not a target was a man or a woman (because women aren’t members of Hamas or PIJ).
To conduct the male/female check, B. claimed that in the current war, “I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added value as a human, apart from being a stamp of approval. It saved a lot of time. If [the operative] came up in the automated mechanism, and I checked that he was a man, there would be permission to bomb him, subject to an examination of collateral damage.”
Lavender's kill lists were then fed into a program called Where’s Daddy?, which tracks targets based on drone footage, intercepted communications, and other mass surveillance techniques and pings IDF forces to bomb a target only when they’re believed to be at home. This is to ensure as much collateral damage as possible.
“You put hundreds [of targets] into the system and wait to see who you can kill,” said one source with knowledge of the system. “It’s called broad hunting: you copy-paste from the lists that the target system produces.”
“One day, totally of my own accord, I added something like 1,200 new targets to the [tracking] system, because the number of attacks [we were conducting] decreased,” the source said. “That made sense to me. In retrospect, it seems like a serious decision I made. And such decisions were not made at high levels.”
According to Abraham’s sources it wasn’t long until everyone marked by Lavender was tracked by systems like Where’s Daddy?
Lavender and systems like Where’s Daddy? were thus combined with deadly effect, killing entire families, sources said. By adding a name from the Lavender-generated lists to the Where’s Daddy? home tracking system, A. explained, the marked person would be placed under ongoing surveillance, and could be attacked as soon as they set foot in their home, collapsing the house on everyone inside.
“Let’s say you calculate [that there is one] Hamas [operative] plus 10 [civilians in the house],” A. said. “Usually, these 10 will be women and children. So absurdly, it turns out that most of the people you killed were women and children.”
Emphasis mine. Within the first month of war the numbers spoke for themselves:
Evidence of this policy is also clear from the data: during the first month of the war, more than half of the fatalities — 6,120 people — belonged to 1,340 families, many of which were completely wiped out while inside their homes, according to UN figures. The proportion of entire families bombed in their houses in the current war is much higher than in the 2014 Israeli operation in Gaza (which was previously Israel’s deadliest war on the Strip), further suggesting the prominence of this policy. (emphasis mine)
Given Israel’s policies pushed for more targets and more collateral damage the weapon of choice became clear: dumb bombs. Not only do they cause more destruction than guided or smart bombs but they’re also much cheaper to manufacture and purchase. CNN reported that by December, 2023 45% of the munitions used were dumb bombs.
Three intelligence sources, however, told +972 and Local Call that junior operatives marked by Lavender were assassinated only with dumb bombs, in the interest of saving more expensive armaments. The implication, one source explained, was that the army would not strike a junior target if they lived in a high-rise building, because the army did not want to spend a more precise and expensive “floor bomb” (with more limited collateral effect) to kill him. But if a junior target lived in a building with only a few floors, the army was authorized to kill him and everyone in the building with a dumb bomb.
“It was like that with all the junior targets,” testified C., who used various automated programs in the current war. “The only question was, is it possible to attack the building in terms of collateral damage? Because we usually carried out the attacks with dumb bombs, and that meant literally destroying the whole house on top of its occupants. But even if an attack is averted, you don’t care — you immediately move on to the next target. Because of the system, the targets never end. You have another 36,000 waiting.”
Emphasis mine. Other factors compounded to increase the bloodshed. Firstly, collateral damage estimates were based on old data:
“This model was not connected to reality,” claimed one source. “There was no connection between those who were in the home now, during the war, and those who were listed as living there prior to the war. [On one occasion] we bombed a house without knowing that there were several families inside, hiding together.”
The source said that although the army knew that such errors could occur, this imprecise model was adopted nonetheless, because it was faster. As such, the source said, “the collateral damage calculation was completely automatic and statistical” — even producing figures that were not whole numbers.
Secondly, there were often lag times between a receiving a ping from Where’s Daddy and the IDF actually executing the kill command + the presence of the target was never confirmed in real-time:
“Sometimes [the target] was at home earlier, and then at night he went to sleep somewhere else, say underground, and you didn’t know about it,” one of the sources said. “There are times when you double-check the location, and there are times when you just say, ‘Okay, he was in the house in the last few hours, so you can just bomb.’”
Another source described a similar incident that affected him and made him want to be interviewed for this investigation. “We understood that the target was home at 8 p.m. In the end, the air force bombed the house at 3 a.m. Then we found out [in that span of time] he had managed to move himself to another house with his family. There were two other families with children in the building we bombed.”
Another pre-October 7th policy that was scrapped in the interests of speed and scale for these AI generated kill targets was even confirming the kill:
In previous wars in Gaza, after the assassination of human targets, Israeli intelligence would carry out bomb damage assessment (BDA) procedures — a routine post-strike check to see if the senior commander was killed and how many civilians were killed along with him. As revealed in a previous +972 and Local Call investigation, this involved listening in to phone calls of relatives who lost their loved ones. In the current war, however, at least in relation to junior militants marked using AI, sources say this procedure was abolished in order to save time. The sources said they did not know how many civilians were actually killed in each strike, and for the low-ranking suspected Hamas and PIJ operatives marked by AI, they did not even know whether the target himself was killed.
Habsora seems to be the architectural equivalent of Lavender. According to Abraham’s reporting its used by Israel to generate a variety of target types:
Tactical targets–like weapons warehouses, military headquarters, etc.
Underground tunnels
Power targets–public cultural, commercial, and governmental institutions as well as private high-rises and residential towers
Family homes
In the early stages of the current war, the Israeli army appears to have given particular attention to the third and fourth categories of targets. According to statements on Oct. 11 by the IDF Spokesperson, during the first five days of fighting, half of the targets bombed — 1,329 out of a total 2,687 — were deemed power targets.
Emphasis mine. As you might expect, protocols have changed in the current war. Previously demolition of power targets were said to only be permitted if buildings were empty of residents and civilians. Yet another policy abandoned in the current bloodbath.
The massive attacks on power targets and private residences came at the same time as the Israeli army, on Oct. 13, called on the 1.1 million residents of the northern Gaza Strip — most of them residing in Gaza City — to leave their homes and move to the south of the Strip. By that date, a record number of power targets had already been bombed, and more than 1,000 Palestinians had already been killed, including hundreds of children.
Emphasis mine. As with Lavender, Habsora was test-piloted in Gaza in 2021:
Gen. Aviv Kochavi, reflected on Habsora's use in the 11-day 2021 Israeli bombing of Gaza. "You see, in the past there were times in Gaza when we would create 50 targets per year," Kochavi said. "And here the machine produced 100 targets in one day."
Rolling back further still, the notion of power targets stems from the “Dahiya Doctrine” first proposed during the Second Lebanon War in 2006:
According to the doctrine — developed by former IDF Chief of Staff Gadi Eizenkot, who is now a Knesset member and part of the current war cabinet — in a war against guerrilla groups such as Hamas or Hezbollah, Israel must use disproportionate and overwhelming force while targeting civilian and government infrastructure in order to establish deterrence and force the civilian population to pressure the groups to end their attacks. The concept of “power targets” seems to have emanated from this same logic.
If that sounds like the U.S. concept of shock and awe then you’re not alone. Spencer Ackerman, pulitzer-prize winning American journalist noted the similarities:
Abraham writes that his sources, to one degree or another, understood that inflicting such "damage to civil society" is "the real purpose of these attacks." The point of bombing "Power Targets" is to shock Palestinians into "put[ting] pressure on Hamas," one of his sources said. That sounds awfully reminiscent of the U.S. airpower concept of "Shock and Awe" made famous during the Iraq War: inflicting devastation so psychologically overwhelming as to break Iraqis' will to resist. If you remember the Iraq War, you'll remember it didn't exactly turn out that way. The scale promised by military AI is like a combination of Shock And Awe and the CIA's War on Terror "Signature Strikes." Those strikes killed people whose observed "patterns of life" fit analyst-presumed patterns of militant activity, like being a man between the ages of 18 and 50 who carried a gun. The CIA didn't even have to know their names to kill them.
Existing alongside the "Power Target" concept for the IDF are decisions about acceptable civilian deaths. Abraham reminds us that those have changed over time. In 2002, Israel reaped outrage when it bombed the home of Hamas military commander Salah Shehadeh, killing Shehadeh, his aide and 13 of his relatives and neighbors, including eight children. Twenty-one years later, Israel considers it acceptable to bomb the residences of even low-level Hamas fighters. Once that choice has been made, Israel is necessarily going to kill scores of Palestinians with no ties to Hamas beyond their misfortune to live in the same apartment building or on the same street as someone who does. Strikes on such targets, in such an environment—one of the most densely packed areas on earth, remember—operate as a "means that allows damage to civil society," one of Abraham's IDF sources said. At that point, Habsora might be better understood as a war-crimes engine.
Emphasis mine.
Fantasists have been raising the alarm of military applications of AI for years. Examples abound but one of the ones that I always took umbrage with was “Slaughterbots”. In this grim portent of future war “miniature, flying lethal autonomous weapons become real weapons of mass destruction”. Hacked by bad actors they lay waste to university students, congress critters across various ideological fault lines. While I praise the production values and support the arms-control advocacy of the project it always seemed wrong on one key point.
Precision.
Precision is a myth. The naked war of aggression in Gaza by Israel today should make that abundantly clear. War doesn’t care about precision. War doesn’t care about accuracy. War only cares about scale.
If you’d like to read more about this then I recommend you read Abraham’s original exposes and the cliffnotes coverage of them from Spencer Ackerman:
‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza By Yuval Abraham at 972mag.com
‘A mass assassination factory’: Inside Israel’s calculated bombing of Gaza By Yuval at Abraham 972mag.com
Kill Lists In The Age of Artificial Intelligence By Spencer Ackerman at Forever Wars
The Future of Warfare Is Happening In Gaza By Spencer Ackerman at Forever Wars