PoliticsSecurity

The Blind Technology: How Israel Used Artificial Intelligence in the Wars in Gaza and Lebanon?

Artificial intelligence (AI) has become a powerful presence on the battlefield, playing significant roles in military operations worldwide. Its applications range from analyzing intelligence data to carrying out targeted strikes using drones and conducting reconnaissance through military robots. However, in its conflicts against Hamas and Hezbollah, Israel employed AI in a markedly different manner.

Traditionally, human operators identify military targets for attack or assassination, and AI is tasked with executing these missions. For example, a drone may carry out the assassination of a target designated by military leadership. However, when the scope of potential targets is significantly broadened, human operators may struggle to manually identify, assess, and review all these targets within the limited time available, which is often not in favor of the combat forces.

In light of Israel designating nearly all members of Hamas and Hezbollah as targets, the country adopted a strategy based on collaboration between humans and AI. In this instance, AI took on the responsibility of identifying, reviewing, and tracking targets, while human operators focused on executing strikes via aerial bombardment.

To implement this strategy effectively, Israel utilized several AI systems, including “The Gospel,” “Lavender,” and “Where’s Daddy?” These systems were responsible for target identification, with humans handling the execution of attacks.

The “Gospel” system identifies buildings believed to be used by Hamas and Hezbollah members. The “Lavender” system identifies, prioritizes, and tracks human targets. Meanwhile, “Where’s Daddy?” tracks these individuals when they are in their family homes, preparing them for bombardment.

According to a report published by the Israeli website (+972 Magazine), these systems assist Israeli forces in determining who should be targeted, with minimal human intervention in the decision-making process.

This analysis will shed light on these three systems based on available information, while reviewing the strategy employed by Israel to achieve integration between these systems and its military forces in the wars in Gaza and Lebanon.

The “Human-Machine Team” Strategy:

In 2021, a significant book titled “The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World” was published, authored by Brigadier General Y.S., believed to be the commander of Israel’s elite intelligence unit (8200). The book discusses a revolutionary machine capable of processing vast amounts of data rapidly to generate thousands of potential targets for military strikes with exceptional speed and accuracy. This surpasses the challenges faced by human operators in identifying targets and making critical decisions during wars.

The machine described in this book is not a distant future we aspire to; it has already been deployed in the Gaza conflict. Several Israeli and Western reports indicated that Israel utilized various AI systems in significant integration with its military forces to identify military targets, whether human or structural, for automated destruction.

This process involves AI-driven systems that analyze big data to determine targets for destruction or assassination and issue automated decisions based on real-time and updated information. Such technology significantly accelerates the process of identifying and striking targets, functioning like a “production line for targets.” It also addresses the issue of “human bottlenecks,” as described by the book’s author, concerning target identification and the rapid decision-making necessary for authorizing their destruction; it essentially serves as a large-scale production line for targets designated for elimination.

To effectively realize this strategy, Israel established a new military intelligence unit within the Military Intelligence Directorate in 2019, classified as secret, named the “Target Unit.” This unit played a pivotal role in Israel’s response to Hamas’s attack on October 7, 2023, by utilizing AI technologies. It facilitated the swift identification of required targets and sent recommendations for engagement to military forces through a smart application known as “Pillar of Fire,” which is employed by military operations leaders.

The effectiveness of this strategy is evident in the number of daily airstrikes and the targets hit by Israeli forces, which exceeded attacks on more than a thousand targets daily, in addition to numerous precision assassinations of military leaders. This level of operation necessitated significant effort from intelligence agencies, but by themselves, these agencies were insufficient to achieve such a vast number of targets in a very short timeframe with that degree of accuracy, unless supported by a comprehensive strategy that integrates human effort with artificial intelligence to work in complete synergy between man and machine.

“Gospel”: A Building and Structure Targeting System

On November 2, 2023, a brief statement on the Israeli army’s website highlighted the use of the “Gospel” system in the war against the Gaza Strip for rapid target generation. The primary function of “Gospel” is to automatically identify potential military strike targets, particularly buildings and structures that may have military use. It is designed to assist the Israeli army in automating the process of target identification and decision-making, thereby reducing the need for human intervention.

While detailed information on how “Gospel” operates is scarce, such systems typically analyze satellite imagery, drone and surveillance camera footage, intercepted communications, and information derived from monitoring the movements and behavior patterns of individuals and large groups, along with intelligence data. All of this data is fed into the system, which then establishes relationships among the information to identify buildings used by Hamas operatives or other potential military targets.

“Gospel” utilizes several indicators or features to pinpoint these areas, such as buildings with additional fortifications, concrete reinforcements, or those frequented by large groups regularly, or that have undergone structural changes when compared to older images. The system also relies on multispectral image analysis to identify the types of materials used in fortifications and to distinguish between natural terrains and artificial structures, such as tunnel openings or concealed trenches beneath the surface. Furthermore, it analyzes logistical movements in potential sites to identify military activity, such as the transfer of weapons or heavy equipment, which typically signals preparations for military operations or the presence of a military facility.

The “Gospel” system integrates data from various sources, including satellite imagery, radar, thermal sensing, and other intelligence such as field reports or communications. This ultimately helps build an accurate picture of Hamas locations and activities.

“Gospel” has played a critical role in compiling lists of targeted objectives for Israel. Aviv Kochavi, a former chief of the Israeli army, stated that “when this machine was activated during Israel’s 11-day war with Hamas in May 2021, it produced 100 targets a day, whereas previously we generated 50 targets in Gaza per year. Now, this machine produces 100 targets in a single day.”

According to the official Israeli army website, the “Gospel” system was developed by the renowned intelligence unit 8200, known for creating intelligent and cyber technologies. This unit contributed to targeting the Iranian nuclear program in 2009 through the “Stuxnet” worm. “Gospel” is relatively modern, having been mentioned for the first time on the official site in 2020 as one of the projects awarded the Israeli army’s innovation prize.

“Lavender”: An Individual Targeting System

The “Lavender” system has played a significant role in building a database of human targets associated with Hamas, resulting in an unprecedented escalation in the bombing of Palestinians, especially during the initial phases of the current war. This system is designed to identify all suspected individuals within the military wings of Hamas and Islamic Jihad, including low-ranking members, as direct targets for airstrikes.

The system is fed with data on current Hamas members, and it is trained on this information by analyzing individual patterns and behaviors to learn their general characteristics, aiming to extrapolate these traits to other potential individuals. The system then searches through surveillance data related to almost every individual in Gaza, including images, phone contacts, and other information to assess the likelihood of someone being armed. Factors such as being part of a WhatsApp group with a known militant, frequently changing mobile phones, or regularly changing addresses are taken into account. Palestinians are then classified based on their similarity to militants, and any individual identified with incriminating features becomes a target for assassination, with minimal human intervention, according to an investigative report by +972 Magazine.

In the initial weeks of the Gaza war, “Lavender” identified approximately 37,000 Palestinians as suspected militants as potential targets for airstrikes. This was achieved by analyzing information collected on most of the 2.3 million residents of the Gaza Strip through a mass surveillance system, categorizing the likelihood of each individual being active in the military wing of Hamas or Islamic Jihad according to an index ranging from 1 to 100, which indicates their probability of being armed or a member of Hamas.

If “Lavender” determines that an individual is armed in Hamas, they become a target for bombing without any independent verification of why the system selected those targets or examination of the preliminary intelligence data it relied on. For instance, the magazine reported, citing soldiers who worked on this system, that “Lavender” occasionally misidentifies individuals as targets simply because their communication patterns resemble those of Hamas or Islamic Jihad operatives, including police officers, civil defense workers, relatives of militants, and residents who happen to share a name or surname with a militant or individuals in Gaza using a device that previously belonged to a Hamas member. Additionally, the system’s training based on their communication profiles has made “Lavender” more prone to mistakenly selecting civilians when applying its algorithms to the general population; a common error occurs when a Hamas member lends their phone to their son, older brother, or another man, leading to that individual being bombed in their home along with their family.

“Where is My Father?”: A Home Targeting System

The “Where is My Father?” system represents the third phase of targeting. This system was specifically designed to track targeted individuals and execute airstrikes when they enter their family homes, rather than attacking them during military activities. From an intelligence perspective, the likelihood of hitting these individuals in their homes is significantly higher than in military areas, which often feature fortifications. Additionally, this method results in the targeting of entire families rather than just a single Hamas operative, leading to a further increase in civilian casualties.

Everyone in Gaza has a personal home that can be linked to them, allowing Israeli army surveillance systems to easily and automatically associate individuals with their family residences. Programs have been developed to track thousands of individuals in real time, and when the moment is identified for when the targets return to their homes, an automatic alert is sent to the targeting officer, who then carries out the bombing on the house and everyone inside. Consequently, the combination of the “Lavender” system and the “Where is My Father?” system has resulted in deadly outcomes.

Fatal Errors

These technologies have rendered the killing process more automated and devastating, lacking any distinction between civilian and military targets. Soldiers receive a list of targets generated by the computer without necessarily knowing the basis on which these targets were selected. Are the recommendations of this machine truly accurate, or do they contain a margin of error? Are all those identified by it 100% the required elements, or do they possess lower confidence levels that have been overlooked in the desire to target anyone associated with Hamas? Furthermore, what about the issues related to the quality of data used for training the system, algorithmic biases, or errors in assessing the level of threat?

All these errors are highly likely when dealing with intelligent systems. If their primary function is killing, this increases the probability of hitting wrong or unwanted targets. The use of destructive power that is disproportionate to the size of the target can sometimes result in a very high number of unintended casualties. Thus, these machines become blind rather than intelligent; their aim is comprehensive destruction rather than precise target selection. The result has been an alarming death toll, surpassing 41,000 in Gaza alone.

Mohamed SAKHRI

I’m Mohamed Sakhri, the founder of World Policy Hub. I hold a Bachelor’s degree in Political Science and International Relations and a Master’s in International Security Studies. My academic journey has given me a strong foundation in political theory, global affairs, and strategic studies, allowing me to analyze the complex challenges that confront nations and political institutions today.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *


Back to top button