Khaberni - Experts have confirmed that the artificial intelligence tools used by the Israeli occupation in its military arsenal cannot distinguish between civilians and military personnel, and are designed to make mass killing decisions without referring to humans.
The military and strategic expert Elias Hanna explained that the Israeli occupation forces employ artificial intelligence technologies at all political, strategic, operational, and tactical levels, altering the nature of decision-making in battlefields.
He said that the role of artificial intelligence involves compiling and analyzing large datasets, then working on suggesting simple targets to expedite decision-making, "and when the primary target bank is depleted, the technical system begins generating new targets from the information it possesses."
Hanna clarified that this system trains on massive datasets and connects them to open-source information to refresh and suggest more targets.
Hanna described the strategy used by the occupation forces as a "digital proxy war," as if it grants a proxy to a digital system that collects, verifies, analyzes information, and then makes decisions without returning to the human element.
The military expert talked about the "bias" of these models in issuing targets that serve Tel Aviv's interest, listing several artificial intelligence systems and software used by the army in its operations in Gaza Strip:
- Fire Factor: For scheduling targets and estimating the necessary amount of explosives to eliminate a specific target.
- Depth of Wisdom: For connecting and tracking tunnel information in Gaza, and developing strategies to eliminate them.
- Chemist: It acts as an alarm system indicating risks to combat forces during tactical missions.
- Lavender: An artificial intelligence model developed by the army to analyze data from phones, images, and social media. Hanna confirms that the "Lavender" system is entirely biased towards Israeli interests because the data and maps it is trained on are inclined in favor of Tel Aviv.
Hanna also pointed to the use of explosive robots carrying tons of explosives, and mentioned the "Arbel" system, which improves firing accuracy via combat mechanisms such as rifles and night sights.
He said that a bomb like "GBU-32" dropped on a residential complex, even if empty, definitely causes civilian casualties, and the "Lavender" model allows the user to target civilians in specific numbers according to their military rank.
Minimal Human Supervision
From the perspective of legal and ethical responsibility, Jessica Dorsey, an expert in responsible use of artificial intelligence, stated that the real disaster lies in the generation of targeted lists that include civilians with minimal or insufficient human supervision over automatic decisions, which amounts to breaches of international targeting rules.
The expert accused the Israeli government of causing civilian casualties despite relying on artificial intelligence tools that are supposed to protect civilians due to their accuracy in target determination.
Hanna warned of the risks of this complete reliance on artificial intelligence systems, noting that they do not reliably distinguish between military and civilian targets, especially in densely populated areas like the Gaza Strip, thereby increasing the likelihood of civilian casualties.
Hanna gave a clear example of targeting policies that might allow a military leader of a certain rank to commit "civilian casualties" reaching specified numbers.
According to Hanna, if a commander is of the second rank, he is allowed to make a decision to kill 20 civilians, while if he is of the first rank, he is permitted to cause 100 civilian deaths.
Dorsey emphasized the importance of employing human cognition behind the machine to monitor its movements and adjust its decisions before it makes any decision and targets unintended objectives.
The artificial intelligence expert pointed to the role of international tech companies like "Microsoft" in supplying the Israeli military with technology products used in this context.
How did Hamas surpass?
Despite the overwhelming technological superiority, Hanna explained that the occupation's failure to achieve its strategic objectives over the past two years is due to failures in managing the war within the enclave, and the complex and flexible strategy of the Islamic Resistance Movement (Hamas).
He said that the fighters from the resistance "prepared the battlefield in an unmatched manner, utilizing tactics and a combat environment that made the use of greater technological force—such as tanks and advanced munitions—less effective in urban clusters and tunnels."
Hanna considered that advanced weapons and software do not alone guarantee victory; the absence of an ethical human officer and suitable field strategies, along with the characteristics of the battlefield in Gaza, reduced the effectiveness of these technologies and led to outcomes contrary to Israeli interests.




