The terrible human toll in Gaza has many causes.
A chilling investigation by +972 highlights efficiency:
An engineer: “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed.”
An AI outputs "100 targets a day". Like a factory with murder delivery:
"According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”"
"The third is “power targets,” which includes high-rises and residential towers in the heart of cities, and public buildings such as universities, banks, and government offices."
Behind any aircraft that takes off for an attack, there are thousands of soldiers, men and women, who make the information accessible to the pilot. "They produce the targets and make the targets accessible. To set a target, it’s a process with lots of factors that need to be approved. The achievement, the collateral damage and the level of accuracy. For that, you have to interconnect intelligence, (weapon) fire, C4I [an integrated military communications system, including the interaction of troops, intelligence and communication equipment] and more," said Nati Cohen, currently a reservist in the Exercises Division of the C4I Division of the army.
"The unit is engaged in the same kind of AI work that the world’s biggest tech companies, like Google, Facebook and China’s Baidu are doing in a race to apply machine learning to such functions as self-driving cars, analysis of salespeople’s telephone pitches and cybersecurity — or to fight Israel’s next war more intelligently."
“I’ve always loved algorithms. I was already involved with them in high school and worked in the field. When I [was] drafted I wanted to combine the technology with a combat,” Maj. Sefi Cohen, 34, recalls.
The unit’s only female member left recently. so for the moment it’s an all-male team. Cohen says: “Everyone who’s here is the tops.”
Lt.-Col. Nurit Cohen Inger has overseen #Data at the Israeli #military’s Computer Service Directorate. She showed her enthusiasm on #AIWar to JNS.org in 2017:
“The top level in this field of big data is to have a system that makes recommendations on what to do, based on the data. We are there.”
In theory, this could figure out where to direct strikes, to achieve maximum damage.
Inger said AI “can influence every step and small decision in a conflict, and the entire conflict itself.”
“For this system to work, it has to function at a very high level,” she added. “AI is a machine that has the intelligence characteristics of a person—in this case, by giving recommendations.”
Human commanders will still make the final decisions, Inger said, but they will receive “very precise and relevant recommendations. This is happening, and it will happen much more.”
It was easier to locate the individuals in their private houses.
“We were not interested in killing operatives only when they were in a military building or engaged in a military activity. On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”