PM: Senkung der Hemmschwelle durch den Einsatz von Künstlicher Intelligenz – #Lavender und Co. sind als #Kriegsverbrechen einzustufen
Wir fordern, die Praxis des „Targeted Killing“ mit unterstützenden KI-Systemen als Kriegsverbrechen einzustufen. Die derzeitige Nutzung von intelligenten Systemen und Mensch-Maschine-Schnittstellen in geheimdienstlichen und militärischen Sicherheitsapparaten muss gestoppt werden.
Wir stellen dazu ein #Positionspapier vor, das der Aufklärung der Öffentlichkeit und als Grundlage für weitere Diskussionen und politische Forderungen dienen soll.
È grazie ad #Habsora che l’esercito israeliano riesce oggi ad effettuare attacchi su larga scala contro le case dove vivono i membri di Hamas. Ma i raid aerei colpiscano anche altre abitazioni in cui non abitano i militanti, uccidendo intere famiglie di civili.
The terrible human toll in Gaza has many causes.
A chilling investigation by +972 highlights efficiency:
An engineer: “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed.”
An AI outputs "100 targets a day". Like a factory with murder delivery:
"According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”"
"The third is “power targets,” which includes high-rises and residential towers in the heart of cities, and public buildings such as universities, banks, and government offices."
#AI#Habsora estimates in advance the number of innocents killed for each "generated" bombing target:
"Five different sources confirmed that the number of civilians who may be killed in attacks on private residences is known in advance to Israeli intelligence, and appears clearly in the target file under the category of “collateral damage.”
According to these sources, there are degrees of collateral damage, according to which the army determines whether it is possible to attack a target inside a private residence. “When the general directive becomes ‘Collateral Damage 5,’ that means we are authorized to strike all targets that will kill five or less civilians — we can act on all target files that are five or less,” said one of the sources."
A person who took part in previous Israeli offensives in Gaza said:
“If they would tell the whole world that the [Islamic Jihad] offices on the 10th floor are not important as a target, but that its existence is a justification to bring down the entire high-rise with the aim of pressuring civilian families who live in it in order to put pressure on terrorist organizations, this would itself be seen as terrorism. So they do not say it.”
#IDI stands for the Intelligence Division of the Israel army. Here is some praise of technology usage:
May 2021 "is the first time that the intelligence services have played such a transformative role at the tactical level.
This is the result of a strategic shift made by the IDI [in] recent years. Revisiting its role in military operations, it established a comprehensive, “one-stop-shop” intelligence war machine, gathering all relevant players in intelligence planning and direction, collection, processing and exploitation, analysis and production, and dissemination process (PCPAD)".
"The unit is engaged in the same kind of AI work that the world’s biggest tech companies, like Google, Facebook and China’s Baidu are doing in a race to apply machine learning to such functions as self-driving cars, analysis of salespeople’s telephone pitches and cybersecurity — or to fight Israel’s next war more intelligently."
“I’ve always loved algorithms. I was already involved with them in high school and worked in the field. When I [was] drafted I wanted to combine the technology with a combat,” Maj. Sefi Cohen, 34, recalls.
The unit’s only female member left recently. so for the moment it’s an all-male team. Cohen says: “Everyone who’s here is the tops.”
Lt.-Col. Nurit Cohen Inger has overseen #Data at the Israeli #military’s Computer Service Directorate. She showed her enthusiasm on #AIWar to JNS.org in 2017:
“The top level in this field of big data is to have a system that makes recommendations on what to do, based on the data. We are there.”
In theory, this could figure out where to direct strikes, to achieve maximum damage.
Inger said AI “can influence every step and small decision in a conflict, and the entire conflict itself.”
“For this system to work, it has to function at a very high level,” she added. “AI is a machine that has the intelligence characteristics of a person—in this case, by giving recommendations.”
Human commanders will still make the final decisions, Inger said, but they will receive “very precise and relevant recommendations. This is happening, and it will happen much more.”
It was easier to locate the individuals in their private houses.
“We were not interested in killing operatives only when they were in a military building or engaged in a military activity. On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”