My iPad is showing only 1 featured photo 🔍 . I could have posted one of my many waterfalls, snowy mountains, seascapes, everything, but it is tiny lavender (what are the odds).
Which is also amazing, by the way, smells lovely, and also beautiful.
Discover the VerdantGent Organic Beard Oil! Perfect for maintaining a healthy, luscious beard. This natural blend nourishes both hair and skin, promoting growth and reducing itch.
Experience the calming scent of Lavender essential oil in this variant.
"The #Israeli#Lavender system, supported by artificial intelligence, identifies Palestinians by tracking their communications via WhatsApp or the groups they join."
Termine el primer módulo del diplomado 🫶🏻✨. Y al final la lavanda se convirtió en un bonito bolsito, aún no se en que convertir el bordado de la manzanilla 🤔.
PM: Senkung der Hemmschwelle durch den Einsatz von Künstlicher Intelligenz – #Lavender und Co. sind als #Kriegsverbrechen einzustufen
Wir fordern, die Praxis des „Targeted Killing“ mit unterstützenden KI-Systemen als Kriegsverbrechen einzustufen. Die derzeitige Nutzung von intelligenten Systemen und Mensch-Maschine-Schnittstellen in geheimdienstlichen und militärischen Sicherheitsapparaten muss gestoppt werden.
Wir stellen dazu ein #Positionspapier vor, das der Aufklärung der Öffentlichkeit und als Grundlage für weitere Diskussionen und politische Forderungen dienen soll.
»Autopilot-Technologie - Software für Kamikaze-Drohnen made in Switzerland:
Günstige Mini-Drohnen sind mittlerweile ein fester Bestandteil moderner Kriegsführung – auch dank Schweizer Software.«
Leider wird Technik immer auch von Militärs missbraucht und Menschenfeindlich eingesetzt, auch wenn es unter guten Gewissen erforscht/etwickelt wird.
Wie schon oben erwähnt, die KI wird von Militärs verwendet und ist mMn nicht bloß nur ein Testlabor sondern volle Absicht, egal von wem.
»Grenzen der Kriegsführung - Israels Krieg im Gazastreifen als Testlabor für Künstliche Intelligenz:
Militärprogramme wie "Gospel" und "Lavender" versprechen im Offensivbereich hohe Effizienz – wegen hoher Opferzahlen sind sie aber umstritten. […]«
Quds News Network:
"Israel's army kills the eldest daughter of professor Refat Al Areer, Shaima'a, along with her husband and newborn baby, in a strike on an apartment in Al Rimal neighbourhood in #Gaza
The AI-assisted #genocide: with #Lavender the #zionists set their objective regardless of casualties, women, children, non-combatants.
They know exactly where and when there are children. They know it.
We now understand why permissive #licensing is bad for #FOSS.
#Redis taught us why #GPL is important and #MIT, #Apache, #BSD etc allow corporations to enclose and steal our contributions.
#Israel's use of #Lavender for targeting in #Gaza, which may also use the code we donated to the commons, shows that we need to be more restrictive if we want to avoid assisting war crimes and probable #genocide.
I hope some lawyers are on this, and will help us add exclusions to protect from such use.
@patkane On the point! Mechanised slaughter is also my impression when I read about #AI, but also #drones for war. And it's more awful than the industrialisation of war that we have since WWI. Imagine soldiers with joy(!)sticks, dissociated from reality in a computer game attack. Imagine war AI trained as racist and biased as generative AI ...
It is infuriating and certainly not funny.
Some citizens say that the army has to go periodically "mow the lawn". It reflects an awful colonial mentality.
Now the society has increased another step and decided of #ethnicCleansing.
A cynical Frenchman chose an eradication image: #Lavender.
What kind of developers are that who develop software which selects people to be killed? Can you imagine working in such a team? Are they working with user stories on Kanban boards?
It’s horrifying that we have gotten so close to that seemingly far away future where AI kills actual humans.
SRF schreibt in Infobox: "Gewaltiger Effizienzgewinn durch KI-Einsatz"
Effizienzgewinn = #KI kann mehr Angriffsziele identifizieren (als es die israelischen Militärs können).
Die hohen Kollateralschäden - aka zivile Opfer- sind da nicht mitreingerechnet.
Über ein KI-System namens #Lavender schreibt das Magazin 972:
"This was despite knowing that the system makes what are regarded as “errors” in approximately 10 percent of cases, (...)"
"For every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians; in the event that the target was a senior Hamas official with the rank of battalion or brigade commander, the army on several occasions authorized the killing of more than 100 civilians in the assassination of a single commander."
The terrible human toll in Gaza has many causes.
A chilling investigation by +972 highlights efficiency:
An engineer: “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed.”
An AI outputs "100 targets a day". Like a factory with murder delivery:
"According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”"
"The third is “power targets,” which includes high-rises and residential towers in the heart of cities, and public buildings such as universities, banks, and government offices."
It was easier to locate the individuals in their private houses.
“We were not interested in killing operatives only when they were in a military building or engaged in a military activity. On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”
“The #protocol was that even if you don’t know for sure that the machine is right, you know that statistically it’s fine. So you go for it,” said a source who used #Lavender.
“It has proven itself,” said B., the senior officer. “There’s something about the statistical approach that sets you to a certain norm and standard. There has been an illogical amount of [bombings] in this operation. This is unparalleled, in my memory. And I have much more trust in a statistical mechanism than a soldier who lost a friend two days ago. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”
Another intelligence source said: “In war, there is no time to incriminate every target. So you’re willing to take the margin of error of using artificial intelligence, risking collateral damage and civilians dying, and risking attacking by mistake, and to live with it.”