tness16, to ai
@tness16@mastodon.social avatar

How serious is the threat of a of -driven wars? | News

AI’s importance for is growing rapidly – even though there’s no international framework on the ethical use of it yet. Now China and the have started bilateral talks on the matter.

https://www.youtube.com/watch?v=NGJ0I7-POeU

remixtures, to ai Portuguese
@remixtures@tldr.nettime.org avatar

: "A new investigation by +972 Magazine and Local Call reveals that the Israeli army has developed an artificial intelligence-based program known as “Lavender,” unveiled here for the first time. According to six Israeli intelligence officers, who have all served in the army during the current war on the Gaza Strip and had first-hand involvement with the use of AI to generate targets for assassination, Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war. In fact, according to the sources, its influence on the military’s operations was such that they essentially treated the outputs of the AI machine “as if it were a human decision.”

Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets. (...) During the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants — and their homes — for possible air strikes.

During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based. One source stated that human personnel often served only as a “rubber stamp” for the machine’s decisions, adding that, normally, they would personally devote only about “20 seconds” to each target before authorizing a bombing — just to make sure the Lavender-marked target is male. This was despite knowing that the system makes what are regarded as “errors” in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all." https://www.972mag.com/lavender-ai-israeli-army-gaza/

alex, to random

Horrid new report from @972mag on an "AI" system called Lavender and "Where's Daddy?" used by the IDF. Has a 10% error rate, and acceptable "collateral" from 15 to 100 civilians per target.

This is sick and the future of AI warfare for US Empire.

https://www.972mag.com/lavender-ai-israeli-army-gaza/

NatureMC,
@NatureMC@mastodon.online avatar

@alex "Their accounts (Israeli Intelligence Officers) were shared exclusively with the Guardian in advance of publication."

The Guardian is officially connected to the investigations of +972 Magazine and investigates more now. https://www.theguardian.com/world/2024/apr/03/israel-gaza-ai-database-hamas-airstrikes

remixtures, to ai Portuguese
@remixtures@tldr.nettime.org avatar

: "Less than four years after that milestone, America’s use of AI in warfare is no longer theoretical. In the past several weeks, computer vision algorithms that form part of the US Department of Defense’s flagship AI effort, Project Maven, have located rocket launchers in Yemen and surface vessels in the Red Sea, and helped narrow targets for strikes in Iraq and Syria, according to Schuyler Moore, the chief technology officer of US Central Command. The US isn’t the only country making this leap: Israel’s military has said it’s using AI to make targeting recommendations in Gaza, and Ukraine is employing AI software in its effort to turn back Russia's invasion.

Navigating AI’s transition from the laboratory into combat is one of the thorniest issues facing military leaders. Advocates for its rapid adoption are convinced that combat will soon take place at a speed faster than the human brain can follow. But technologists fret that the American military’s networks and data aren’t yet good enough to cope; frontline troops are reluctant to entrust their lives to software they aren’t sure works; and ethicists worry about the dystopian prospect of leaving potentially fatal decisions to machines. Meanwhile, some in Congress and hawkish think tanks are pushing the Pentagon to move faster, alarmed that the US could be falling behind China, which has a national strategy to become “the world’s primary AI innovation center” by 2030."

https://www.bloomberg.com/features/2024-ai-warfare-project-maven/

remixtures, to ai Portuguese
@remixtures@tldr.nettime.org avatar

: "OPENAI THIS WEEK quietly deleted language expressly prohibiting the use of its technology for military purposes from its usage policy, which seeks to dictate how powerful and immensely popular tools like ChatGPT can be used.

Up until January 10, OpenAI’s “usage policies” pageOpens in a new tab included a ban on “activity that has high risk of physical harm, including,” specifically, “weapons development” and “military and warfare.” That plainly worded prohibition against military applications would seemingly rule out any official, and extremely lucrative, use by the Department of Defense or any other state military. The new policy retains an injunction not to “use our service to harm yourself or others” and gives “develop or use weapons” as an example, but the blanket ban on “military and warfare” use has vanished.

The unannounced redaction is part of a major rewrite of the policy page, which the company said was intended to make the document “clearer” and “more readable,” and which includes many other substantial language and formatting changes."

https://theintercept.com/2024/01/12/open-ai-military-ban-chatgpt/

remixtures, to ai Portuguese
@remixtures@tldr.nettime.org avatar

: "One former intelligence officer explained that the Habsora system enables the army to run a “mass assassination factory,” in which the “emphasis is on quantity and not on quality.” A human eye “will go over the targets before each attack, but it need not spend a lot of time on them.” Since Israel estimates that there are approximately 30,000 Hamas members in Gaza, and they are all marked for death, the number of potential targets is enormous.

In 2019, the Israeli army created a new center aimed at using AI to accelerate target generation. “The Targets Administrative Division is a unit that includes hundreds of officers and soldiers, and is based on AI capabilities,” said former IDF Chief of Staff Aviv Kochavi in an in-depth interview with Ynet earlier this year."

https://www.972mag.com/mass-assassination-factory-israel-calculated-bombing-gaza/

remixtures, to ai Portuguese
@remixtures@tldr.nettime.org avatar

: "The IDF said that “through the rapid and automatic extraction of intelligence”, the Gospel produced targeting recommendations for its researchers “with the goal of a complete match between the recommendation of the machine and the identification carried out by a person”.

Multiple sources familiar with the IDF’s targeting processes confirmed the existence of the Gospel to +972/Local Call, saying it had been used to produce automated recommendations for attacking targets, such as the private homes of individuals suspected of being Hamas or Islamic Jihad operatives.

In recent years, the target division has helped the IDF build a database of what sources said was between 30,000 and 40,000 suspected militants. Systems such as the Gospel, they said, had played a critical role in building lists of individuals authorised to be assassinated."

https://www.theguardian.com/world/2023/dec/01/the-gospel-how-israel-uses-ai-to-select-bombing-targets

estelle, to random
@estelle@techhub.social avatar

The terrible human toll in Gaza has many causes.
A chilling investigation by +972 highlights efficiency:

  1. An engineer: “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed.”

  2. An AI outputs "100 targets a day". Like a factory with murder delivery:

"According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”"

  1. "The third is “power targets,” which includes high-rises and residential towers in the heart of cities, and public buildings such as universities, banks, and government offices."

🧶

estelle,
@estelle@techhub.social avatar

The first AI war was in May 2021.

stands for the Intelligence Division of the Israel army. Here is some praise of technology usage:

May 2021 "is the first time that the intelligence services have played such a transformative role at the tactical level.

This is the result of a strategic shift made by the IDI [in] recent years. Revisiting its role in military operations, it established a comprehensive, “one-stop-shop” intelligence war machine, gathering all relevant players in intelligence planning and direction, collection, processing and exploitation, analysis and production, and dissemination process (PCPAD)".

Avi Kalo: https://www.frost.com/frost-perspectives/ai-enhanced-military-intelligence-warfare-precedent-lessons-from-idfs-operation-guardian-of-the-walls/

(to be continued) 🧶

estelle,
@estelle@techhub.social avatar

Behind any aircraft that takes off for an attack, there are thousands of soldiers, men and women, who make the information accessible to the pilot. "They produce the targets and make the targets accessible. To set a target, it’s a process with lots of factors that need to be approved. The achievement, the collateral damage and the level of accuracy. For that, you have to interconnect intelligence, (weapon) fire, C4I [an integrated military communications system, including the interaction of troops, intelligence and communication equipment] and more," said Nati Cohen, currently a reservist in the Exercises Division of the C4I Division of the army.

Published in 2021 in a security mag: https://israeldefense.co.il/en/node/50155 @military

estelle,
@estelle@techhub.social avatar

“Levy describes a system that has almost reached perfection. The political echelon wants to maintain the status quo, and the military provides it with legitimacy in exchange for funds and status.”

“Levy points out the gradual withdrawal of the old Ashkenazi middle class from the ranks of the combat forces[…]:
• the military’s complete reliance on technology as a decisive factor in warfare;
• the adoption of the concept […] of an army that is “small and lethal”;
• the obsession with the idea of , which is supposed to negate the other side’s will to fight; and
• the complete addiction to the status quo as the only possible and desirable state of affairs.”

https://www.972mag.com/yagil-levy-army-middle-class/ @israel @ethics @military @idf

estelle,
@estelle@techhub.social avatar

It was easier to locate the individuals in their private houses.

“We were not interested in killing operatives only when they were in a military building or engaged in a military activity. On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

Yuval Abraham reports: https://www.972mag.com/lavender-ai-israeli-army-gaza/

(to follow) 🧶 @palestine @israel @ethics @military @idf @terrorism

  • All
  • Subscribed
  • Moderated
  • Favorites
  • megavids
  • rosin
  • ngwrru68w68
  • Durango
  • DreamBathrooms
  • mdbf
  • magazineikmin
  • thenastyranch
  • Youngstown
  • khanakhh
  • slotface
  • everett
  • vwfavf
  • kavyap
  • provamag3
  • osvaldo12
  • GTA5RPClips
  • ethstaker
  • tacticalgear
  • InstantRegret
  • cisconetworking
  • cubers
  • tester
  • anitta
  • modclub
  • Leos
  • normalnudes
  • JUstTest
  • All magazines