this post was submitted on 03 Apr 2024
53 points (88.4% liked)

Technology

59374 readers
7416 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

In 2021, a book titled “The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World” was released in English under the pen name “Brigadier General Y.S.” In it, the author — a man who we confirmed to be the current commander of the elite Israeli intelligence unit 8200 — makes the case for designing a special machine that could rapidly process massive amounts of data to generate thousands of potential “targets” for military strikes in the heat of a war. Such technology, he writes, would resolve what he described as a “human bottleneck for both locating the new targets and decision-making to approve the targets.”

Such a machine, it turns out, actually exists. A new investigation by +972 Magazine and Local Call reveals that the Israeli army has developed an artificial intelligence-based program known as “Lavender,” unveiled here for the first time. According to six Israeli intelligence officers, who have all served in the army during the current war on the Gaza Strip and had first-hand involvement with the use of AI to generate targets for assassination, Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war. In fact, according to the sources, its influence on the military’s operations was such that they essentially treated the outputs of the AI machine “as if it were a human decision.”

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 7 months ago

There appear to be a number of systems here.

The previously reported AI system "The Gospel" tracked buildings that presumably had targets. "Lavender" is a newly discovered AI system that tracked people who had displayed similar characteristics as targets. This system tended to mark non-combatant civil workers for the Hamas government among other mistakes, but was understood to have a 90% accuracy rate from a manually acquired sample. The threshold for valid targets changed from day to day depending on how many targets the higher command wanted. A third system called "Where's Daddy?" followed input targets so that they could efficiently find and kill them, along with their family, children and the other uninvolved families that happened to be there. This appears to be a matter of convenience for the IDF.

Intelligence personnel tended to just copy and paste (the 90% accurate) Lavender system output directly into the Where's Data System. Typically the only check prior to strike authorization was whether the target was male, but this didn't consider that the rest of the people in the target building would be mostly women and children, and in some instances the intended target had fled or avoided the area by the time the strike occurred.


The Nazi Party kept better records and had more oversight of their systematic genocide campaign 80 years ago.