this post was submitted on 18 Jul 2024
480 points (96.5% liked)
Technology
59374 readers
7409 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I really have a hard time deciding if that is the scandal the article makes it out to be (although there is some backpedaling going on). The crucial point is: 8% of the decisions turn out to be wrong or misjudged. The article seems to want us to think that the use of the algorithm is to blame. Yet, is it? Is there evidence that a human would have judged those cases differently? Is there evidence that the algorithm does a worse job than humans? If not, then the article devolves onto blatant fear mongering and the message turns from "algorithm is to blame for deaths" into "algorithm unable to predict the future in 100% of cases", which of course it can't...
Could a human have judged it better? Maybe not. I think a better question to ask is, "Should anyone be sent back into a violent domestic situation with no additional protection, no matter the calculated risk?" And as someone who has been on the receiving end of that conversation and later narrowly escaped a total-family-annihilation situation, I would say no...no one should be told that, even though they were in a terrifying, life-threatening situation, they will not be provided protection, and no further steps will be taken to keep them from being injured again, or from being killed next time. But even without algorithms, that happens constantly...the only thing the algorithm accomplishes is that the investigator / social worker / etc doesn't have to have any kind of personal connection with the victim, so they don't have to feel some kind of way for giving an innocent person a death sentence because they were just doing what the computer told them to.
Final thought: When you pair this practice with the ongoing conversation around the legality of women seeking divorce without their husband's consent, you have a terrifying and consistently deadly situation.
Yep. The ones who manage to slip notes to their veterinarian to help them get away are the exception.
Reading stuff like this makes me sick. All is not well with the world.