this post was submitted on 02 Oct 2023
275 points (97.6% liked)

Technology

59374 readers
3125 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 100 points 1 year ago (3 children)

This is neither novel, nor morally acceptable. People that do this work usually end up traumatized for life, because of the fucked up shit they often have to look at. Prisoners are not really in a position to negotiate, meaning you can push this work on them in a sort of non consentual way that is below what modern society should strive for.

[–] [email protected] 34 points 1 year ago (1 children)

If you actually really the article, she's parsing real estate news articles.

Most AI jobs do not involve CP.

[–] [email protected] 8 points 1 year ago* (last edited 1 year ago)

Rule number 1 on Reddit is: “never read the article “

I guess that still applies here.

Rule 2: “disagree with everyone”

Rule 3: “You’re always right”

Rule 4: “everyone else is always wrong“

I’m sure there are lots of other rules, but that should get anyone started in the modern social media.

[–] [email protected] 32 points 1 year ago (2 children)

Training an AI is not traumatizing - what you think it is moderating public networks

[–] [email protected] 13 points 1 year ago

Unfortunately one major sector of image machine learning is CSAM scanning, which was also recently revealed as one of the major funding parties for the planned legislation intended to allow scanning all private communication in the EU. But generally i agree most of the things they will see might not be too bad by themselves but its still a job no human really wants to do of their own free will. If they do decide to do it, it is either out of a lack of choice or because they dont know what they are getting themselves into.

[–] [email protected] 8 points 1 year ago

It depends.

Around the world, millions of so-called “clickworkers” train artificial intelligence models, teaching machines the difference between pedestrians and palm trees, or what combination of words describe violence or sexual abuse.

[–] [email protected] 13 points 1 year ago (1 children)

Prisoners are not really in a position to negotiate, meaning you can push this work on them in a sort of non consentual way that is below what modern society should strive for

Well the article does mention that the prisoner "Marmalade" was not forced to do any of this.
In fact the article mentions that she could have spend her time in her cell, doing online courses or doing chores for the prison for little cash. The fact that wired managed to just book an interview with the prisoner also makes it quite risky for the company to subject the prisoners to any traumatizing material.

The only problem I really see with this is the fact that this doesn't really prepare the prisoners for live outside the prison in any way.