this post was submitted on 23 Oct 2023
570 points (86.4% liked)

Technology

59421 readers
3034 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A new tool lets artists add invisible changes to the pixels in their art before they upload it online so that if it’s scraped into an AI training set, it can cause the resulting model to break in chaotic and unpredictable ways.

The tool, called Nightshade, is intended as a way to fight back against AI companies that use artists’ work to train their models without the creator’s permission.
[...]
Zhao’s team also developed Glaze, a tool that allows artists to “mask” their own personal style to prevent it from being scraped by AI companies. It works in a similar way to Nightshade: by changing the pixels of images in subtle ways that are invisible to the human eye but manipulate machine-learning models to interpret the image as something different from what it actually shows.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago (4 children)

I know how it is used in a non psychiatric way, I brought that up it can be used in colloquially. That doesn’t diminish the way that it can be used to harm and stigmatize an already stigmatized group of people. There are other terms that can be used, but this is used because people want to humanize AI and do not care about dehumanizing people who have psychotic disorders.

The fact of the matter remains that AI creators are not people who specialize in human brains, but they act like computers and human brains are one and the same. Similarity doesn’t equal the same processes. They can choose different language but they do not. They could call it a processing error, a glitch, a distortion. All would be accurate, but no, they chose a term that is harmful to a minority group because no one cares about stigmatizing them.

[–] [email protected] 1 points 1 year ago (3 children)

Look at 2 and 3: https://www.merriam-webster.com/dictionary/hallucination

And I just do not see how that can stigmatize a group of people. It is like saying that the use of the word "headache" in non-medical contexts (e.g., "this homework is a headache") stigmatizes people with migraines. It just does not.

[–] [email protected] 1 points 1 year ago (2 children)

Listen, I live in a state where anyone who commits a violent crime, before they catch the person the police say, “he was hallucinating, they were hearing voices” aka mental illness is why they are doing this as a way to take away more rights. Also in this state if you are in a conservatorship for mental illness you legally are barred from voting. How can you say hallucination is not a loaded term? It is different from headache because people are not stigmatized for migraines. No one is taking away your voting rights for migraines. No one is saying you are a murderer for migraines.

load more comments (1 replies)
load more comments (1 replies)
load more comments (1 replies)