this post was submitted on 05 Aug 2024
467 points (96.8% liked)

Technology

59123 readers
2294 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 6 points 3 months ago (1 children)

I am not equating humans with computers. These businesses are not selling people's data when doing AI training (unlike actual data brokers). You can't say something AI generated is a clone of the original anymore than you can say parody is.

[–] [email protected] -1 points 3 months ago* (last edited 3 months ago) (1 children)

I absolutely can. Parody is an art form, which is something that can exclusively only be created by human beings. AI is an art laundering service. Not an artist.

The law should reflect that these companies need to be first granted permission to use datasets by the rights holders, and creative commons licenses need to be given an opportunity to opt out of being crawled for these datasets. Anything else is wrong. Machines are not humans. Creative common copyright law was not written with the concept of machines being "consumers". These companies took advantage of the sudden emergence of these models and the delay of law in holding their hunger for data in check. They need to be held accountable for their theft.

[–] [email protected] 4 points 3 months ago* (last edited 3 months ago)

There are already anti-AI licenses out there. If you didn't license your stuff with that in mind that's on you. Deep learning models have been around for a lot longer than GPT 3 or anything that's happened in the current news cycle. They have needed training data for that long too. It was predictable stuff like this would happen eventually, and if you didn't notice in time it's because you haven't been paying attention.

You don't get to dictate what's right and wrong. As far as I am concerned all copyright is wrong and dumb, but the law is what the law is. Obviously not everyone shares my opinion and not everyone shares yours.

Whether an artist is involved or not it's still a transformative use.