this post was submitted on 02 Aug 2024
42 points (85.0% liked)
Technology
59174 readers
3103 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I was just talking with a friend who is a software dev (I'm a Linux Engineer so I do software as part of my job, just not my main focus) and we were just commiserating on how 75-80% of the world doesn't understand that "AI" is just regurgitating information it has collected and it's not like Jarvis or Skynet and thinks for itself.
I agree that the term "sexual abuse" is definitely misleading, I think "sexual exploitation" is better. I agree with you it's no different than face swapping, but the difference is that it's a lot easier for the general public to do it now than it was 5 or 10 years ago. It's also pretty fucked that a fake image of you could potentially put you in "hot water" years down the road and you have zero control over it.
While I definitely hate the "AI bubble" that has grown tremendously over the past 2-3 years, we definitely need to figure out how to place limits on it before shit really gets out of hand in another year or two. The problem is that anyone that knows anything about this stuff doesn't work in or for the government. The woman in the article that said that this needs to be regulated at every point of course doesn't work in tech, she works for some rights organization 🤦♂️