this post was submitted on 14 Jan 2024
264 points (95.5% liked)

Technology

59390 readers
3596 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

AI girlfriend bots are already flooding OpenAI’s GPT store::OpenAI’s store rules are already being broken, illustrating that regulating GPTs could be hard to control

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 8 points 10 months ago (2 children)

If we get wiped out by AI girlfriends we deserve it. If the reason why a person never reproduced is solely because they had a chatbot they really should not reproduce.

[–] [email protected] 7 points 10 months ago (2 children)

I was trying to dream up the justification for this rule that wasn’t about mitigating the ick-factor and fell short… I guess if the machines learn how to beguile us by forming relationships then they could be used to manipulate people honeypot style?

Honestly the only point I set out to make was that people were probably working on virtual girlfriends for weeks (months?) before they were banned. They had probably been submitted to the store already and the article was trying to drum up panic.

[–] [email protected] 3 points 10 months ago

Sure which you know we already can do. Honeypots are a thing and a thing so old the Bible mentions them. Delilah anyone? It isn't that cough...hard...cough to pretend to be interested enough in a guy to make them fall for you. Sure if the tech keeps growing, which it will, you can imagine more and more complex cons. Stuff that could even have webcam chats with the marks.

I suggest we treat this the same way we currently treat humans doing this. We warn users, block accounts that do this, and criminally prosecute.

[–] [email protected] 1 points 10 months ago* (last edited 10 months ago)

Its a hard question to answer, there is a good reason but its sevral pargraphs long and i likely have gaps in knolage and in some places misguided. The reduced idea: being emotionally open (no emotional guarding or sandboxing/RPing) with a creature that lacks many traits required to take on that responsability. the model is being pretrained to perform jestures that make us happy, having no internal state to ask itself if it would enjoy garlic bread given its experience with garlic. its an advanced tape recorder, being pre-populated with an answer. Or it lies and picks somthing because saying idk is the wrong response. As apposed to a creature that has some kind of consistant external world and a memory system. firehosing it with data, means less room for artistic intent.

If your sandboxing/Roleplaying, theres no problem.

[–] [email protected] 2 points 10 months ago* (last edited 10 months ago)

Interesting idea. We could effectively practice eugenics in a way that won't make people so mad. They'll have to contend with ideas like free will and personal responsibility before they can go after our program.

Let's make a list of all the "asocials" we want removed from the gene pool and we can get started.