this post was submitted on 14 Jan 2024
264 points (95.5% liked)
Technology
59390 readers
2519 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Not that I'm really interested in one but what's actually wrong with making an AI gf app?
It encourages the dehumanization of women and gives men even more unrealistic expections about relationships and sex. But if they take themselves out of the gene pool this way then it could end up being a win.
As if dehumanization of men wasn't just as bad.
You know, I saw a pic posted somewhere recently saying something about not liking bodybuilders and unrealistically cool guys, those she likes are absolutely normal and casual, like guys on the picture.
And guys on the picture are Hollywood actors, LOL, in very good form, with no signs of sleep deprivation and tiredness, with a selling smile and the photos are likely edited on top of that.
And the totally realistic and normal expectation of many women towards men is that if a woman has a moment of weakness and pain, then it's her personality to be proud of, and if a man has that, then he should accept being dumped for that moment alone as a man.
I actually think it absolutely mirrors the dehumanization of women. All the same things.
You are losing sight of the discussion to frame it as a "men vs women" thing. This will also feed into the dehumanization of men because it will also generated "ideal" impossible men.
N-nah. But if we get back to the root of this discussion - I've read lots of fanfiction in my life. Mostly written by girls for girls. Taboons of imagined idealized men right there.
And about imagined idealized women - men write fanfiction (and other fiction) too.
So I just don't see how such bots are bad, except they are not real.
The difference is that as far as fanfic goes you cannot escape the fact that they aren't real, it's static text on a screen, and even people roleplaying are liable to get a "dude wtf" response if they have no notion of what are appropriate expectations and behavior. But an AI will go along with whatever it's told to and try to appear like person doing it. They will validate and reward even the user's wildest expectations.
If there's people so lost in their fantasies that they will convince themselves they are in love with some scripted basic visual novel character, imagine what AI bots will do to them.
To be fair I don't think this is downfall of society material, but I think it's a given some people will go absolutely nuts because of them, and it might affect how they treat real human beings around them. The internet has enough unhinged people even when they are capable of interacting with each other. Imagine when we are dealing with people whose main practice of conversation is getting sexted by AI bots they treat like trash?
It's even easier to lose yourself in fantasies over a real woman which differs just a bit from what you imagine, and that little difference changes everything.
Yes.
I have good imagination, so didn't need any bots to go down that path.
Over time they'll realize that it's more enjoyable to get a pat on the head by a real woman you strongly like than to get all kinds of sexual talk an LLM bot can produce.
You know, i've noticed over my 40+ years that the vast majority of men are unnattractive. Men like to rate women on a scale but i just do a yes or no and 97% are no. But they still get girlfriends, get married and have kids. Ignore the women who care about looks because they seem to be a tiny minority.
I didn't have to do that anyway, my problems in this area result mostly from my own mistakes, but one can't just abruptly stop making them.
Though I think I actually get something right, after the dust settles I still rather like (as people) everybody for whom I felt something.
They could train it however they want, it wouldn't have to be dehumanizing (admittedly probably wouldn't be as successful). Hell, maybe they could disguise a therapy AI as a gf AI and trick them into getting their shit together.
Side question, how do you feel about romance movies/novels that give unrealistic expectations of men? Should those be banned as well?
I never said it should be banned, just that I don't like it.
People will train it in all kinds of ways. Lets take sex out of tbe equation and say a nazi trains a home automation/personal assistant ai as a house removed bot. Still cool?
I’m not sure there’s anything wrong with it, that’s what the article reported on as though it were some sort of harbinger of doom… Felt like my smarmy retorts would be slightly less punchy if I had opened up a side discussion regarding appropriate uses for AI. I suppose part of my motivation was that it seemed incredibly innocent relatively speaking.
Open AI claims to be in this to save humanity from Skynet, this seems like a fairly pathetic attempt keep their store from filling up with “disreputable” content before… what exactly I don’t know. The killer app for AI that would be magically devoid of controversy?
Are people really wargaming this? Planning on making anti-skynets to defend humanity from skynets? I can't decide if that is a massive waste of time or a vital use of it.
I’m not an authority on the subject, but that was my understanding from the reporting surrounding Open AI’s recent kerfuffle. That their complex management structure was part of some elaborate strategy to promote the development of ethical AI.
Sounded a bit sus to me, but clearly smarter folks think it’s a good way to spend money.