253
this post was submitted on 06 Dec 2023
253 points (95.7% liked)
Technology
59148 readers
2372 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
So a Board member wrote a paper about focusing on safety above profit in AI development. Sam Altman did not take kindly to this concept and started pushing to fire her (to which end he may or may not have lied to other Board members to split them up). Sam gets fired for trying to fire someone for putting safety over profit. Everything exploded and now profit is firmly at the head of the table.
I like nothing about this version of events either.
I feel like this isn't surprising knowing about all the other stuff altman has done. Seems like yet another loss for the greater good in the name of profit.
Now what would the company do if the AI model started putting safety above profit (i.e. refusing to lie to profit the user (aka reducing market value))? How fucked are we if they create an AGI that puts profit above safety?
Entirely. We all die. The light cone is turned into the maximum amount of "profit" possible.
This is still better than a torment maximizer, which may come as some comfort to the tiny dollar bills made of the atoms that used to be you.