Alright, this is a good way to show why AI is so dangerous! I usually look at thumbnails before reading article titles, so when I read the title, my brain auto replaced butter with peanut butter.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
That's almost as good as this one I got months back
I also got that article recommendation and had a good laugh.
Also unhelpful, since, if you look at my comment history, I was already aware of the Costco butter fiasco and had made a snarky comment about it.
Are you insinuating Google is reading your Lemmy comments and pushing you news recommendations based on those?
Considering they are not even capable of removing video recommendations for videos you literally just finished watching on Youtube I doubt it.
i get that this is supposed to be a joke but just because they can't get a bug fixed doesn't mean they aren't teacking you and recommending you stuff based on your web activity
Which is probably what they spend the most time and energy perfecting
If they have an android based phone, they probably are..
"Allergy butter" is my new name for Jif. Thanks.
it’s pronounced “jif”
Thanks Satan
This is when you turn off notifications from that app. Use the app whenever you want, not someone else or a machine.
AI creates fiction that sometimes intersects with reality, in the same way that Legends & Lattes has a few real-world things like coffee shops and lattes, but the things like orcs, ratkin, succubi, and magic that comprise the rest of the details are still currently fiction.
People just need to learn to assume LLMs are always writing fiction with a handful of details borrowed from real life.