this post was submitted on 12 Nov 2024
1059 points (96.6% liked)

Technology

60071 readers
4967 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 253 points 1 month ago (8 children)

Is "dragged" the new "slammed"?

[–] [email protected] 67 points 1 month ago (1 children)

Gen Z journalism entered the chat?

[–] [email protected] 87 points 1 month ago (3 children)

Reporters threw Elon Musk off Hell In A Cell, and plummeted 16 ft through an announcer's table after his chatbot admitted he spread lies.

[–] [email protected] 39 points 1 month ago* (last edited 1 month ago) (1 children)

Holy fuck. I miss shittymorph just for his creative responses using this.

[–] [email protected] 26 points 1 month ago

Yeah man. Those were the good ol’ days, when X was called Twitter lol. Musk was absolutely spreading misinformation when it was still called Twitter also, before he owned it. I remember when he started talking complete rubbish about Dogecoin, making its price oscillate all over the place that whole week. One of his fanboys bought in…like hard. A 30-something year old, and he put his whole life savings into Doge at its peek, only to lose it all the night it was revealed that in 1998, The Undertaker threw Mankind off Hell In A Cell, and plummeted 16 ft through an announcer's table.

[–] [email protected] 14 points 1 month ago (4 children)

Chat, is Elon cooked? No cap?

load more comments (4 replies)
load more comments (1 replies)
[–] [email protected] 21 points 1 month ago

Where I'm from, "dragged" means to be removed against your will.

You know, like "the pitcher got dragged after the first inning".

[–] [email protected] 19 points 1 month ago

It's a refreshing change of pace

[–] [email protected] 14 points 1 month ago (2 children)

I was hoping a horse was involved.

load more comments (2 replies)
[–] [email protected] 13 points 1 month ago* (last edited 1 month ago) (4 children)

Yeah, you know, like “Dragon Deez”

load more comments (4 replies)
[–] [email protected] 11 points 1 month ago (5 children)

I feel like dragged predates slammed as slang but it definitely wasn't popular headline material

load more comments (5 replies)
load more comments (1 replies)
[–] [email protected] 156 points 1 month ago (40 children)

Implying he gives a shit. The thing about people who lack any empathy is they're immune to embarrassment even when they're the most embarrassing human on the planet.

load more comments (40 replies)
[–] [email protected] 69 points 1 month ago (5 children)

misinformation? just call it lies. reads easier and just as accurate.

[–] [email protected] 36 points 1 month ago (3 children)

Even more accurately: it's bullshit.

"Lie" implies that the person knows the truth and is deliberately saying something that conflicts with it. However the sort of people who spread misinfo doesn't really care about what's true or false, they only care about what further reinforces their claims or not.

load more comments (3 replies)
load more comments (4 replies)
[–] [email protected] 56 points 1 month ago (2 children)

Chatbots can't "admit" things. They regurgitate text that just happens to be information a lot of the time.

That said, the irony is iron clad.

load more comments (2 replies)
[–] [email protected] 56 points 1 month ago (9 children)

The ultra powerful see us as NPCs, and nothing more.

Your anger is barely a pop up window on the game they're playing.

load more comments (9 replies)
[–] [email protected] 32 points 1 month ago (16 children)

Well then they will have to train their Ai with incorrect informations... politically incorrect, scientifically incorrect, etc.... which renders the outputs useless.

Scientifically accurate and as close to the truth as possible never equals conservative talking points.... because they are scientifically wrong.

load more comments (16 replies)
[–] [email protected] 23 points 1 month ago (13 children)

And we have to ask ourselves WHY he'd want to spread misinformation. What is he trying to do?

load more comments (13 replies)
[–] [email protected] 22 points 1 month ago

He lies to assert power. In his company yesmen say yes because he pays their checks. To the rest of us he generally looks like a loon.

It's obvious to a daft AI.

[–] [email protected] 20 points 1 month ago (2 children)

In Texas, we call this lying... I don't know when the goal post got moved but these parasites have always been lying to us the pedons.

Why do peasant accept or listen to these clowns? They are your enemy, treat them as such.

But now... pleb has his daddy who is good, and other pleb's daddy is bad 🤡

"me daddy strong, me daddy kick ur daddy ass"

ADULT FUCKING PEOPLE IN 2024

[–] [email protected] 18 points 1 month ago (3 children)

I don't know when the goal post got moved

January 22nd 2017. When Kellyanne Conway used the term "alternative facts"

load more comments (3 replies)
load more comments (1 replies)
[–] [email protected] 15 points 1 month ago* (last edited 1 month ago) (6 children)

This is an article about a tweet with a screenshot of an LLM prompt and response. This is rock fucking bottom content generation. Look I can do this too:

Headline: ChatGPT criticizes OpenAI

[–] [email protected] 11 points 1 month ago* (last edited 1 month ago) (7 children)

To add to this:

All LLMs absolutely have a sycophancy bias. It's what the model is built to do. Even wildly unhinged local ones tend to 'agree' or hedge, generally speaking, if they have any instruction tuning.

Base models can be better in this respect, as their only goal is ostensibly "complete this paragraph" like a naive improv actor, but even thats kinda diminished now because so much ChatGPT is leaking into training data. And users aren't exposed to base models unless they are local LLM nerds.

load more comments (7 replies)
load more comments (5 replies)
[–] [email protected] 15 points 1 month ago (1 children)

Come on guys, this was clearly the work of the Demtards hacking his AI and making it call him names. We all know his superior intellect will totally save the world and make it a better place, you just gotta let him go completely unchecked to do it.

/s

load more comments (1 replies)
[–] [email protected] 14 points 1 month ago (2 children)

Damn thats hard. And Melon Husk will soon be the new Chef of Nasa!

[–] [email protected] 13 points 1 month ago (4 children)

Actually they made a new department of "Government Oversight" for him...

Which sounds scummy, but it's basically ju8st a department that looks for places to cut the budget and reduce waste... not a bad idea, except it's Right Wingers running it so "Food" would be an example of frivolous spending and "Planes that don't fly" would be what they're looking to keep the cash flowing on

load more comments (4 replies)
load more comments (1 replies)
load more comments
view more: next ›