this post was submitted on 07 Mar 2024
486 points (97.5% liked)

Technology

60107 readers
2326 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Trust in AI technology and the companies that develop it is dropping, in both the U.S. and around the world, according to new data from Edelman shared first with Axios.

Why it matters: The move comes as regulators around the world are deciding what rules should apply to the fast-growing industry. "Trust is the currency of the AI era, yet, as it stands, our innovation account is dangerously overdrawn," Edelman global technology chair Justin Westcott told Axios in an email. "Companies must move beyond the mere mechanics of AI to address its true cost and value — the 'why' and 'for whom.'"

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 6 points 9 months ago (3 children)

LLMs should absolutely not be used for things like customer support, that's the easiest way to give customers wrong info and aggregate them. For reviewing documents LLMs have been abysmally bad.

For gammer it can be useful but what it actually is best for is for example biochemistry for things like molecular analysis and creating protein structures.

I work in an office job that has tried to incorporate AI but so far it has been a miserable failure except for analysing trends in statistics.

[–] [email protected] 2 points 9 months ago (1 children)

I agree about customer support, but in the end it's going to come down to number of cases like this, how much they cost, versus the cost of a room of paid employees answering them.

It's going to take actual laws forbidding it to make them stop.

[–] [email protected] 5 points 9 months ago

Oh, yea, of course companies will take advantage of this to just replace a ton of people with a zero cost alternative. I'm just saying that's not where it should be used as it's terrible at those tasks.

[–] [email protected] 2 points 9 months ago (1 children)

A LLM is terrible for molecular analysis, AI can be used but not LLM.

[–] [email protected] -2 points 9 months ago (1 children)

AI doesn't exist currently, that's what LLMs are currently called. Also they have been successfully used for this and show great promise so far, unlike the hallucinating chatbot.

[–] [email protected] 3 points 9 months ago (1 children)

AGI Artificial General Intelligence doesn't exist that is what people think of in sci-fi like Data or Hal. LLM or Large Language Models like CHAT GPT are the hallucinating chat bots, they are just more convincing than the previous generations. There are lots of other AI models that have been used for years to solve large data problems.

[–] [email protected] 1 points 9 months ago

Pretty much anything Google is giving me says they are using deep learning LLMs in biology.