this post was submitted on 07 Oct 2023
342 points (96.5% liked)

Technology

59207 readers
3264 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 68 points 1 year ago (7 children)

Doubt. These large language models can't produce anything outside their dataset. Everything they do is derivative, pretty much by definition. Maybe they can mix and match things they were trained on but at the end of the day they are stupid text predictors, like an advanced version of the autocomplete on your phone. If the information they need to solve your problem isn't in their dataset they can't help, just like all those cheap Indian call centers operating off a script. It's just a bigger script. They'll still need people to help with outlier problems. All this does is add another layer of annoying unhelpful bullshit between a person with a problem and the person who can actually help them. Which just makes people more pissed and abusive. At best it's an upgrade for their shit automated call systems.

[–] [email protected] 26 points 1 year ago

Most call centers have multiple level teams where the lower ones are just reading of a script and make up the majority. You don't have to replace every single one to implement AI. Its gonna be the same for a lot of other jobs as well and many will lose jobs.

[–] [email protected] 9 points 1 year ago (2 children)

I know how AI works inside. AI isn't going to completely replace such thing, yes, but it'll also be the end of said cheap Indian call centers.

[–] [email protected] 7 points 1 year ago

Who also don’t have the information or data that I need.

[–] [email protected] 1 points 1 year ago

It isn't going to completely replace whole business departments, only 90% of them, right now.

In five years it's going to be 100%.

[–] [email protected] 8 points 1 year ago

I'd say at best it's an upgrade to scripted customer service. A lot of the scripted ones are slower than AI and often have stronger accented people making it more difficult for the customer to understand the script entry being read back to them, leading to more frustration.

If your problem falls outside the realm of the script, I just hope it recognises the script isn't solving the issue and redirects you to a human. Oftentimes I've noticed chatgpt not learning from the current conversation (if you ask it about this it will say that it does not do this). In this scenario it just regurgitates the same 3 scripts back to me when I tell it it's wrong. In my scenario this isn't so bad as I can just turn to a search engine but in a customer service scenario this would be extremely frustrating.

[–] [email protected] 7 points 1 year ago

Check out this recent paper that finds some evidence that LLMs aren't just stochastic parrots. They actually develop internal models of things.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago)

This isn't true, provided that their dataset is large enough. The models are stochastic, and with a large enough number of parameters and a large enough training set, can generate truly unique content. For example, I strongly doubt you'd be able to find anything remotely resembling the following anywhere, ever (look up what the movie is about, and watch it, to understand the absurdity of my request), and yet it was generated by ChatGPT:

https://chat.openai.com/share/803f2633-8682-45f0-b999-3bede5c02c21

If you read interviews from the development of these models, you'll see the creators saying what can be clear from the above link: With a large enough training set, these models start to learn something about the organization of language itself, and how to generate novel content.

The model architecture that these things are based on tries to replicate how our brains work, and the process by which they learn language isn't unlike how we learn language.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

Your description of AI limitations sounds a lot like the human limitations of the reps we deal with every day. Sure, if some outlier situations comes up then that has to go to a human but let's be honest - those calls are usually going to a manager anyway so I'm not seeing your argument. An escalation is an escalation. The article itself is even saying that's not a literal 100% replacement of humans.

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago)

You can doubt it all you want, the fact of the matter is that AI is provably more than capable to take over the roles of humans in many work areas, and they already do.