this post was submitted on 13 Nov 2024
624 points (95.0% liked)

Technology

59374 readers
7244 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 10 points 1 day ago (14 children)

As I use copilot to write software, I have a hard time seeing how it'll get better than it already is. The fundamental problem of all machine learning is that the training data has to be good enough to solve the problem. So the problems I run into make sense, like:

  1. Copilot can't read my mind and figure out what I'm trying to do.
  2. I'm working on an uncommon problem where the typical solutions don't work
  3. Copilot is unable to tell when it doesn't "know" the answer, because of course it's just simulating communication and doesn't really know anything.

2 and 3 could be alleviated, but probably not solved completely with more and better data or engineering changes - but obviously AI developers started by training the models on the most useful data and strategies that they think work best. 1 seems fundamentally unsolvable.

I think there could be some more advances in finding more and better use cases, but I'm a pessimist when it comes to any serious advances in the underlying technology.

[–] [email protected] 1 points 20 hours ago (8 children)

So you use other people's open source code without crediting the authors or respecting their license conditions? Good for you, parasite.

[–] [email protected] 1 points 18 hours ago (1 children)

Ahh right, so when I use copilot to autocomplete the creation of more tests in exactly the same style of the tests I manually created with my own conscious thought, you're saying that it's really just copying what someone else wrote? If you really believe that, then you clearly don't understand how LLMs work.

[–] [email protected] -1 points 11 hours ago

I know both LLM mechanisms better than you, it would appear, and my point is not so weak that I would have to fabricate a strawman that I then claim is what you said, to proceed to argue the strawman.

Using LLMs trained on other people's source code is parasitic behaviour and violates copyrights and licenses.

load more comments (6 replies)
load more comments (11 replies)