this post was submitted on 01 Apr 2024
199 points (97.2% liked)

Technology

59374 readers
6264 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Several big businesses have published source code that incorporates a software package previously hallucinated by generative AI.

Not only that but someone, having spotted this reoccurring hallucination, had turned that made-up dependency into a real one, which was subsequently downloaded and installed thousands of times by developers as a result of the AI's bad advice, we've learned. If the package was laced with actual malware, rather than being a benign test, the results could have been disastrous.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 18 points 7 months ago (1 children)

I find if I write one or two tests on my own then tell Copilot to complete the rest of them it's like 90% correct.

Still not great but at least it saves me typing a bunch of otherwise boilerplate unit tests.

[โ€“] [email protected] 8 points 7 months ago

I actually haven't tried it this way. I just asked it to write the tests for whatever class I was on and it started spitting some stuff at me. I'll try your way and see.