this post was submitted on 08 Dec 2024
459 points (94.6% liked)
Technology
60033 readers
2895 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
We poors are going to have to organize and make best use of our human intelligence to form an effective resistance against corporate rule. Or we can see where this is going.
The thing I'm heartened by is that there is a fundamental misunderstanding of LLMs among the MBA/"leadership" group. They actually think these models are intelligent. I've heard people say, "Well, just ask the AI," meaning asking ChatGPT. Anyone who actually does that and thinks they have a leg up are insane and kidding themselves. If they outsource their thinking and coding to an LLM, they might start getting ahead quickly, but they will then fall behind just as quickly because the quality will be middling at best. They don't understand how to best use the technology, and they will end up hanging themselves with it.
At the end of the day, all AI is just stupid number tricks. They're very fancy, impressive number tricks, but it's just a number trick that just happens to be useful. Solely relying on AI will lead to the downfall of an organization.
As a programmer I have yet to see evidence that LLMs can even achieve that. So far everything they product is a mess that needs significant effort to fix before it even does what was originally asked of the LLM unless we are talking about programs that have literally been written already thousands of times (like Hello World or Fibonacci generators,...).
I've seen a junior developer use it to more quickly get a start on things like boiler plate code, configuration, or just as a starting point for implementing an algorithm. It's kind of like a souped up version of piecing together Stack Overflow code snippets. Just like using SO, it needs tweaking, and someone who relies too much on either SO or AI will not develop the proper skills to do so.
I find LLM's great for creating shorter snippets of code. It can also be great as a starting point or to get started with something that you are not familiar with.
Even asking for an example on how to use a specific API has failed about 50% of the time, it tends to hallucinate entire parts of the API that don't exist or even entire libraries that don't exist.
I'm not a programmer, more like a data scientist, and I use LLMs all day, I write my shity pretty specific code, check that it works and them pass it to the LLM asking for refactoring and optimization. Some times their method save me 2 secs on a 30 secs scripts, other ones it's save me 35 mins in a 36 mins script. It's also pretty good helping you making graphics.