this post was submitted on 24 Oct 2023
875 points (93.1% liked)
Programmer Humor
19589 readers
855 users here now
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
For the love of God, if you're a junior programmer you're overestimating your understanding if you keep relying on chatGPT thinking 'of course I'll spot the errors'. You will until you won't and you end up dropping the company database or deleting everything in root.
All ChatGPT is doing is guessing the next word. And it's trained on a bunch of bullshit coding blogs that litter the internet, half of which are now chatGPT written (without any validation of course).
If you can't take 10 - 30 minutes to search for, read, and comprehend information on stack overflow or docs then programming (or problem solving) just isn't for you. The junior end of this feel is really getting clogged with people who want to get rich quick without doing any of the legwork behind learning how to be good at this job, and ChatGPT is really exarcebating the problem.
A lot of the time this is just looking for syntax though; you know what you want to do, and it's simple, but it is gated behind busywork. This is to me the most useful part about ChatGPT, it knows all the syntax and will write it out for you and answer clarifying questions so you can remain in a mental state of thinking about the actual problem instead of digging through piles of junk for a bit of information.
ChatGPT cannot explain, because it doesn't understand. It will simply string together a likely sequence of characters. I've tried to use it multiple times for programming tasks and found each time that it doesn't save much time, compared to an IDE. ChatGPT regularly makes up methods or entire libraries. I do like it for creating longer texts that I then manually polish, but any LLM is awful for factual information.
I think that when it is doing that, it is normally a sign that what you are asking for does not exist and you are on the wrong track.
I often get good explanations that seem to reflect understanding, which often would be difficult to look up otherwise. For example when I asked about the code generated,
{myVariable}
, and how it could be a valid function parameter in javascript, it responded that it is the equivalent of{"myVariable":myVariable}
, and "When using object literal property value shorthand, if you're setting a property value to a variable of the same name, you can simply use the variable name."If ChatGPT gives you correct information you're either lucky or just didn't realize it was making shit up. That's a simple fact. LLMs absolutely have their uses, but facts ain't one of them.