ChatGPT regularly makes up methods or entire libraries
I think that when it is doing that, it is normally a sign that what you are asking for does not exist and you are on the wrong track.
ChatGPT cannot explain, because it doesn’t understand
I often get good explanations that seem to reflect understanding, which often would be difficult to look up otherwise. For example when I asked about the code generated, {myVariable}
, and how it could be a valid function parameter in javascript, it responded that it is the equivalent of {"myVariable":myVariable}
, and "When using object literal property value shorthand, if you're setting a property value to a variable of the same name, you can simply use the variable name."
What I can run with my hardware just isn't comparable yet but I fully support the sentiment and plan to switch over once it gets a bit better.