I use ChatGpt to ask programming questions, it’s not always correct but neither is Stack Overflow nowadays. At least it will point me in the right direction.
SkunkWorkz
It’s because people want a big boot. In Europe hatchbacks/cross overs are favored over sedans for that reason. And people just don’t like the look of a station wagon/estate car. Only the luxury brands still make sedans.
Pronounced as Aitch or Haitch?
You forgot his baptismal name. It’s Jesus H Roosevelt Mary Christ
Affinity got bought up by Canva. It’s only a matter of time that it will get enshittified. They are already giving non-profit and education subscribers free access to Affinity. Bet they will phase out perpetual licensing in the future.
Next time just pirate it. The Affinity people already got their fat cheque.
There is such a thing called HDMI Ethernet. If you connect some sort of Android box to your TV it might establish an Ethernet connection with it and thus connect to the internet.
Ruby is literally Japanese. It was invented there. Plus a Danish guy popularized it outside of Japan. Like how weebs spurred interest in Japan and the Japanese language outside Japan.
It’s not a bug. Just a negative side effect of the algorithm. This what happens when the LLM doesn’t have enough data points to answer the prompt correctly.
It can’t be programmed out like a bug, but rather a human needs to intervene and flag the answer as false or the LLM needs more data to train. Those dozens of articles this guy wrote aren’t enough for the LLM to get that he’s just a reporter. The LLM needs data that explicitly says that this guy is a reporter that reported on those trials. And since no reporter starts their articles with ”Hi I’m John Smith the reporter and today I’m reporting on…” that data is missing. LLMs can’t make conclusions from the context.
Never buy a product for a non existent feature that’s promised in update.