SkunkWorkz

joined 1 year ago
[–] [email protected] 48 points 6 days ago (4 children)

Never buy a product for a non existent feature that’s promised in update.

[–] [email protected] 6 points 1 week ago (2 children)

I use ChatGpt to ask programming questions, it’s not always correct but neither is Stack Overflow nowadays. At least it will point me in the right direction.

[–] [email protected] 1 points 1 week ago

Google Chrome /s

[–] [email protected] 1 points 2 weeks ago

It’s because people want a big boot. In Europe hatchbacks/cross overs are favored over sedans for that reason. And people just don’t like the look of a station wagon/estate car. Only the luxury brands still make sedans.

[–] [email protected] 2 points 3 weeks ago (1 children)

Pronounced as Aitch or Haitch?

[–] [email protected] 2 points 3 weeks ago

You forgot his baptismal name. It’s Jesus H Roosevelt Mary Christ

[–] [email protected] 9 points 1 month ago (1 children)

Affinity got bought up by Canva. It’s only a matter of time that it will get enshittified. They are already giving non-profit and education subscribers free access to Affinity. Bet they will phase out perpetual licensing in the future.

Next time just pirate it. The Affinity people already got their fat cheque.

[–] [email protected] 71 points 1 month ago (7 children)
[–] [email protected] 38 points 1 month ago (3 children)
[–] [email protected] 15 points 1 month ago (5 children)

There is such a thing called HDMI Ethernet. If you connect some sort of Android box to your TV it might establish an Ethernet connection with it and thus connect to the internet.

[–] [email protected] 4 points 1 month ago

Ruby is literally Japanese. It was invented there. Plus a Danish guy popularized it outside of Japan. Like how weebs spurred interest in Japan and the Japanese language outside Japan.

[–] [email protected] 10 points 1 month ago* (last edited 1 month ago)

It’s not a bug. Just a negative side effect of the algorithm. This what happens when the LLM doesn’t have enough data points to answer the prompt correctly.

It can’t be programmed out like a bug, but rather a human needs to intervene and flag the answer as false or the LLM needs more data to train. Those dozens of articles this guy wrote aren’t enough for the LLM to get that he’s just a reporter. The LLM needs data that explicitly says that this guy is a reporter that reported on those trials. And since no reporter starts their articles with ”Hi I’m John Smith the reporter and today I’m reporting on…” that data is missing. LLMs can’t make conclusions from the context.

view more: next ›