gens

joined 1 year ago
[–] [email protected] 11 points 10 months ago (3 children)

Cost-benefit is not there. You can buy fancy ones that do some such things, but they are expensive.

[–] [email protected] 2 points 10 months ago

SystemD sucks. You know, the truth.

[–] [email protected] 21 points 11 months ago (2 children)

Yea. They have worse efficiency. To get better efficiency from them you would need to run them hotter (afaik), and if you do that they would last even shorter.

It's great if you want a smaller but still strong engine, but it's not efficient and those seals are a big problem.

[–] [email protected] 3 points 11 months ago* (last edited 11 months ago)

Back to ~~PS1~~ N64.

[–] [email protected] 5 points 11 months ago (1 children)

They did make the e-golf. Not gti, but still a golf.

[–] [email protected] 1 points 11 months ago

Yes, the first movie.

[–] [email protected] 16 points 11 months ago (3 children)

It's probably not just because ev. Golf has become a high end brand even before that. Not really a peoples car if parts cost a lot.

[–] [email protected] 1 points 11 months ago

It's very useful in zig's comptime.

[–] [email protected] 1 points 11 months ago* (last edited 11 months ago) (1 children)

When you look at a coffe cup from the side, you know it has a hole in it. Because you imagine, not because it's a reflex.

LLM is basically a point cloud of words. The training uses neural networks and thus pattern recognition. But the llm itself is closer to a database. But hey, sql is also useful for ai (data storage/retrival according to logic).

I'm not an llm expert, by far. But right now they are not much more practical then a find out a bout things helper.

Edit: I do like them. It's been helpful a couple times and i even got gpt4all installed on my computer for fun.

[–] [email protected] 1 points 11 months ago* (last edited 11 months ago) (4 children)

What one would think is ai today is not really i. Chatgpt does not understand what it's talking about and definitively can not lead the machine uprising. Straight up neural networks maybe could, but they'd need magnitudes more computing power then we have now. We would need a new ai for it to be practical.

In my experience gpt-s are more like "what are some examples of x" then "can you solve this problem". Because the problems are either easy to google or, for the harder problems, gpt straight up lies or rambles uselessly. A search engine helper, in a way.

I'd rather we put all those MWh into solving real problems, instead of startups. Also; Nvidia, fuck you.

[–] [email protected] 27 points 11 months ago (3 children)

0-127, top bit is always 0.

view more: ‹ prev next ›