SystemD sucks. You know, the truth.
gens
Raylib.
Yea. They have worse efficiency. To get better efficiency from them you would need to run them hotter (afaik), and if you do that they would last even shorter.
It's great if you want a smaller but still strong engine, but it's not efficient and those seals are a big problem.
Back to ~~PS1~~ N64.
They did make the e-golf. Not gti, but still a golf.
Yes, the first movie.
It's probably not just because ev. Golf has become a high end brand even before that. Not really a peoples car if parts cost a lot.
It's very useful in zig's comptime.
When you look at a coffe cup from the side, you know it has a hole in it. Because you imagine, not because it's a reflex.
LLM is basically a point cloud of words. The training uses neural networks and thus pattern recognition. But the llm itself is closer to a database. But hey, sql is also useful for ai (data storage/retrival according to logic).
I'm not an llm expert, by far. But right now they are not much more practical then a find out a bout things helper.
Edit: I do like them. It's been helpful a couple times and i even got gpt4all installed on my computer for fun.
What one would think is ai today is not really i. Chatgpt does not understand what it's talking about and definitively can not lead the machine uprising. Straight up neural networks maybe could, but they'd need magnitudes more computing power then we have now. We would need a new ai for it to be practical.
In my experience gpt-s are more like "what are some examples of x" then "can you solve this problem". Because the problems are either easy to google or, for the harder problems, gpt straight up lies or rambles uselessly. A search engine helper, in a way.
I'd rather we put all those MWh into solving real problems, instead of startups. Also; Nvidia, fuck you.
0-127, top bit is always 0.
Cost-benefit is not there. You can buy fancy ones that do some such things, but they are expensive.