HakFoo

joined 1 year ago
[–] [email protected] 11 points 1 week ago (3 children)

Or that there's a huge amount of legit demand for mature node chips and it makes sense to own the supply for it.

The 5000 microcontrollers you inyeract with each day, by and large, do not need 5nm processes.

We saw a few years ago how relatively cheap, commodity-grade, low-complexity chips suddenly become vital when you can't get them and they have unfinished cars piling up at the assembly plant.

[–] [email protected] 9 points 2 weeks ago (1 children)

I suspect Intel has a broader product range than AMD to justify the headcount, but I'm not sure where the extra resources should go.

Their networking chipsets were gold-standard in the 100M and Gigabit era, but their 2.5G stuff is spotty to the point Realtek is considered legit.

They've pulled back from flash, SSDs and Optane.

There must be some other rich product lines that they do and AMD doesn't

[–] [email protected] 15 points 3 weeks ago

It can also throw things against the wall with no concern for fitness-to=purpose. See "None pizza, left beef".

[–] [email protected] 18 points 3 weeks ago

So thry're saying they have plenty of licenses for the use case, but somehow people are still pirating?

Maybe their license management paradigm is just garbage. This could be the vendor, but also poor IT policy if the users can't requisition what they need.

As usual, service problem.

So much licensing fuckery-- dealing with floating or reissuing licenses, users needing to move to different machines-- could be solved via affordable site licensing. But that might leave dollars on the table if users don't overbuy.

[–] [email protected] 2 points 1 month ago* (last edited 1 month ago) (1 children)

I guess the assumption is more that for me, a fresh install is often about decluttering as much as anything-- the five Wayland compositors, three music players, and six pseudo-IDEs I tried and didn't like don't need to follow me to the next build.

In a conventional install, that just means "don't check the checkbox in the installer next time". In a Nix-style system, this is a conscious process of actively deciding to remove things from the stored configuration, no?

I suppose the closest I've gotten was recently migrating my setup from a desktop to a new laptop. Mostly copying over some config from my home directory, but even then, I wanted enough different stuff-- removing tools I don't use on the laptop, adding things like battery monitoring and Wi-Fi control-- that it involved some reconfiguration.

[–] [email protected] 8 points 1 month ago (3 children)

I suspect the tooling isn't quite there yet for desktop use cases.

If I were to try to replicate my current desktop in an immutable model, it would involve a lot of manual labour in scripting or checkpointing every time I installed or configured something, to save a few hours of labour in 2 years time when I get a new drive or do a full install.

The case is easier for defined workload servers and dev environments that are regularly spun up fresh.

[–] [email protected] 9 points 1 month ago (4 children)

I wonder now about the social norms of the "default" versions of things in replicators.

Would you go on one ship and "soft drink" is Pepsi and another Coke, like venues today? How is it negotiated?

Or do you spend your life building and refining a profile? How is that carried around?

[–] [email protected] 37 points 1 month ago (1 children)

No, this is a general practice-- I see it a lot with third-party vendors who want you to integrate with their services. They'll expire the documentation portal password after 90 days, but the actual user facing service still accepts the same "password123" that's been set since 2004.

I suspect the pattern is to protect the vendors from developer scrutiny: by the time you've jumped through enough hoops to read the docs and realize it's trash, the execs have signed the contracts and the sunk costs are too high to bail out.

Also add another 6 months to actually get the credentials for the test environment.

 

(Alt: The Drake meme. Upper panel shows him hiding his face from "Securing Customer Data". Lower panel shows him smirking at "Securing Public API Documentation")

[–] [email protected] 15 points 1 month ago (3 children)

They've got a guy at work whose job title is basically AI Evangelist. This is terrifying in that it's a financial tech firm handling twelve figures a year of business-- the last place where people will put up with "plausible bullshit" in their products.

I grudgingly installed the Copilot plugin, but I'm not sure what it can do for me better than a snippet library.

I asked it to generate a test suite for a function, as a rudimentary exercise, so it was able to identify "yes, there are n return values, so write n test cases" and "You're going to actually have to CALL the function under test", but was unable to figure out how to build the object being fed in to trigger any of those cases; to do so would require grokking much of the code base. I didn't need to burn half a barrel of oil for that.

I'd be hesitant to trust it with "summarize this obtuse spec document" when half the time said documents are self-contradictory or downright wrong. Again, plausible bullshit isn't suitable.

Maybe the problem is that I'm too close to the specific problem. AI tooling might be better for open-ended or free-association "why not try glue on pizza" type discussions, but when you already know "send exactly 4-7-Q-unicorn emoji in this field or the transaction is converted from USD to KPW" having to coax the machine to come to that conclusion 100% of the time is harder than just doing it yourself.

I can see the marketing and sales people love it, maybe customer service too, click one button and take one coherent "here's why it's broken" sentence and turn it into 500 words of flowery says-nothing prose, but I demand better from my machine overlords.

Tell me when Stable Diffusion figures out that "Carrying battleaxe" doesn't mean "katana randomly jutting out from forearms", maybe at that point AI will be good enough for code.

[–] [email protected] 4 points 1 month ago (1 children)
[–] [email protected] 3 points 2 months ago

I suppose the weird surprise lesson of the Windows 8 fiasco is no matter how badly they bollixed it up, they wouldn't lose enough customers that they could afford break a lot more of the user experience than they ever originally thought.

Even Vista, while people had issues*, still provided a largely familiar interface and didn't go out of its way to break muscle memory and traditional workflows.

IMO, Vista wasn't as bad as is commonly held. A lot of the problem was that it was more resource-intensive than previous systems-- it really asked for decent graphics cards and 2Gb memory, but they sold a lot of cheap machines with 512Mb and crappy shared-memory chipsets that only qualified as "Vista Basic Capable" so that the manufacturers wouldn't have to formally declare them obsolete. Some drivers had teething trouble, but switching to 64 bit was going to have growing pains anyway.

[–] [email protected] 4 points 2 months ago (1 children)

Source?

I'm more willing to forgive not getting Baizhu for the promise of unlimited cheap energy...

view more: next ›