Wirlocke

joined 1 year ago
[–] [email protected] 8 points 5 months ago* (last edited 5 months ago)

Ironically the business people are terrible at business. I genuinely think LLMs (despite their economic evils) are stunning pieces of technology.

But they are money sinks and the only plans for profit are subscriptions or advertisements. It's Social Media/Streaming/Tech Startups panicked hype investing all over again. Subscriptions and advertising just simply do not pay the bills for huge server and gpu farms.

But sustainability isn't what they want is it? They want the stock to go up to then cash out when it's about to fall. sigh

[–] [email protected] 8 points 5 months ago (1 children)

Microsoft's bread and butter has been selling and servicing to businesses.

So with that in mind, the hell are they thinking? Windows 10 end of life guarantees that businesses specifically will have to switch. Then the next option in line is one that will by default vacuum up all your proprietary information to feed into an AI, effectively "copyright laundering" it?.

Even if there's ways to deactivate the feature, the non-tech savvy managers will just go off of the headlines and the tech savvy ones will recognize the security risk. And government/healthcare computer might just fork Linux into a non-open source version.

Ironically it feels like they're focusing too much on consumers (on extorting them) and shooting themselves in the foot for their business clientele.

[–] [email protected] 23 points 6 months ago (8 children)

It's a little funny how everyone sobered up from perpetually investing in unprofitable free social media then they dove right back in to perpetually investing in LLMs with no real plan for sustainable profit.

[–] [email protected] 7 points 6 months ago* (last edited 6 months ago) (2 children)

In terms of LLM hallucination, it feels like the name very aptly describes the behavior and severity. It doesn't downplay what's happening because it's generally accepted that having a source of information hallucinate is bad.

I feel like the alternatives would downplay the problem. A "glitch" is generic and common, "lying" is just inaccurate since that implies intent to deceive, and just being "wrong" doesn't get across how elaborately wrong an LLM can be.

Hallucination fits pretty well and is also pretty evocative. I doubt that AI promoters want to effectively call their product schizophrenic, which is what most people think when hearing hallucination.

Ultmately all the sciences are full of analogous names to make conversations easier, it's not always marketing. No different than when physicists say particles have "spin" or "color" or that spacetime is a "fabric" or [insert entirety of String theory]...

[–] [email protected] 1 points 6 months ago* (last edited 6 months ago)

On Discord though there's a lot of unchecked predation. Theoretically if this were implemented it would let them see the most suspicious users that interact with an unusual amount of children and review if the messages are inappropriate.

But all that's unlikely because if they actually cared they'd implement other simpler solutions first. So this idea is just hypothetical but not ideal.

[–] [email protected] 56 points 6 months ago (11 children)

I'm a bit annoyed at all the people being pedantic about the term hallucinate.

Programmers use preexisting concepts as allegory for computer concepts all the time.

Your file isn't really a file, your desktop isn't a desk, your recycling bin isn't a recycling bin.

[Insert the entirety of Object Oriented Programming here]

Neural networks aren't really neurons, genetic algorithms isn't really genetics, and the LLM isn't really hallucinating.

But it easily conveys what the bug is. It only personifies the LLM because the English language almost always personifies the subject. The moment you apply a verb on an object you imply it performed an action, unless you limit yourself to esoteric words/acronyms or you use several words to overexplain everytime.

[–] [email protected] 8 points 6 months ago (6 children)

The gender thing is creepy, but if they could predict age groups then in a perfect world they could analyze adult users talking to children and shut that down.

In a perfect world though, I doubt they'd put effort into making their app safer, heavens no.

[–] [email protected] 3 points 6 months ago (2 children)

Unfortunately the spam arms race has destroyed any chance of search going back to the good ole days. SEO and AI content farms means we'll need a whole new system to categorize webpages, as well as filter out human sounding but low effort spam.

Point being, it's no longer enough to find a page that's relevant to the topic, it has to be relevant and actually deliver information, which currently the only feasible tech that can differentiate those is LLMs.

[–] [email protected] 18 points 6 months ago (1 children)

I hate that it's links are "incompatible" with Firefox, even though if you trick it into thinking it's Chrome, it works just fine.

[–] [email protected] 1 points 6 months ago (4 children)

I've seen those trucks with a bunch of cars packed on top, something like that (minus truck) could totally fit in a train cargo container.

[–] [email protected] 14 points 6 months ago (8 children)

In fact I think there's a missed opportunity for EVs to partner with long distance public transit.

The main limitations of electric cars is distance, but if people knew they could go across the state or several states comfortably without their car, they might be more willing to take a electric car for city driving.

[–] [email protected] 9 points 6 months ago (3 children)

Yeah I love Foundry, but I'm convinced the DM needs technical knowledge to use it. I ran a server for non tech savvy DM and it was like working customer service.

With plenty of investment you can get the tabletop to be almost exactly what you want it to be, and for a popular system like 5e you can make it as automated as a Baldurs Gate game. You just need to download a lot of modules to get there and customize a lot of settings. Without that it just becomes a less intuitive Roll20.

And I must stress from experience, never offer to host/troubleshoot a server for someone else, especially if the DM likes to complain or can't handle minor technical setbacks.

view more: ‹ prev next ›