aniki

joined 1 year ago
[–] [email protected] 9 points 5 months ago (1 children)
[–] [email protected] 6 points 5 months ago

That's what I was thinking. It's mighty convenient...

[–] [email protected] 18 points 5 months ago* (last edited 5 months ago)

Runs locally, mirrors remotely.

To ensure a seamless customer experience when their hardware isn't capable of running the model locally or if there is a problem with the local instance.

microsoft, probably.

[–] [email protected] 13 points 5 months ago (4 children)

Decades ago it was a funny joke. Now it's the most popular handheld OS on the planet by a huge margin. Linux is damn EVERYWHERE except the desktop now, and it's only a matter of time.

[–] [email protected] 7 points 5 months ago (5 children)

Best avoid Microsoft all together in the future. There's nothing of value left.

[–] [email protected] 1 points 5 months ago

That's not an excuse.

[–] [email protected] 5 points 5 months ago

So I won't buy a yacht anytime soon. Good.

[–] [email protected] 1 points 5 months ago

Plenty of other things to do in a city that size, then.

[–] [email protected] 1 points 5 months ago

Not LN/Ticketmaster shows.

[–] [email protected] 8 points 5 months ago

WTF? You can make your very own private, locally run, AI assistant on a Raspberry PI, and make your own interface with an ESP32. Right now.

[–] [email protected] 5 points 5 months ago

Exactly this. We need to figure out making machines that can reason first and then we can have THEM sort the data and figure out what to feed the data pool.

But if we have a computer that can reason, we don't need LLMs at all.

view more: ‹ prev next ›