aniki
That's what I was thinking. It's mighty convenient...
Runs locally, mirrors remotely.
To ensure a seamless customer experience when their hardware isn't capable of running the model locally or if there is a problem with the local instance.
microsoft, probably.
Decades ago it was a funny joke. Now it's the most popular handheld OS on the planet by a huge margin. Linux is damn EVERYWHERE except the desktop now, and it's only a matter of time.
Best avoid Microsoft all together in the future. There's nothing of value left.
That's not an excuse.
So I won't buy a yacht anytime soon. Good.
Plenty of other things to do in a city that size, then.
Not LN/Ticketmaster shows.
WTF? You can make your very own private, locally run, AI assistant on a Raspberry PI, and make your own interface with an ESP32. Right now.
Exactly this. We need to figure out making machines that can reason first and then we can have THEM sort the data and figure out what to feed the data pool.
But if we have a computer that can reason, we don't need LLMs at all.