560
Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec
(www.businessinsider.com)
This is a most excellent place for technology news and articles.
You're right. Run an llm locally adjacent to your application sandboxes and local user apps and your office will lower its heating bills.