524
AI one-percenters seizing power forever is the real doomsday scenario, warns AI godfather
(www.businessinsider.com)
This is a most excellent place for technology news and articles.
I have Llama 2 running on localhost, you need a fairly powerful GPU but it can totally be done.
I’ve run one of the smaller models on my i7-3770 with no GPU acceleration. It is painfully slow but not unusably slow.