this post was submitted on 14 Feb 2024
675 points (95.5% liked)
Technology
59374 readers
3767 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If they're local they'd be basically useless due to a lack of computing power and potential lack of indexing for a search engine chatbot, so I doubt it. It would also have to be so polished that it wouldn't require further user knowledge / input, and that's just not a thing with any local LLM I've come across. Mozilla can gladly prove me wrong though. I certainly wouldn't mind if they generally can make the whole process of local LLMs easier and more viable.
The requirements to run good local LLMs have really been shrinking this past year… I have a lot of faith that there is a generally useful yet tiny AI tool within the grasp of Mozilla.
I can understand your thinking, but it could be as simple as giving the user the option to outsource the computation to a secure something or other, if their machine can’t handle it.
And yeah, the requirements are still quite high, but they are being reduced somewhat steadily, so I wouldn’t be surprised if average hardware could manage it in the long term.
Edit: For the record, Mozilla is one of the only companies I would trust if they said “the secure something or other is actually secure.” And they’d likely show actual proof and provide and explanation as to how.