this post was submitted on 14 Feb 2024
675 points (95.5% liked)
Technology
59374 readers
6264 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yes, I did. And yes, it is possible. It's terribly slow in comparison, making it less useful. It very quickly devolves into random mumbling or get stuck in weird loops. It also hogs resources that are actually used by other tasks you may be doing.
I mainly test dev AI solutions, and moving from 1B to 7B models made them vastly more pertinent. And moving from CPU implementation (Ryzen 7 3700X) to GPU (RTX 3080 Ti) made them fast enough to be used as quick completion and immediate suggestion without breaking workflow, in addition to freeing resources for IDE, building tools and the actual software being run, while running it on CPU had multi-seconds delay, which made this use case completely useless.