this post was submitted on 07 Feb 2024
218 points (98.7% liked)
Technology
59207 readers
2939 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
BTW, are you running that locally?
You bet your ollama I am.
If you have good enough hardware, this is a rabbithole you could explore. https://github.com/oobabooga/text-generation-webui/
Naah. I think this model needs a crazy amount of vram to run. I'm stuck with 4gigs :(
Did you use a specific website to use Mixtral? I want to try but system requirements are crazy.
huggingface.co/chat
You can run it locally with an RTX 3090 or less (as long as you have enough RAM), but there's a bit of a tradeoff in speed when using more system RAM vs VRAM.