this post was submitted on 27 Oct 2023
524 points (94.9% liked)

Technology

59287 readers
4196 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago (1 children)

you can, but things as good as chatgpt can't be ran on local hardware yet. My main obstacle is language support other then english

[–] [email protected] 2 points 1 year ago (1 children)

They're getting pretty close. You only need 10GB VRAM to run Hermes Llama2 13B. That's within the reach of consumers.

[–] [email protected] 1 points 1 year ago

nice to see! i'm not following the scene as much anymore (last time i played around with it was with wizard mega 30b). definitely a big improvement, but as much as i hate to do this, i'll stick to chatgpt for the time being, it's just better on more niche questions and just does some things plain better (gpt4 can do maths (mostly) without hallucinating)