silverlose

joined 2 weeks ago
[–] [email protected] 1 points 1 day ago

True. Honestly apples software is just getting worse by the day. It’s sad.

It’s a version with 12gb of vram. I use it to game though. If you want a real GPU for this, I hear the Tesla P40 is the best.

[–] [email protected] 1 points 5 days ago* (last edited 5 days ago) (2 children)

FWIW speech to text works really well on Apple stuff.

I’m not exactly sure what info you’re looking but: my gaming PC is headless and sits in a closet. I run ollama on that and I connect to it using a client called “ChatBox”. It’s got a gtx 3060 which fits the whole model, so it’s reasonably fast. I’ve tried the 32b model and it does work but slowly.

Honestly, ollama was so easy to setup, if you have any experience with computers I recommend giving it a shot. (Could be a great excuse to get a new gpu 😉)

[–] [email protected] 1 points 5 days ago (4 children)

Have you heard of ollama? You can run deepseek and stuff locally super easy. I know it’s not a complete replacement, but it feels nice to use an LLM guilt free. I’ve compared the 14b distilled model from deepseek vs the paid version of ChatGPT and it made me cancel my account.

[–] [email protected] 1 points 1 week ago

It blows my mind everyday

[–] [email protected] 12 points 1 week ago (2 children)

I’m trying to convince people to check out Lemmy but it’s hard. People are so stuck in their ways. I’ll keep trying though

[–] [email protected] -1 points 1 week ago

War romance??? You think people want war? Wtf is wrong with you

view more: next ›