rs137

joined 1 year ago
[–] [email protected] 1 points 9 months ago

Llama 2 70B with 8b quantization takes around 80GB VRAM if I remember correctly. I’ve tested it a while ago.

[–] [email protected] 4 points 9 months ago (1 children)

Get fucked for what? That I’m from the EU? That I send donations to the country at war? That I have a gf?

[–] [email protected] 11 points 9 months ago (3 children)

Not everyone! I’m one of those. From time to time some American reminds me but my brain filters out as a completely useless information and I forget it.