MalReynolds

joined 1 year ago
[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

Same, easy as... Plus you know who's leaking your email to spammers and can just turn off the tap, it's glorious. Combined with temp mail (browser add on ) for truly disposable addresses and I get no spam. They'll probably find their way through in time, but for now, golden.

[–] [email protected] 1 points 1 year ago

Look into the temp mail browser add-on.

[–] [email protected] 2 points 1 year ago

Fair cop, Godspeed!

[–] [email protected] 2 points 1 year ago (2 children)

I was thinking more of training the base models, LLAMA(2), and more topically GPT4 etc. You're doing LoRA or augmenting with a local corpus of documents, no?

[–] [email protected] 3 points 1 year ago (4 children)

Akshually, while training models requires (at the moment) massive parallelization and consequently stacks of A100s, inference can be distributed pretty well (see petals for example). A pirate 'ChatGPT' network of people sharing consumer graphics cards could probably indeed work if the data was sourced. It bears thinking about. It really does.

view more: ‹ prev next ›