Sure, some of those links are different. But you have to admit, even if you are interested in this story, 5 times is a bit excessive.
admin
Since you asked, here are the other four times it was posted.
- https://lemmy.world/post/17906460
- https://lemmy.world/post/17913261
- https://lemmy.world/post/17930528
- https://lemmy.world/post/17949956
There was a fifth one, but that one has since been removed.
How many times is this going to be posted? I've seen this several times now over the past few days.
Right now, sure. But remember that 10 years ago, neural net generated images were putting eyes everywhere, and wouldn't create anything close to a believable photo. I wouldn't be surprised if 10 years from now, video's will have made a similar leap.
On the other hand, I do hope that between now and then, some laws will have been put in place to only train on ethically sourced datasets - which will slow down progress, but is more fair to the creators.
In a similar vain, I tried out a garmin smartwatch for a while, and at some point it warned me I was getting stressed.
I wasn't though - I was excited about a project that I had been working on coming together. But apparently the watch could only think in negative moods.
For that, and other privacy and usability based reasons, I decided to return it and go back to my non-heart-rating Pebble Time Steel.
I suppose that's one way to generalize an entire country.
Oh, sure. For the 405B model it's absolutely infeasible to host it yourself. But for the smaller models (70B and 8B), it can work.
I was mostly replying to the part where they claimed meta can take it away from you at any point - which is simply not true.
Oof - not on my 12gb 3060 it doesn't :/ Even at 48k context and the Q4_K quantization, it's ollama its doing a lot of offloading to the cpu. What kind of hardware are you running it on?
WAKE UP!
It works offline. When you use with ollama, you don't have to register or agree to anything.
Once you have downloaded it, it will keep on working, meta can't shut it down.
Technically correct (tm)
Before you get your hopes up: Anyone can download it, but very few will be able to actually run it.
Ah, that's a wonderful use case. One of my favourite models has a storytelling lora applied to it, maybe that would be useful to you too?
At any rate, if you'd end up publishing your model, I'd love to hear about it.
Such a lovely post, a nice distraction from all the doom scrolling articles! I wish we had more of this.
...
I should write a happy news moderator bot for my instance.