admin

joined 1 year ago
[–] [email protected] 24 points 3 months ago* (last edited 3 months ago)

Such a lovely post, a nice distraction from all the doom scrolling articles! I wish we had more of this.

...

I should write a happy news moderator bot for my instance.

[–] [email protected] 3 points 3 months ago

Sure, some of those links are different. But you have to admit, even if you are interested in this story, 5 times is a bit excessive.

[–] [email protected] 5 points 3 months ago* (last edited 3 months ago) (2 children)

Since you asked, here are the other four times it was posted.

There was a fifth one, but that one has since been removed.

[–] [email protected] -3 points 3 months ago (4 children)

How many times is this going to be posted? I've seen this several times now over the past few days.

[–] [email protected] 4 points 3 months ago (5 children)

Right now, sure. But remember that 10 years ago, neural net generated images were putting eyes everywhere, and wouldn't create anything close to a believable photo. I wouldn't be surprised if 10 years from now, video's will have made a similar leap.

On the other hand, I do hope that between now and then, some laws will have been put in place to only train on ethically sourced datasets - which will slow down progress, but is more fair to the creators.

[–] [email protected] 2 points 3 months ago

In a similar vain, I tried out a garmin smartwatch for a while, and at some point it warned me I was getting stressed.

I wasn't though - I was excited about a project that I had been working on coming together. But apparently the watch could only think in negative moods.

For that, and other privacy and usability based reasons, I decided to return it and go back to my non-heart-rating Pebble Time Steel.

[–] [email protected] -2 points 3 months ago (1 children)

I suppose that's one way to generalize an entire country.

[–] [email protected] 9 points 3 months ago

Oh, sure. For the 405B model it's absolutely infeasible to host it yourself. But for the smaller models (70B and 8B), it can work.

I was mostly replying to the part where they claimed meta can take it away from you at any point - which is simply not true.

[–] [email protected] 1 points 3 months ago (1 children)

Oof - not on my 12gb 3060 it doesn't :/ Even at 48k context and the Q4_K quantization, it's ollama its doing a lot of offloading to the cpu. What kind of hardware are you running it on?

[–] [email protected] 33 points 3 months ago* (last edited 3 months ago) (2 children)

WAKE UP!

It works offline. When you use with ollama, you don't have to register or agree to anything.

Once you have downloaded it, it will keep on working, meta can't shut it down.

[–] [email protected] 96 points 3 months ago (35 children)

Technically correct (tm)

Before you get your hopes up: Anyone can download it, but very few will be able to actually run it.

[–] [email protected] 1 points 3 months ago (3 children)

Ah, that's a wonderful use case. One of my favourite models has a storytelling lora applied to it, maybe that would be useful to you too?

At any rate, if you'd end up publishing your model, I'd love to hear about it.

view more: ‹ prev next ›