catty

joined 2 days ago
[–] [email protected] 1 points 1 day ago* (last edited 1 day ago) (1 children)

But won't this be a mish-mash of different docker containers and projects creating an installation, dependency, upgrade nightmare?

[–] [email protected] 2 points 1 day ago (1 children)

But its website is Chinese. Also what's the github?

[–] [email protected] 3 points 1 day ago (2 children)

This looks interesting - do you have experience of it? How reliable / efficient is it?

[–] [email protected] 1 points 1 day ago* (last edited 1 day ago)

Try the beta on the github repo, and use a smaller model!

[–] [email protected] 4 points 2 days ago

I'm getting very-near real-time on my old laptop. Maybe a delay of 1-2s whilst it creates the response

[–] [email protected] 1 points 2 days ago* (last edited 2 days ago)

I agree. it looks nice, explains the models fairly well, hides away the model settings nicely, and even recommends some initial models to get started that have low requirements. I like the concept of plugins but haven't found a way to e.g. run python code it creates yet and display the output in the window

[–] [email protected] 10 points 2 days ago* (last edited 1 day ago) (4 children)

I've discovered jan.ai which is far faster than GPT4All, and visually a little nicer.

EDIT: After using it for an hour or so, it seems to crash all the time, I keep on having to reset it, and currently am facing it freezing for no reason.

 

I was looking back at some old lemmee posts and came across GPT4All. Didn't get much sleep last night as it's awesome, even on my old (10yo) laptop with a Compute 5.0 NVidia card.

Still, I'm after more, I'd like to be able to get image creation and view it in the conversation, if it generates python code, to be able to run it (I'm using Debian, and have a default python env set up). Local file analysis also useful. CUDA Compute 5.0 / vulkan compatibility needed too with the option to use some of the smaller models (1-3B for example). Also a local API would be nice for my own python experiments.

Is there anything that can tick the boxes? Even if I have to scoot across models for some of the features? I'd prefer more of a desktop client application than a docker container running in the background.

[–] [email protected] 1 points 2 days ago* (last edited 2 days ago)

It gets the clamps. Also, loads more gold, so gold prices plummet.