this post was submitted on 12 Jan 2025
25 points (82.1% liked)

Privacy

32740 readers
2511 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
25
Free AI chat bots (lemmy.world)
submitted 4 days ago* (last edited 4 days ago) by [email protected] to c/[email protected]
 

I am seeking recommendations for alternative free and privacy-focused AI chatbots that can be accessed via a web browser or utilized as offline software with a user-friendly installation process, not requiring extensive coding knowledge.

Specifically, I am interested in exploring options similar to DuckDuckGo or Venice, which prioritize user data protection and anonymity. Additionally, I would appreciate suggestions for offline AI chatbots that can be easily installed on various operating systems without requiring technical expertise.

It is notable that while there are numerous resources available for evaluating proxies and privacy-focused software, there appears to be a lack of comprehensive lists or reviews specifically focused on free AI chatbots that prioritize user privacy. If such resources exist, I would appreciate any guidance or recommendations.

top 15 comments
sorted by: hot top controversial new old
[–] [email protected] 14 points 4 days ago

For me I decided to reduce AI usage as it starts to hurt my real intelligence :)

[–] [email protected] 2 points 2 days ago

I don't use it but I have a self hosted llama instance that works alright.

[–] [email protected] 11 points 4 days ago (1 children)
[–] [email protected] 1 points 3 days ago
[–] [email protected] 8 points 4 days ago (1 children)

Probably a bit too technical but I wrote (and keep on updating) https://fabien.benetou.fr/Content/SelfHostingArtificialIntelligence which does include some of those, as mentioned here by others, e.g. GPT4All, LMStudio ,tlm, localAI, Ollama, etc.

If I can somehow clarify this to help you, please do ask.

[–] [email protected] 2 points 3 days ago (2 children)

LMStudio is what I use, extremely simple and runs well

[–] [email protected] 1 points 3 days ago* (last edited 3 days ago)

~~I'm trying to switch to this from Ollama after seeing the benchmarks, so much faster. But it has given me nothing but issues with CUDA incompatibility where Ollama runs smooth as butter. Hopefully I get some feedback on my repo discussion. Same docker setup as working Ollama, but Ollama has a lot more detailed docs.~~

Ignore that, thought you said LMDeploy.

[–] [email protected] 1 points 3 days ago

Indeed, very convenient. Just noticed they also now provide a JS/TS way to access models https://github.com/lmstudio-ai/lmstudio.js so might try that soon, especially if they conveniently support RAG.

[–] [email protected] 3 points 3 days ago

get a llamafile.

[–] [email protected] 2 points 3 days ago
[–] [email protected] 4 points 4 days ago (1 children)

, I would appreciate suggestions for offline AI chatbots that can be easily installed on various operating systems without requiring technical expertise.

The path on which you are going will likely require you to up skill at some point

Good luck!

But there decent amount of options for non tech person to run local llm but unless you got a good gaming PC ie high end graphics with RAM it ain't as usae.

CPU/RAM set up is too slow for chatbot functionality... Maybe Apple Silicon could work but I am not sure, it does have better bandwidth than traditional PC architectures

[–] [email protected] 2 points 4 days ago

I can confirm that Apple silicon works for running the largest Llama models. 64GB of RAM. Dunno if it would work with less as I haven’t tried. It’s the M1 Max chip, too. Dunno how it’d do on the vanilla chips.

[–] [email protected] 4 points 4 days ago

Look up LM Studio. It's a free software that let's you easily install and use local LLMs. Note that you need to have a good graphics card and a lot of RAM for it to be useful.

[–] [email protected] 2 points 4 days ago

I've been very happy with GPT4All. It's open source and privacy-focused by running on your own hardware. It provides a clean GUI for downloading various LLMs to chat with.

[–] [email protected] 2 points 4 days ago

Take a look at [email protected] ... they have lots of info in the sidebar and it's moderately active so people will probably answer questions.