Diabolo96

joined 1 year ago
[–] [email protected] 15 points 6 months ago (2 children)

It's AI and your voice won't be used for training if you use a local model.

Use Whisper stt. It run on your computer so nothing will be out. You can adapt the model size based on how powerful your computer is. The bigger the model the better at transcribing it will be.

[–] [email protected] 2 points 6 months ago* (last edited 6 months ago) (2 children)

Yeah, it's not a potato but not that powerful eaither. Nonetheless, it should run a 7b/8b/9b and maybe 13b models easily.

running them in Python with Huggingface's Transformers library (from local models

That's your problem right here. Python is great for making llms but is horrible at running them. With a computer as weak as yours, every bit of performance counts.

Just try ollama or llama.ccp . Their github is also a goldmine for other projects you could try.

Llama.ccp can partially run the model on the gpu for way faster inference.

Piper is a pretty decent very lightweight tts engine that can be directly run on your cpu if you want to add tts capabilities to your setup.

Good luck and happy tinkering!

[–] [email protected] 11 points 6 months ago* (last edited 6 months ago) (2 children)

Teach kids programming by making games with them. Find a random simple to make 'one tap, easy to control but hard to master game' like flappy bird.etc on playstore. Try remaking the game with the kid.

[–] [email protected] 1 points 6 months ago* (last edited 6 months ago)

One last thing that power users would certainly adore ( I certainly do ) :

When you long press an app a menu will pop-up with different icones. Remove the shortcut, widgets, uninstall the app, open the app informations (phone settings)(must have), open in playstore, edit the icon.

The short and buscar (search) button are app specific and are just shortcut to the app activities (i think it's the right word for it). A camera app would probably have take image and take video instead,etc

[–] [email protected] 1 points 6 months ago (4 children)

Specs? Try mistral with llama.ccp.

[–] [email protected] 2 points 6 months ago (6 children)

It shouldn't happen for a 8b model. Even on CPU, it's supposed to be decently fast. There's definitely something wrong here.

[–] [email protected] 3 points 6 months ago (8 children)

Sadly, can't really help you much. I have a potato pc and the biggest model I ran on it was Microsoft phi-2 using the candle framework. I used to tinker with Llama.cpp on colab, but it seems they don't handle llama3 yet. ollama says it does , but I've never tried it before. For the speed, It's kinda expected for a 70b model to be really slow on the CPU. How much slow is too slow ? I don't really know...

You can always try the 8b model. People says it's really great and even replaced the 70b models they've been using.

[–] [email protected] 4 points 6 months ago (2 children)

Looks great , but it doesn't look like it fit my needs.

Your search box is almost similar to what I described. Here's a picture :

I also like having my less frequently used apps organized inside folders in the home screen for quick access. Like so:

[–] [email protected] 18 points 6 months ago* (last edited 6 months ago) (14 children)

Run 70b llama3 on one and have a 100% local, gpt4 level home assistant . Hook it up with coqui.Ai xttsv2 for mind baffling natural language speech (100% local too ) that can imitate anyone's voice. Now, you got yourself Jarvis from Ironman.

Edit : thought they were some kind of beast machines with 192gb ram and stuff. They're just regular middle-low tier pcs.

[–] [email protected] 19 points 6 months ago (5 children)

A few pictures would be great. Will it be free/ donationware or paid? Open source or closed source ?

I've been on an old version of nova launcher for several years now due to simple feature that is somehow unavailable anywhere else. When you open the search widget, it also shows a small 3 tab drawer that shows you the last 12 recently installed/updated, recently opened, and frequently opened apps. This made handling apps far easier.

[–] [email protected] 3 points 6 months ago* (last edited 6 months ago) (1 children)

Yeah, I saw a documentary about Huawei being the biggest pig meat supplier in china among other things. I meant technologically speaking. Huawei was really big here so I followed what was happening to them after the US cut of their supply of ARM chips. Chinese chips weren't really the best at that time so even with the big internal market, they had to hibernate their technology sector until they started making good enough chips, and it paid off. Their tech is on par with the Snapdragon 8 gen 1 already. US sanctioning China, forcing it become a real competitors in the chip making industry ended being a Good thing for freedom of the market.

[–] [email protected] 0 points 6 months ago (5 children)

Not even a competitor, but a company that almost died a few years ago because it was locked from buying chips for its products.

view more: ‹ prev next ›