this post was submitted on 10 Jan 2024
1240 points (96.5% liked)

Technology

59421 readers
2850 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] -5 points 10 months ago* (last edited 10 months ago) (5 children)

You can run your own open source large language models at home about as well as you can run Bethesda’s Starfield on a same spec’d PC

...

Yes, you can download an executable of a chatbot lol.

That's different than running something remotely like even OpenAI.

The more it has to reference, the more the system scales up. Not just storage, but everything else.

Like, in your example of video games it would be more like stripping down a PS5 game of all the assets, then playing it on a NES at 1 frame per five minutes.

You're not only wildly overestimating chatbots ability, you're doing that while drastically underestimating the resources needed.

Edit:

I think you literally don't know what people are talking about..

Do you think people are talking about AI image generators?

No one else is...

[–] [email protected] 2 points 10 months ago* (last edited 10 months ago) (2 children)

I am talking about generative AI, be it text or image both have a challenge with copyrighted material.

"executable of a chatbot" lol, aint you cute

"example of video games"

Are you refering to my joke?

I am far from overestimating capacity, Starfield runs mediocre on a modern gaming system compared to other games. The Vicuna 13b llm runs mediocre on the same system compared with gpt 3.5. To this date there is no local model that i would trust for professional use and chatgpt 3.5 doesnt hit that level either.

But it remains a very interesting, rapidly evolving technology that i hope receives as much future open source support as possible.

"I think you literally don’t know what people are talking about" I hate to break it to you but you're embarrassing yourself.

I presume you must believe the the following lemmy community and resources to be typed up by a group of children, either that or your just naive.

https://lemmy.world/c/fosai

https://www.fosai.xyz/

https://github.com/huggingface/transformers

https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard

https://huggingface.co/microsoft/phi-2 & https://www.microsoft.com/en-us/research/blog/phi-2-the-surprising-power-of-small-language-models/

https://www.theguardian.com/technology/2023/may/05/google-engineer-open-source-technology-ai-openai-chatgpt

[–] [email protected] -4 points 10 months ago (1 children)

Or...

I could just block some of the people who are really really into chatbots, but don't understand it in the slightest.

I think that might be more productive than reading a bunch of stuff from other people who don't understand it.

[–] [email protected] 4 points 10 months ago

HOT TAKE: Hugging face is run by people who are really into chatbots but dont understand it in the slightest.

I have been patient and friendly so far but your tone has been nothing but dismissive.

you cannot have a nuanced conversation about AI while excluding the entire Open Source field within it. That's simply unreasonable and i plore you to ask others because i know you wont take my word for it.

Farewell

load more comments (2 replies)