this post was submitted on 21 Nov 2023
44 points (92.3% liked)

Technology

59421 readers
2842 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

ChatGPT is down, the new CEO is threatening to quit, and the majority of employees are doing the same... so I decided to look up some alternatives

The name Anthropic came up a few times, but I'm not familiar with them. I've seen Claude mentioned in the past

Anthropic PBC is an American artificial intelligence (AI) startup company, founded by former members of OpenAI.[3][4] Anthropic develops general AI systems and large language models.[5] It is a public-benefit corporation, and has been connected to the effective altruism movement.

As of July 2023, Anthropic had raised US$1.5 billion in funding. In September, Amazon announced an investment of up to US$4 billion, followed by a $2 billion commitment from Google the next month https://en.wikipedia.org/wiki/Anthropic

So are they more ethical / committed to AI safety, or more of the same? Does the product compare well to ChatGPT?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 8 points 1 year ago (1 children)

It refuses more from what I've seen. Personally I don't think it's a good idea to become dependant on any commercially hosted model. Open models are a bit behind but they're getting there.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

The problem with open models is you basically have to run it on your own hardware, and the hardware is not only expensive it's also unobtainable.

H100 GPUs are sold by scalpers for $50k with no warranty — and worse that's an obsolete model. The H200 GPU just can't be purchased at all unless you're filling a datacentre with them.

[–] [email protected] 3 points 1 year ago (1 children)

You can run ollama on a regular laptop

[–] [email protected] 1 points 1 year ago

It's also insufferablely slow, and the answers are ... well ... not exactly up to gpt-4 level to say the least