this post was submitted on 08 Dec 2024
459 points (94.6% liked)

Technology

60033 readers
2817 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
459
The GPT Era Is Already Ending (www.theatlantic.com)
submitted 2 weeks ago* (last edited 2 weeks ago) by [email protected] to c/[email protected]
 

If this is the way to superintelligence, it remains a bizarre one. “This is back to a million monkeys typing for a million years generating the works of Shakespeare,” Emily Bender told me. But OpenAI’s technology effectively crunches those years down to seconds. A company blog boasts that an o1 model scored better than most humans on a recent coding test that allowed participants to submit 50 possible solutions to each problem—but only when o1 was allowed 10,000 submissions instead. No human could come up with that many possibilities in a reasonable length of time, which is exactly the point. To OpenAI, unlimited time and resources are an advantage that its hardware-grounded models have over biology. Not even two weeks after the launch of the o1 preview, the start-up presented plans to build data centers that would each require the power generated by approximately five large nuclear reactors, enough for almost 3 million homes.

https://archive.is/xUJMG

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 week ago* (last edited 1 week ago)

I'm glad you appreciate it, it was as much an excuse for me to unload that rant as anything else :)

But we actually get into trouble when our models of reality are poor. Our nature isn't self destructive at all, look at how many times we've been at the brink of nuclear annihilation and someone said, "actually don't", some of them in defiance of entrenched power structures that punished them for it.

We've had that world ending button for most of the last century, and we've never used it. If we really, on an instinctual level, were self-destructive we never would've evolved.

I think the real problem is the power structures that dominate us, and how we allow them to. They are aberrant, like tumours. They have an endless growth strategy, which just like in malignant tumours tend to kill the host. If they're destroyed, the host can go on to live a complete life.

And things can change fast, these structures are tenacious but fragile. Look at the UHC assassination - claims immediately started getting approved. After decades of entrenched screwing over of people, they flipped on their back the moment they were threatened. How many other seemingly intractable problems could be cut out tomorrow if we applied the right kind of pressure?