this post was submitted on 01 Jun 2024
105 points (88.9% liked)

Technology

59174 readers
2122 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 19 points 5 months ago* (last edited 5 months ago) (3 children)

So, first, that's just a reduction. But set that aside, and let's talk big picture here.

My GPU can use something like 400 watts.

A human is about 100 watts constant power consumption.

So even setting aside all other costs of a human and only paying attention to direct energy costs, if an LLM running on my GPU can do something in under a quarter the time I can, then it's more energy-efficient.

I won't say that that's true for all things, but there are definitely things that Stable Diffusion or the like can do today in a whole lot less than a quarter the time it would take me.

[–] [email protected] 25 points 5 months ago

That said, the LLM isn't running an array of bonus functions like breathing and wondering why you said that stupid thing to your Aunt's cousin 15 years ago and keeping tabs on your ambient noise for possible phone calls from that nice boy who promised to call you back.

[–] [email protected] 8 points 5 months ago

Chat GPT can output an article in a much shorter time than it'd take me to write one but people would probably like mine more

[–] [email protected] 4 points 5 months ago (2 children)

The problem is that using those tools no matter how energy efficient will add to the total amount of energy humans use, because even if an AI generates an image faster than a human could, the human still needs 100W constantly.

This doesn't mean, that we shouldn't make it more efficient but let's be honest, more energy efficient AI just means that we would use even more AI everywhere.

[–] [email protected] 4 points 5 months ago* (last edited 5 months ago) (1 children)

Solution: remove human

That’s what a lot of news sites are doing, getting rid of large parts of the employees and having the remaining do the same work with LLM. If you burn the no longer needed employees as an alternative heating solution your energy usage drops effectively to zero

[–] [email protected] 1 points 5 months ago

True, but It's still not what I meant unless they kill those humans. The employees that did that work before still need the 100W. It might be that they can now do something else (or just be unemployed) but the net energy usage is not going down.

[–] [email protected] 2 points 5 months ago (1 children)

But speaking of efficiency, a human can do more useful tasks while AI is crunching numbers. But that is very subjective.

[–] [email protected] 2 points 5 months ago* (last edited 5 months ago) (1 children)

It depends what you mean by useful. Most humans are (at least at the moment) more versatile than even the most advanced AI we have. But you have to keep in mind that there are jobs with pretty mundane tasks where you don't really need the intelligence and versatility of a human.

[–] [email protected] 1 points 5 months ago

Thats what I meant, keep the tasks separated, and let both what they do better than the other half.