this post was submitted on 25 Jul 2024
1009 points (97.5% liked)
Technology
60052 readers
3243 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
AI is a rounding error in terms of energy use. Creating and worldwide usage of chatGPT4 for a whole year comes out to less than 1% of the energy Americans burn driving in one day.
I think I'll go with Yale over 'person on the Internet who ignored the water part.'
https://e360.yale.edu/features/artificial-intelligence-climate-energy-emissions
From that article:
Forgive me for not trusting an ariticle that says that AI will use a petawatt within the next two years. Either the person who wrote it doesnt understand the difference between energy and power or they are very sloppy.
Chat GPT took 50GWh to train source
Americans burn 355 million gallons of gasoline a day source and at 33.5 Kwh/gal source that comes out to 12,000GWh per day burnt in gasoline.
Water usage is more balanced, depending on where the data centres are it can either be a significant problem or not at all. The water doesnt vanish it just goes back into the air, but that can be problematic if it is a significant draw on local freshwater sources. e.g. using river water just before it flows into the sea, 0 issue, using a ground aquifer in a desert, big problem.
Training is already over. This has nothing to do with training, so that is irrelevant. This is about how much power is needed as it is used more and more. I think you know that.
Also, I'm not sure why you think just because cars emit a lot of CO2, it doesn't mean that other sources that emit a lot of CO2, but less than cars, are a good thing.
Cool, tell that to all the people who rely on glaciers for their fresh water. That only includes a huge percentage of people in India and China.
But really, what you're telling me is that studies and scientists are wrong and you're right. Cool. Good luck convincing people of that.
This New Yorker article estimates GPT usage at 0.5GWhr a day, which comes out to 0.0041% of the energy burnt just in vehicle gasoline per day in the USA (and this is for worldwide usage for chatGPT).
I'm not asking you to trust me at all, I've listed my sources, if you disagree with any of them or multiplying three numbers together that's fine.
Yes, if you read my last reply I answered that directly. Water usage can be a big issue, or it can be a non-issue, its locale dependent.
What New Yorker article? You didn't link to one. I, however, linked to Yale University which has a slightly better track record on science than The New Yorker.
And, again, you are arguing that emitting less CO2 is a good thing. It is not.
And if water can be a big issue, why is AI a good thing when it uses it up? You can say "people shouldn't build data centers in those locations," but they are. And the world doesn't run on "shouldn't."
Edit: Now you linked to it. It's paywalled, which means I can't read it and I doubt you did either.
Apologies, I didn't post the link, it's edited now.
If you want to take issue with all energy usage that's fine, its a position to take. But it's quite a fringe one given that harnessing energy is what gives us the quality of life we have. Thankfully electricity is one of the easiest forms of energy to decarbonise and is already happening rapidly with solar and wind power, we need to transition more of our energy usage to it in order to reduce fossil fuel usage. My main point is that this railing against AI energy usage is akin to the whole plastic straw ban, mostly performative and distracting from the places where truely vast amounts of fossil fuels are burnt that need to be tackled urgently.
I'm 100% behind forcing data centres to use sustainable water sources or other methods of cooling. But that is a far cry from AI energy consumption being a major threat, the vast majority of data centre usage isn't AI anyway, it's serving websites like the one we are talking on right now.
Why can't we analyze AI on its own merits? We dont base our decisions on whether an idea is more or less polluting than automobiles. We can look at what we are getting for what's being put into it.
The big tech companies could scrap their AI tech today and it wouldnt change most peoples lives.
Yes, and it's paywalled, so I can't read it. I think you knew that. It could say anything.
Cool, good luck with that happening.
A different subject from water. You keep trying to get away from the water issue. I also think you know why you're doing that.
Also, define threat. It contributes to climate change. It gets rid of potable water. I'd call that a threat.
By the way, there is nowhere in the U.S. where water is not going to be a problem soon.
https://geographical.co.uk/science-environment/us-groundwater-reserves-being-depleted-at-alarming-rate
But hey, we can just move the servers to the ocean, right? Or maybe outer space! It's cold!
Ok, you just want to shout not discuss so I wont engage any further.
That's a nice cop-out there since nothing I said could remotely be considered shouting and your New Yorker article in no way supported your point.
Whole article for ref since you cant access it for whatever reason (its not very nice assuming bad faith like that btw)
Your link is just about Google's energy use, still says it uses a vast amount of energy, and says that A.I. is partially responsible for climate change.
It even quotes that moron Altman saying that there's not enough energy to meet their needs and something new needs to be developed.
I have no idea why you think this supports your point at all.
That was the only bit I was referring to for a source for 0.5GWh energy usage per day for GPT, I agree what Altman says is worthless, or worse deliberately manipulative to keep the VC money flowing into openAI.
I see, so if we ignore the rest of the article entirely, your point is supported. What an odd way of trying to prove a point.
Also, I guess this was a lie:
Although since it was a lie, I'd love you to tell me what you think I was shouting about.
They aren't just taking water noone was using.