Despite its name, the infrastructure used by the “cloud” accounts for more global greenhouse emissions than commercial flights. In 2018, for instance, the 5bn YouTube hits for the viral song Despacito used the same amount of energy it would take to heat 40,000 US homes annually.
Large language models such as ChatGPT are some of the most energy-guzzling technologies of all. Research suggests, for instance, that about 700,000 litres of water could have been used to cool the machines that trained ChatGPT-3 at Microsoft’s data facilities.
Additionally, as these companies aim to reduce their reliance on fossil fuels, they may opt to base their datacentres in regions with cheaper electricity, such as the southern US, potentially exacerbating water consumption issues in drier parts of the world.
Furthermore, while minerals such as lithium and cobalt are most commonly associated with batteries in the motor sector, they are also crucial for the batteries used in datacentres. The extraction process often involves significant water usage and can lead to pollution, undermining water security. The extraction of these minerals are also often linked to human rights violations and poor labour standards. Trying to achieve one climate goal of limiting our dependence on fossil fuels can compromise another goal, of ensuring everyone has a safe and accessible water supply.
Moreover, when significant energy resources are allocated to tech-related endeavours, it can lead to energy shortages for essential needs such as residential power supply. Recent data from the UK shows that the country’s outdated electricity network is holding back affordable housing projects.
In other words, policy needs to be designed not to pick sectors or technologies as “winners”, but to pick the willing by providing support that is conditional on companies moving in the right direction. Making disclosure of environmental practices and impacts a condition for government support could ensure greater transparency and accountability.
This article may as well be trying to argue that we're wasting resources by using "cloud gaming" or even by gaming on your own, PC.
Gaming actually provides a real benefit for people, and resources spent on it mostly linearly provide that benefit (yes some people are addicted or etc, but people need enriching activities and gaming can be such an activity in moderation).
AI doesn't provide much benefit yet, outside of very narrow uses, and its usefulness is mostly predicated on its continued growth of ability. The problem is pretrained transformers have stopped seeing linear growth with injection of resources, so either the people in charge admit its all a sham, or they push non linear amounts of resources at it hoping to fake growing ability long enough to achieve a new actual breakthrough.
Lol
I don't understand how you can argue that gaming provides a real benefit, but AI doesn't.
If gaming's benefit is entertainment, why not acknowledge that AI can be used for the same purpose?
There are other benefits as well -- LLMs can be useful study tools, and can help with some aspects of coding (e.g., boilerplate/template code, troubleshooting, etc).
If you don't know what they can be used for, that doesn't mean they don't have a use.
LLMs help with coding? In any meaningful way? That's a great giveaway that you've never actually produced and released any real software.
FWIW I do that all the time, it's helpful for me too.
I gave up on ChatGPT for help with coding.
But a local model that's been fine-tuned for coding? Perfection.
It's not that you use the LLM to do everything, but it's excellent for pseudo code. You can quickly get a useful response back about most of the same questions you would search for on stack overflow (but tailored to your own code). It's also useful for issues when you're delving into a newer programming language and trying to port over some code, or trying to look at different ways of achieving the same result.
It's just another tool in your belt, nothing that we should rely on to do everything.