Home Technology The US grid can’t handle the drain of growing AI energy use

The US grid can’t handle the drain of growing AI energy use

45
0

The growth of generative AI requires huge amounts of water and energy and the US grid is struggling to cope.

As AI has taken off in the last few years, new data centers are popping up across the country to support the paid acceleration of AI. Learning models need such data centers to provide the vast amount of computing resources required to train and deploy complex machine learning models and algorithms.

However, data centers also require a huge amount of power to run and maintain them, as well as water to cool the servers inside. Concerns are rising about whether the US power grid can generate enough electricity for the growing number of necessary data centers. While AI has been helping to improve sustainability in some fields, if it’s not a sustainable technology in itself, it won’t be doing any good for the planet.

What can be done to save energy in the AI space?

Those working in AI production both on a hardware and software level are working to mitigate these drains on energy and resources, as well as investing in sustainable energy sources.

“If we don’t start thinking about this power problem differently now, we’re never going to see this dream we have,” Dipti Vachani, head of automotive at Arm, told CNBC. The chip company’s low-power processors are being used more and more by huge companies like Google, Microsoft, Oracle and Amazon because they can help to reduce power use by up to 15% in data centers.

Work is also being done to reduce how much energy AI models need in the first place. For example, Nvidia’s latest AI chip, Grace Blackwell, incorporates Arm-based CPUs that can supposedly run generative AI models on 25 times less power than the previous generation.

“Saving every last bit of power is going to be a fundamentally different design than when you’re trying to maximize the performance,” Vachani said.

Nonetheless, concerns remain that these measures won’t be enough. After all, one ChatGPT query uses nearly 10 times as much energy as a typical Google search, while generating an AI image can use as much power as charging a smartphone fully.

The effects are being seen most clearly with the larger companies, with Google’s latest environmental report showing greenhouse gas emissions rising nearly 50% between 2019 and 2023, in part because of data center energy consumption. That’s despite claims from Google that data centers are 1.8 times as energy efficient as a typical data center. Similarly, Microsoft’s emissions rose nearly 30% from 2020 to 2024, also due partly to data centers.

Featured image: Unsplash



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here