AI is an energy hog. It’s a
strain on the power grid
Data centers’ high electricity demand is raising concerns that it will hasten climate change
Google AI takes 10 times as much electricity to process a result as a regular Google search, according to one analysis. Data centers, or large buildings filled with computer servers, already accounted for about 4% of energy use in the U.S. in 2022, with consumption expected to hit 6% by 2026, an increase driven in part by the boom in AI use.
And there is a push to build more in California and across the country. Pacific Gas & Electric revealed in June that it had received 26 applications for new data centers that would use 3.5 gigawatts in total, Times’ reporter Melody Petersen recounted. That amount of power could support nearly 5 million homes.
Meeting that demand will put pressure on the U.S.’ aging electrical grid and, since
Microsoft reported a nearly 30% increase in carbon emissions since 2020 due to the construction of data centers. The International Energy Agency estimates that in 2021, Amazon, Microsoft, Google and Meta collectively used 72 trillion watt hours, more than double the amount they used in 2017, and this number is expected to continue rising.
Data centers run about 100,000 servers on average and often need to be located near power plants. There are concerns that these facilities could strain local power supplies, causing rolling blackouts. California is particularly vulnerable; the state ranks 49th out of 50 in the ability to avoid blackouts by having more electricity available than homes and businesses need at peak hours. In addition to being a threat to the power grid, servers produce heat and data centers use a lot of water to cool them down.
Given the environmental impacts, there has been increasing resistance from communities over plans to expand data centers. Lawmakers in Washington, Virginia, Georgia and other states have pushed for studies of data centers’ energy use and the effects on grid reliability, ProPublica reported. That’s a good start. We have to make sure this rapidly expanding technology doesn’t undermine climate goals.
Jesse Dodge, a senior research scientist at the Allen Institute for AI, said that before this recent wave of consumer AI, artificial intelligence models were mostly being used by researchers for academic purposes and prioritized for efficiency and sustainability. The large tech companies developing AI for the marketplace prioritize bigger models that use more energy.
Although tech companies are not required to specify how they trained newer models, revealing this data could help reduce energy demands. Oftentimes if a company is updating a model, it trains the model from scratch instead of expanding on a model that it has already completed. If companies were to release these models, that could help eliminate the repetition seen throughout generative AI, reducing energy and water waste, Dodge said.
Shaolei Ren, an assistant professor of electrical and computer engineering at UC Riverside, has been conducting research on ways Big Tech can be responsible when creating new AI models, and he believes that these companies are more than capable of operating sustainably.
“Theoretically, they could physically be at carbon zero by routing the workloads around the world. There are data centers all over the world. Since California has solar energy they can put the workload [during the day] here and at night they can move the computing to Europe. They could do this, but they do not because there is a lot of risk,” Ren said.
Big Tech has the resources to curb its energy demands, but so far it has chosen not to. For example, there is no way for users to opt out of Google’s AI-generated search, forcing them to get the energy-intensive results even if they did not want them.
That has to change. AI is only going to grow, and the companies behind the boom have a responsibility to ensure their technology doesn’t slow progress in fighting climate change.