A recent study has brought a glimmer of hope to the tech industry, which has been grappling with the prospect of an unprecedented surge in electricity demand driven by the growth of artificial intelligence (AI). According to the research, data center operators and other heavy electricity users can unlock a substantial amount of capacity by curtailing their power use, even if only slightly.
The study proposes that by limiting power drawn from the grid to 90% of the maximum for a couple of hours at a time – equivalent to about a day per year – new users could unlock 76 gigawatts of capacity in the United States. To put this number into perspective, it's more than the total power used by all data centers globally, according to Goldman Sachs, and approximately 10% of peak demand in the U.S.
This approach is not entirely new, as utilities have been encouraging big electricity users like shopping malls, universities, and factories to curtail their use during peak demand periods, such as hot summer days, for decades. In return, these users receive credits on their bills. However, data centers have largely opted out of such programs, prioritizing uptime and performance levels for their customers.
The study argues that data centers are ideal candidates for demand-response programs due to their potential flexibility. There are several ways data centers can trim their power use, including temporal flexibility, where computing tasks are shifted to times of lower demand. For instance, AI model training could be rescheduled to accommodate brief curtailments. Another approach is spatial flexibility, where companies shift their computational tasks to regions with lower demand. Data center operators can also consolidate loads and shut down a portion of their servers.
In cases where tasks are mission-critical and cannot be delayed or shifted, data center operators can turn to alternative power sources to make up for any curtailment. Batteries are particularly well-suited for this purpose, as even modestly sized installations can provide several hours of power almost instantaneously.
Some companies have already experimented with ad hoc versions of these demand-response programs. Google, for example, has utilized its carbon-aware computing platform to enable demand response, while Enel X has collaborated with data centers to tap into the batteries in their uninterruptible power supplies (UPS) to stabilize the grid. PG&E is offering to connect data centers to the grid more quickly if operators agree to participate in a demand response program.
While these tweaks won't entirely eliminate the need for new sources of power, they could transform a potentially catastrophic situation – in which half of all new AI servers are underpowered – into one that's more manageable. As the tech industry continues to grapple with the implications of AI-driven growth, this research offers a promising solution to alleviate concerns over electricity demand.
In conclusion, the study's findings highlight the potential for data centers to play a crucial role in mitigating the electricity capacity crunch. By adopting demand-response strategies, data center operators can not only reduce their environmental footprint but also contribute to a more sustainable and resilient grid. As the industry moves forward, it will be essential to explore and implement these innovative solutions to ensure a stable and efficient supply of power.