Can data centers evolve as AI drives energy demand?

Recently I sat down with the editor of Illuminaire – a publication focused on AI deployment in industry – to discuss the challenges created by the growth in AI applications leading to a surge in electricity demand by data centers.

Here are my thoughts of the main points we discussed.

Unlike traditional web searches that consume fractions of a watt-hour, AI queries can be tens or even thousands of times more power intensive. Did you know that a single ChatGPT request is estimated to consume around 2.9 watt-hours of electricity? Significantly more energy than a standard Google search at around 0.3 watt-hours.

Multiply that across billions of queries, and this creates an infrastructure burden that is mostly invisible to us as end users, but not to utility (and data center) operators.

This is not just an increase in energy consumption; it’s a massive shift in how energy is used, demanded, and distributed. The challenge is not just availability, but also timing, location and resilience.

The question is can we make enough energy available to cope with this growth in demand?

We know that microgrids, virtual power plants (VPPs), and distributed energy systems are already operational, enabling data centers to manage their energy demands intelligently, integrate renewables, and send power back to the grid. A simple solar system and battery can already form a basic microgrid, but if you add a management layer that can handle electric vehicle (EV) chargers and appliances, you effectively have a VPP.

Imagine scaling that up for a data center.

We are already seeing this at scale to some extent. The Ajax football stadium in Amsterdam is powered by wind, solar panels, and second-life batteries from Nissan Leafs. The energy peaks are managed intelligently during matches, and the system dynamically offsets demand. This same model could be applied to data infrastructures.

A data center with renewable generation and storage capacity is not just sustainable; it’s also environmentally responsible. It becomes an active grid participant, exporting excess power during off-peak hours and stabilizing demand during high-load periods.

What’s missing is not the technology, but the integration.

Components exist, standards for devices, interfaces, and protocols. But, without a systems-level definition of how they interact, interoperability becomes a bottleneck.

While the growth in AI infrastructure is creating the energy demand, AI itself could be the solution to managing it. Work is already underway to create an AI-powered operating system for buildings.

This would sit above the device layer, enabling natural language commands and automated decision-making across diverse systems and standards.

You tell the system what you want, and the AI manages the building accordingly. It becomes the orchestrator, integrating solar, battery, HVAC, EV chargers, and so on, without the need for humans to worry about which protocol connects to what. This could be transformative for data centers, allowing AI to manage its own energy needs.

But there’s a disconnect between data center operators and utilities, with not enough cross-pollination. Even within the utilities themselves, different departments are siloed.

We have started conversations with a few data center players who are looking at flexibility, but we need more open channels.

If data centers are to become energy producers and part of a decentralized power system, the conversation must evolve to become technical, collaborative, and, most importantly, aligned.

Rolf Bienert spoke to Mark Venables, editor at Illuminaire, where you can read the full interview: The intelligent grid needs intelligent alignment

 

 

 

Share this post:

Comments on "Can data centers evolve as AI drives energy demand?"

Comments 0-5 of 0

Please login to comment