Skip to Content
846 × 636px (22)

How Ultra-Fast Batteries Enable Next-Gen Data AI Centres

Imagine expecting a grandma to take part in a HIIT class on her 90th birthday. Well, that’s how the poor energy grid feels with all the focus on artificial intelligence (AI) and high-performance compute. It’s no secret that, like your grandma’s last hip replacement, the energy grid could do with an upgrade. With AI data centres each consuming as much electricity as 100,000 households, global data centre energy consumption is set to match the total electricity usage of Japan by 2030, and with grid interconnection queues growing to decades, we explore how batteries can help data centre companies reduce their strain on the grid, potentially speeding up deployment times and making projects more attractive to regulators.

The demand for AI data centres

The demand for AI is growing massively, with McKinsey estimating that around 70% of data centre capacity by 2030 will be specifically for AI workloads. The issue is that AI uses far more energy than traditional compute. That’s mainly down to what AI is used for. While traditional compute executes predictable tasks like running websites, spreadsheets, or payment systems, AI learns from large volumes of data to recognise patterns and make predictions, like understanding speech or recommending content. In simple terms, one executes instructions; the other learns from experience.

Power hungry AI

To put this into perspective, traditional compute uses central processing units (CPUs), the first being Intel’s 4004 chip, which consumed roughly 0.5 watts of power. In stark contrast, NVIDIA’s latest GPU, the Vera Rubin, can draw 1.8 to over 2.3 kilowatts per chip. And that’s just a single chip. Zoom out to the data centre scale, and the change is staggering: ten years ago, a 20 MW data centre was considered large. Last year, OpenAI announced a 1.2 GW site in Texas, showing just how much demand for compute power has surged.

Pushing the grid to the edge

Using more power isn’t the only challenge with AI, it’s also how that power is consumed. Traditional computing is relatively steady and predictable. AI workloads, on the other hand, are highly variable and arrive in bursts. A data centre might spike for model training or inference, then drop sharply once the task finishes. These spikes are extremely challenging for the grid. And again, you wouldn’t expect your grandma to perform well in a Hyrox class. That’s why access to power is a major hurdle for many data centre companies. The grid doesn’t want this highly unstable demand relying on it, potentially knocking its frequency off balance and putting nearby homes at risk of outages.

Even when projects are approved, grid capacity may be limited. An original plan for a 100 MW data centre could end up being approved for just 50 MW. But if a provider can show grid operators they can reduce their load during times of stress, it makes the development far more attractive, helping them maintain capacity or speed up interconnection. A good example is Aligned Data Centers, which recently announced a 31 MW battery energy storage system (BESS) for its Pacific Northwest campus, giving the regional utility confidence to approve interconnection faster.

Ultra-fast batteries to the rescue

Battery energy storage systems were once used mainly for backup power. Today, they are increasingly deployed to reduce reliance on local grids through peak shaving, where batteries supply power during periods of high demand or sudden load spikes, stabilising consumption and reducing the risk of grid disruption. But with AI, workloads can spike and drop in milliseconds, much faster than most batteries’ C-rates, which measure how quickly they can charge or discharge, can handle. If batteries try to keep up, it can rapidly degrade them and shorten their lifespan. Luckily, battery chemistries and technology are evolving fast. New ultra-fast charging batteries with high C-rates are equipped to handle AI’s demand spikes. Companies like Nyobolt lead in this space. Their batteries include dynamic response systems that provide stable power during spikes and a built-in energy reserve if the grid drops. Novel chemistries mean their batteries can retain 80% capacity after 25,000 cycles compared to roughly 1,000 for standard lithium-ion.

The bigger picture

Batteries, as enablers for data centre expansion, could help AI systems handle sudden spikes by giving them the power they need when workloads surge. By reducing reliance on the grid and supporting overall stability, they allow developers to move faster through interconnection queues and get projects approved. While it sometimes feels like AI’s main output is endless online content, the potential gains are enormous: it could help discover new materials, develop medicines, and even design breakthrough energy storage solutions. Batteries could be the key to accelerating these discoveries.

Back to top