Close
Close

Published

Google’s DeepMind in National Grid discussions for AI-based load-balancing

DeepMind founder Demis Hassabis has hinted that Google’s AI wing is in talks with National Grid in the UK, to provide AI-based load-balancing savings, speaking to the Financial Times this week. National Grid, the UK’s publicly listed grid infrastructure company could reduce its energy consumption by 10%, according to Hassabis, although the exec could go no further than to say the pair were in talks of a “possible partnership.”

When contacted for a statement, National Grid  said that it was in “very early discussions” with DeepMind. The cautious tone was accompanied by a more general platitude that National Grid is “excited about how the latest advances in technology can bring improvements in our performance, ensure we are making the best use of renewable energy, and help save money for bill payers” – making Hassabis appear a bit keen, apparently going to the press before shaking hands.

The UK generates around 330 terrawatt-hours (TWh) of electricity a year. National Grid operates and owns the UK’s Transmission network infrastructure that transports electricity around the UK.  It serves the crucial role of ensuring there is enough power to meet demand at all times, whilst balancing the grid load at around +/-1% of 50Hz.

National Grid’s role in balancing the system over the years has become more difficult as intermittent sources of renewable electricity are a larger part of Britain’s energy mix, which should represent 30% of all the UK’s electricity generation by 2020. Consequently, a cost effective system that could deliver on a promise of reducing energy consumption by 10% and improve grid management efficiency enticing.

DeepMind claims its algorithms could more accurately predict demand patterns, as predicting peaks in both supply and demand would allow National Grid to maximize the use of renewables. Machine-learning will play the crucial role in DeepMind’s strategy if talks progress to partnership, using the techniques to analyze the data that National Grid’s infrastructure generates.

Hassabis noted DeepMinds achievement in reducing power consumption in its data center cooling bill. DeepMind made this announcement last July, asserting that the effect of better managing the data center’s cooling system, reduced the data center’s energy consumption by 15%. DeepMind estimated that this will translate into a saving of hundreds of millions of dollars over several years.

The DeepMind data center coolant management system relies on collecting historical data from thousands of sensors within the data center, taking readings of temperatures, power, pump speeds, set points, and so on. The data is then used to train an ensemble of neural networks, which then allow DeepMind to more efficiently manage the cooling system.

These predications are used to simulate the recommended actions, and ensure that the system operates efficiently and without constraints – focusing on the ratio of the total building energy usage to the IT energy usage, while also predicting the future temperature and pressure of the data center cooling system over the course of the next hour.

However, there is no published evidence of Google rolling out this system across its wider data center network, and when asked to comment, DeepMind did not reply. As such, it looks more like a pilot project rather than a strategic initiative within Google – but a 15% saving certainly isn’t to be sniffed at.

Hassabis believes the same principles applied to improving the efficiency of a data center, can deliver energy efficiency savings across the whole of the UK’s national grid – without any further investment in infrastructure. National Grid already collects real time data across the transmission network, and so DeepMind doesn’t have to worry about rolling out IoT-enabled infrastructure – as it might have to in greenfield opportunities.

But by the sounds of it, DeepMind still has some work to do to convince National Grid that the data center coolant model can be extrapolated across a much larger network – as while both systems are running critical infrastructure, losing a Netflix stream is quite a bit less damaging than living through a blackout.

To date, DeepMind has yet to justify its price tag – in terms of directly attributable revenues. The Google subsidiary seemed to spend most of 2016 publishing research papers on machine learning and neural network – when it wasn’t entangling itself in data privacy debates, as it appears to have done via its deal with the UK’s NHS.

However, Google will be happy to run DeepMind purely on an R&D basis, confident that at some point its output will be complimentary to Google’s core businesses – and that it doesn’t mind spending millions in R&D dollars if it gets to lead innovation in machine-learning and eventually boost Google revenues by billions.

Close