Close
Close

Published

AI’s carbon footprint in the firing line, relatively small fry though

A report from the University of Massachusetts has found that the training process for an AI model emits more than 626,000 lbs of carbon dioxide, or around the same amount as the carbon cost of building and running five average US cars through their conventional life expectancy.

For some context, the report says that a round-trip flight between New York and San Francisco accounts for 1,984 lbs of CO2, one year of the average human life accounts for 11,023 lbs, and an American life accounts for 36,156 lbs of CO2 annually. The aforementioned cars account for 126,000 lbs, which is where the 5x claim comes from.

The results came as a shock to Carlos Gómez-Rodríguez, a computer scientist at the University of A Coruña in Spain, who was not involved in the research, speaking to MIT Tech Review. “While probably many of us have thought of this in an abstract, vague level, the figures really show the magnitude of the problem. Neither I nor other researchers I’ve discussed them with thought the environmental impact was that substantial.”

Of course, any application running in the cloud is going to incur a hefty carbon footprint, thanks to all those spinning disks, churning CPUs, and jampacked RAM sticks, firing away in a data center, turning electricity into numbers and a lot of heat. However, the thrust of the environmental angle is that this centralized usage has a lower total carbon footprint than running all these applications in on-premises deployments.

The gist of that argument is that the cloud version of the application allocates resources as needed, meaning that there are not idle resources being wasted. In theory, if there’s a spike in demand for compute power or storage in one of these cloud instances, the data center can simply task more units to handle that.

Outside the cloud data center, the spike in demand would require the IT department to buy and install new systems, and then either decommission them once the surge is over, or find a new use for this capex investment. For the customer, buying all that equipment would typically work out more expensive than expanding the bill on a public cloud deployment. All this extra equipment has to be manufactured, which will incur a hefty carbon footprint before the devices are even moved through the retail chain and delivered to the customer.

So then, the counter to the initial allegation that AI has a very poor environmental record is that it should be more efficient than the alternative distributed computing model, when you account for all the elements in the supply chain and sales and integration process.

But the results are still surprising, especially when you do some extrapolation from the cloud-compute costs that the researchers have provided. The study was examining Natural Language Processing (NLP) models, which are some of the largest in the AI sector, due to the complexities of language.

The study looked at Transformer, ELMo, BERT, and GPT-2, and trained each model on a single GPU, to estimate its power draw in a cloud environment. The next step was factoring this into the given training hours used in the original research papers for the models, to then calculate the power consumption of the training process – based on the energy mix available in the USA.

The difference between two versions of Transformer are most striking. With 65mn parameters, the model used around 27 kWh of electricity, which accounted for 26 lbs of CO2. At 213mn parameters, and with the neural architecture search function enabled, this leaps up to 656,347 kWh (656 MWh), and 626,155 lbs of CO2 emitted.

The first version of Transformer was estimated to cost between $41-140 to train in a cloud environment, and if you will forgive the napkin-mathematics, rounding that to $100 would give you an approximate measurement for the emitted carbon dioxide per $100 spent in cloud model training. This would be around 26 lbs per $100, or 0.26lbs per $1.

For the larger version, the average for the estimated cost is $2,072,347, and when you divide the 626.155 lbs of CO2 by this, you end up with a figure of 0.30 lbs per $1. So if we average the two, we end up with a figure of 0.28 lbs per $1 spent on model training.

When you think of how much model training has supposedly taken place in the world, we’re suddenly into somewhat scary territory. IDC has recently said that public cloud spending will reach $210bn in 2019, up from $160bn in 2018. Gartner reckons public cloud spending will hit $331bn by 2022.

For the AI side of things, IDC says that the worldwide spending on AI systems will reach $77.6bn in 2022, up from $24bn in 2018, but doesn’t break out what percentage of that is directly attributable to cloud or training.

So then, the napkin is needed again. If you estimated that 10% of total spend here could be allocated to paying the cloud providers, then this gives us a figure of $2.4bn in 2018, and $7.7bn in 2022. Our CO2-per-dollar figure then puts this amount at 672mn lbs of CO2 in 2018, and 2.16bn lbs of CO2 in 2022.

Again, this sounds like a huge amount, but once you divide these by the numbers from the initial study, you find out that 2018’s CO2 emissions would be equivalent to 5,333 cars, and 2022’s expectation would be 17,111 cars. In the grand scheme of things, this is not very much at all, as the US bought around 17mn new cars in 2018, and the global total was around 78.7mn.

Even if we were to double our estimated percentage spend attributable to the cloud, we’re still a good few orders of magnitude away from making a dent in the number of cars sold. If the models were also being pitched as some method to save emissions in cars, via improved and smoother driving, then you might be able to use this stick to beat them with, but that’s not really the case.

Close