Could AI Negatively Impact the Environment?

We have seen the result of new artificial intelligence powered algorithms that can do everything from detect cancer to help drive cars. AI is at the driving force of so many new technologies, but at what cost? A new report shows that a common AI training model can emit more than 626,000 pounds of carbon dioxide equivalent. That’s about five times the lifetime emissions of the average American car - including the manufacture of the car itself. While some AI researchers have noted the potential environmental impact of what they do - the actual figure has shocked the sector.

The paper looked at the process of natural-language processing (NLP). this subfield of AI is focused on racing machines to work with human language. The NLP development community have made some fascinating breakthroughs in recent years. The work by researchers in this area are responsible for technology such as machine translation and sentence completion. These advances are super helpful across a range of applications, but to get there, the training requires huge data sets scraped from the Internet. Converting these sprawling datasets from nonsense into a training program takes a huge amount of computer power and, as a result, a huge amount of energy.



Four models identified by the researchers as being responsible for the biggest leap forward in performance were examined; the Transformer, ELMo, BERT, and GPT-2. To figure out just how much CO2 emissions the models were responsible for, the researchers first trained each model on a single GPU for up to a day - to measure its power draw. They then used the number of training hours stated by the model’s original papers to calculate the total energy that the complete training process would consume. This number was then, converted into pounds of carbon dioxide equivalent. The results show that computational and environmental costs of training grew in proportion to the model size.

The costs ballooned when extra training was added to increase the model's accuracy. For example, a tuning process known as ‘neural architecture’ that uses a trial and error model to optimize a model through tweaking the network design has huge cots for little overall gain. By removing this final step, the most costly model, BERT, had a much more modest carbon footprint of about 1,400 pounds of carbon dioxide. This is almost equivalent to a round-trip trans-America flight for one person. What’s worse is that the researchers say that these are modest numbers based on training a model to the most minimal level. And, that most big training programs will actually have a much larger footprint as they develop parts of the rising program from scratch.

There is no doubt that the impact of these numbers is massive. AI is driving everything from medical research to defense, however, with the dangers of the implications of climate change, AI should switch directions and become the driving force of creating more sustainability in a dangerous time. 


Meher Bhatia