Training AI Leaves Higher Carbon Footprint Than Cars

We live in a day and age where minimising our carbon footprint is crucial to slowing down the rate at which the global temperature is increasing. We have developed alternative and more environmental friendly solutions such as investing in the research and development of electric cars as opposed to gasoline cars. However, we should be worried about the carbon footprint that training an AI leaves.

Training artificial intelligence is an energy intensive process. New research estimates suggest that the carbon footprint of training a single artificial intelligence is as high as 284 tonnes of carbon dioxide. This amount of carbon dioxide is equivalent to approximately five times more than the lifetime emissions of an average car.

Researcher Emma Strubell, alongside a few colleagues at the University of Massachusetts Amherst in the United States have analysed and interpreted the energy consumption required to train four neural networks, a type of artificial intelligence which is the foundation for language processing.

Language processing artificial intelligence systems use algorithms that power Google Translate. These AIs are trained in a process known as deep learning. It is a technique which helps computers to learn through examples. Strubell says that “In order to learn something as complex as language, the models have to be large.” A common way of explaining the way deep learning actually works involves giving an artificially intelligent program billions of written articles so that it learns to understands the meaning of words and how sentences can be constructed.

Training an AI can leave up to a five times higher carbon footprint than the lifetime emissions of a car.

Training an AI can leave up to a five times higher carbon footprint than the lifetime emissions of a car.

Researchers sampled four different AI’s to measure energy consumption. A carbon footprint was then estimated through calculations which they performed and based on the average carbon emissions used in power production in the US.

Neural architecture search (NAS) is a process in which the design of a neural network is automated via trial and error. This method was extremely energy intensive and very time-consuming. Training one of the AI’s they tested without NAS took 84 hours, but it took over 270,000 hours with it, this required 3000 times more energy.

Because the applications and use of artificial intelligence does not seem to be slowing down anytime soon, that also means that carbon footprint will be greater. With AI being the ‘next big thing’ we also need to focus on developing programs that are more efficient so that less energy is needed.

Zacharia Sharif