Skip to content Skip to sidebar Skip to footer

Source: Gènéthique as of 06-07-2020 

Artificial intelligence systems “consume a lot of energy and can generate large volumes of carbon emissions that contribute to climate  change.” For example, one study has shown that the experiments needed to build and learn an artificial intelligence language processing system can generate more than 35 tons of CO2 emissions,2or “twice the emissions of an average American throughout his or her  life.” Faced with these challenges, a team of researchers from Stanford University, Facebook AI Research,andMcGill University “has  developed an easy-to-use tool that quickly measures the amount of electricity used by a machine learning project and the amount of carbon emissions it represents.”  

“There is a strong push to increase machine learning to solve increasingly important problems, using more computing power and more data,”  says Dan Jurafsky, Chair of Linguistics and Professor of Computer Science at Stanford. “In  this context, we must consider whether the benefits of these intensive computational models are worth the cost of the impact on the  environment.” Thus, “machine  learning systems develop their skills by running millions of statistical experiments 24 hours a day, constantly refining their models to perform tasks. These training sessions, which can last for weeks or even months, are becoming more energy-intensive.”  “And  as costs have fallen for both computing power and managing massive amounts of data, machine learning is becoming more prevalent in businesses, governments, universities and personal life.”. »

In order to get an accurate measure of what this means in terms of carbon emissions, “researchers began by measuring the energy consumption of a particular artificial intelligence  model.” A task “more  complicated than it seems, because a single machine often drives multiple models at the same time, so that each learning session must be untangled from the others. Each learning session also consumes energy for common functions, such as data storage and cooling, which must be properly distributed. Then we have to translate energy consumption into carbon emissions, which depend on the energy mix used to produce electricity. A mix that  “varies considerably depending on the place and time of  day.” The researchers estimated that “holding  a session in Estonia, which depends largely on shale oil, will produce 30 times more carbon than the same session in Quebec, which depends mainly on hydropower.”  This leads researchers to recommend “moving  learning sessions to a place powered primarily by renewable sources.”  

And “researchers have found that some machine learning algorithms are more energy-intensive than  others.” When it is not appropriate to move the work to reduce emissions,  “as for car navigation, because long distances cause communication delays, “latency” it would nevertheless be possible to “define  the most effective program as the default parameter when choosing which one to use.”  

“Over time it is likely that machine learning systems consume even more energy in production than during the learning phase,”says Peter Henderson, a doctoral student in computer science at Stanford University and lead author of the study. The better we understand our options, the more we can limit the potential impacts on the environment. »

Show CommentsClose Comments

Leave a comment

News ORS © 2020. All Rights Reserved.