HomeTech and GadgetsArtificial IntelligenceWorking with AI Comes with a Big Environmental Footprint

Working with AI Comes with a Big Environmental Footprint

June 7, 2019 – In a newly published paper by three computer science researchers at the University of Massachusetts Amherst, the authors argue that training neural networks is proving to be costly in terms of the environment. That’s because the amount of computing resources needed represents a significant carbon footprint in terms of energy use. One of the complaints environmentalists make in the age of cloud computing, is the proliferation of massive server farms used by the cloud, and the carbon footprint they create. It appears, based on the new research, that it isn’t just cloud computing that is environmentally problematical but also AI training.

The research found that the training of a series of AI models emitted almost 284,000 kilograms (626,000 pounds) of carbon dioxide (CO2), five times the average lifetime emissions of an automobile including the CO2 released during manufacturing.

Why is this the case?

New AI models based on neural network computing require many hours of training and exposure to vast amounts of data. In many cases, the hardware used is top-of-the-line supercomputers or multiprocessor server farms. The energy requirements to power the hardware, plus the manufacturing of the systems themselves, if not sourced from renewables produces net new carbon emissions. This is particularly true when the hardware being used to develop the AI models is sourced from often, energy-intensive in-house data centers, and not from cloud sources which more recently have become greener by developing accompanying wind, solar and battery storage farms.

For the purpose of the study, the authors looked at model training for natural language processing (NLP), teaching AI to understand human language. The training of NLP models has involved mega datasets of scraped pages from the Internet, computationally expensive, and an enormous energy consumer. In studying the four models that represent state-of-the-art NLP performance, they were able to calculate resource usage, and total energy consumed over the model training process. They then converted the energy and resources consumed into CO2 equivalents.

What they found was that the more complex the AI model, the higher the environmental cost in CO2 equivalent emissions, with the most costly model of the four producing a carbon footprint equal to 635 kilograms (1,400 pounds) of CO2, equal to a single fare round-trip flight from New York City to Los Angeles. The CO2 amount described here covers only one modeling effort. So imagine when you are testing a number of options, how quickly the emission equivalent values would rise. For example, in a six-month AI training effort involving 4,789 modeling options, the amount of CO2 emitted equaled 35,800 kilograms (78,000 pounds). Multiply that last statistic by the geometric growth in AI research, and particularly, using large neural networks, and you have a growing and massive carbon footprint issue that needs to be addressed.

Would using third-party cloud computing resources mitigate the problem?

In the paper, the authors address this issue by looking at the shared computing resources offered by companies such as AWS, Google, and Microsoft, looking at advantages and disadvantages of doing the research using third-party commercial data farms. What they found out is that in most cases, the economic benefit is not immediately apparent using cloud-based commercial server farms proving to be twice as expensive as relying on home-built resources. And once built, the home-based servers can be repeatedly used for subsequent projects for the cost of energy and operational maintenance.

The authors post for consideration that because of the burgeoning field of AI research, a good national strategy consider cloud-based data farms built by governments for common use and that these facilities be powered from renewable energy sources. Continuing down the current path, the authors conclude is not feasible because the computational resource requirements for advancements in AI will in time become environmentally, energy, and cost prohibitive.

 

lenrosen4
lenrosen4https://www.21stcentech.com
Len Rosen lives in Oakville, Ontario, Canada. He is a former management consultant who worked with high-tech and telecommunications companies. In retirement, he has returned to a childhood passion to explore advances in science and technology. More...

2 COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here


Most Popular

Recent Comments

Verified by ExactMetrics