Environmental problems

AI energy consumption poses environmental problems

Training an advanced AI model takes time, money, and high-quality data. It also takes energy – a lot.

Between storing data in large-scale data centers and using that data to train a machine learning or deep learning model, the power consumption of AI is high. While an AI system can pay off, AI poses an environmental problem.

AI energy consumption during training

Take for example some of the most popular language models.

OpenAI trained its GPT-3 model on 45 terabytes of data. To form the final version of MegatronLM, a similar language model but smaller than GPT-3, Nvidia ran 512 V100 GPUs over nine days.

A single V100 GPU can consume between 250 and 300 watts. If we assume 250 watts, then 512 V100 GPUs consume 128,000 watts, or 128 kilowatts (kW). Running for nine days means training the MegatronLM cost 27,648 kilowatt hours (kWh).

The average household uses 10,649 kWh per year, according to the US Energy Information Administration. Therefore, forming the final version of MegatronLM used almost the amount of energy used by three households in one year.

New training techniques reduce the amount of data needed to train machine learning and deep learning models, but many models still need a huge amount of data to complete an initial training phase and additional data to stay up to date.

Data center power consumption

As AI becomes more complex, expect some models to use even more data. This is a problem, because data centers use an incredible amount of energy.

“Data centers are going to be one of the most impacting things on the environment,” said Alan Pelz-Sharpe, founder of the analysis firm Deep Analysis.

AI has many benefits for businesses, but it creates problems for the environment.

IBM’s weather company processes approximately 400 terabytes of data per day to enable its models to predict weather days in advance around the world. Facebook generates approximately 4 petabytes (4000 terabytes) of data per day.

People generated 64.2 zettabytes of data in 2020. That’s about 58,389,559,853 terabytes, estimated market research firm IDC.

Data centers store this data all over the world.

Meanwhile, the largest data centers require more than 100 megawatts of electrical capacity, which is enough to power some 80,000 American homes, depending on energy and climate. think tank Energy innovation.

With about 600 hyperscale data centers – data centers that exceed 5,000 servers and 10,000 square feet – around the world, it’s not clear how much power is needed to store all of our data, but the number is probably staggering.

From an environmental perspective, the energy consumption of data centers and AI is also a nightmare.

Google data center
A Google data center in Douglas County, Georgia.

AI, data and environment

The use of energy creates CO2, the main greenhouse gas emitted by humans. In the atmosphere, greenhouse gases like CO2 trap heat near the Earth’s surface, causing the Earth’s temperature to rise and disrupting delicate ecosystems.

“We have an energy consumption crisis,” said Gerry McGovern, author of the book Waste in the world.

AI is energy intensive, and the higher the demand for AI, the more energy we use, he said.

“It’s not just electrical energy to train AI,” he said. “It builds the supercomputers. It collects and stores data.”

McGovern pointed to estimates that by 2035 humans will have produced more than 2,000 zettabytes of data.

Data centers are going to be one of the most impacting things on the environment.

Alan pelz sharpeFounder, Deep Analysis

“The storage energy required for this will be astronomical,” he said.

Currently, the heaviest users of data are not doing much about the carbon footprint or the problem of AI power consumption.

“I am aware of a certain recognition [of AI’s carbon footprint problem] but not a lot of action, “said McGovern.” Data centers, which are the ‘food source’ for AI, have focused on electrical efficiency and have certainly made major improvements over the past 10 years. last years.”

While data centers have become more electrically efficient over the past decade, experts estimate that electricity accounts for only about 10% of a data center’s CO2 emissions, McGovern said. The infrastructure of a data center, including the building and cooling systems, also produces a lot of CO2.

On top of that, data centers also use a lot of water as a form of evaporative cooling. This cooling method reduces electricity consumption, but can use millions of gallons of water per day per hyperscale data center. Additionally, the water used can become polluted in the process, McGovern noted.

“There is always this broad assumption that digital is inherently green, and it is far from it,” he said.

Environmental impact of companies

While the average business cannot change the way large companies store their data, companies that care about their environmental footprint can focus on high quality creation, rather than large amounts of data. For example, they can delete data that they no longer use; companies tend not to use 90% of data 90 days after it is stored, according to McGovern.

Businesses can also adjust the way they use AI or the type of AI they use.

Organizations can think about the specific use case they want to accomplish and choose an AI or automation technology dedicated to that use case. However, different types of AI have additional AI power consumption costs.

Businesses can get carried away with the idea that they need an advanced deep learning system that can do it all, Pelz-Sharpe said. However, if they want to tackle a targeted use case, like automating an invoicing process, they don’t need an advanced system. These systems are expensive and use a lot of data, which means they have a high carbon footprint.

A dedicated system will have been trained on a much smaller amount of data while probably completing a specific use case as well as a more general system.

“Because it is highly specialized, the AI ​​has been trained on the most precise data possible” while keeping a small set of data, Pelz-Sharpe said. A deep learning model, on the other hand, has to process huge amounts of data to achieve anything.

“In all of our decisions, we must take into account the experience of the land,” said McGovern.


Source link

Leave a Reply

Your email address will not be published. Required fields are marked *