Close Menu
Karachi Chronicle
  • Home
  • AI
  • Business
  • Entertainment
  • Fashion
  • Politics
  • Sports
  • Tech
  • World

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Today’s Earthquake: Massive Trembling 3.8 Jorz Afghanistan

Do you have three years of citizenship? The new transition in Germany, Visa Freeze rules explained |

OPEC countries will “implement production adjustment” for 411K BPD in July

Facebook X (Twitter) Instagram
  • Home
  • About us
  • Advertise
  • Contact us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
Facebook X (Twitter) Instagram Pinterest Vimeo
Karachi Chronicle
  • Home
  • AI
  • Business
  • Entertainment
  • Fashion
  • Politics
  • Sports
  • Tech
  • World
Karachi Chronicle
You are at:Home » Explained: The environmental impact of generative AI | Massachusetts Institute of Technology News
AI

Explained: The environmental impact of generative AI | Massachusetts Institute of Technology News

Adnan MaharBy Adnan MaharJanuary 17, 2025No Comments7 Mins Read1 Views
Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Reddit
Share
Facebook Twitter LinkedIn Pinterest WhatsApp Email


MIT News explores the environmental impact of generative AI in a two-part series. This article explains why this technology is so resource-intensive. In the second article, we explore what experts are doing to reduce genAI’s carbon footprint and other impacts.

From increasing worker productivity to advancing scientific research, the excitement surrounding the potential benefits of generative AI is hard to ignore. The explosive growth of this new technology has enabled the rapid deployment of powerful models across many industries, but the environmental impact of this generative AI “gold rush” is far from being mitigated. It also remains difficult to identify.

The computational power required to train generative AI models, often with billions of parameters, such as OpenAI’s GPT-4, can require staggering amounts of electricity, which reduces carbon emissions. leading to an increase and pressure on the power grid.

Additionally, to deploy these models into real-world applications to enable millions of people to use generated AI in their daily lives, and then fine-tune the models to improve performance, model development is required. A large amount of energy is required over a long period of time.

In addition to electricity demands, large amounts of water are required to cool the hardware used to train, deploy, and fine-tune generative AI models, which strains municipal water supplies and disrupts local ecosystems. may be destroyed. The increase in the number of generative AI applications has also stimulated demand for high-performance computing hardware, adding to the indirect environmental impacts of its manufacturing and transportation.

“When we think about the impact that generative AI has on the environment, it’s not just the power consumed when we plug in a computer. There are broader impacts that go down to the system level and persist based on the actions we take. ” he says. Elsa A. Olivetti, professor in the Department of Materials Science and Engineering and MIT’s new climate project decarbonization mission leader.

In response to an institute-wide call for papers exploring the transformative potential of generative AI, both positive and negative, Olivetti co-authored a 2024 paper with colleagues at MIT called “The Climate of Generative AI.” and its implications for sustainability”. Orientation to society.

Demanding data center

Data center power demands contribute to the environmental impact of generative AI, as data centers are used to train and run deep learning models behind popular tools such as ChatGPT and DALL-E. One of the major factors.

A data center is a temperature-controlled building that houses computing infrastructure such as servers, data storage drives, and networking equipment. For example, Amazon has more than 100 data centers around the world, and each data center uses approximately 50,000 servers to support its cloud computing services.

While data centers have existed since the 1940s (the first was built at the University of Pennsylvania in 1945 to support ENIAC, the first general-purpose digital computer), the rise of generative AI has led to data The pace of center construction has increased dramatically.

“What makes generative AI different is the power density required. Although it’s basically just compute, generative AI training clusters can consume 7 to 8 times more energy than typical compute workloads. ” said Norman Bashir, lead author of the Impact paper and a Computing and Climate Change Impacts Fellow at MIT Climate. Sustainability Consortium (MCSC) and Computer Science and Artificial Intelligence Laboratory (CSAIL) Postdoctoral Fellow.

Scientists estimate that data center power requirements in North America will increase from 2,688 megawatts at the end of 2022 to 5,341 megawatts at the end of 2023, driven in part by the demand for generative AI. Global data center power consumption increased to 460 terawatts in 2022. This would make data centers the world’s 11th largest consumer of electricity, behind Saudi Arabia (371 terawatts) and France (463 terawatts). Organization for Economic Co-operation and Development.

By 2026, data center electricity consumption is expected to approach 1,050 terawatts (which would move data centers to fifth place in the world rankings, between Japan and Russia).

Although not all data center computing includes generative AI, this technology is a major driver of increased energy demand.

“The demand for new data centers cannot be met in a sustainable way. The rate at which companies are building new data centers means that the majority of the electricity must come from fossil fuel-based power plants. It means something,” Bashir says.

It’s difficult to see the power required to train and deploy a model like OpenAI’s GPT-3. In a 2021 research paper, scientists from Google and the University of California, Berkeley, found that the training process alone consumes 1,287 megawatt-hours of electricity (enough to power about 120 average U.S. homes for a year). It was estimated that approximately 552 tons of carbon dioxide would be generated.

All machine learning models need to be trained, but one of the challenges specific to generative AI is that energy usage fluctuates rapidly during different stages of the training process, Bashir explains.

Grid operators must have a way to absorb these fluctuations to protect the grid, and typically employ diesel-based generators for that purpose.

Increasing influence through inference

Once a generative AI model is trained, its energy demands do not disappear.

Every time the model is used, perhaps by an individual asking ChatGPT to summarize an email, the computing hardware that performs those operations consumes energy. Researchers estimate that ChatGPT queries consume approximately five times more power than a simple web search.

“But everyday users don’t think much about it,” Bashir says. “The ease of use of generative AI interfaces and the lack of information about the environmental impact of our actions means that we as users have little incentive to reduce our use of generative AI.”

In traditional AI, energy usage is split roughly evenly between data processing, model training, and inference (the process of using a trained model to predict new data). But Bashir expects the power demands of generative AI inference to eventually become mainstream. This is because these models are prevalent in so many applications, and as future versions of the models become larger and more complex, the power required for inference will increase.

Additionally, the increasing demand for new AI applications means that generative AI models have a particularly short shelf life. Because companies release new models every few weeks, the energy used to train previous versions is wasted, Bashir adds. New models typically have more parameters than previous models, so they often consume more energy to train.

While data center power demands may receive most of the attention in the research literature, the amount of water consumed by these facilities also impacts the environment.

Chilled water is used to cool data centers by absorbing heat from computing equipment. Bashir estimates that for every kilowatt-hour of energy consumed by a data center, it requires two liters of water for cooling.

“Just because it’s called ‘cloud computing’ doesn’t mean the hardware resides in the cloud. Data centers exist in our physical world and use water; It impacts biodiversity both directly and indirectly,” he says.

Computing hardware within a data center has its own non-direct environmental impacts.

It is difficult to estimate how much power is required to manufacture GPUs, a type of powerful processor that can handle intensive generative AI workloads, but the more complex manufacturing process makes it easier to manufacture simpler CPUs. It will take more power than you need to do it. The carbon footprint of GPUs is further increased by emissions associated with transporting materials and products.

Obtaining the raw materials used to manufacture GPUs also has environmental implications, including dirty mining procedures and potentially toxic chemicals used in processing.

Market research firm TechInsights estimates that the three largest manufacturers (NVIDIA, AMD, and Intel) shipped 3.85 million GPUs to data centers in 2023, up from about 2.67 million GPUs in 2022. That number is expected to increase by an even greater percentage in 2024. .

Although the industry is on an unsustainable path, Bashir says there are ways to encourage the responsible development of generative AI that supports environmental goals.

He and Olivetti and their MIT colleagues argue that this requires not only comprehensive consideration of all environmental and social costs of generative AI, but also a detailed assessment of the value of the perceived benefits. It is claimed that there is.

“We need a more contextualized way to systematically and comprehensively understand the impact of new developments in this field. We haven’t had a chance to catch up to our ability to do that,” Olivetti said.



Source link

Share. Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Telegram Email
Previous ArticleDonald Trump and Elon Musk talk about AI and cybersecurity with Microsoft CEO and ET CISO Satya Nadella
Next Article European carmakers at risk of high carbon credit demands from Chinese rivals
Adnan Mahar
  • Website

Adnan is a passionate doctor from Pakistan with a keen interest in exploring the world of politics, sports, and international affairs. As an avid reader and lifelong learner, he is deeply committed to sharing insights, perspectives, and thought-provoking ideas. His journey combines a love for knowledge with an analytical approach to current events, aiming to inspire meaningful conversations and broaden understanding across a wide range of topics.

Related Posts

Dig into Google Deepmind CEO “Shout Out” Chip Engineers and Openai CEO Sam Altman, Sundar Pichai responds with emojis

June 1, 2025

Google, Nvidia invests in AI startup Safe Superintelligence, co-founder of Openai Ilya Sutskever

April 14, 2025

This $30 billion AI startup can be very strange by a man who said that neural networks may already be aware of it

February 24, 2025
Leave A Reply Cancel Reply

Top Posts

President Trump’s SEC nominee Paul Atkins marries multi-billion dollar roof fortune

December 14, 2024100 Views

20 Most Anticipated Sex Movies of 2025

January 22, 202588 Views

Alice Munro’s Passive Voice | New Yorker

December 23, 202456 Views

2025 Best Actress Oscar Predictions

December 12, 202434 Views
Don't Miss
AI June 1, 2025

Dig into Google Deepmind CEO “Shout Out” Chip Engineers and Openai CEO Sam Altman, Sundar Pichai responds with emojis

Demis Hassabis, CEO of Google Deepmind, has expanded public approval to its chip engineers, highlighting…

Google, Nvidia invests in AI startup Safe Superintelligence, co-founder of Openai Ilya Sutskever

This $30 billion AI startup can be very strange by a man who said that neural networks may already be aware of it

As Deepseek and ChatGpt Surge, is Delhi behind?

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

About Us
About Us

Welcome to Karachi Chronicle, your go-to source for the latest and most insightful updates across a range of topics that matter most in today’s fast-paced world. We are dedicated to delivering timely, accurate, and engaging content that covers a variety of subjects including Sports, Politics, World Affairs, Entertainment, and the ever-evolving field of Artificial Intelligence.

Facebook X (Twitter) Pinterest YouTube WhatsApp
Our Picks

Today’s Earthquake: Massive Trembling 3.8 Jorz Afghanistan

Do you have three years of citizenship? The new transition in Germany, Visa Freeze rules explained |

OPEC countries will “implement production adjustment” for 411K BPD in July

Most Popular

ATUA AI (TUA) develops cutting-edge AI infrastructure to optimize distributed operations

October 11, 20020 Views

10 things you should never say to an AI chatbot

November 10, 20040 Views

Character.AI faces lawsuit over child safety concerns

December 12, 20050 Views
© 2025 karachichronicle. Designed by karachichronicle.
  • Home
  • About us
  • Advertise
  • Contact us
  • DMCA
  • Privacy Policy
  • Terms & Conditions

Type above and press Enter to search. Press Esc to cancel.