The voice of impact
Greenlyhttps://images.prismic.io/greenly/43d30a11-8d8a-4079-b197-b988548fad45_Logo+Greenly+x3.pngGreenly, la plateforme tout-en-un dédiée à toutes les entreprises désireuses de mesurer, piloter et réduire leurs émissions de CO2.
GreenlyGreenly, la plateforme tout-en-un dédiée à toutes les entreprises désireuses de mesurer, piloter et réduire leurs émissions de CO2.
Descending4
Home
1
Blog
2
Category
3
The Environmental Impact of Artificial Intelligence
4
Media > Data Stories > The Environmental Impact of Artificial Intelligence

The Environmental Impact of Artificial Intelligence

AITechnology
Level
Hero Image
Hero Image
data story cover
In this case we’ll look at the emissions of GPT-3 technology where it is used to “reply automatically to 1 million emails per month, over the course of 1 year”.
Industries
2025-02-27T00:00:00.000Z
en-gb
Data Story

Artificial intelligence (AI) has exploded in recent years, making its way into everyday life and promising to reshape the job market. Whether it's AI chatbots like ChatGPT answering customer service queries, generative AI designing graphics and music, or AI-driven algorithms predicting financial trends, this technology is here to stay.

But AI's rise comes with consequences. While it promises to revolutionise productivity, job markets, and economic growth, it also demands an enormous amount of energy and resources. As AI scales up, so does its environmental footprint, raising critical questions about sustainability.

AI's rapid growth

AI is growing at a breakneck pace:

  • The global AI market is projected to contribute $15.7 trillion to the global economy by 2030, with a 23% boost in GDP. 
  • Companies are racing to integrate AI, with 49% of technology leaders stating that AI is now fully embedded in their core business strategies. In fact, this shift is already evident, as 37% of employees report working in companies driven by AI-powered data usage and organisation.
  • AI-powered tools are supercharging efficiency - some businesses are reporting 20-30% productivity gains, accelerating workflows, reducing costs, and creating new revenue streams.

What’s fueling this explosion? Three main factors: cheaper computing power, better machine learning models, and companies realising that AI isn’t just an experiment - it’s a necessity.

robots in front of their laptops

AI and the job market

Whenever new technology takes off, it disrupts jobs and AI is no different.

On one hand, AI is automating tasks, cutting down on repetitive work, and streamlining operations. However, many administrative and data-processing roles are at risk. The International Monetary Fund estimates that AI could impact 40% of jobs worldwide, with some roles disappearing entirely.

On the other hand, AI isn’t just eliminating jobs - it’s creating new ones. Entire fields like AI ethics, AI auditing, and machine learning engineering didn’t exist a decade ago. Many companies are hiring AI specialists at record rates, and workers who learn how to use AI effectively are becoming more valuable.

Then there's the "AI agent" revolution. Businesses are increasingly using AI-powered digital workers - virtual assistants that handle emails, schedule meetings, write reports, and even analyse market trends. 

AI in everyday life

Most people don’t realise just how deeply AI is embedded in their daily routines. Whether it’s sorting emails, navigating traffic, or curating personalised content, AI is working behind the scenes to enhance efficiency and convenience.

Some examples?

  • Entertainment: Netflix, Spotify, and YouTube all use AI to recommend what to watch, listen to, or read next.
  • Shopping: AI curates personalised ads and product recommendations, influencing what people buy online.
  • Navigation: Google Maps and Waze use AI to analyse traffic patterns in real-time.
  • Healthcare: AI is improving medical diagnostics, predicting disease outbreaks, and even assisting in robotic surgeries.

Despite its widespread use, public awareness of AI’s presence in everyday life remains mixed. According to a Pew Research Center survey of 11,004 adults in the US, 27 % of respondents say they interact with AI several times a day, while another 28 % believe they use it about once a day or several times a week. However, 44 % of respondents think they do not regularly interact with AI at all.

This perception gap suggests that AI’s role is far more extensive than people realise. In the same survey, only 30 % of Americans could correctly identify six common examples of AI in daily life. This means that while many people report using AI sparingly, they likely interact with it far more frequently than they recognise.

AI’s role - from medicine to climate solutions

AI isn't just making life more convenient, it’s reshaping industries.

  • Healthcare: In some cases AI can analyse medical scans faster than human doctors, flagging potential health issues with high accuracy. Some hospitals are already using AI to optimise staffing and reduce patient wait times.
  • Technology: AI is speeding up software development, cybersecurity, and even chip manufacturing.
  • Finance: AI algorithms manage investments, detect fraud, and analyse market risks in real-time.
  • Climate change & sustainability: AI is helping model climate scenarios, optimise renewable energy grids, and even monitor deforestation through satellite imagery.
The potential is staggering. But there's a catch - AI itself has a negative impact on the environment.

Beyond its environmental impact, AI’s rapid expansion has also raised concerns about its misuse. For example, during the 2024 presidential election, Donald Trump circulated deepfake videos of Taylor Swift, potentially misleading fans into believing she had endorsed him. As AI-generated content becomes more advanced, addressing the ethical risks - and the additional energy demands of training and running these models - will be increasingly critical.

an artificial human face

The hidden cost of artificial intelligence

For all its benefits, AI comes at a high environmental price.

  • Energy consumption: AI computing is far more power-intensive than traditional computing. AI servers require significantly more electricity than standard computing systems, with AI server racks consuming anywhere from 30 to 100 kilowatts - compared to just 7 kilowatts for traditional servers. This increased demand is due to AI’s reliance on high-powered hardware optimised for complex matrix-based calculations. Importantly, AI’s energy footprint extends beyond training: only 20–40% of AI’s total energy use comes from training models, while 60–70% comes from inference (the continuous process of running AI models in real-time).
  • Water consumption: AI data centers require extensive cooling, leading to high water consumption. Research from the University of California, Riverside and The Washington Post found that generating a 100-word email with ChatGPT-4 consumes 519 milliliters of water - roughly a full bottle. At scale, this puts increasing pressure on local water supplies.
  • Electronic waste: The AI boom is increasing demand for specialised hardware, leading to more discarded processors and server parts.

AI’s resource consumption is only set to grow as competition in the sector accelerates. In the US a $500 billion private-sector investment led by OpenAI, SoftBank, and Oracle aims to expand AI infrastructure with the construction of new data centers, intensifying concerns over energy and water use. Meanwhile, China’s DeepSeek launch has disrupted the market, signaling an escalating global AI race with mounting environmental consequences.

Artificial intelligence is reshaping the world faster than we ever imagined. It’s revolutionising businesses, disrupting industries, and changing the way we work and live. But with its rapid rise comes an urgent need to address its environmental impact. The question is no longer "will AI transform the world?". The real question is: “can we make AI’s future a sustainable one?

The rise of ChatGPT

Among the many AI applications reshaping our world, one stands out in both its impact and scale - ChatGPT. With hundreds of millions of users and a rapidly growing presence in daily workflows, AI-powered chatbots are driving up energy demand like never before.

The environmental cost of these systems is particularly striking, as the power required to generate text, answer queries, and sustain ongoing conversations adds up to an enormous carbon footprint. ChatGPT, one of the most widely used AI models, offers a case study in just how energy-intensive AI has become.

The environmental cost of ChatGPT’s rapid growth

By December 2024, ChatGPT had reached an astounding 300 million users - a number that continues to grow as AI becomes further embedded in daily life.

Yet, while AI adoption skyrockets, the environmental impact of ChatGPT remains largely unknown. Data centers are projected to expand by 28 % by 2030, and AI’s energy demands are rapidly rising, with estimates suggesting it could account for 3–4 % of global electricity consumption by the end of the decade. Carbon emissions linked to AI are also expected to double between 2022 and 2030, amplifying its environmental footprint.

The previous iteration of ChatGPT, ChatGPT-3, with 175 billion parameters, consumed 1,287 MWh of electricity per year - equivalent to 502 metric tons of carbon emissions, or the annual footprint of 112 gasoline-powered cars!

ChatGPT in 2025 

By 2025, ChatGPT is more relevant than ever, with users relying on AI for everything from planning vacations to handling tedious work emails.

However, as AI becomes more sophisticated, the need for multi-agent systems (where multiple AI models work together to complete complex tasks) will grow. This shift could significantly increase computing demands and energy consumption, ultimately raising AI’s overall carbon footprint.

So, how is ChatGPT shaping daily life in 2025?

  • Workplace productivity: From drafting emails and reports to summarising meetings, ChatGPT has become a standard tool in offices worldwide. While it boosts efficiency, its widespread use raises concerns about data security, job displacement, and the environmental impact of increased server demand.
  • Customer service & automation: Businesses increasingly rely on ChatGPT for customer interactions, reducing human workload but also generating vast amounts of automated text that contribute to digital clutter and energy consumption.
  • Education & learning: ChatGPT is now a go-to resource for students and professionals alike, assisting with tutoring, language translation, and even research. 
  • Personal assistance: Whether helping users plan trips, compose social media posts, or even generate meal plans, ChatGPT has become an everyday assistant. While this convenience is valuable, the energy costs of processing countless individual queries continue to grow.

As ChatGPT’s role in daily life expands, so too do questions about its environmental impact, ethical use, and long-term implications for human interaction.

Impact of generative AI

Take a simple task like using AI to draft a routine email. While writing in one language may seem inconsequential, ChatGPT relies heavily on GPUs (Graphics Processing Units) for both training and inference, given their high parallel processing capabilities - making even simple AI interactions surprisingly energy-intensive.

As a result, generating a single AI response consumes roughly the same amount of energy as fully charging a smartphone. This highlights how even casual AI use contributes to excess carbon emissions - especially at scale. 

In 2025, AI is deeply embedded in everyday tasks like email writing and search queries, making its environmental impact far greater than many users realise. With billions of interactions occurring daily, the cumulative emissions are both substantial and growing.

The energy intensity of different tasks

A study, conducted by Hugging Face and Carnegie Mellon University and led by Sasha Luccioni, an AI researcher at Hugging Face, evaluated 88 different AI models performing popular tasks such as:

  • Text generation;
  • Asking questions;
  • Image classification;
  • Image captioning;
  • Image generation.

The table below shows the study's results for the energy consumption of different AI tasks.

AI Task Energy Used per 1,000 Queries (kWh) CO₂ Emissions per 1,000 Queries
Text classification 0.002 ~0.3g CO₂e
Text generation 0.047 ~7.5g CO₂e
Summarization 0.049 ~8g CO₂e
Image classification 0.007 ~1.1g CO₂e
Object detection 0.038 ~6.1g CO₂e
Image generation 2.9 1,594g CO₂e (4.1 miles driven)

The results from the study highlight the significant energy demands of generative AI compared to task-oriented AI.

Additional research supports these findings, showing that generating 1,000 images can produce as much CO₂ as driving 4.1 miles (6.6 km) in a gasoline-powered car. In contrast, text generation requires far less energy - consuming only 16 % of a full smartphone charge, with emissions equivalent to driving just 0.0006 miles.

On average, studies have found that AI image generation is 60 times more energy-intensive than text generation. 

Beyond inference, generative AI models also require extensive training, further increasing their overall energy footprint. For example, training the BLOOMz AI model (with over 100 billion parameters) consumes 0.0001 kWh per query. To put this into perspective, even the least energy-efficient image-generation model used as much energy as 522 smartphone charges for just 1,000 inferences.

Overall, the shift to AI agent networks and expanded generative AI use is expected to drive a significant increase in energy consumption, placing greater demand on data centers and raising concerns about the environmental impact of AI development.

What is the carbon footprint of ChatGPT4?

As AI adoption accelerates, understanding its environmental impact is crucial. While AI models like ChatGPT provide immense value in automating tasks and improving efficiency, their rapid growth comes with a significant carbon cost. The training and use of these models require vast amounts of computing power, electricity, and cooling, contributing to a rising energy footprint.

But just how big is this impact? To quantify AI’s environmental toll, we need to break down the carbon footprint of one of the most widely used AI models - ChatGPT4.

Before we dig into the numbers, let's first take a look at the difference between GPT-4 and ChatGPT: 

While ChatGPT and GPT-4 are often mentioned interchangeably, they serve distinct purposes: 

  • GPT-4 is the underlying AI model - a powerful, general-purpose language model capable of performing a wide range of tasks, from text generation and summarisation to translation and data analysis. It can be used in various applications beyond conversation, including research, business intelligence, and predictive modeling. 
  • ChatGPT, on the other hand, is a product built on GPT-4, specifically optimised for conversational interactions. It is fine-tuned to provide real-time responses, making it ideal for chatbots, virtual assistants, and customer support. 

In essence, GPT-4 is the foundation, while ChatGPT is a tailored implementation designed to enhance dialogue-based AI experiences.

Greenly’s methodology

To assess the carbon emissions associated with GPT-4, Greenly has modeled a realistic business use case: automatically replying to 1 million emails per month over the course of a year. This scenario captures the emissions generated across two key stages: training and use, both of which contribute to GPT-4’s overall carbon footprint.

For training, this estimate assumes that 25,000 NVIDIA A100 GPUs were used over a 100-day period at 30 % workload. The data centers are modeled as large-scale, well-optimised facilities with a Power Usage Effectiveness (PUE) of 1.1, located in the United States, where the average electricity mix emits 403.6 gCO₂e/kWh (based on IEA data).

By considering both the training and usage phases, this methodology provides a comprehensive view of GPT-4’s carbon impact in a real-world business application.

Emissions linked to training GPT-4

GPT-4 first needs to be trained to become competent and efficient at writing appropriate email responses - a process that requires extensive computing power over many hours. In this scenario, as per our methodology, we assume the training occurs in large-scale data centers with a Power Usage Effectiveness (PUE) of 1.1.

Training GPT-4 in a data center requires a range of equipment, including servers, cooling systems, and networking infrastructure. Rather than accounting for the continuous operation of this equipment over a full year, the environmental impact is calculated by amortising the energy consumption from the 100-day training period over one year. This results in a total of 5,759,430 kgCO₂e (kilograms of carbon dioxide equivalent).

In addition to direct energy consumption, data centers also contribute to emissions through refrigerant leaks from cooling systems, which are essential for preventing overheating in high-performance computing environments. Over the course of a year, these refrigerant leaks are estimated to produce an additional 49,587 kgCO₂e.

The manufacturing of GPUs used to train GPT-4 also carries a significant footprint, producing and estimated 1,329,141 kgCO₂e (based on a 4-year lifespan and 30% workload).

The total emissions from training GPT-4 over one year amount to 7,138,158 kgCO₂e.

Emissions linked to GPT-4 use

Once GPT-4 has been fully trained, it can be deployed to perform the task of automatically responding to 1 million emails per month. Over the course of a year, this amounts to 12 million emails, with the following carbon emissions.

Factoring in electricity consumption, cooling requirements, and server manufacturing, the operation of GPT-4 for this task results in a total of 514,800 kgCO₂e over one year - equivalent to 42,900 kgCO₂e per month.

GPT-4's total carbon footprint

Based on these calculations, for GPT-4 to answer 1 million emails per month, the emissions from the use phase alone amount to 42,900 kgCO₂e.  However, when including both the training and use phases, amortised over a single month, the total impact rises to 637,771 kgCO₂e, which equates to approximately 360 round trips between Paris and New York in just one month (over 4,300 a year!).

It should also be noted that if GPT-4 were to be used to respond to emails in more than one language, it would be necessary to re-train the technology for each language. This means that total emissions scale with the number of languages required, making the carbon footprint significant for companies operating in multiple languages. 

However, the impact can be mitigated by adjusting the training frequency or leveraging the model for multiple tasks, which helps amortise its overall environmental footprint over a broader range of applications.

a robot looking at a plant

Key findings

  • Training is the most carbon-intensive stage: Training is often the most carbon-intensive stage for a singular task: Greenly’s calculations reveal that training GPT-4 results in significantly higher carbon emissions than its actual use - up to 15 times higher over the course of one year for the specific use case assessed. However, it's important to note that this comparison is based on a defined one-year period for a single task. In reality, GPT-4 is deployed globally across billions of queries every day, meaning that over time, the cumulative emissions from usage can quickly surpass those from training.
  • The choice of data center strongly influences emissions: The location and efficiency of the data center play a major role in determining emissions. For example, a traditional data center produces significantly more emissions than a well-optimised one. Additionally, the country in which the data center is located affects emissions levels due to differences in electricity grid intensity. Greenly’s calculation is based on a data center in the United States, but if the same model were trained in France, emissions would be approximately three times lower due to France’s lower-carbon electricity mix.
  • GPT-4 has significantly higher emissions than comparable AI models: Compared to other AI technologies performing similar tasks, GPT-4’s emissions are substantially higher. This is largely due to the greater number of GPU hours required for pre-training - the cumulative duration for which individual GPUs are used for deep learning. As a result, GPT-4 demands more electricity, more servers, and greater overall resource consumption.
  • GPT-4's energy use is approximately 20 times higher than GPT-3: GPT-4's computational intensity is driven by its massive 1.8 trillion parameters, significantly increasing electricity consumption, GPU hours, and overall environmental impact. Compared to GPT-3, GPT-4 requires around 20 times more energy across these factors.
  • The AI arms race is amplifying energy demand: More broadly, there is an escalating race among AI developers to scale up model parameters, pushing the limits of computational power. As models are trained on ever-larger datasets with increasing complexity, the energy required for training continues to rise, amplifying AI’s overall environmental footprint.
As AI adoption accelerates and its environmental footprint grows, the industry is under increasing pressure to find more sustainable solutions. While GPT-4 and similar models have set new benchmarks for performance, their high energy demands highlight the urgent need for efficiency improvements.

This is where emerging players like DeepSeek are stepping in, claiming to deliver comparable AI capabilities with a fraction of the computing power. Could this signal a shift toward more sustainable AI development? If DeepSeek’s approach proves scalable, it may offer a glimpse into a future where AI innovation and environmental responsibility go hand in hand.

DeepSeek - the AI disruptor with a smaller environmental impact?

On the 20th of January, China’s DeepSeek sent shockwaves through the AI industry, promising a breakthrough that could dramatically reduce the energy footprint of artificial intelligence. The company claims its R1 model achieves performance comparable to OpenAI’s GPT-4 while using a fraction of the computing power. 

Unlike its American rivals, which rely on thousands of high-powered Nvidia chips, DeepSeek claims to have trained its model using just 2,000 Nvidia H800 chips - a fraction of the 16,000+ chips needed for comparable models like Meta’s Llama 3.1. The result? Training DeepSeek R1 reportedly cost just $6 million, compared to an estimated $60 million for Meta’s model.

At first glance, this efficiency breakthrough appears to be an environmental game-changer. AI’s growing energy demands have raised alarms, with projections suggesting that data centers could account for up to 12 % of US electricity consumption by 2028, up from 4.4 % in 2023 - a similar trend can be expected in other technology-reliant countries around the world. 

Globally, AI infrastructure is already responsible for roughly 1% of energy-related greenhouse gas emissions, with the International Energy Agency (IEA) warning that global electricity demand for AI and data centers could double by next year, reaching levels comparable to Japan’s entire annual electricity consumption. 

A new model for efficiency?

A key innovation behind this efficiency is DeepSeek’s Mixture-of-Experts (MoE) architecture. Unlike conventional AI models that activate their full network for every query, MoE assigns tasks to specialised sub-models, only activating the necessary computing power for a given request. This selective computation reduces overall energy consumption while maintaining performance.

Additionally, DeepSeek claims its model is more efficient in data storage and does not require the same level of high-performance computing hardware, helping to further reduce resource demand.

The company had to develop these innovations out of necessity - due to US sanctions restricting access to Nvidia’s most advanced AI chips. This limitation forced DeepSeek to design models that maximise efficiency rather than relying on large-scale computing power.

Could DeepSeek cut AI's carbon footprint?

If DeepSeek’s claims hold true, its approach could significantly lower the emissions associated with AI. Some key advantages include:

  • Lower energy demand: Training DeepSeek’s V3 model required only 2.78 million GPU hours, compared to Meta’s Llama 3.1 model, which used 30.8 million GPU hours despite relying on newer, more efficient chips.
  • Fewer AI chips: DeepSeek’s reliance on 2,000 Nvidia H800 chips, compared to the 16,000+ used by its competitors, means reduced demand for chip production, which itself is highly resource-intensive.
  • Local processing capabilities: As an open-weight model, DeepSeek’s AI can be downloaded and run on local devices, reducing the need for cloud computing and data center usage. If widely adopted, this could cut AI’s dependence on energy-hungry server farms.
  • Lower water consumption: Data centers not only consume massive amounts of electricity but also require large quantities of water for cooling. A 2023 study found that training a large AI model like GPT-3 could consume nearly a million liters of water. More critically, global AI demand is projected to account for 4.2–6.6 billion cubic meters of water withdrawals in 2027. If DeepSeek reduces reliance on large-scale data centers, it could help mitigate this impact.

Could efficiency increase demand?

While DeepSeek’s efficiency breakthroughs are promising, they also raise concerns that this increased efficiency will lead to higher overall consumption. Microsoft CEO Satya Nadella acknowledged this risk, posting on X, Jevons paradox strikes again! As AI gets more efficient and accessible, we will see its use skyrocket, turning it into a commodity we just can’t get enough of.”

If AI models become cheaper and more efficient, adoption could grow exponentially, ultimately increasing electricity demand rather than reducing it. 

The future of AI and sustainability?

DeepSeek’s model heralds a potential shift in AI development, challenging the assumption that expanding AI requires ever-growing infrastructure and energy use. If AI companies adopt similar low-power architectures and local processing capabilities, the sector could become less dependent on energy-intensive data centers.

However, it remains to be seen whether DeepSeek’s efficiency claims will translate into meaningful reductions in global AI emissions.

For AI to become truly sustainable, improvements in model efficiency must be paired with responsible deployment and investments in clean energy infrastructure. Otherwise, AI’s environmental footprint will continue to grow, even as individual models become more efficient.

How we could reduce the impact of AI?

As AI adoption accelerates, so does its environmental impact. However, there are several ways to reduce the energy intensity of generative AI:

  1. Optimised hardware: Transitioning from NVIDIA GPUs to TPUs (Tensor Processing Units), such as Google’s TPUs, can improve energy efficiency and reduce electricity consumption. TPUs are specifically designed for machine learning workloads, making them more power-efficient.
  2. Smarter cloud computing: The choice of data center significantly impacts emissions. For example, Google’s Iowa data center, powered largely by clean energy, achieved a 5.4x reduction in emissions compared to data centers running on fossil-fuel-heavy grids.
  3. More efficient Training mechanisms: Since training remains one of the most energy-intensive stages of AI development, optimising training algorithms can reduce emissions by up to 88x, cutting down on computational requirements and energy use. Techniques like model pruning, knowledge distillation, and quantisation can improve efficiency without sacrificing performance.
  4. Stronger policies and benchmarks: The AI industry currently lacks clear regulatory standards for sustainability. Implementing industry-wide benchmarks could encourage more responsible energy use, such as: requiring AI developers to report the full carbon impact of model training, regulating energy-intensive AI operations, and incentivising companies like Google, OpenAI, and Meta to adopt greater transparency in their emissions reporting.
  5. Renewable energy and smart data center location: Powering data centers with renewable energy can drastically reduce AI’s carbon footprint. Additionally, choosing locations with clean energy grids can make a major difference. For example, running AI models in France - where electricity is primarily nuclear and renewable - produces far fewer emissions than in parts of the United States, which rely more on fossil fuels.

While AI can improve efficiency, its energy demands must be managed carefully. Proactive strategies like these will be essential to making AI’s future more sustainable.

Plus d'articles

View all