What is the SEC and How is it Related to the Environment?
How does the SEC, otherwise known as the U.S. Securities and Exchange Commission, help the United States regulate environmental policies?
ESG / CSR
Industries
Greenly solutions
New generation artificial Intelligence chatbots like ChatGPT hit the mainstream recently, and due to their increasingly advanced capabilities (ChatGPT can produce incredibly accurate and realistic copy), and their ease of access (anyone can use ChatGPT for free), they’re being used for everything from homework to drafting legal briefs.
Some are even predicting that they’ll eventually change the shape of our job market.
At this point in time, ChatGPT is probably the most widely known and used chatbot on the market. Its technology is based on GPT-3 which is the 3rd generation of OpenAI’s GPT language model. Because of this, it is difficult to calculate the precise percentage of GPT3 emissions that should be allocated to ChatGPT.
For this reason, we’ll look at GPT-3’s carbon footprint, and not ChatGPT’s, but since they’re very similar it provides a good estimate.
Before we dig into the numbers, let's take a look at the difference between GPT-3 and ChatGPT:
In order to assess the carbon emissions relating to GPT-3, Greenly has focused on a realistic action that a business might employ GPT-3 for.
In this case we’ll look at the emissions of GPT-3 technology where it is used to “reply automatically to 1 million emails per month, over the course of 1 year”.
There are two stages to the use of GPT-3 in this scenario: training and use. Both stages have their own carbon footprint that must be considered to accurately calculate the emissions for this scenario.
GPT-3 first needs to be trained in order to become competent and efficient at writing appropriate replies to emails - something that takes many hours. In this scenario we can assume that the training would take place in a classic data centre, resulting in the following emissions.
Once GPT-3 has been effectively trained, the technology can then be employed to actually carry out the task of answering 1 million emails per month. This action would result in the following carbon emissions.
It should also be noted that if GPT-3 were to be used to respond to emails in more than one language, it would be necessary to re-train the technology for each language.
This means that the total emissions would need to be multiplied by the number of languages required. Where a company operates in a number of different languages, the resulting carbon emissions are significant.
Training is the most carbon intensive stage. Greenly’s calculation has revealed that the training of GPT-3 results in significantly higher carbon emissions than its actual use - in fact carbon emissions are as much as 230 times higher.
The choice of data centre strongly influences emissions. For example, a classic data centre produces 40% more emissions than an optimised data centre. The country in which the data centre is located will also play a part: Greenly’s calculation is based on a data centre in France, however, if the calculation had instead been calculated based on a data centre in the US, this would have resulted in emissions that were 6 times higher.
GPT-3 produces higher carbon emissions than other comparable AIs. If we compare GPT-3 with other AI technology carrying out the exact same task, it’s clear that GPT-3 results in significantly higher emissions. This is largely because the number of GPU hours (ie. the sum of the duration of each individual GPU that’s been used for deep learning) for pre-training GPT-3 is between 100 and 30000 times higher than for the other AIs, which means more electricity, more servers, etc. This is because GPT-3 has an incredibly high number of parameters (ie. the variables that are input to allow the AI to make predictions) - 175 billion to be exact.
New AI technology such as GPT-3 and ChatGPT are an exciting advancement and something that many believe has the potential to re-shape the future of how we work and how different industries operate.
However, they’re not flawless and further advances in this field can be expected. Already, Chinese firms such as Huawei and Inspur are working on AIs with more than 200 billion parameters.