AI

From Emergent Earth

What is the climate impact of AI?

  • The environmental cost includes not just energy use, but also:
    • Hardware manufacturing > the environmental impact of the materials required to manufacture the technologies
    • Data center cooling
    • Network infrastructure
    • Server maintenance
    • The unequal distribution of its benefits, which could further exacerbate inequalities.


While individual AI queries might seem low-impact, the scale of global AI usage makes its total environmental footprint a significant concern

Measurements vary based on multiple factors and are still subject to ongoing research.

Environmental cost of ai.png

Training

Large language models (LLMs), such as ChatGPT (with GPT-4 as the backbone model) ... have estimated that training GPT-3 on a database of 500 billion words required 1287 MWh of electricity and 10,000 computer chips, equivalent to the energy needed to power around 121 homes for a year in the USA.

Furthermore, this training produced around 550 tons of carbon dioxide, equivalent to flying 33 times from Australia to the UK.

Since the subsequent version, GPT-4, was trained on 570 times more parameters than GPT-3, it undoubtedly required even more energy.

Big models emit big carbon emissions numbers – through large numbers of parameters in the models, power usage effectiveness of data centers, and even grid efficiency. The heaviest carbon emitter by far was GPT-3, but even the relatively more efficient BLOOM took 433 MWh of power to train, which would be enough to power the average American home for 41 years.

Link: https://hai.stanford.edu/news/2023-state-ai-14-charts

Usage

The environmental cost is not restricted to training, as using these systems also has a cost. As an example, GPT-3 was accessed 590 million times in January 2023, leading to energy consumption equivalent to that of 175,000 persons.4 Moreover, in inference time, each ChatGPT query consumes energy equivalent to running a 5 W LED bulb for 1hr 20 min, representing 260.42 MWh per day.

Energy Consumption Per Query Google search Sending an email Boiling a kettle (eletric)
0.1 to 50 grams of CO2 ~0.2-0.7g CO2e ~4g CO2e ~70g CO2e

Energy Consumption Per Query

  • One query to a large language model typically consumes between 0.1 to 50 grams of CO2 equivalent, depending on:
    • Length of the conversation
    • Complexity of the query
    • Server location and energy source
    • Model size and architecture

Direct Energy Use

  • Server processing power
  • Cooling systems
  • Data center operations


Infrastructure Impact

  • Network transmission
  • Data storage
  • Hardware lifecycle
Water usage

Smal Language Model

training

usage