OpenAI CEO Sam Altman recently took to social media to remind the world that humans themselves are energy-intensive creatures. In a brief but controversial statement, Altman pointed out that it also requires large amounts of energy to raise and educate a human — a post clearly intended as a response to the increasing criticism of the artificial intelligence industry's massive power consumption, according to TechCrunch.

What does it cost to train an AI model?

The figures for the energy consumption of large language models are significant. Training GPT-3, which has 175 billion parameters, is estimated to have required approximately 1,287 MWh of electricity and generated around 502 tons of CO₂ equivalents. This corresponds to the annual electricity consumption of about 130 American households.

For GPT-4, which is far more complex, the estimate is dramatically higher: over 50,000 MWh, with a carbon footprint potentially ten to one hundred times larger than its predecessor, according to available research data.

50,000 MWh
Estimated electricity consumption for GPT-4 training
502 tons
CO₂ equivalents from GPT-3 training

In addition to the training itself, ongoing operational consumption is substantial. Each individual search on ChatGPT uses an estimated 0.34 Wh. If the service handles one billion requests daily, it results in an annual consumption of around 124 GWh — plus water consumption for cooling.

Altman: Training a human also requires a lot of energy

What does it cost to "train" a human?

Altman's point is not entirely without merit, although it is difficult to put precise figures on it. According to the International Energy Agency (IEA), a child born in the 1950s will cause around 350 tons of CO₂ during their lifetime. For children born in the 2020s, under a net-zero scenario by 2050, the estimate is down to 34 tons of CO₂.

The world's average energy consumption per person in 2018 was 79 gigajoules per year — in the US, it was closer to 284 gigajoules. Over a lifetime, this adds up to very large numbers.

It also requires a lot of energy to train a human.

Direct comparisons between human life-cycle energy and AI training are methodologically challenging, as the data sources measure very different things. Human energy use includes food, transport, housing, healthcare, and consumer goods over several decades, while AI training is a discrete, industrial process.

Altman: Training a human also requires a lot of energy

A defensive but not meaningless comparison

Critics would argue that Altman's post is a rhetorical move to deflect attention from an industry with a rapidly growing carbon footprint. This is not unreasonable: the energy consumption of data centers globally is increasing sharply in step with the AI boom, and the sector faces mounting pressure from both authorities and investors.

The energy consumption of the AI sector is growing faster than any efficiency gains can compensate for

At the same time, Altman's post points to a legitimate point in the broader debate: energy consumption alone is not necessarily an argument against a technology, but the question of utility versus cost is relevant to ask — for human activity as well as for artificial intelligence.

Researchers are working to make the training of AI models more efficient. Frameworks like Zeus, developed at the University of Michigan, are said to be able to reduce energy requirements by up to 75 percent without new hardware, and model compression can provide savings of up to 44 percent.

The debate continues

Altman's statement is symptomatic of an industry that is increasingly forced to defend its resource use to a skeptical public. The fact that OpenAI's top executive chooses to draw parallels to human energy use rather than presenting concrete climate plans is unlikely to dampen the criticism — but it highlights that the energy debate surrounding AI is far more complex than simple headlines often suggest.