Electricity for AI exceeds transporation

The projection is that the rapid explosion in artificial intelligence (AI) computing chips means electricity for AI exceeds transportation. Some have claimed 5 times more. Is this really true?  Electricity for AI is due to huge computing needs to train AI data models. New customised chips from Nvidia, Tesla and others, designed from first principles, are much faster and consume low power. When Microsoft / OpenAI / Google / Meta / Tesla datacentre or others have over 100,000 AI computers, even at a few watts, the demand is TerraWatts of electricity and be an order of magnitude more than transportation.

Ironically, electricity for AI could be more than the vehicles’ electrification. Data centres can use 50MW, more than a small city, and estimates are that data centre electricity demand will rise from 3% of US electricity demand to 4 or 5% by 2030. The IEA says the demand will double.

“My not-that-funny joke is that you need transformers to run transformers. You know, the AI is like… There’s this thing called a transformer in AI… I don’t know, it’s a combination of sort of neural nets… Anyway, they’re running out of transformers to run transformers.

“Then, the next shortage will be electricity. They won’t be able to find enough electricity to run all the chips. I think next year, you’ll see they just can’t find enough electricity to run all the chips.

Elon Musk New Atlas
data center electricity demand
Figure from IEA Electricity2024-Analysisandforecastto2026.pdf

IEA says the rate at which electricity usage will increase by 2026 depends on the pace of deployment, the range of efficiency improvements, and artificial intelligence and cryptocurrency trends.

  • Demand to somewhere between 650TWh and 1,050TWh
  • This increase is equivalent to adding the entire power consumption of a country like Sweden at the lowest end of the scale, or Germany at the highest.
  • Australian National Electricity Market uses 210TWh. Electrification of transport will need about 50TWh
  • Hence, this 5x demand

Electricity Demand is Overstated

Prof John Quiggan has a good summary which says a typical modern computer consumes around 30-60 watts when it is operating, less than a bar fridge or an incandescent light bulb. The rise of large data centres and cloud computing produced another round of alarm. A US EPA report in 2007 predicted a doubling of demand every five years. Again, this number fed into various debates about renewable energy and climate change.

The simplest explanation, epitomised by the Forbes article from 1999, is that coal and gas producers want to claim that there is a continuing demand for their products, one that can’t be met by solar PV and wind. That explanation is certainly relevant today, as gas producers in particular seize on projections of growing demand to justify new plants.

Prof John Quiggan Prof UQ [9]

AI Demands Extremely Large Compute

AI is computer intensive.  Ark Invest in their 2024 white paper have predicted the major cost of training will decrease in cost exponentially and say that within a few years the cost to train a model from $4.5 million could fall to under $30 by 2030.

  • Training computers on ever increasingly large data sets. 
  • Inference computers that use the neural nets created. These can very powerful, or limited to run even in modern mobile phones. 

Tesla took a strategy of a very low power interference computer (hardware 3 or 4) in their vehicles that draw less than 100W. To compensate, they need much more compute in their training computers (exaflops or terraflops). Musk stated in March 2024 that Tesla  is no longer compute-constrained for its AI for Grok, autonomous vehicles, and their Robot Optimus. 

Herbert Ongs discusses this with his Brighter with Herbert “Tesla’s Hidden Compute Power: Bigger Than Expected! w/ Brian Wang”video. Brian Wang reports this in detail on his website..

Tesla published this graph on why they built their own DOJO supercomputer, while at the same time spending hundreds of millions buying NVIDIA H100 compute engines. They had about 2 Exaflops in Jan 2023 and planned to have 100 Exaflops by October 2024.

In mid 2023, XAi and Tesla announced a rapid build out of AI compute. (SemiAnalysis.) 

What is Tesla status in April 2024?

tesla compute projections in 2023.
From Tesla 2023

Brian Wang Wang states Tesla has not been alone in purchasing A100 chips or H100. Meta has purchased over 350,000 A100 chips, while Tesla has purchased over 10,000 H100.  A H100 is ~ 4 PFlops, In all probability Tesal has over 120 exaflops already by March 2024, which would explain Musk’s comments. 

By Dec 2024, Tesla will have bought and installed over 800 exaflops.

  • ~ 20,000 B100s, which are 20 PFlops each. I.e. 400 exaflops. (listed as primary purchaser of the Blackwell chips)
  • ~100,000 H100s i.e. 400 exaflops
  • Unknown in-house DOJO systems – maybe ~200

AI comput is increasing by 10x annually.  which allows AI to expand rapidly.

AI Datacenters Need MW of Electricity

  •  A Nividia H100 chip power demand is ~700W.
  • The new B200 Blackwell, with 8GPUs, 2 Intel GPUs is 14kW max with compute of 72 petaFLOPS training and 144 petaFLOPS inference.
  • A 20,000-equipped datacenter, along with petabytes of data storage, would need power of 300MW and achieve 7 exaflops, 

A Microsoft data center employee says they cannot have more than 100,000 H100 in one state in the USA without breaking that state’s power supply..

Meta, Microsoft, and XAI are all looking to co-locate with under-utilized power plants and some have even proposed siting a data center with its dedicated nuclear power plant.

As AI requires data and cache, the difficulty to split out data geographically is too hard for transmission costs. 

Brian Wang, Nextbigfuture, Rough Projections has presented this table for Grok (XAI LLM) but is similar to OpenAI GPT.

Grok /
GPT Version
ComputeData SourceTokensEnergy (MW)Year
Grok-220,000 H100Regular data2202024
Grok-3100,000 H100All regular data401002024
Grok-42X50% synth & RWV data1001502025
Grok-55X75% synth & RWV data2003002025
Grok-610X90% synth & RWV data4003502025
Grok-725X96% synth & RWV data1,0004002026
Grok-8125X98% synth & RWV data2,0008002027
Grok-9625X99% synth & RWV data4,0001,6002028
Grok-103000X99.5% synth & RWV data8,0003,0002029
Table from Next Big Future [2]

Electricity for AI Exceeds Transportation – Really?

Transportation

Transport electrification increases electricity demand by about 20%. E.g. in Australia, the supply of electricity from the NEM is currently  210TWh annually. An additional 50TWh electricity would electrify transportation With an average age of vehicles 10 years, demand will increase by about 5-10GWh per year (over the 10 years) which is about the normal annual increase.

Data Centers and AI Demand

  • Currently, data centers use about 5% of the USA electricity.  Many expect that to increase by 4X over the next decade.
  • Some suggest many electricity grids are unsuited to the green revolution. E.g. Alimeter More grids enable wider resilience to more intermittent electricity supply.  
  • The current USA electricity consumption is about 4,178 Terrawatt hour (TWh)
  • AI will add ~26 TWh to the electricity every year for every AI datacenter.
  • Ark Invest says [8] for 100 AI datacenters, demand would 2,600 TWh, a 50% growth of total energy demand and 4 times the expected energy demand from transportation. This an order of magnitude than for McKinsey. Who is correct?
  • McKinsey have a much lower number of 50GW or 385TWh for AI datacents [7]

The demand for electricity for AI will exceed the demand for electricity for electrifying transportation. 

Growth Projections

Prediction SourceCapacity 2023Capacity 2030Electricity 2030Increase CAGR
McKinsey21GW50GW2852.7%
Ark Invest21GW2602,30010 times
Global rate is from 2.7% up to 3.4%
Morgan Stanley18GW in 2023, 45GW in 2024

McKinsey growth projections are a tenth of Ark Invest. Reporting on the article has two data points, 2023 and 2030. How does this translate to the graphed data? Essentially they have reduced the compounded annual growth rate. How will this energy be supplied, given that interconnection delays are up to 4 years. This means the energy will need to be behind the meter if they are are to meet aggressive AI targets.

Conclusion

It is unlikely that electricity for AI will exceed transportation. Electricity demand will continue to rise with more AI, but the rise in cheap solar and battery storage is adequate to meet demand.

More Reading 

  1. The U.S. power grid wasn’t built for the green movement. 2023. https://altimetry.com/articles/this-is-the-next-big-power-grid-investment
  2. Tesla’s Hidden Compute Power Brian Wang 2024 https://www.nextbigfuture.com/2024/04/teslas-hidden-compute-power.html#more-194874
  3. Tesla AI Capacity Expansion – H100, Dojo D1, D2, HW 4.0, X.AI, Cloud Service Provider Semi-Analysis https://www.semianalysis.com/p/tesla-ai-capacity-expansion-h100
  4. Elon Musk: AI will run out of electricity and transformers in 2025 March 2024 https://newatlas.com/technology/elon-musk-ai/
  5. Big Ideas 2024 Ark Invest https://ark-invest.com/big-ideas-2024/
  6. Global data center electricity use to double by 2026 – IEA report https://www.datacenterdynamics.com/en/news/global-data-center-electricity-use-to-double-by-2026-report/
  7. US electric utilities brace for surge in power demand from data centers 2024 https://www.reuters.com/business/energy/us-electric-utilities-brace-surge-power-demand-data-centers-2024-04-10
  8. Is there enough energy to power Innovation? | The Brainstorm EP 54 https://www.ark-invest.com/podcast/the-brainstorm-ep-54
  9. AI won’t use as much electricity as we are told John Quiggan 2024 https://johnquigginblog.substack.com/p/ai-wont-use-as-much-electricity-as