From AI to Absorption: Office Demand, AI Talent Concentrations, and What it Means for Data Centers
Open AI’s GPT-3 used more power than other leading models to train Large Language Models Use a lot of Power to Train
Energy consumption when training LLMs in 2022 (in MWh)
• Energy consumption in training is high; frequent retraining is required to maintain data relevance. Lifetime energy consumption is even higher than initial training usage. • Energy savings from
1,400
1,200
1,000
800
600
AI will be big. Mobile phone operators alone expect AI to reduce power consumption by 10-15%.
MWh Used in Training
400
200
0
GPT-3
Gopher
Bloom
OPT
Note: San Francisco includes pre-OpenAI lease total
Source: Cushman & Wakefield Research, Cornell University, 2023
29
Made with FlippingBook flipbook maker