As Meta, Amazon, Google and OpenAI race to build more, and more massive, data centers across the US to support their generative AI efforts, AI is expected to be the "biggest driver of electricity consumption" in North America over the next five years, a new energy report finds.

Those data centers, which pack in thousands of computers to handle everything from training AI models to fielding your ChatGPT, Gemini and Sora requests, will gobble up not just megawatts of electricity but also millions of gallons of water and thousands of acres of land.

By 2040, overall data center energy use around the globe -- including both general-purpose (think cloud storage and video streaming) and AI-focused data centers -- will quintuple to become 5% of all electricity use, with AI account

See Full Page