Renewable Energy
“Green” AI’s Dirty Secret: Why Data Centers Fuel a Natural Gas Boom
The promise of a future powered by Artificial Intelligence often paints a picture of sleek, renewable energy-driven data centers. Yet, a stark and urgent reality is emerging: AI's explosive energy demands are overwhelming existing electricity grids, forcing an unexpected and significant resurgence in fossil fuel reliance, particularly natural gas, to maintain stability and meet immediate power needs. This isn't a distant threat; it's unfolding now, in 2025-2026, threatening climate goals and shifting the energy landscape.
AI's appetite for electricity is nothing short of voracious. Goldman Sachs projects that power demands from AI and data centers will hit 92 gigawatts (GW) as early as 2027. In the U.S., data center consumption, driven primarily by AI, is forecast to skyrocket from 183 terawatt-hours (TWh) in 2024 (4% of national electricity) to a staggering 606 TWh by 2030, representing nearly 12% of the nation's power demand. Globally, data centers could consume 945 TWh by 2030. This isn't just about more data centers; it's about the sheer intensity of AI workloads, with a single AI-related task consuming up to 1,000 times more electricity than a traditional web search. NVIDIA's upcoming GPUs, for instance, are projected to use 1 megawatt (MW) per rack – a volume that once powered an entire data hall.
For decades, utility capacity planning followed predictable growth patterns. That era is over. Today, the single biggest constraint on new AI data center development isn't land or capital; it's simply access to grid power. Utilities across the U.S. are openly struggling, with over half of industry leaders citing available power as their biggest challenge in bringing data centers online. The problem is a profound mismatch in timelines: while a new data center can be built in about 18 months, necessary grid infrastructure upgrades – like new transmission lines and substations – can take anywhere from three to six years, and sometimes up to seven years in critical hubs like Northern Virginia. This creates a
The Unprecedented AI Power Surge
AI's appetite for electricity is nothing short of voracious. Goldman Sachs projects that power demands from AI and data centers will hit 92 gigawatts (GW) as early as 2027. In the U.S., data center consumption, driven primarily by AI, is forecast to skyrocket from 183 terawatt-hours (TWh) in 2024 (4% of national electricity) to a staggering 606 TWh by 2030, representing nearly 12% of the nation's power demand. Globally, data centers could consume 945 TWh by 2030. This isn't just about more data centers; it's about the sheer intensity of AI workloads, with a single AI-related task consuming up to 1,000 times more electricity than a traditional web search. NVIDIA's upcoming GPUs, for instance, are projected to use 1 megawatt (MW) per rack – a volume that once powered an entire data hall.
Gridlock: The Hidden Bottleneck
For decades, utility capacity planning followed predictable growth patterns. That era is over. Today, the single biggest constraint on new AI data center development isn't land or capital; it's simply access to grid power. Utilities across the U.S. are openly struggling, with over half of industry leaders citing available power as their biggest challenge in bringing data centers online. The problem is a profound mismatch in timelines: while a new data center can be built in about 18 months, necessary grid infrastructure upgrades – like new transmission lines and substations – can take anywhere from three to six years, and sometimes up to seven years in critical hubs like Northern Virginia. This creates a