In 2023, a single AI-optimized data center in northern Virginia consumed more electricity than the entire city of Baltimore. It was not a power plant. It was a warehouse full of servers running machine learning models. And it was one of hundreds like it — with thousands more planned, approved, and under construction across the United States, Europe, and China.
Nobody voted on this. No legislature debated it. No environmental impact statement captured it before it became the defining infrastructure story of the decade. The AI revolution arrived with remarkable speed and remarkable silence about one of its most consequential side effects: it is consuming electricity at a scale that is beginning to reshape national energy grids, strain power infrastructure, and raise questions that the technology industry has been very reluctant to answer publicly.
The Numbers Nobody Is Talking About
The International Energy Agency published its landmark report on AI and energy in April 2025. The findings were striking enough that they should have dominated headlines for weeks. They did not.
Global electricity consumption from data centers currently stands at approximately 415 terawatt-hours per year — roughly 1.5 percent of total global electricity consumption. By 2030, the IEA projects this figure will more than double to 945 terawatt-hours. That is roughly equivalent to the entire annual electricity consumption of Japan — the world’s third-largest economy — added to the global grid within six years, consumed almost entirely by computing infrastructure.
The AI component of this growth is the most significant driver. Electricity consumption from AI-optimized accelerated servers is projected to grow at 30 percent annually in the IEA’s base case scenario. By 2030, AI could account for between 35 and 50 percent of all data center electricity demand, up from roughly 5 to 15 percent today.
In the United States specifically, data centers are on course to consume more electricity by 2030 for processing data than for manufacturing all energy-intensive goods combined — including aluminum, steel, cement, and chemicals. A country that built its industrial identity around manufacturing is quietly redirecting its energy infrastructure toward running AI models.
Ireland as a Warning
The consequences of AI-driven electricity demand are not evenly distributed. The countries and regions where data centers concentrate face pressures that national averages obscure.
Ireland has become the clearest early warning. Data centers already consume approximately 21 percent of Ireland’s national electricity supply. The IEA projects this figure could rise to 32 percent by 2026. In Dublin specifically, data centers account for an estimated 79 percent of the city’s electricity consumption.
The practical consequences are significant. Ireland’s grid operator has repeatedly warned that new data center development in the greater Dublin area risks compromising electricity supply for residential and other commercial users. Several planned data centers have been denied grid connections. The Irish government — which spent decades attracting tech investment as a cornerstone of its economic strategy — is now grappling with the possibility that it attracted more computing infrastructure than its energy system can sustain.
Virginia faces a similar concentration problem. Data centers already consume 26 percent of the state’s electricity. In Northern Virginia — the global epicenter of data center development — the share is substantially higher. The regional grid operator has issued warnings about the pace of data center development outstripping the grid’s capacity to supply it.
Where the Power Comes From
The electricity powering the AI revolution does not come from clean sources at the rate that the technology industry’s public commitments suggest.
Natural gas currently supplies approximately 40 percent of the electricity consumed by US data centers. Renewables supply around 24 percent. Nuclear around 20 percent. Coal around 15 percent. The IEA projects that natural gas power will grow by 175 terawatt-hours to meet data center demand through 2030 — in part because the current US administration’s pro-gas energy policies have already produced major announcements of new gas generation facilities specifically to serve data centers.
The gap between the technology industry’s climate commitments and its actual energy consumption is one of the more quietly significant contradictions of the current moment. Microsoft, Google, Amazon, and Meta have all made public commitments to run on 100 percent renewable energy. All four have seen their absolute carbon emissions increase as their data center footprints expand. The commitments are real — but they are made against a baseline that keeps moving.
The combined electricity consumption of Amazon, Microsoft, Google, and Meta more than doubled between 2017 and 2021. The AI boom that accelerated after 2022 has pushed that trajectory steeper. Google’s own environmental report acknowledged that its greenhouse gas emissions increased 48 percent between 2019 and 2023, driven primarily by data center energy use and construction.
The Water Nobody Mentions
Electricity is the most visible resource that AI consumes. It is not the only one.
Data centers require massive cooling infrastructure to prevent servers from overheating. A significant portion of that cooling relies on water — either directly, through evaporative cooling systems, or indirectly, through the water used to generate the electricity that powers them.
US data centers directly consumed approximately 17 billion gallons of water in 2023, according to estimates from a 2024 research study. Hyperscale and colocation facilities — the largest, most AI-intensive data centers — accounted for 84 percent of that consumption. In regions already facing water stress, the expansion of data center infrastructure is creating direct competition for a resource that agriculture, municipalities, and ecosystems also depend on.
The water consumption of AI systems is rarely disclosed by the companies operating them. When researchers have attempted to estimate it, the figures are significant enough to suggest that the full environmental cost of AI infrastructure is substantially larger than electricity consumption alone captures.
The Nuclear Bet
The technology industry’s response to the energy problem it has created is revealing. Rather than slowing the deployment of AI infrastructure or investing primarily in demand reduction and efficiency, major tech companies are betting on nuclear power as the solution.
Microsoft signed an agreement to restart the Three Mile Island nuclear plant in Pennsylvania — the site of the United States’ worst civilian nuclear accident — to power its data centers. Google signed a deal to purchase power from small modular reactors being developed by Kairos Power. Amazon has invested in nuclear energy startups. The logic is straightforward: nuclear provides the firm, carbon-free baseload power that intermittent renewable sources cannot reliably deliver at the scale and consistency that AI data centers require.
The nuclear bet reflects a genuine recognition that the electricity problem is structural, not marginal. You cannot power a data center that operates continuously at maximum load with solar panels that generate nothing at night and wind turbines that stop when the air is still. Nuclear, if it works as planned, solves this problem. If it works as planned.
Small modular reactors — the technology that tech companies are primarily investing in — have not been commercially deployed at scale. The first commercial SMR projects are still years from operation. The timeline for nuclear to meaningfully address data center energy demand runs well past 2030, which means that in the medium term, the gap will be filled primarily by natural gas.
The Efficiency Argument and Its Limits
The standard response to concerns about AI energy consumption is that efficiency improvements will moderate the growth. As chips become more powerful per watt, as software becomes more efficient, as cooling technologies improve, the energy required per unit of AI computation will fall — and the net effect will be manageable.
This argument has historical support. The energy intensity of computing has fallen dramatically over the past several decades. The IEA’s own projections incorporate efficiency gains as a moderating factor in its central scenario.
The problem is the rebound effect. When the cost of computation falls, demand for computation increases — often faster than efficiency gains reduce the energy per unit. The history of computing is a history of efficiency improvements that reduced cost per computation while total energy consumption grew, because the lower cost generated demand that more than offset the efficiency gains.
There is no strong reason to expect AI to break this pattern. If anything, the economic incentives pushing AI deployment are stronger than those that drove previous computing booms. The efficiency argument is real. It is also, historically, insufficient.
The Question Nobody Is Asking
The AI industry frames its energy consumption as a cost of progress — an unavoidable byproduct of developing technology that will ultimately benefit humanity. The framing is not entirely wrong. AI applications in drug discovery, climate modeling, energy optimization, and scientific research have genuine potential to produce benefits that outweigh their energy costs.
But the energy consumption of AI is not primarily driven by these applications. It is driven by recommendation algorithms, advertising optimization, chatbot interactions, image generation, and the computational arms race between technology companies competing for AI supremacy. The marginal watt powering the AI revolution is more likely to be delivering a personalized social media feed than curing a disease.
The question nobody is asking publicly — in boardrooms, in legislatures, or in the media coverage of the AI boom — is whether the allocation of electricity to AI applications reflects any considered judgment about relative social value, or whether it is simply the output of capital allocation decisions made by a small number of companies operating with minimal public accountability for their infrastructure choices.
Data centers are consuming the electricity equivalent of Japan. By 2030, the figure will be higher. The lights will stay on — probably. The planet will absorb the emissions — partially. The water will flow — for now.
But the decisions being made about how to power artificial intelligence are infrastructure decisions with generational consequences. They are being made without public deliberation, without democratic authorization, and without the kind of honest accounting of costs and benefits that decisions of this magnitude should require.
The AI revolution is real. So is its electricity bill. And so far, neither has been subjected to the scrutiny that both deserve.
If this analysis interests you, read next: The AI Arms Race Nobody Voted For

