ai power consumption 2025

Will AI power consumption surpass Bitcoin mining by 2025?

New analyses show that by the end of 2025, the electricity required to operate cutting-edge AI systems could surpass the considerable energy consumption currently required by Bitcoin mining. The consequences of this increase extend beyond the technology business.

The rapid growth of generative AI has led to a boom in data center construction and component manufacturing. With the increasing complexity and adoption of AI applications, the specialized hardware supporting them, such as Nvidia and AMD accelerators, has proliferated at an unprecedented rate. This increase has led to a significant increase in electricity consumption, with AI expected to account for nearly half of total data center electricity use next year, up from around 20% today.

This shift has been the subject of a detailed study by Alex de Vries-Gao, a PhD student at the Institute for Environmental Studies at the Free University of Amsterdam. Their research, published in the journal Joule, draws on public device specifications, analyst forecasts, and corporate communications to estimate the production volume and energy demand of AI hardware.

The energy demand of AI accelerator modules and AI systems manufactured in 2023 and 2024 was estimated, along with their cumulative energy demand through 2025. The energy demand calculation is based on a utilization rate of 65% and a PUE of 1.2, with margins of error indicating the impact of varying PUE values ​​between 1.1 and 1.3 and utilization rates ranging from 55% to 75%.

Given the reluctance of large tech companies to disclose the power consumption of their AI operations, de Vries-Gao employed a triangulation method, examining the advanced chip supply chain and manufacturing capacity of major players such as TSMC.

Numbers that speak for themselves

ai power consumption 2025
Source: Alex de Vries-Gao

Each Nvidia H100 AI accelerator, a fundamental element in modern data centers, consumes 700 watts continuously when running complex models. Multiplying this figure by millions of units, the total energy consumption reaches staggering levels.

De Vries-Gao estimates that the hardware produced between 2023 and 2024 alone could require between 5.3 and 9.4 gigawatts, enough to exceed the entire electricity consumption of a country like Ireland.

The research also compares the estimated energy demand of AI hardware in 2025 with the energy demand of various countries in 2023 (Ireland, Switzerland, Austria, the Netherlands, the United Kingdom, and France), Finland in 2022, Bitcoin mining (March 2025), and the total energy demand of data centers (excluding cryptocurrency mining, 2024).

However, the greatest growth is yet to emerge. TSMC’s CoWoS packaging technology enables the integration of powerful processors and high-speed memory into single units, forming the core of today’s AI systems. De Vries-Gao noted that TSMC more than doubled its CoWoS production capacity between 2023 and 2024, although demand from AI chipmakers such as Nvidia and AMD continued to exceed available supply. TSMC plans to double its CoWoS capacity again in 2025. If current trends continue, de Vries-Gao predicts that the AI ​​system’s total power requirements could reach 23 gigawatts by the end of the year, roughly equivalent to the UK’s average national energy consumption.

This would give AI a larger energy footprint than global Bitcoin mining. The International Energy Agency warns that this increase alone could double data center power consumption within two years.

Efficiency Going Against the Current

While improvements in energy efficiency and a greater shift toward renewable energy have provided some help, these advances are quickly outpaced by the scale of new hardware and data center deployments.

The industry’s prevailing “bigger is better” philosophy—whereby progressively larger models are pursued to optimize performance—has created a cycle of increasing resource usage. Even as individual data centers become more efficient, overall energy consumption continues its upward trend.

Behind the scenes, the struggle in manufacturing hampers any efficiency optimization. Each new generation of AI chips requires increasingly sophisticated packaging. TSMC’s recent CoWoS-L technology, while essential for next-generation processors, is hampered by low production yields.

Meanwhile, companies like Google are reporting “energy capacity constraints” as they scramble to build data centers quickly enough.

Some projects are repurposing fossil fuel infrastructure, with one securing 4.5 gigawatts of natural gas capacity specifically for AI tasks.

The environmental impact of AI depends largely on the location of these energy-intensive systems. In regions where electricity is primarily derived from fossil fuels, the associated carbon emissions can be significantly higher than in areas supplied by renewable energy.

For example, a server farm in West Virginia, a coal-dependent state, produces nearly twice as many carbon emissions as one in California, a state abundant with renewable energy.

Yet, tech giants rarely disclose where or how their AI operates, a lack of transparency that threatens to undermine climate goals. This opacity makes it difficult for policymakers, researchers, and the general public to fully assess the environmental consequences of the rise of AI.

AI-generated cover image.

Disclaimer: The content and links provided in this article are for informational purposes only. islaBit does not offer legal, financial or investment recommendations or advice, nor is it a substitute for the due diligence of each interested party. islaBit does not endorse any investment or similar offer promoted here. Some of the postings may be promoted by interested third parties.

Leave a Comment