How Does Artificial Intelligence Impact Global Electricity Consumption?

Artificial intelligence (AI) has emerged as one of the most transformative technologies of the modern era, with the potential to create significant gains for the global economy in the long term. However, the global push toward developing new AI models and applications has raised serious concerns about its carbon footprint due to its excessive electricity consumption. This has prompted many countries and companies involved in developing various AI applications to take proactive measures aimed at reducing the energy consumption associated with AI and adopting more environmentally friendly models for generating the electricity needed for the associated data centers.
A Growing Market:
The use of AI is increasing worldwide year by year, as companies and governments are rapidly expanding their reliance on AI’s immense capabilities to manage and improve the outputs of operational processes across various economic activities. A survey conducted by IBM in November 2023, as part of its “IBM Global AI Adoption Index 2023,” revealed that 42% of global companies have integrated AI into their operations, while 40% are considering using it in the future.
AI applications hold promising prospects across multiple sectors, including industry, financial services, transportation, healthcare, education, and others. In the financial sector, for example, AI models are currently being used by banks and financial institutions in a variety of applications such as fraud detection, conducting audits, and assessing clients for loans.
The widespread adoption of AI applications is expected to have positive effects on all economies, especially major ones like the United States. AI is projected to boost U.S. GDP growth by between 0.5 and 1.5 percentage points over the next decade. Overall, these prospects will pave the way for strong growth in the global AI market, which is expected to increase from $40 billion in 2022 to $1.3 trillion by 2032, with a compound annual growth rate of 43%, according to Bloomberg Intelligence.
Intensive Energy Consumption:
The rapid pace of AI adoption across various economic activities has raised concerns among energy sector stakeholders and electricity generation entities. While AI offers productivity improvements and supports economic growth, generating and applying AI models require significant energy consumption.
AI consumes energy during two main phases: the training phase and the inference phase. The training phase involves teaching and developing models by processing vast amounts of data and variables. The more complex the model, the more energy is consumed during this phase.
Afterward comes the inference phase, where AI algorithms are applied to solve problems. In terms of energy use, 20% of the electricity consumed by AI models goes to the training phase, while 80% is used in the inference phase. The more widely an AI model is used, the more electricity it consumes.
Data centers, which store vast amounts of information, are also an integral part of AI operations. Estimates suggest that a single data center can consume as much electricity as 50,000 homes. For instance, training a language model like ChatGPT-3 uses about 1,300 megawatt-hours of electricity—the equivalent of the annual energy consumption of 130 U.S. homes. ChatGPT handles hundreds of millions of queries daily (inference operations), requiring about 1 gigawatt-hour of electricity daily to respond.
Another example is a study conducted by researcher Alex de Vries, published in October 2023, which estimated that Google consumed 18.3 terawatt-hours of electricity in 2021. If Google fully integrates AI into its search operations, its electricity consumption could rise to 29.3 terawatt-hours annually, equivalent to the total electricity consumption of Ireland.
Future Prospects:
The World Economic Forum estimates that AI’s electricity consumption is growing at a rapid pace, between 26% and 36% annually. By 2027, AI’s total electricity consumption is projected to reach between 85 and 134 terawatt-hours, according to de Vries’ study.
Based on this, AI is expected to account for 3% to 4% of global electricity demand by the end of this decade, according to Standard & Poor’s. At the same time, global demand for electricity from data centers is expected to grow by 160% over the same period, according to estimates from Goldman Sachs.
In the U.S., one of the most promising markets for AI applications, data centers are projected to consume up to 8% of the country’s electricity by 2030, up from 3% in 2022. Consequently, U.S. utilities will need to invest $50 billion in new electricity generation capacity to support data centers alone.
Global Preparations:
The expected increase in electricity demand from AI data centers highlights the urgent need for proactive measures to improve their energy efficiency and use more sustainable energy sources to reduce their carbon emissions.
Governments and companies are currently looking to manage more efficient systems for the electricity usage of AI data centers by using shared centers instead of individually operating private infrastructure. They are also relocating data centers to regions with abundant and low-cost energy.
At the same time, to reduce the carbon footprint of data centers, major tech companies such as Microsoft, Google, and Amazon have begun relying on renewable energy to power their data centers as part of long-term plans to achieve carbon neutrality.
To encourage companies to reduce AI-related electricity consumption, the U.S. government is reconsidering tax exemptions for data center developers due to the pressure they place on energy infrastructure. Additionally, the government is working on drafting legislation regarding AI’s environmental impacts. Similarly, in Europe, the European Union introduced the AI Act this year, aiming to promote more environmentally friendly AI applications.
Ironically, despite the additional burden on the global electricity sector from the expansion of AI, policymakers are counting on AI itself to increase the efficiency of the global energy sector, enhancing electricity systems and accelerating the global energy transition. This could partially offset the increase in electricity consumption related to AI generation, as mentioned earlier.
In conclusion, the promising future of AI applications underscores the importance of finding more efficient and sustainable solutions to address its intensive energy consumption. However, the critical question remains: how long will it take for AI to reach a tipping point where the benefits of its use in enhancing energy sector efficiency outweigh the costs of its electricity consumption and carbon footprint?



