There’s a fascinating link in physics between energy and data: Landauer’s principle ties information to energy. It says that erasing a bit of data requires a tiny amount of energy due to thermodynamic laws. This shows that energy and data are intertwined at a fundamental level.
In today’s world, the demand for data (think AI, streaming, or IoT) drives energy consumption—data centers now guzzle electricity like small cities.
Forecasting electricity consumption for data centers across various countries by 2028 requires evaluating multiple factors, such as the swift rise of artificial intelligence (AI), cloud computing, digital infrastructure growth, and regional energy policies. While exact predictions differ by country and data availability, global patterns and insights from credible sources offer a window into potential future scenarios.
The International Energy Agency (IEA) projects that global data center electricity use could climb from 240-340 terawatt-hours (TWh) in 2022—roughly 1-1.3% of global electricity demand—to 650-1,050 TWh by 2026, or about 2-4% of the world’s total.
Extending this trend, Goldman Sachs Research anticipates a 160% surge in data center power demand from 2023 to 2030, potentially hitting 1,000-1,300 TWh globally by 2030, equating to 3-4% of worldwide usage. By 2028, this suggests data centers might represent 2.5-3.5% of global electricity, depending on efficiency gains and AI growth.
Country-specific outlooks vary based on local conditions like data center density and energy infrastructure:
Japan: Globally, the 11,000+ data centers could match Japan’s current annual electricity use (around 1,000 TWh) by 2027. In Japan itself, data center expansion is slower than in the US or Europe, but rising AI use could lift its share to 3-4% of national electricity by 2028, up from a modest baseline today.
India: With its booming digital economy, India’s data center energy needs could soar. Some experts predict global AI-driven data centers might consume 7% of the world’s electricity by 2030, roughly equivalent to India’s current usage (about 1,400 TWh). Within India, data center demand might reach 2-3% of national electricity by 2028, reflecting gradual infrastructure growth.
China: A leader in AI and digital infrastructure, China’s data center power use is set to rise steeply. By 2030, estimates suggest data centers could account for 8% of its total electricity, with 2028 figures potentially hitting 6-7%, up from lower levels today, propelled by investments in AI and renewable-supported grid growth.
European Union: Goldman Sachs Research forecasts a 40-50% increase in Europe’s power demand from 2023 to 2033, with data centers as a key driver. By 2030, Europe’s data center needs could equal the current combined usage of Portugal (50 TWh), Greece (50 TWh), and the Netherlands (~120 TWh), totaling around 220 TWh. After a 10% drop in electricity use from 2008 to 2023 due to efficiency and economic shifts, this marks a reversal driven by data center expansion and electrification in transport and heating. By 2028, data centers might comprise 5-6% of Europe’s electricity demand, with growth concentrated in countries like Germany, the UK, and the Nordics, aided by power availability and tech incentives.
In Europe, the IEA pegged 2022 data center consumption at 36-51 TWh (assuming Europe hosts 15% of global data centers), though some estimates suggest 50-60 TWh, reflecting rapid hyperscale investments from companies like AWS, Microsoft, and Google.
McKinsey predicts this could nearly triple by 2030, possibly exceeding 200 TWh, while Morgan Stanley forecasts capacity growing sixfold to 38 GW by 2035 (250-300 TWh at full utilization), with 2030 as a midpoint. A single ChatGPT query, using 2.9 watt-hours versus 0.3 for a Google search, underscores AI’s energy intensity.
The EU’s push for sustainability may temper growth, while regions with lax policies or cheap power (e.g., Nordics or parts of Asia) could see faster rises. Local factors like grid capacity and renewable availability—such as Ireland’s looming power challenges—will further shape outcomes.
Overall, data center electricity use by 2028 will reflect regional energy strategies, since AI will be incredibly important for staying competitive, whether we’re talking about businesses, technological innovation, or even personal opportunity.
AI is already transforming how companies operate. It powers everything from automated customer service (think chatbots) to data analysis that uncovers insights humans might miss. Businesses that adopt AI can work faster, smarter, and more efficiently—giving them a clear edge. If a company skips out on AI, it risks being outpaced by competitors who are leveraging these tools to cut costs, improve decision-making, and innovate. Simply put, AI isn’t optional anymore—it’s a must-have to stay in the game.
Thus, AI is critical for competitiveness because it boosts efficiency, fuels innovation, and sharpens decision-making. Whether you’re a business trying to stay relevant, a tech leader aiming to pioneer new solutions, or an individual looking to future-proof your career, ignoring AI isn’t an option. It’s reshaping the landscape too fast and too significantly to overlook despite ethical challenges.
But AI is so energy-intensive primarily because it demands massive computational power to process vast amounts of data and perform complex calculations. Here’s why:
1. Training Requires Immense Resources
When an AI model, especially a large language model like GPT-3 with 175 billion parameters, is trained, it’s fed huge datasets and undergoes millions—or even billions—of adjustments to its internal settings. Each adjustment involves intricate calculations that are performed on specialized hardware like GPUs (Graphics Processing Units) or TPUs (Tensor Processing Units). These devices are designed for parallel processing, which makes them perfect for AI tasks but also makes them significant energy consumers. For instance, training a model like this can take weeks or even months of non-stop operation, equivalent to running several high-end gaming PCs 24/7 during that time.
2. Running AI Models (Inference) Adds to the Load
Even after training, using an AI model—known as inference—still requires substantial energy. When you ask an AI a question or generate an image, the model performs real-time calculations on powerful servers. While this uses less energy than training, the sheer scale of AI usage (think millions of users worldwide) keeps the energy demand high.
3. Data Centers and Cooling
The servers that power AI don’t operate in isolation—they’re housed in massive data centers. These facilities can contain thousands of servers working simultaneously, generating a tremendous amount of heat. To prevent overheating, data centers rely on extensive cooling systems, which further increase energy consumption. In fact, some data centers use as much electricity as small cities.
4. Why So Much Power?
At its core, AI’s energy intensity stems from the need to simulate countless tiny “experiments” to learn patterns and relationships in data. Imagine solving a Rubik’s Cube blindfolded while juggling—it takes a lot of effort. For AI, that effort translates into electrical power consumed by power-hungry hardware.
In summary, AI’s energy intensity comes from the heavy computational demands of training and running models on specialized hardware, compounded by the energy needs of data centers. In short: It’s a power-hungry technology.
No comments.
By submitting a comment you grant Free West Media a perpetual license to reproduce your words and name/web site in attribution. Inappropriate and irrelevant comments will be removed at an admin’s discretion. Your email is used for verification purposes only, it will never be shared.