ARTICLE AD BOX
Summary
Behind every AI prompt lies a vast network of data centres guzzling growing amounts of electricity and water, raising questions about energy security, local resources and the sustainability of rapid AI expansion.
India has positioned itself as a global hub for artificial intelligence (AI) and digital infrastructure, with massive investments underway. But AI-focused data centres are energy- and water-intensive.
As India accelerates data-centre expansion to support AI growth, policymakers and utilities face a key question: will this strain electricity grids and intensify water stress in emerging data-centre hubs? Mint explores.
How much power are AI data centres using?
Training and deploying AI models takes place inside power-hungry data centres. According to the International Energy Agency (IEA), a typical AI-focused data centre can consume as much electricity as 100,000 households, while the largest facilities under construction could consume 20 times as much.
In 2024, data centres accounted for 1.5% of global electricity consumption, largely concentrated in the US, China and Europe. Global data-centre electricity consumption is expected to more than double to 945 terawatt-hours (TWh) by 2030, equivalent to more than half of India’s electricity consumption in 2023-24 (1,622 TWh).
The IEA estimates data centres will account for about half of electricity demand growth in the US between 2025 and 2030. In Ireland, data centres already consume around a fifth of the metered electricity supply.
What about India?
India’s data-centre capacity has tripled since 2020 and is estimated to reach up to 6.5 gigawatts (GW) by 2030, according to a report by the Council on Energy, Environment and Water (CEEW) published in February. Investments are expected to exceed $100 billion by FY27.
For now, the impact remains limited. In 2025, data centres accounted for just 0.5% of national electricity consumption and used about 150 billion litres of water, roughly a third of the water Pune consumes annually. These figures are projected to double by 2030.
As of January 2026, India had 271 data centres. Mumbai leads the market, hosting about a quarter of them, followed by Chennai, Hyderabad and Bengaluru. To attract investment, several states, including Uttar Pradesh, Rajasthan and Tamil Nadu, are offering incentives to lower energy costs. Coastal states such as Andhra Pradesh are promoting seawater cooling to reduce cooling-related electricity demand.
Why do data centres need energy and water?
Inside a data centre, racks of servers spend months ingesting training data and performing computations before delivering results to users. Inference, where a trained model applies learned patterns to new data to generate real-time responses, accounts for a large share of computing demand and energy use.
Data centres also require a stable, uninterrupted 24/7 power supply to prevent service outages, limiting their ability to rely entirely on intermittent renewable sources such as solar and wind.
Processing vast volumes of data generates significant heat, making cooling essential. Servers and cooling infrastructure together account for about 70% of data-centre electricity consumption. Locating facilities in water-stressed regions can affect local communities; about 43% of data centres globally operate in areas facing high water stress.
Has there been public backlash globally?
Yes. In rural Michigan in the US, residents opposed a $7 billion data-centre project, citing concerns over rising residential electricity prices and risks to water supply. A Bloomberg analysis found electricity costs rose 267% over five years in areas located close to data centres in the US.
In February, residents in Johor, Malaysia, protested against a data-centre project, citing dust pollution and pressure on local water supplies. Similar protests have taken place in recent years in Uruguay and Chile. US President Donald Trump has also asked technology companies to generate their own electricity to ensure citizens do not face higher power costs.
What is the energy footprint of a single AI query?
Measuring the energy required for an AI query is more complex than estimating a car’s mileage. The climate impact depends on the size and type of model, the energy mix powering the data centre, and even the time of day a query is processed, as night-time queries are less likely to be powered by renewables.
According to MIT Technology Review, most leading systems such as ChatGPT, Gemini and Claude are closed models, meaning companies do not disclose detailed energy data. Researchers have instead estimated consumption using open-source models such as Llama.
An average text-based query on Llama (model 3.1 405B) requires about 6,706 joules of energy, roughly equivalent to running a microwave for eight seconds. Generating a five-second AI video, even at lower resolution, can require at least 3.4 million joules, comparable to running a microwave for an hour.

2 hours ago
1






English (US) ·