AI Data Centers: The Three Bills Nobody Pays
TL;DR: A new study found AI data centers warm surrounding land by up to 9.1°C, affecting 340 million people worldwide. But heat is only one of three hidden costs. Each AI query also evaporates water and strains the power grid — classic economic externalities that someone eventually pays for. Here's who, and how much.
Every time you ask an AI a question, three invoices are generated. You pay for none of them.
A study published in March 2026 revealed something striking: AI data centers create "heat islands" that raise temperatures by up to 9.1°C in surrounding areas, stretching as far as 10 kilometers. The researchers found that more than 340 million people worldwide live within these thermal zones.
But heat is just the first unpaid bill. There are two more.
What Is the Data Heat Island Effect?
The data heat island effect occurs when waste heat from data center operations measurably raises land surface temperatures in surrounding communities. Think of it as a building-sized space heater running 24/7, every day of the year.
The March 2026 study, titled *"The data heat island effect: quantifying the impact of AI data centers in a warming world,"* analyzed land surface temperature data from 2004 to 2024 across data center locations worldwide. The findings:
| Metric | Value |
|---|---|
| Average temperature increase | 2°C after operations begin |
| Maximum recorded increase | 9.1°C |
| Affected radius | Up to 10 km (6.2 miles) |
| People affected globally | 340+ million |
The researchers validated their findings in three regions: Bajío, Mexico (+2°C over 20 years), Aragon, Spain (+2°C), and Ceará, Brazil (+2.8°C, projected to exceed 3.5°C).
A caveat: the paper has not yet been peer-reviewed, and some experts have questioned whether the observed warming stems entirely from waste heat or partly from land-use changes — construction, paving, and vegetation removal around new facilities. Both mechanisms likely contribute, and both are consequences of data center siting decisions.
This matters because heat isn't just discomfort. Higher ambient temperatures increase air conditioning costs, worsen air quality, and compound existing urban heat island effects. Communities near data centers bear the thermal burden of AI workloads they may never use.
Bill #1: Heat Exported to Your Neighborhood
The traditional urban heat island effect — where cities run several degrees warmer than surrounding countryside — is well documented. Data centers add a new layer.
Here's the mechanism: servers convert electricity into computation and heat. Nearly all electricity entering a data center ultimately becomes waste heat that must be expelled — both from the servers themselves and from the cooling systems working to keep them running. Modern data centers achieve a Power Usage Effectiveness (PUE) of 1.1-1.2, meaning for every watt powering computation, another 0.1-0.2 watts power the cooling infrastructure. All of it becomes heat.
The scale is staggering. The four largest hyperscalers — Amazon, Google, Meta, and Microsoft — spent a combined $413 billion on infrastructure in 2025, according to their earnings reports, and total industry spending is projected to surpass $600 billion in 2026. Each new facility becomes another permanent heat source.
What separates this from a factory or office building is density. A single hyperscale data center can pack 100 megawatts of power consumption — equivalent to roughly 80,000 homes — into a few acres. That concentration of thermal output in a small footprint is what creates the measurable heat island effect documented in the study.
What makes this an externality — an economic term for costs imposed on third parties — is that the people paying the thermal bill aren't the ones using the AI. A family living 3 kilometers from a hyperscale facility in Arizona didn't sign up for higher cooling bills.
Key insight: Data centers externalize heat costs to surrounding communities in the same way factories once externalized pollution to downstream rivers.
Bill #2: Water Evaporated for Cooling
The second invoice is liquid. Data centers are enormous water consumers.
| Facility Size | Daily Water Use | Equivalent Households |
|---|---|---|
| Typical data center | 300,000 gallons | ~1,000 households |
| Large hyperscale facility | 5 million gallons | ~50,000 residents |
In total, U.S. data centers consumed approximately 17.5 billion gallons of water in 2023, according to Lawrence Berkeley National Laboratory estimates. By 2028, hyperscale facilities alone are expected to consume 33 billion gallons annually.
How Cooling Works
Most data centers use evaporative cooling: water absorbs heat from server racks and evaporates into the atmosphere. For every kilowatt-hour of energy consumed, roughly two liters of water are needed. Each 100-word AI prompt consumes approximately 519 milliliters of water — about one bottle.
Training GPT-3 evaporated roughly 700,000 liters of freshwater for on-site cooling alone, according to a University of California, Riverside study. Larger models like GPT-4 require substantially more.
Where This Becomes a Crisis
Two-thirds of U.S. data centers built since 2022 are located in high water-stress areas. More than 160 new AI data centers have been constructed in regions with scarce water resources. Arizona, a state already struggling with water allocation, has seen significant data center expansion.
The water externality follows the same pattern as heat: the community absorbs the cost, not the computation consumer. When a data center draws millions of gallons from a municipal water supply, every other user — farmers, households, hospitals — faces increased scarcity and cost.
There's an environmental justice dimension too. Research shows that 82% of data centers in California are situated in communities already suffering from poor air quality. The communities bearing the water and heat burden are disproportionately working-class and minority neighborhoods — populations with the least political leverage to negotiate siting decisions.
Bill #3: Electricity Diverted from the Grid
The third invoice arrives in your mailbox. Literally.
U.S. data center electricity consumption is projected to grow from 176 TWh in 2023 to between 325 and 580 TWh by 2028. That represents a jump from 4.4% to potentially 12% of total U.S. electricity consumption.
| Year | Data Center Energy (TWh) | Share of U.S. Electricity |
|---|---|---|
| 2023 | 176 | 4.4% |
| 2026 (projected) | ~260 | 4-6% |
| 2028 (projected) | 325-580 | 6.7-12% |
Grid Strain Is Already Real
PJM Interconnection, the largest U.S. grid operator serving 65 million people across 13 states, projects it will be six gigawatts short of its reliability requirements by 2027. That's roughly the output of six nuclear power plants.
Electricity demand is growing faster than the grid — much of it built decades ago — was designed to handle. The strain is geographically concentrated because data centers cluster in the same regions.
The Cost to Consumers
U.S. household electricity bills in data-center-heavy regions could rise $10-25 per month due to grid upgrade costs — Baltimore-area ratepayers, for example, saw increases of approximately $17 per month following PJM capacity auction results. Grid infrastructure investments get spread across all ratepayers, whether or not they use AI.
Meanwhile, some data centers are turning to dedicated natural gas plants. A Google-funded facility in Texas will rely partly on a gas plant expected to emit roughly 4.5 million tons of greenhouse gases per year. The emissions are everyone's problem. The computation is Google's product.
The Externality Pattern: Why Markets Ignore These Bills
In economics, an externality is a cost (or benefit) imposed on a third party who didn't choose to incur it. Pollution is the textbook example. Data centers generate three simultaneous externalities:
| Externality | Who Creates It | Who Pays |
|---|---|---|
| Heat | Data center operator | Surrounding community (higher cooling costs) |
| Water | Data center operator | Local water users (scarcity, higher prices) |
| Electricity | Data center operator | All ratepayers (grid upgrades, higher bills) |
Markets ignore externalities because the cost falls on someone else. The price of an AI query doesn't include the heat absorbed by a neighborhood in Phoenix, the water drawn from an aquifer in Oregon, or the grid capacity consumed in Virginia.
This isn't unique to tech. It's the same economic pattern that drove environmental regulation for factories, power plants, and automobiles. In the 1970s, the Clean Air Act forced manufacturers to internalize pollution costs they had externalized for decades. The result wasn't the death of manufacturing — it was cleaner manufacturing.
The Concentration Problem
What makes data center externalities harder to address is geographic concentration. In northern Virginia, the world's largest data center market, facilities cluster along a narrow corridor. PJM's grid strain, local water drawdowns, and thermal effects compound in the same communities. A single resident might experience all three bills simultaneously — higher electricity costs, reduced water availability, and measurably warmer summers.
Meanwhile, the benefits of AI computation are distributed globally. The mismatch between where costs land and where benefits flow is the defining feature of this externality.
What's Being Done (And What Isn't)
Solutions in Progress
- Immersion cooling: Bathing servers in non-conductive fluid eliminates evaporative water loss. Adoption is growing but still limited to newer facilities
- Renewable energy commitments: Renewables supply roughly a quarter of data center electricity, according to DOE estimates, with nuclear providing another fifth. Renewable generation is projected to meet nearly half of demand growth by 2030
- Heat recycling: Some European facilities redirect waste heat to district heating systems, turning an externality into a product. In Stockholm, data center waste heat warms thousands of apartments
- Efficiency gains: Modern chips deliver more computation per watt each generation. But as the Jevons Paradox predicts, efficiency gains tend to increase total demand rather than reduce it
What's Missing
| Gap | Why It Matters |
|---|---|
| No mandatory disclosure | Industry reporting is voluntary and fragmented. Communities can't assess impact |
| No externality pricing | AI query prices reflect compute costs only, not environmental costs |
| Weak siting regulations | Most jurisdictions lack data-center-specific environmental review |
| No community compensation | Affected residents bear costs without benefit-sharing mechanisms |
The gap between technological solutions and regulatory frameworks is where the externality persists. Technology can reduce the size of each bill. Only policy can ensure the right party pays it.
The Real Cost of "Free" AI
When a service is priced only for computation, the remaining costs don't disappear — they shift. Heat goes to neighborhoods. Water goes from aquifers. Electricity goes from the grid. Someone always pays.
Understanding externalities doesn't mean opposing AI. The Clean Air Act didn't kill manufacturing; it made manufacturers pay their actual costs, which drove innovation in cleaner processes. The same pattern will likely play out with data centers — once environmental costs are internalized, the incentive to develop immersion cooling, waste heat recycling, and water-free systems becomes economic, not just moral.
The next time you send a prompt, three invoices will be generated. The question isn't whether they'll be paid — they always are. The question is whether the price tag will eventually appear on your screen or continue arriving at someone else's door.
SUGGESTED_EVERGREEN: Economic Externalities Explained — How Hidden Costs Shape Markets and Policy
📌 Sources
- The data heat island effect: quantifying the impact of AI data centers in a warming world (arXiv, March 2026)
- AI datacenters create heat islands around them, paper finds (The Register, April 2026)
- AI data centers create heat islands (Fortune, April 2026)
- Data Centers and Water Consumption (EESI)
- US data centers' energy use amid the AI boom (Pew Research Center)
- AI data centers use a lot of electricity (NPR, January 2026)
Related Posts
- Google TurboQuant: Why AI Efficiency Won't Kill Chip Demand — The Jevons Paradox explains why making AI more efficient actually increases total resource consumption
- AI Literacy: What Every Person Actually Needs to Know — A comprehensive guide to understanding AI's role in daily life
'🔬 Science & Tech' 카테고리의 다른 글
| Neuro-Symbolic AI: How Thinking Cuts Energy Use 100x (0) | 2026.04.07 |
|---|---|
| AI "Beats" Humans: The Benchmark Illusion Nobody Talks About (0) | 2026.04.06 |
| Deepfake X-Rays Fool Doctors and AI: The Detection Paradox (0) | 2026.03.31 |
| Google TurboQuant: Why AI Efficiency Won't Kill Chip Demand (0) | 2026.03.30 |
| Why AI Agents Fail at Scale: The Accountability Gap (0) | 2026.03.27 |