AI needs water to think. Water needs governance to be shared. Governance needs data to be fair. Data needs AI to be seen. Somewhere in this loop, communities disappeared.
Beverly and Jeff Morris bought their house in Madison, Georgia in 2016. A quiet town. One stoplight. Six acres of oak trees. Then Meta broke ground on a $750 million data center, one thousand feet from their property.
Within a year, the water pressure had slowed to a trickle. Then nothing came out at all. The dishwasher, the ice maker, the washing machine, the toilet. Everything stopped. They spent $5,000 trying to fix it. A well replacement would cost $25,000. Their realtor told them only one party would ever be interested in buying their land: Meta itself.
That is one facility. In the same county, nine more companies have applied to build data centers. Some are asking for six million gallons a day, more than the county’s entire current daily usage. Newton County is now on track for a water deficit by 2030. Water rates are projected to increase 33% in two years. The county’s population is 120,000.
This is not an isolated incident. The same pattern is unfolding in Texas, Arizona, Louisiana, the United Arab Emirates, and along the Colorado River. Wherever electricity is cheap, data centers follow. They bring tax revenue. They bring jobs. They bring thirst.
And now this pattern is approaching Ontario.
On February 22, 2026, OpenAI CEO Sam Altman was asked about AI’s water consumption at the India AI Impact Summit. He called the concern “completely untrue, totally insane” and explained that OpenAI no longer uses evaporative cooling in its data centers.
This is a carefully constructed statement. It is narrowly accurate. And it obscures almost everything that matters. Altman is answering a question about his building’s cooling system. The actual question is about a supply chain.
Six weeks earlier, Xylem and Global Water Intelligence published Watering the New Economy, the most comprehensive assessment to date of how AI is reshaping global water use. Their findings tell a fundamentally different story.
Power and chips account for 96% of AI’s growing water demand. When Altman says “we don’t do evaporative cooling anymore,” he is addressing the 4% while declaring the problem solved.
And the power generation share is not abstract or distant. The thermoelectric plants serving a data center are typically on the same regional grid, drawing from the same watersheds. When a data center arrives in Newton County, so does the power demand that draws water from local rivers and aquifers. The true local water footprint is 58% of the supply chain, not 4%. Only semiconductor fabrication (42%) happens elsewhere, at chip fabs in Taiwan, South Korea, and a handful of US facilities. That water is still consumed. It still strains watersheds and communities. It is just not this watershed, not this community. The distance makes it easier to ignore, not less real.
“Water is totally fake. It used to be true. We used to do evaporative cooling in data centres, but now we don’t do that.”
– Sam Altman, India AI Impact Summit, February 22, 2026
AI’s water demand across the full value chain is projected to increase by 129% by 2050, adding 30 trillion litres annually. Nearly 40% of existing data centers are already in water-stressed regions. Even OpenAI’s new data center complex in Abilene, Texas, which uses a closed-loop system, still requires 8 million gallons of water from the city to fill its cooling system.
This is not a factual dispute about gallons per query. It is a study in how the loop stays hidden. When a CEO addresses one node in a supply chain and calls the whole system solved, the framing transforms a systemic resource question into a per-query efficiency metric. One that tells you nothing about the watershed, the power grid, the chip fab upstream, or the community downstream.
The Grand River watershed in Ontario supplies drinking water to nearly one million people. The region around Waterloo and Kitchener is one of the most groundwater-dependent urban areas in Canada, and sits within the Toronto-Waterloo corridor, North America’s third-largest tech region. It is already in crisis: the region’s water system has been operating at 97% capacity, its largest aquifer is at dangerously low levels, and a development freeze has been imposed across much of the region.
And the pressure is no longer hypothetical. Several data center projects are already proposed or underway across the region, at various stages of planning and approval. The same conditions that attracted nine applications to Newton County, Georgia (cheap power, available land, permissive tax structures) are drawing them here.
When the pressure arrives, who evaluates it?
Ontario is proposing to collapse 36 local conservation authorities (the bodies that manage watersheds, issue water-taking permits, and protect flood plains) into nine regional entities. The Grand River Conservation Authority, one of the most established in the country, would be amalgamated into the Eastern Lake Erie Regional Conservation Authority. These are the same authorities whose specialists know which aquifers are stressed, which wetlands are critical to recharge, and which communities are already at the edge of their water supply.
In this restructuring, there is a silence that echoes the one in Georgia, and the one in Altman’s answer.
No reference to data center water demand, or the power and energy that drives it.
No framework for cumulative industrial extraction.
No mention of Indigenous water rights.
No mechanism for community consent.
No acknowledgment that upstream decisions create downstream consequences.
Six Nations of the Grand River, the most populous First Nation in Canada, sits downstream. Their community has been under a drinking water advisory for years. Seventy percent of homes lack access to clean water. In 2025, Six Nations commenced litigation against Canada over its drinking water supply. They are downstream of every water-taking permit the Grand River Conservation Authority has ever issued, and of every permit the new regional body will issue once the GRCA is absorbed.
Three silences, one pattern. A tech CEO calls the water question “fake.” A governance restructuring removes local watershed expertise. A community downstream is not at either table. Each silence, on its own, looks like an oversight. Together, they look like a system.
The water problem is discussed by hydrologists. The AI problem by computer scientists. The governance problem by public administrators. The rights problem by legal scholars and Indigenous governments. The energy problem by grid operators. The economic problem by developers and municipal finance officers.
They are all discussing the same system. They are almost never in the same room.
The silo is not just an intellectual problem. It is a governance failure with a specific mechanism: decisions get made in sequence rather than in relation. A data center is approved based on economic projections. Its water demand is assessed against current supply, not cumulative demand from nine applications. The conservation authority that would have flagged the risk is being absorbed into a body that has never evaluated this watershed.
And the CEO, when asked about all of this, talks about the cooling system in his building.
In February 2026, a multi-disciplinary team led by Pale Blue AI and CIFAR-Earth4D gathered at the University of Toronto for a workshop called Pale Blue Futures: AI for Water Resilience. Hydrologists, AI researchers, mechanism design theorists, economists, and policy scholars. In the same room, working on the same model.
They asked a practical and timely question: If a data center were proposed for the Grand River watershed, what would happen?
To answer it, they built a multi-agent simulation where every water user is modelled with their own actions, observations, rewards, and their own information constraints. Not a single-variable optimization. A system.
Actions: build, extract water, select location, lobby. Observations: water price, audit probability, groundwater level. Rewards: maximize profit minus water and compliance costs. Can lie about actual consumption.
Actions: set water prices, set withdrawal limits. Observations: usage across sectors. Rewards: revenue, sustained supply. Catastrophic loss if systems collapse.
Actions: consent decision (as rights-holders), legal action, community monitoring. Observations: downstream flow, water quality, but with limited upstream visibility. Rewards: water security, treaty rights, seven generations wellbeing.
Actions: irrigation demand, conservation practices, land sale decisions. Observations: soil moisture, weather, water allocations. Drought and water prices can trigger farm sales to developers, creating feedback on recharge zones.
The model treats Six Nations not as a stakeholder to be consulted but as a rights-holder with independent standing. It explicitly models what the selective framing obscures:
Information asymmetry. Data centers rarely disclose actual water use. Six Nations observes downstream effects but has limited upstream visibility. What happens when one agent can lie?
Tipping points. At what number of data centers does groundwater cross a threshold from which it cannot recover? This is the question that per-query framing makes structurally impossible to ask.
The full loop. Water policy shapes land use, which shapes recharge, which shapes water availability, which shapes the next round of policy. When a drought coincides with peak data centre demand, who gets rationed first? These are simulation scenarios designed for the hackathon.
The same model serves different decision-makers asking different questions about the same watershed. A data centre developer asks: “Can I build here under drought conditions?” The Six Nations Water Office asks: “What happens to our downstream flow?” A regional planner asks: “What permit conditions protect all rights-holders?”
The Grand River question is a specific instance of a much larger one. Who controls the infrastructure that shapes how knowledge is produced, governed, and distributed?
Today, that control rests almost entirely with a handful of hyperscale cloud providers. The same companies that consume 500,000 gallons of water a day also determine who has access to AI compute, what data those systems are trained on, and whose knowledge is represented. The technology that could help communities manage their own resources is built, owned, and governed by the entities extracting those resources.
But there is another potential infrastructure.
Modest local inference nodes in public libraries, universities, and Indigenous cultural archives, running open-source models for community needs. Connected to a shared backbone for heavier workloads. Not replacing hyperscale. Complementing it with public interest infrastructure.
Community-controlled compute gives data sovereignty a physical address. Data stays under community control at the hardware level. Each node implements appropriate governance: OCAP® for First Nations, CARE principles for Indigenous contexts, and locally determined frameworks for municipalities, marginalized populations, and vulnerable communities whose data is routinely extracted without consent or benefit.
The layer that makes nodes community-governed, not merely community-located. Binding community decisions on data collection, AI model deployment, research priorities, and benefit distribution. Using platforms already deployed with 150+ organizations.
This is the Community Knowledge Nodes proposal: a federated network that transforms public institutions into sites of community-governed AI infrastructure. The pieces already exist: Canada’s national research network, the Digital Research Alliance, mature Indigenous governance frameworks, proven participatory democracy platforms. Without the democratic engagement layer, community nodes risk becoming extractive satellites. With it, communities gain the deliberative capacity to govern their own resources.
AI needs water to run. It needs energy that reshapes grids and watersheds. It needs land that belongs to someone. It needs governance to be accountable. Right now, each of these needs is managed in a separate room, and the people with the most power to keep them closed have the most incentive to do so.
Privatization of rewards, socialization of losses.
Shared resources, shared governance, shared benefit.
The question is who is in the room when the decisions are made, and whether they can see the whole loop.
Beverly Morris can see the data center from her front yard. She cannot see how much water it uses. Six Nations can see the downstream water quality. They cannot see the upstream extraction data. Sam Altman can see his building’s cooling system. He calls the rest “fake.”
The loop is already running.
The question is whether the people inside it
can see it, shape it, and share in what it produces.
Or whether they will learn what happened
when the water stops.