Utah’s New AI Data Center Pitch: Save Water, Burn Gas, Skip the Grid

data center IA pénurie eau électricité refroidissement boucle fermée générateurs gaz 12 GW

AI data centers have become the new neighborhood villain: they guzzle electricity, they sip up water, and they show up right when both are getting harder to come by.

Now a proposed campus in Utah says it can ease the squeeze on both. The trick: tap “historic” groundwater rights for water, and generate a big chunk of its own power on-site with natural-gas generators backed by battery storage—so it doesn’t have to wait years for a grid hookup or jack up local electric bills.

The developers are dangling two headline numbers: 75% less water use than the agricultural operation that used to run on the land, and on-site generation that could scale to a jaw-dropping 12 gigawatts.

That’s the sales pitch. The obvious comeback: are we really “greening” anything if we save water by… burning a mountain of gas?

How they plan to cut water use: closed-loop “direct-to-chip” cooling

The water story hinges on a closed-loop cooling system called direct-to-chip. Instead of cooling an entire room of hot air (and often relying on evaporative cooling, which can be brutally water-hungry), water circulates in a sealed circuit and pulls heat straight off the chips.

The project claims that switch saves millions in water volume every year compared with common evaporative setups. And the basic logic checks out: modern AI hardware runs hot. Those accelerators—specialized chips used to train and run models—pack a lot of power into a small area. Cooling them at the source is efficient.

But there’s a catch. These systems can require more energy to pump and manage the liquid loop. So you can end up “saving” water while quietly spending more electricity to do it. The problem doesn’t vanish; it moves.

The developers also lean hard on a comparison that plays well in public meetings: the site includes roughly 4,000 acres that were previously agricultural, and they say the data center operation would use 75% less water than that prior farming use. Messaging-wise, it flips the script from “tech is stealing our water” to “this land used to drink more than we will.”

One big asterisk: the water access depends on historic groundwater rights tied to the property. That’s not a plug-and-play model other projects can copy. It’s a legal and geographic advantage—one plenty of communities don’t have, and one that can turn into a political brawl fast when drought and growth collide.

Power on-site: up to 12 GW of gas generation, plus batteries

The second pillar is electricity—specifically, not relying on the local grid.

The plan calls for a large fleet of natural-gas generators paired with battery storage, with capacity that could eventually reach 12 gigawatts. That’s not “a big warehouse with servers.” That’s utility-scale muscle.

The motivation is straightforward: grid interconnections can take years, and local networks in fast-growing regions are already strained. Meanwhile, electricity demand is getting yanked in multiple directions at once—EVs, household growth, and the data-center boom all pulling on the same rope. So the campus is basically saying: we’ll bring our own power, keep our own schedule, and (conveniently) avoid pushing costs onto local ratepayers.

Environmentally, though, this is where the pitch gets spicy. Saving water with sophisticated cooling while leaning on gas generation invites the immediate criticism: emissions.

That’s awkward timing for an industry that loves big 2030 promises. Google, Amazon, and Meta have all talked up “water positive” goals—putting more water back into watersheds than they consume—and net-zero carbon targets. Microsoft has similar water goals and says it wants to be carbon negative by 2030 (removing more carbon than it emits). The more AI sites that pop up, the harder those pledges get to square with reality.

And the accountability gap is real. In a May survey by S&P Global, only 21% of more than 15,000 companies said they actually quantify how their AI initiatives affect sustainability goals. Translation: AI is sprinting; the environmental bookkeeping is jogging.

A clever workaround—or a convenient exception?

This Utah proposal is being sold as a practical answer to two bottlenecks: water and power. From an industrial standpoint, it’s coherent—lock down the inputs you can’t operate without, and don’t wait for overloaded infrastructure to catch up.

From a public-interest standpoint, it’s also a bundle of exceptions: private water rights plus an off-grid energy strategy that, at full build-out, looks less like a “data center” and more like a power plant that happens to run servers.

The core contradiction is the whole story. The campus says it can save millions in water volume with closed-loop cooling, while also building gas-fired capacity that could reach 12 GW. If you’re a local official, a resident, or a regulator, you’re forced into a tradeoff nobody wants to own: do you prioritize local water savings, or do you prioritize cutting emissions—even if that means using more water or waiting longer for cleaner grid power?

Français