Date published

16 Dec 2025

Author

Francesca Cain-Watson

As AI reshapes how we work and learn, the companies building it face unprecedented obstacles in supporting the demand for more compute power.  Data centers are popping up everywhere from suburban New Jersey to rural Texas; these massive server farms power everything from ChatGPT queries to global weather forecasts.  

 

But these “AI Factories” require ever-increasing amounts of electricity and water to run and cool them. And there’s another problem: Communities are increasingly saying “no thanks” to new data center projects, and that pushback is starting to slow things down in a serious way. Over $64 billion worth of data center projects have already been delayed or cancelled due to local opposition. At the current pace, zoning boards may slow AI progress more than any technical challenge. 

 

The future neighbors of AI data centers aren't anti-tech; they're reacting to the IRL downsides of living near traditional data centers. These facilities can use as much power as an entire town and require huge amounts of water to cool, straining local ecosystems. As the news spreads about datacenter projects that contaminate local water supplies, increase air pollution, and spike utility rates, the pushback is not surprising. And these buildouts don't provide many jobs for local residents, despite optimistic projections from investors. It's not just individual opposition; over 200 environmental organizations have called for a moratorium on new datacenter build-outs in the US.

 

Flipping the Script

The tech sector is at a crossroads. Right now, a lot of companies follow a “build it first, justify it later” playbook. They announce a project, talk up the jobs and tax revenue, and only then start addressing environmental concerns when communities get upset. This reactive approach leads to delays, cancellations, and mistrust. A better approach is “prove and partner”: Show up early with real sustainability plans, treat community members as actual partners, and make environmental responsibility part of the core value proposition. 

 

Thermal Management: The Cause of (and Solution to) All of AI's Problem

Apologies to Homer Simpson for the misquote, but heat is one of the biggest obstacles to building AI data centers. GPUs essential for AI workloads can run so hot that conventional air conditioning becomes both energy intensive and inadequate, and each new generation of AI hardware generates significantly more heat than the previous one. Traditional data center facilities dedicate up to half of their total energy consumption just on cooling hardware, and many rely on water-intensive systems that waste huge amounts of water through evaporation. New approaches to cooling are urgently needed. 

 

Technology that Wins Back Trust

Advanced liquid cooling technologies represent a paradigm shift that could resolve both the performance and sustainability challenges facing modern data centers. Instead of blasting cold air around server rooms or evaporating water into the atmosphere, advanced liquid cooling technologies use sealed, closed-loop systems to route heat away and allow it to be used for other purposes. Some solutions can cut cooling energy use by more than 80% while dramatically reducing water use, which is exactly the kind of story communities want to hear before they agree to host new facilities.  

 

It’s not hard to connect the dots between AI infrastructure and everyday life. Almost everything online—from checking a bank balance to searching for a recipe —runs through data centers. The problem is that many companies still treat sustainability as a “nice to have,” instead of leading with it.  And with more efficient cooling, data centers don’t have to be massive, resource-hungry buildings. Smaller, local facilities can deliver compute closer to where people live and work, without overwhelming the surrounding environment. 

 

Containerized, liquidcooled data centers are a good example of what this future could look like. They can be dropped into urban or suburban environments with a smaller footprint, reducing latency while also lowering environmental and community impact. Since they use noiseless fluid instead of noisy fans for cooling, they are a much quieter neighbor than your typical fraternity house. It’s a timely shift from the “bigger is always better” mindset that has dominated data center development for years. 

 

AI for All, No Cap

Companies that will succeed in the next wave of AI infrastructure buildouts are the ones that treat sustainability as a strategic advantage, not just a box to check. They’ll show up with credible environmental plans, invite communities into the process, provide job training and career paths, and prove that powerful computing and responsible resource use can go hand in hand.   

 

If done right, the AI revolution won’t just benefit shareholders—it will create real, shared value for the communities hosting this critical infrastructure.