Data Center Cooling Explained: Legacy to High-Density Systems

In the digital age, data is the new currency, and data centers are the vaults. But inside these vaults, a silent battle is constantly being waged: the battle against heat.

For IT facility managers and energy strategists, cooling is no longer just a facility management checkbox it is a critical pillar of operational strategy. It dictates uptime, defines energy efficiency, and ultimately limits (or unlocks) a facility’s ability to scale. With the explosion of AI, machine learning, and high-performance computing (HPC), rack densities are skyrocketing. The cooling solutions that worked a decade ago are struggling to keep up.

At NexCap Energy, we understand that power and cooling are two sides of the same coin. Efficient cooling reduces the energy load, while reliable power storage ensures that cooling systems never fail. In this guide, we explore the evolution of data center cooling from legacy CRAC units to futuristic High-Density solutions

The Cooling Challenge: Why It Matters More Than Ever

Before diving into the hardware, we must understand the stakes. Poor cooling decisions result in “hot spots,” leading to server throttling or, worse, catastrophic thermal shutdown. Furthermore, cooling often accounts for 40% or more of a data center’s total energy consumption.

Optimizing this system isn’t just about temperature; it’s about Power Usage Effectiveness (PUE). The more efficient your cooling, the more power you have available for actual computing.

We will examine the four dominant technologies shaping the landscape today:

  1. CRAC (Computer Room Air Conditioners)
  2. CRAH (Computer Room Air Handlers)
  3. Thermal Walls
  4. RDHx (Rear-Door Heat Exchangers)

CRAC Units: The Legacy Workhorse

The Computer Room Air Conditioner (CRAC) is the grandfather of data center cooling. If you walk into a server room built in the early 2000s, this is likely what you will see.

How It Works

CRAC units operate similarly to the air conditioner in your home, but on a larger scale. They utilize a Direct Expansion (DX) cycle. The unit contains a compressor and refrigerant. It sucks in hot air from the server room, passes it over cold refrigerant-filled coils to remove the heat, and blows the cooled air back into the room (often under a raised floor). The heat is then rejected to the outside via an external condenser.

The Pros and Cons

AdvantagesLimitations
Simplicity: Self-contained units; easy to install without complex plumbing.Capacity Limits: Generally caps at ~100 kW per unit. Not built for hyperscale.
Low Entry Cost: No need for a central chiller plant or water towers.Inefficiency: The DX cycle consumes significant electricity compared to water-cooled systems.
Redundancy: Easy to add “one more unit” as you grow incrementally.Maintenance: Requires refrigerant handling and compressor maintenance.

CRAC is still a viable solution for Telecom Backup Power Solutions sites or smaller server rooms where installing a massive water chiller plant isn’t financially feasible.

CRAH Units: The Enterprise Standard

As data centers grew, the limitations of CRAC units became apparent. Enter the Computer Room Air Handler (CRAH). While they look similar to CRACs on the outside, the inside is fundamentally different.

How It Works

CRAH units do not have compressors inside them. Instead, they act as the lungs of the data center, connected to a massive central heart: the Chiller Plant.

  1. A central chiller produces cold water.
  2. Pumps circulate this water to the CRAH units on the data floor.
  3. Fans in the CRAH unit draw hot air over the water-filled coils.
  4. The water absorbs the heat and cycles back to the chiller.

The Pros and Cons

AdvantagesLimitations
Efficiency: Water conducts heat 24x better than air. Using central chillers is far more efficient than hundreds of small compressors.High CapEx: Requires expensive piping, pumps, and a central chiller plant.
Capacity: Can handle up to ~250 kW per unit.Water Risk: Introducing water piping onto the data floor adds a risk of leaks.
Reliability: Fewer moving parts inside the unit (mostly just fans and valves).Raised Floor Reliance: Often relies on pressurized raised floors, which can be difficult to balance.

CRAH is the standard for a reason. For facilities utilizing our Industrial Peak Shaving Solutions, switching to efficient CRAH systems can lower baseload power, making peak shaving even more effective.

Thermal Wall Units: The High-Density Air Solution

As we move toward 2026, the traditional raised-floor air distribution (used by CRAC and CRAH) is falling out of favor. It takes up vertical space and is hard to manage. The solution? The Thermal Wall.

How It Works

Imagine a wall of fans. Thermal wall units are large, modular arrays of coils and fans built directly into the perimeter of the data hall.

Instead of pushing air under the floor, they flood the room horizontally with cold air. The hot air from the servers is captured in a dedicated “hot aisle” corridor behind the racks and sucked back into the wall.

Why It’s Winning

  • Space Saving: Because they are integrated into the wall, they free up the white space (floor space) for more server racks.
  • Massive Airflow: A thermal wall can move massive amounts of air at low velocity, capable of cooling 500 kW+ zones.
  • Efficiency: By separating the maintenance corridor (hot side) from the data hall (cold side), they achieve near-perfect air isolation.

This is the architecture of choice for the giants (Google, Meta, AWS). It pairs perfectly with large-scale energy storage, such as our Microgrid Energy Management systems, to ensure continuous operation of the massive fans required.

RDHx (Rear-Door Heat Exchangers): The AI Era Solution

We are now entering the era of AI. A standard server rack used to consume 5kW. An AI training rack can consume 50kW to 100kW. Blowing cold air at that rack is like trying to cool a campfire by blowing on it it simply isn’t enough.

RDHx brings the cooling to the source.

How It Works

An RDHx is essentially a radiator installed on the back door of the server rack.

  1. The servers pull in room-temperature air.
  2. The servers heat the air up.
  3. As the hot air exits the back of the rack, it immediately passes through the RDHx coil (filled with chilled water or refrigerant).
  4. The air is neutralized (cooled back down to room temp) before it even enters the room.

The Pros and Cons

AdvantagesLimitations
Targeted Cooling: Treats the heat exactly where it is created.Complexity: Every rack needs plumbing connections.
High Water Temps: Can use warmer water (e.g., 20°C/68°F), drastically improving chiller efficiency (free cooling).Cost: High upfront investment per rack.
Room Neutrality: The data center room doesn’t get hot, because the heat never escapes the rack.Weight: Heavy doors can make rack access slightly more cumbersome.

RDHx is the bridge to the future. It allows legacy data centers to host high-density AI clusters without rebuilding the entire facility.

The Critical Variable: Containment

Regardless of whether you use CRAC, CRAH, or Thermal Walls, the biggest enemy is mixing. If your cold air mixes with hot exhaust air before it reaches the server, you are wasting money. This is where Containment comes in.

Hot Aisle Containment (HAC)

  • Concept: Enclose the back of the servers. Trap the hot air and duct it straight back to the AC unit.
  • Benefit: The rest of the room becomes a giant “cold aisle.” It is generally more comfortable for humans to work in.
  • Best for: New builds and Thermal Wall setups.

Cold Aisle Containment (CAC)

  • Concept: Enclose the front of the servers. Trap the cold air so the servers must drink it.
  • Benefit: Easier to retrofit into existing raised-floor environments.
  • Drawback: The rest of the room becomes a “hot aisle,” which can be uncomfortable for technicians.

Conclusion

As we look toward 2026, the data center is no longer just a building; it is a complex energy ecosystem. The choice between CRAC, CRAH, Thermal Wall, and RDHx depends on your density, your budget, and your growth goals.

But remember: Cooling requires power.

A power outage doesn’t just mean servers go down; it means cooling pumps stop, fans stop, and temperatures spike to critical levels in seconds. This is why robust energy storage is the partner to every cooling solution.

Whether you need EV Fleet Charging Solutions for your logistics or massive Grid Scale Battery Storage to back up your chiller plant, the infrastructure must be holistic.

Optimizing your cooling is step one. Securing the power that runs it is step two.

Interested in securing your data center’s power infrastructure? Contact NexCap Energy today.

Scroll to Top