Gordon Johnson, Subzero Engineering’s Senior CFD Manager, examines how air cooling remains critical for data center memory storage and containment systems

By Amber Jackson
As Published on DataCentreMagazine.com
Why liquid cooling is becoming essential for AI-driven data centers—while air cooling still plays a critical role
As AI drives immense compute demands, it is expected to be the main cause of data center power demands doubling worldwide between 2022 and 2026—unless operators are able to tackle sustainability head-on to decrease rising emissions.
One of the main topics of conversation is thermal management, as GPUs continue to draw significant power densities. Cooling servers quickly has become one of the most pressing challenges in a hyperscale data center environment, particularly as traditional air cooling systems can no longer keep up with the demands of AI workloads.
To dig into this further, Subzero Engineering’s Senior CFD Manager, Gordon Johnson, shares his analysis that liquid cooling has quickly become the new norm for data centers of the future—but that air cooling is still required.
“Direct Liquid Cooling (DLC), and specifically Direct-to-Chip (DTC), is now essential for controlling heat,” he says. “However, about 25% of the heat produced by IT equipment still needs to be expelled through the air, especially from secondary parts such as memory subsystems, storage, and power delivery circuits.
“It is impossible to overlook this heat residue, and that’s where traditional airflow strategies are still needed, albeit in a supporting role.”
Confronting hyperscale cooling challenges
Gordon explains that hyperscale operators are seeing a sharp rise in OPEX from both power and cooling, given that it has become one of their most significant challenges.
“In recent years, power and cooling have become strategic levers and margin killers in hyperscale operations,” he says. “If you’re operating at scale, your P&L is directly tied to your power and cooling intelligence.
“Those who get it right will widen their advantage. Those who don’t could find AI infrastructure becoming financially unsustainable.”
He argues that efficiency is no longer just best practice, as—despite the rise of renewable energy resources—AI is effectively slowing down decarbonization.
“Data centers’ energy usage is driven by the fact that advancements in AI model performance frequently result in larger models and more inference, raising energy costs and contributing to sustainability challenges,” he explains.
“AI needs to get more efficient, not just more powerful.”
Energy consumption remains a key concern as AI continues to boom. Once deployed, these models require an enormous amount of inference infrastructure to process countless queries every day.
“Modern AI GPUs are now drawing upwards of 500 watts per chip,” Gordon says. “Hyperscale data centers that once operated in the 10–30 kW/rack range are now pushing 80–120 kW/rack to support AI training and inference.
“With air cooling limited to about 30–40 kW/rack, the air just cannot carry the created heat quickly enough, even with optimal containment and supply airflow.”
Embracing air cooling in direct liquid cooling
Higher compute density, increased energy efficiency, and more reliable thermal control at the component level are all made possible for hyperscale operators by DLC—and specifically DTC.
Gordon says: “It is a practical means of maintaining the safe thermal working range of contemporary CPUs and GPUs. DLC also permits higher incoming air temperatures, reducing reliance on traditional HVAC systems and chillers.”
However, he argues that advanced DTC systems do not eliminate the need for air cooling.
“Air cooling is necessary even with the most sophisticated DTC systems,” he says. “The cooling of non-critical components, cabinet pressurization, and residual heat evacuation still require airflow.”
Additionally, hot and cold aisle containment systems have been proven to separate the hot exhaust air from the cold intake air effectively.
“Efficiency increases can result in a cooling energy decrease of 10–30%,” Gordon says. “Containment is essential for optimizing the performance of air-cooled systems in legacy settings.
“Raised flooring, hot/cold aisles, and containment systems are becoming progressively more crucial in environments that are transitional or hybrid (liquid + air cooled).
“These airflow techniques aid in the separation of AI-specific and older infrastructure in mixed-use data centers. However, in modern AI racks, air cooling is the supporting act rather than the main attraction.”
For operators managing large megawatts of IT load, hot/cold aisle containment is one of the most cost-effective and space-saving solutions available. Gordon explains that making slight improvements to airflow containment can ultimately result in large-scale energy savings in high-density settings.
“By stabilizing temperature zones and lowering fluctuation, this improves cooling system responsiveness while lowering chiller load and encouraging energy-reuse initiatives,” he explains.
“Hot/cold aisle containment is no longer just a best practice—it is becoming a critical optimization layer in tomorrow’s high-performance, high-efficiency data centers.
“For operators managing hundreds of megawatts of IT load, hot/cold aisle containment is still one of the most cost-effective, space-efficient tools available.”
Where does the industry go from here?
As the data center industry continues to transition toward liquid cooling adoption, Gordon is eager for operators to understand that air cooling will remain relevant.
He explains that air management in the cooling stack is changing from a primary to a supporting, yet essential, system—highlighting that the industry is improving and re-integrating traditional tactics alongside cutting-edge liquid systems rather than discarding them.
“Hyperscalers are under constant examination to meet net-zero targets. In addition to complying with energy efficiency regulations, the hybrid solution offers data center operators a way to transition from conventional air-cooled facilities to liquid-readiness without requiring complete overhauls,” he says.
“With high-density AI workloads, air cooling just cannot keep up. It’s a physical limitation. Hybrid methods that combine regulated airflow with DLC are now the engineering benchmark for scalable, effective, and future-ready data centers.”