While direct-to-chip cooling dominates today’s data centers, immersion cooling may ultimately define how the industry scales performance, efficiency, and sustainability tomorrow.

By Subzero Engineering
The Safe Choice Today vs. the Scalable Choice Tomorrow
For many large-scale deployments, direct-to-chip (DTC) single-phase cooling has emerged as the market’s preferred direct liquid cooling (DLC) technique.
It is easy to see why. DTC is dependable, well-established, and reasonably simple to incorporate into the current data center infrastructure. For risk-sensitive facilities that are wary of operational disruption or retrofit headaches, DTC is the logical choice, and that’s why it has become the dominant DLC standard.
But the “logical choice today” is not the same as the “best choice for the future.” Technically, the superior solution is immersion cooling. With the ability to support denser racks than air cooling or DTC, immersion cooling offers higher heat removal capacity by immersing entire servers in dielectric fluids. Right now, immersion cooling is mostly used in specific areas such as crypto mining, experimental high-performance computing, and some edge computing setups. However, it hasn’t gained much popularity in the data center market yet, mainly due to high initial costs, the need for special infrastructure, and the challenges of training people to use it.
But with the consistent trajectory of compute, energy economics, and environmental pressures, is immersion cooling simply waiting for its moment to shine?
Why DTC Leads Today
Today, DTC dominates the market due to its ease of use. DTC solutions can be deployed in standard racks with minimal retrofits, data center teams don’t need to undergo extensive retraining, and maintenance procedures stay somewhat familiar. More than anything else, DTC is a straightforward solution for organizations that are unable or unwilling to re-architect their environments for immersion cooling in the face of increasing power densities.
Why Immersion Lags
Immersion Cooling is still in the early adoption and growth phase rather than being fully mature however, its efficiency improvements indicate that it will soon move from being a specialized solution to a vital component of hyperscalers’, HPC operators’, and edge deployments’ thermal management arsenal.
So why isn’t immersion adopted more often if it is a technically superior solution?
- High CAPEX: Immersion costs are significantly more than DTC retrofits because it requires specialized tanks, dielectric fluids, pumps, and monitoring systems.
- Specialized Infrastructure: Racks replaced with tank-based designs require redesign of cabling, power distribution, and architectural layouts.
- Learning Curve: Hardware compatibility, fluid handling, and maintenance considerations all call for retraining and skill sets.
The high initial expenditures, specific infrastructure requirements, and a process that hasn’t yet unified on a single model are the main obstacles to immersion today. Therefore, is it too disruptive to consider seriously?
The Case for Immersion
History has shown us that often the “niche” of today becomes the necessity of tomorrow.
Immersion cooling solves problems that DTC can only mitigate.
- Unmatched thermal performance: Immersion can absorb heat from all components, not just CPUs and GPUs, by completely submerging servers in dielectric fluids.
- Extreme density potential: Racks can be packed much more densely without needing to depend on airflow. This enables higher compute density per square foot.
- Energy efficiency: Immersion cooling can significantly lower power usage associated with cooling. Power usage effectiveness (PUE) levels of 1.02 to 1.05 have been observed by some operators.
- Sustainability: Immersion does not use water as the primary cooling method, which is a developing benefit in areas with water scarcity compared to evaporative cooling technologies. However, water may still be used indirectly to carry heat away from the dielectric fluid via heat exchangers and via chillers or cooling towers to reject heat from the building.
- Hardware longevity: Immersion can increase the useable life of servers by removing hot spots and thermal cycling.
Additionally, immersion cooling eliminates the need of airflow from data center design. Operators can completely redesign facility layouts without the requirement for air-handling infrastructure, raised floors, or large HVAC systems. However, experts in both air and immersion cooling still predict a future where both air and immersion will coexist in the industry depending on specific cooling needs.
What Will Force the Shift?
Operators cannot afford ineffective thermal management as power costs grow, and the trajectory of compute power continues to rise. Rack power densities are already increasing beyond that which air and DTC cooling can sustainably handle as processors get near 1000W+ TDP levels. Energy prices remain volatile and demands for efficiency and sustainability are being driven by environmental challenges.
Can immersion’s higher upfront investment be justified when weighed against long-term energy savings, environmental benefits, and the ability to extend hardware lifecycles?
Tipping Point
In the near term, DTC will remain the workhorse, as it’s good enough to buy operators time without demanding a full architectural reset. However, immersion’s tipping point won’t arrive because the technology suddenly becomes more appealing. It will arrive when nothing else works. DTC won’t scale forever.
The long-term goal will likely be immersion. Workload trends, power densities, and sustainability requirements all point to a future in which immersion is not only viable but the only workable option.
At that point, immersion will go from trailing to prevailing, but the changes will go beyond thermal efficiency. Data center architecture will evolve, altering not just cooling strategies but the data center’s architecture. A new approach will be unlocked by tank-based designs, fluid-centric maintenance, and different hardware form factors, where cooling will no longer be a limitation but rather a facilitator of sustainability, performance and efficiency.
This is about enabling the next generation of computational infrastructure, not simply about cooling.
Conclusion
The issue is not if immersion will catch up, but rather when high-density workloads will force the shift.
When it does, will operators be prepared? Those who still view immersion as a fringe experiment run the danger of having to rush to catch up when DTC reaches its limit, whereas those early adopters will be able to take the lead since they will already be training, developing, and cultivating knowledge and expertise.
For data centers, the future of cooling won’t be about being cautious. It will belong to those who are brave enough to wager on immersion before the tipping point happens.