HPE Announces Industry’s First 100% Fanless Direct Liquid Cooling Systems Architecture.
Hewlett Packard Enterprise (NYSE: HPE) today announced the industry’s first 100% fanless direct liquid cooling systems architecture to enhance the energy and cost efficiency of large-scale AI deployments. The company introduced the innovation at its AI Day, held at one of its state-of-the-art AI systems manufacturing facilities. During the event, the company showcased its expertise and leadership in AI across enterprises, sovereign governments, service providers and model builders.
Industry’s first 100% fanless direct liquid cooling system
While efficiency has improved in next-generation accelerators, power consumption is continuing to intensify with AI adoption, outstripping traditional cooling techniques.
Organizations running large AI workloads will need to do so more efficiently. The most effective way to cool next-generation AI systems is through direct liquid cooling, of which HPE is a pioneer. This critical cooling technology has enabled HPE’s systems to deliver 7 of the top 10 supercomputers on the Green500 list, which ranks the world’s most energy-efficient supercomputers.
Based on this expertise, HPE’s 100% fanless direct liquid cooling architecture, introduced today, brings the cost and energy efficiency benefits sovereign AI deployments are already enjoying to a broader set of organizations building large-scale generative AI.
Antonio Neri, president and CEO of HPE, said:
As organizations embrace the possibilities created by generative AI, they also must advance sustainability goals, combat escalating power requirements, and lower operational costs.
“The architecture we unveiled today uses only liquid cooling, delivering greater energy and cost-efficiency advantages than the alternative solutions on the market. In fact, this direct liquid cooling architecture has the potential to yield a 90% reduction in cooling power consumption as compared to traditional air-cooled systems. HPE’s expertise deploying the world’s largest liquid-cooled IT environments and our market leadership spanning several decades put us in an excellent position to continue to capture AI demand.”
The system architecture is built on four pillars:
- An 8-element cooling design that includes liquid cooling for the GPU, CPU, full server blade, local storage, network fabric, rack/cabinet, pod/cluster and coolant distribution unit (CDU)
- High-density and high-performance system design, complete with rigorous testing, monitoring software, and on-site services to support successful deployment of these sophisticated compute and cooling systems
- Integrated network fabric design for massive scale integrating lower-cost and lower-power connections
- Open system design to offer flexibility of choice in accelerators
The 100% fanless direct liquid cooling architecture delivers unique benefits — including a 37% reduction in cooling power required per server blade, when compared to hybrid direct liquid cooling alone. This reduces utility costs, carbon production and data center fan noise. In addition, because systems using this architecture can support greater server cabinet density, they consume less floor space.
HPE’s leadership and unique market opportunity
At AI Day, Antonio Neri, president and CEO; Fidelma Russo, EVP & GM, Hybrid Cloud and HPE CTO; and Neil MacDonald, EVP & GM, Server; discussed how the HPE portfolio comprises the critical building blocks of networking, storage and hybrid cloud to deliver on the promise of AI.
READ the latest news shaping the data centre market at Data Centre Central
HPE Announces Industry’s First 100% Fanless Direct Liquid Cooling Systems Architecture. source