Virtualization Technology News and Information
Article
RSS
Inspur Information Announces AI Server Support for Liquid-Cooled NVIDIA PCIe Tensor Core GPUs

Inspur Information announced at ISC 2022 that it will be among the first to support the recently announced liquid-cooled NVIDIA A100 and H100 PCle Tensor Core GPUs. As digital transformation and environmental sustainability become increasingly intertwined, Inspur is pioneering greener and more powerful liquid-cooled computing systems for enterprises and industries. 

Inspur AI servers, including NF5468M6 and NF5468A5, support up to eight liquid-cooled NVIDIA PCIe GPUs. Inspur was the first server vendor to offer eight liquid-cooled 500W HGX A100 GPUs in its NF5488LA5 and NF5688LA5 servers, and now Inspur is also among the first to support liquid-cooled PCIe GPUs in its product portfolio.

Inspur AI liquid-cooled servers have strong AI computing and general computing capabilities, and can be flexibly configured according to user needs. They provide strong computing for AI and HPC applications including image recognition, speech recognition, natural language processing, scientific research and engineering simulation. The cold plates cover high power components like the GPUs and CPUs, and the server adopts warm water cooling technologies, allowing the power usage effectiveness (PUE) of Inspur servers to be as low as 1.1. This results in a substantial reduction in operating costs for cooling equipment. 

"The rapid growth of data center computing power and power consumption highlights the importance of liquid cooling," said Liu Jun, Vice President of Inspur Information and General Manager of AI and HPC. "Our liquid-cooled AI servers offer powerful and green solutions that make it easier to build next-generation intelligent data centers and hyperscale with tremendous density and performance, while reducing energy costs and being more environmentally sustainable."

"The growing demand for mainstream systems that can effectively run AI applications such as training and inference require powerful GPUs," said Paresh Kharya, Senior Director of Product Management for Accelerated Computing at NVIDIA. "Inspur's systems powered by liquid-cooled NVIDIA A100 and H100 PCIe GPUs will enable customers to achieve higher performance on these workloads, while improving energy efficiency in the data center."

Inspur information is a leading AI server provider, with a rich portfolio of AI computing products, and works closely with its AI customers to help them achieve incredible performance improvements in AI applications including voice, semantics, image, video, and search processing. 

Published Wednesday, June 01, 2022 7:05 AM by David Marshall
Filed under: ,
Comments
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
Calendar
<June 2022>
SuMoTuWeThFrSa
2930311234
567891011
12131415161718
19202122232425
262728293012
3456789