CLOUD
University of Cambridge's Cloud Runs On Mellanox 40Gb/s InfiniBand
40Gb/s InfiniBand-Accelerated Solution Centre Provides Academic and Commercial Researchers with Readily Accessible Performance for Multiple Scientific Applications
Mellanox Technologies' 40Gb/s InfiniBand connectivity products, including ConnectX-2 adapter cards, BridgeX gateway system, 40Gb/s InfiniBand switches and cables, provide the high-performance server and storage networking for the new Dell | Cambridge High-Performance Computing Solution Centre. On the University campus, the HPC cluster provides on-demand access to support over 400 internal users spread across 70 research groups ranging from traditional sciences such as chemistry, physics and biology, to rapidly growing areas for HPC-based research such as bio-medicine, clinical-medicine and social sciences.
The University of Cambridge is a world-leading teaching and research institution, consistently ranked in the top three universities worldwide. The University also forms the central hub of Europe’s largest technology centre with more than 1,200 technology companies located in science parks surrounding the city and boasting Europe’s largest bio-technology centre.
“The Solutions Centre provides accessible research computing services and technology to academic or private sector organizations that would otherwise not have the money or expertise,” said Dr. Paul Calleja, Director HPC Service, University of Cambridge. “Incorporating Mellanox 40Gb/s InfiniBand, with its GPU efficiency and offloading capabilities, as the clustering backbone of the Solutions Centre was essential to our cloud provision requirements and provides our research community with leading application performance to enhance and accelerate the wide-range of their research.”
“InfiniBand connectivity enables HPC applications to achieve significant performance improvements, enabling researchers to get faster results,” said Marc Sultzbaugh, vice president of worldwide sales at Mellanox Technologies. “By integrating Mellanox 40Gb/s InfiniBand interconnects, the University of Cambridge can deliver on-demand high-performance cluster access and industry-leading application performance to its large pool of academic and commercial researchers.”
Mellanox’s end-to-end InfiniBand connectivity, consisting of the ConnectX-2 line of I/O adapter products, gateways, cables and comprehensive IS5000 family of fixed and modular switches, delivers industry-leading bandwidth, efficiency and economics for the best return-on-investment for performance interconnects. Mellanox provides its worldwide customers with the broadest, most advanced and highest performing end-to-end networking solutions for the world’s most compute-demanding applications.