NETWORKS
Penguin Announces Availability of Mellanox's BridgeX Multi-protocol Gateway Systems to Simplify and Consolidate HPC Cluster Network Fabrics
Penguin's support for BridgeX now enables high-performance computing (HPC) cluster customers to run all typical network protocols over one common fabric, simplifying deployment, maximizing productivity and reducing overall total cost. Penguin Computing, a Mellanox Certified Channel Partner, announced today availability of Mellanox's BridgeX line of multi-protocol gateway systems in Penguin Computing's Application-Ready High-Performance Computing cluster solutions.
"Penguin Computing has a long standing relationship with Mellanox and the BridgeX gateway technology announced today represents yet another leading-edge innovation available through Penguin Computing," said Tom Coull, senior vice president of product management and development at Penguin Computing. "Our mission is to bring the latest supercomputing technologies to our high-performance technical computing customers and Mellanox's BridgeX is now a key ingredient of our solutions portfolio."
InfiniBand fabrics are being used to speed innovation in the Fortune 500. Mellanox BridgeX brings multi-protocol and multi-application capabilities to InfiniBand environment by adding native 10 Gigabit Ethernet and Fibre Channel ports to the 40Gb/s InfiniBand fabric. Mellanox BridgeX enables seamless, high speed connectivity between cluster nodes and existing Ethernet-based enterprise infrastructures as well as Fibre Channel based Storage Area Networks and storage appliances.
"Mellanox BridgeX is an innovative technology that delivers high-performance and low-latency fabric consolidation technology over a 40Gb/s InfiniBand network to deliver true I/O unification for high-performance clustering and data center applications," said Wayne Augsburger, vice president of business development at Mellanox Technologies. "We are happy to see our channel partner Penguin, a leading cluster vendor, utilizing Mellanox end-to-end solutions to deliver fabric consolidation technology over InfiniBand."
"Penguin has delivered numerous Linux and x86 based cluster solutions to both the public sector and commercial customers. Many of our clusters include high-performance InfiniBand fabrics, however our customers are also investing in Ethernet infrastructure as well as Fibre Channel connected storage. Mellanox BridgeX connects these heterogeneous domains into one cohesive environment," said Jussi Kukkonen, director of hardware products, Penguin Computing. "In addition, BridgeX technology provides very real density and cost advantages by replacing multiple host adapters with a single 40Gb/s InfiniBand link, making multi-application, high-performance connectivity both compact and economical."