SCIENCE
Platform Computing Announces Commercial Support for Apache Hadoop Distributed File System (HDFS)
- Written by: Webmaster
- Category: SCIENCE
Platform Computing announced it has signed the Apache Corporate Contributor License Agreement allowing the company to contribute to the Apache Software Foundation for developing Apache-based, open-source Hadoop Distributed File System (HDFS). Platform Computing and its developers will contribute to the ongoing development of HDFS, focusing on commercial support for the recently announced Platform MapReduce, an enterprise-class, distributed runtime engine for MapReduce applications. For more details on Platform MapReduce's key features and capabilities please see today's related announcement.
"Platform is committed to the growth of a robust enterprise-ready Hadoop ecosystem and excited to be an active member of the open source development community," said Rohit Valia, Director, HPC & Analytics Solutions, Platform Computing. "The Hadoop community is leading the charge in big data computing, and Platform intends to support these efforts, drawing from our 18-year history of superior support for workload management in high performance computing environments, including complex clusters for government and some of the largest financial services, retail, life sciences and manufacturing companies worldwide."
The data explosion has shifted the computing paradigm, requiring applications to move closer to the data, to effectively leverage analytics and other line of business functions. Platform Computing recognizes that global enterprises need the support of the IT development community to tackle this challenge. This has led to Platform Computing's agreement with the Apache Software Foundation to ensure the development of open-source HDFS retains full compatibility with the company's commercial products, such as Platform MapReduce. This will enable the company to provide the best-in-class solutions that IT needs to effectively manage and capitalize on the data explosion. When integrated with Apache HDFS, Platform MapReduce optimally manages the movement of applications closer to the data, dramatically delivering enterprise-class manageability and scale, high resource utilization and availability, ease of operation, and multiple application support.
"Platform MapReduce, coupled with HDFS integration, is a powerful new tool allowing the enterprise to not only manage the data explosion but also maximize ROI," said Dave Henry, Senior Vice President, Enterprise Solutions, Pentaho Corporation. "Pentaho enables business to effectively leverage Apache Hadoop with solutions for end-to-end business intelligence and data integration, and we're thrilled about the synergies made possible through Platform Computing's addition to the community."