GOVERNMENT
IBM Deep Computing On Demand Open for Business
ARMONK, N.Y.--- IBM today ushered in the era of deep computing on demand with the opening of its first facility for delivering supercomputing power to customers over the Internet, freeing them from the fixed costs and management responsibility of owning a supercomputer. Additional facilities are planned nationally and internationally. Delivering both Linux and UNIX supercomputing power, IBM’s Deep Computing on demand offers scalable, secure systems that are accessible from anywhere in the world. One of the first to use Deep Computing on demand is GX Technology Corporation, which produces high resolution subsurface images from large volumes of seismic data. These subsurface images provide vital information to petroleum exploration companies in their search for new oil and natural gas deposits. The service will enable the Houston-based company to expand the scope and the number of projects it can effectively handle around the world. “With dry hole costs in the range of $5 million to $60 million each, companies exploring for oil and gas reserves use our high resolution subsurface images to significantly reduce their drilling risk. Shortening project cycle times allows our clients to gain a significant additional benefit from our services” said Mick Lambert, president and CEO, GX Technology. “IBM’s Deep Computing on demand gives us the power to dramatically reduce project cycle times and increase our project capacity, while reducing infrastructure and operating costs” With Deep Computing on demand customers can:
• avoid large IT capital outlays and long term depreciation or lease schedules. This is especially important for companies with short term projects • bring to bear, on a temporary basis, massive amounts of compute power that would be otherwise unaffordable and which helps provide strategic insight • lower overall operating costs • improve price/performance for compute-intensive applications and processing of massive amounts of data • off-load system maintenance to IBM • use the latest technologies to maximize performance • meet the urgent computational needs of new business opportunities that would otherwise be cost-prohibitive The new Deep Computing on demand facility is located in a specially secured section of IBM’s legendary Poughkeepsie plant, where the mainframe computer was invented more than 30 years ago. Initially, the system consists of a cluster of IBM eServer xSeries Intel-based Linux systems and pSeries UNIX servers with related disk storage. GXT has selected an IBM Linux cluster. Designed for scalability to meet increased demand, the Deep Computing on demand facility is also expected to incorporate a variety of blade technologies and AMD technologies over time. The system is accessible to customers worldwide via a secure VPN connection over the Internet. The service is expected to find favor with a broad spectrum of companies that have peaks and valleys in their need for supercomputing power. These include Hollywood studios that use supercomputing power to create animated movies as well as life sciences companies for genomic and drug discovery research. Financial services organizations, government agencies and national research laboratories are also likely customers. “The on demand computing model allows customers to avoid technological risk as well as the financial risk associated with ownership,” said David Turek, vice president, IBM Deep Computing. “Deep computing on demand fulfills those two goals, freeing customers to focus on growing their business.”