San Diego Supercomputer Center Brings On-Demand Supercomputing to the Masses

Makes available new easy-to-use Star-P-enabled parallel cluster to users across the United States: The San Diego Supercomputer Center (SDSC) is teaming with Interactive Supercomputing, Inc. (ISC) to let potentially thousands of new users take advantage of SDSC's high performance computing resources by making the technology easier and on-demand. The SDSC has made available one of its powerful parallel clusters running ISC's Star-P interactive supercomputing platform to scientists, engineers, students and other researchers across the country. As one of six National Science Foundation (NSF)-funded supercomputing centers, the facility provides high performance computing systems, tools and support at no cost to researchers. While the notoriously complex programming requirements of parallel systems are beyond the capabilities of the average researcher, Star-P is enabling them to transparently work with the cluster using familiar desktop tools. Users can code models and algorithms on their desktop computers using popular applications like MATLAB, Python and R, but run them instantly and interactively from afar over the Internet on SDSC's cluster. Star-P eliminates the need to re-program the applications to run the parallel systems, which typically takes months to complete for large, complex problems. Programming that took months can now be done in days; and simulation runs that took days can be done in minutes. "We're very excited about offering this new capability," said Nancy Wilkins-Diehr of SDSC's scientific computing division. "We see a lot of MATLAB, Python, and R usage across the scientific and engineering community. Making our high performance computing resources suddenly available to hosts of new researchers, while allowing them to work in their familiar development environments, should yield huge productivity gains." The 128-processor Star-P-enabled Sun cluster is already being tapped by dozens of researchers and students to accelerate scientific discovery. For example, researchers at the University of Maryland are using the system to do 4-D MRI medical image processing in a quest to improve radiation therapies. Students at Purdue are applying stochastic computing models to optimize how companies handle their product inventories. And post-doctorate grad students at the University of California San Diego are running eye imaging applications on Star-P in the fight against glaucoma. The cluster is also being used as a teaching tool. An NSF-sponsored summer program is pairing students with the Center's computational scientists to integrate high performance computing into manufacturing and industrial processes. Star-P is a pivotal tool in making the parallel system easily accessible to students. Wilkins-Diehr said that the biggest breakthrough the new computing resource provides is in fostering and jumpstarting research that otherwise wouldn't get done. "In the past, we saw a bit of research stagnation as PCs became more powerful. Users were doing just enough work on their desktops to get by, because they didn't have the time or expertise to invest in parallelizing their legacy code in MPI," she said. "They can now envision new experiments and do all sorts of analyses that they wouldn't haven't considered previously. If you can suddenly do something 100 times faster, it naturally inspires you into new areas of research that you wouldn't have considered doing before due to your desktops' inertia." To hear more from Wilkins-Diehr and watch a video on how the San Diego Supercomputer Center is using Star-P visit its Web site.