APPLICATIONS
Supercomputing Enters a Renaissance, Declares SGI
MOUNTAIN VIEW, Calif.-- SGI announced a differentiated strategy to meet the evolved needs of scientific and technical high-performance computing (HPC) users. This class of users is confronting escalating complexity from enormous data sources such as satellite scans, seismic monitoring, long-range global climate data and full- body medical scans. At the same time, users have a competitive necessity for improved productivity and increased collaboration. These evolved requirements are changing the way supercomputer systems are designed -- a supercomputing renaissance in which the HPC market is coalescing around a more efficient, balanced system design. The result is a shift in the way HPC is defined from solely high-performance to include high-productivity computing. SGI ushers in this new era of supercomputing with a number of new product announcements over the coming weeks and months. "The purpose of computing is insight, not numbers. SGI systems deliver to technical, scientific and creative users the benefits of deeper insights," said Bob Bishop, CEO of SGI. "This can only come from bringing the highest performance computation, enormous data handling and interactive visualization into a unified system. And SGI has plans to extend this technology further to new platforms." A Supercomputing Industry Renaissance The revolutionary ideas of the Renaissance period in history came about when the most brilliant minds of the time reached out to advance the relationships between disparate bodies of knowledge -- science, art, war, religion, economy and humanism. The modern-day renaissance thinkers are poised to bring about new breakthroughs in medicine, energy, entertainment and engineering. But with the massive explosion of data today, they need new tools to turn their explorations into meaningful insight. "We are very impressed with the strategy SGI is putting in place," said Debra Goldfarb, group vice president, worldwide systems and servers, IDC. "The HPC industry is entering a new chapter in which economic conditions and fierce competition are forcing businesses to change the way they utilize their computing resources. Efficiently managing and using data in this highly competitive world is central to companies processing information and turning it into superior decisions. By providing a balanced approach to system design that combines innovative technology and a highly optimized development environment, SGI is clearly on the right track." "While processors and disks have grown in capacity at Moore's Law rates, the size of data has grown twice as fast. The bottleneck is in the memory subsystem. The technology that interconnects these components on most systems is where latency problems are occurring," said Steve Miller, chief engineer, SGI. "SGI is focused on delivering a key enabling technology for productive HPC computing -- the NUMAflex architecture. This interconnect technology binds together all the components of a system with ultrafast results." Miller is also a principal investigator on the Defense Advanced Research Projects Agency (DARPA) high-performance computing systems program. According to Bishop, "SGI is involved in more large-scale data management and visualization projects than ever, transforming complex data into high- fidelity images in a way that takes one's breath away. No one else in the IT industry has dedicated themselves entirely to breaking through the bonds of linear, two-dimensional thinking. No one else has broken out into interactive, immersive, multidimensional space and time. Only SGI." SGI's expanded HPC strategy addresses the four driving factors that are shaping this supercomputing renaissance. -- Emphasis on high-productivity computing with a balanced system design -- DARPA has determined "productivity" essential for a new generation of scalable, high-performance computing systems. In June 2002, SGI was selected by DARPA to participate in a concept study for a new vision of high-performance computing that emphasizes productivity rather than unattainable theoretical performance. -- Demand for density with ever higher performance in the same or smaller footprint -- Scientists, engineers, researchers and analysts in HPC environments in a variety of markets, from life sciences to national defense, are under pressure to acquire more computing power in less space -- from computer labs to submarines to cockpits to eyepieces. -- Instant access to terascale data -- "Humankind will generate more original information over the next three years than was created in the previous 300,000 years combined," according to a new study from the School of Information Management and Systems at the University of California, Berkeley. In 1999 the world created about 1.5 exabytes of unique information -- which is 1.5 billion gigabytes, or the equivalent of 250 megabytes of new information for every man, woman and child in the world. According to the study, that number is expected to double every year for the foreseeable future, even without counting the multiple copies that most information generates. HPC organizations face a fundamental challenge in data management: fast access to enormous volumes of information. -- The trend toward global collaborative networks to access supercomputing resources -- The development of grid computing, giving scientists and engineers ready access to supercomputing resources, is a trend that has been gaining acceptance in recent years. In addition, more organizations are enabling users across a globally extended network the capability to visualize and interact with centrally processed data for the purpose of collaboration.