Internationally Renowned Institute to Employ Newly Announced SGI Supercomputer to Accelerate Cancer Research

SGI today announced that the Institute of Cancer Research (ICR) has selected SGI Altix UV, based on Intel Xeon processors (codenamed Nehalem-EX), to support its future life-saving research. The ICR joins the growing list of globally significant high performance computing (HPC) facilities embracing Altix UV as the future of open, high performance, big-memory supercomputing. Altix UV will provide the ICR with a massively scalable shared memory system to process its growing data requirements, including hundreds of terabytes of data for biological networks, MRI imaging, mass-spectrometry, phenotyping, genetics and deep-sequencing information across thousands of CPUs.

“The Altix UV supercomputer will allow extremely large, diverse data sets to be processed quickly, enabling our researchers to correlate medical and biological data on an unprecedented scale,” said Dr. Rune Linding, cellular and molecular logic team leader at the ICR. “Eventually, this will lead to network-based cancer models that will be used to streamline the process of drug development.”

SGI Altix UV supports up to 16 terabytes of global shared memory in a single system image. It remains highly efficient at scale for applications ranging from in-memory databases to a diverse set of data and compute-intensive HPC applications. As a result, Altix UV is the only hardware solution equipped to meet the vast data processing requirements of the ICR.

“Altix UV will allow HPC customers like the ICR to think differently and solve problems that cannot be solved on other HPC platforms,” said Rod Evans, vice president of sales for Northern Europe at SGI. “We are delighted to be working with the ICR to provide this unique technology required to process the huge amounts of cancer-related data generated in medical research.”

“Systems biology demands massive integration of extremely large data sets. Large shared memory should enable us to handle such data at a much higher speed and with a greater focus on the biological questions at hand,” said Peter Rigby, chief executive professor at ICR. “Altix UV should significantly help our work in this new, exciting area of cancer research.”

"With its upcoming SGI Altix UV deployment, the ICR will be at the forefront of HPC for biological research as it pursues treatments for cancer,” said Richard Dracott, general manager of high performance computing at Intel. "Altix UV will meaningfully transform HPC by drawing on large memory capacity, high core count and scalability of our forthcoming next generation Intel Xeon processor-based server platform, for the expandable server segments (codenamed Nehalem-EX)."

A European consortium has brought the power of grid computing to bear on problems ranging from the genetic origins of heart disease to the management of fish stocks and the reconstruction of ancient musical instruments.

A ‘grid’ is a network of high-powered computing and storage resources available to researchers wishing to carry out advanced number-crunching activities. Resources belong to individual universities, national and international laboratories and other research centres but are shared between them by mutual agreement.

In Europe the data is carried over the GÉANT grid network but the organisation that makes this possible is managed by EGEE-III, the third phase of an EU-funded project to create an infrastructure supporting European researchers using grid computing resources.

“We support the users, we operate the infrastructure and we also develop the middleware that we use to bring all those resources together in a secure manner,” says Steven Newhouse, the project’s technical director. “We take computing and storage resources owned by individual institutions and provide a middleware layer of software that allows these resources to be shared securely over the international research networks.”

Coordinated at CERN near Geneva, EGEE links about 14,000 users at 350 sites in 55 countries, both within and outside Europe. Every day, an average of 350,000 computing jobs pass through the network.

Asking what if?

Although grid computing began in the high-energy physics community – and EGEE will be on hand to process the long-awaited data from the Large Hadron Collider – many other disciplines are now using EGEE to access the world’s most powerful computing facilities.

“We're seeing increasing use from the computational chemistry area, from materials science, the life sciences, environmental sciences and so forth,” says Newhouse.

What many of the applications have in common is the simulation of experiments that would take years or decades to do in the laboratory. A common theme is to study how complex molecules interact with each other, with many applications in the search for new vaccines and other drugs. 

“And this gives scientists a great way of asking 'what if?' questions and to narrow down the chemicals that they need to explore in the lab from hundreds of thousands to just a handful.”

Research published in Nature Genetics last March used EGEE to identify combinations of genes that predispose people to coronary artery disease. Scientists from the EU’s Cardiogenics project were able to find four out of more than 8.1 million possible combinations of genetic markers that were strongly associated with the disease.

Earthquakes and fish stocks

A group in Taiwan is using EGEE to model the effects of earthquakes on urban areas in the hope of learning how to keep damage to a minimum. “It's combining physical sciences and social sciences to do something really practical,” says Catherine Gater, the dissemination manager for EGEE-III.

Another project, AquaMaps, is using the grid to model the worldwide distribution of fish species. “Because climate change is affecting the patterns of where you might find marine species, fish stock management is quite an issue,” she says. “With everything changing so rapidly, the AquaMaps project is mapping where you can find particular species of fish at any one time.”

EGEE is also helping doctors to treat rare diseases through a project to create a worldwide image library. “It gives them almost instant access to medical images spread around the world but in a secure manner,” explains Gater. “That's the key to the grid, that you can share this data with other trusted sources and any patient information is not going to get out beyond the circle that it should.”

The benefits of EGEE have spread beyond the hard sciences and medicine into the humanities. The multidisciplinary ASTRA team in Italy used the grid to construct a digital model of an epigonion, a harp-like instrument used in ancient Greece. The virtual instrument was played in a concert in Naples last December.

New way to do science

From patterns of crime to fine-tuning of radiotherapy treatments, EGEE has brought grid computing to researchers and other professionals who would not have considered using it only a few years ago.

“One of the original motivations for this grid activity was that all this computing power could change the way that scientists do their research,” notes Newhouse. “What we're now seeing, a decade on, are the fruits of all this work.”

Next May, EGEE will come to an end and a new body, the European Grid Initiative (EGI), will take its place. Newhouse is its interim project director. “We want to move away from the short-term project model that has happened within EGEE to a model which is both more sustainable financially but also more sustainable and longer term for the users that increasingly depend upon this infrastructure."

Newhouse likens the grid to other scientific instruments that have changed the way we look at the world. “It’s like the invention of the microscope or the telescope. The grid is actually changing the way scientists think about doing their research and the questions they can pose.”

EGEE-III is an e-Infrastructures project funded under the ‘Research Infrastructures’ part of the Capacities programme of the EU’s Seventh Framework Programme for research.

Source: ICT Results site (http://cordis.europa.eu/ictresults)

Reserved Bandwidth on ESnet Makes Possible Mulit-Gigabit Streaming Between Argonne, SC Conference in Portland

As both an astrophysicist and director of the San Diego Supercomputer Center (SDSC), Mike Norman understands two common perspectives on archiving massive scientific datasets. During a live demonstration at the SC09 conference of streaming data simulating cosmic structures of the early universe, Norman said that some center directors view their data archives as "black holes," where a wealth of data accumulates and needs to be protected.

But as a leading expert in the field of astrophysics, he sees data as intellectual property that belongs to the researcher and his or her home institution — not the center where the data was computed. Some people, Norman says, claim that it's impossible to move those terabytes of data between computing centers and where the researcher sits. But in a live demo in which data was streamed over a reserved 10-gigabits-per-second provided by the Department of Energy's ESnet (Energy Sciences Network), Norman and his graduate assistant Rick Wagner showed it can be done.

While the scientific results of the project are important, the success in building reliable high-bandwidth connections linking key research facilities and institutions addresses a problem facing many science communities.

“A lot of researchers stand to benefit from this successful demonstration,” said Eli Dart, an ESnet engineer who helped the team achieve the necessary network performance.  “While the science itself is very important in its own right, the ability to link multiple institutions in this way really paves the way for other scientists to use these tools more easily in the future.”

"This couldn't have been done without ESnet," Wagner said. Two aspects of the network came into play. First, ESnet operates the circuit-oriented Science Data Network, which provides dedicated bandwidth for moving large datasets. However, with numerous projects filling the network much of the time for other demos and competitions at SC09, Norman and Wagner took advantage of OSCARS, ESnet's On-Demand Secure Circuit and Advance Reservation System.

"We gave them the bandwidth they needed, when they needed it," said ESnet engineer Evangelos Chaniotakis. The San Diego team was given two two-hour bandwidth reservations on both Tuesday, Nov. 17, and Thursday, Nov. 19. Chaniotakis set up the reservations, then the network automatically reconfigured itself once the window closed.

At the SDSC booth, the live streaming of the data drew a standing-room-only crowd as the data was first shown as a 4,0963 cube containing 64 billion particles and cells. But Norman pointed out that the milky white cube was far too complex to absorb, then added that it was only one of numerous time-steps. In all, the data required for rendering came to about 150 terabytes of data.

In real time, the data was rendered on the Eureka Linux cluster at the Argonne Leadership Computing Facility and reduced to one-sixty-fourth of the original size for a 1,0243 representation, making it more manageable and able to be explored interactively. The milky mesh was shown to contain galaxies and clusters linked by sheets and filaments of cosmic gases.

The project, Norman explained, is aimed at determining whether the signal of faint ripples in the universe known as baryon acoustic oscillations, or BAO, can actually be observed in the absorption of light by the intergalactic gas. It can, according to research led by Norman, who said they were the first to determine this. Such a finding is critical to the success of a dark energy survey known as BOSS, the Baryon Oscillation Spectroscopic Survey. The results of his proof-of-concept project, Norman said, "ensure that BOSS is not a waste of time."

Creating a simulation of this size, even using the petaflops Cray XT5 Kraken system at the University of Tennessee can take three months to complete as it is run in batches as time is allocated, Norman said. The data could then be moved in three nights to Argonne for rendering. The images were then streamed to the SDSC OptiPortal for display. Norman said the next step is to close the loop between the client side and the server side to allow interactive use. But the hard work — connecting the resources with adequate bandwidth — has been done, as evidenced by the demo, he noted.

But it wasn't just an issue of bandwidth, according to ESnet's Dart. "We did a lot of testing and tuning"” said Dart.

ESnet is managed by Lawrence Berkeley National Laboratory (LBNL) Other contributors to the demo were Joe Insley of Argonne National Laboratory (ANL), who generated the images from the data, and Eric Olson, also of Argonne, who was responsible for the composition and imaging software. Network engineers Linda Winkler and Loren Wilson of ANL and Thomas Hutton of SDSC worked to set up and tune the network and servers before moving the demonstration to SC09.  The project was a collaboration between ANL, CalIT2, ESnet at the Lawrence Berkeley National Laboratory, the National Institute for Computational Science, Oak Ridge National Laboratory and SDSC.

For more information about computing sciences at the Lawrence Berkeley National Laboratory, please visit: www.lbl.gov/cs

 

  • The agreement enables prediction of the behavior of ocean circulation and sea level in the period 2000 - 2050
  • The plan also allows to understand the most optimal conditions for the shellfish or predictions of water quality
  • In the scope of this agreement, the project of Interreg RAYA collaboration between Galicia and northern Portugal, will install new oceanographic buoys in the Galician and Portuguese coast and implemented new predictive models for the pontoons de la Coruna, Vigo, Viana do Castelo and Leixoes
  • Among others, they will benefit from the new agreement, users and seafood and fishing industry, the new predictions can be made more suited to their needs


President of Puertos del Estado, Fernando Gonzalez Laxe, with the Regional Minister of Environment, Territory and Infrastructures, Augustine Hernandez, and the president of the Foundation Supercomputing Center of Galicia (CESGA), Ricardo Chapel, have signed a pioneering agreement that places Galicia in the forefront of oceanographic studies. Through this collaboration, they will promote activities to enhance the scientific, technical and training of these institutions.

The agreement will address the implementation and evaluation of an operational oceanographic model for all Europe's Atlantic coast. In addition, it will allow the implementation of a regional ocean model for the Spanish Atlantic coast that allows to foresee the behavior of ocean circulation and sea level in the period 2000 - 2050 depending on various climate change scenarios.

In the field of this agreement and the RAIA Interreg project collaboration between Galicia and northern Portugal, they will install new oceanographic buoys in the Galician and Portuguese coast and implement new forecasting models for the pontoons of Coruña, Vigo, Viana do Castelo and Leixoes. Among others, they will benefit from the new agreement, users like seafood and fishing industry, the new predictions to be more suited to their needs.

Currently, MeteoGalicia already working on improving coastal observing network and the preparation of specific predictions for the ports of La Coruña, Vigo, Viana do Castelo and Leixoes. Other production results are the predictions of the most optimal conditions for shellfish or water quality.

Contents of agreement
Through this agreement, the three institutions that manage, maintain, disseminate and participate in projects of modeling ocean-meteorological parameters in the marine environment and share issues, needs and goals, work together through joint development work and exchange of experiences, capacity calculation (supercomputer Finisterrae) and related data (buoys, tide gauges, models, etc.). with such projects.

Under this agreement, it will be implemented and evaluated in the supercomputer CESGA Finisterrae of oceanographic forecasting system for all Europe's Atlantic coast. With three years and funded by the European Union, this project will develop an information service for all users of the maritime industry that will provide basic oceanographic data and information for the knowledge of European seas.

The agreement signed today in Santiago lays the groundwork for the development of these lines of work all three institutions: the Ministry of Environment, Land and Infrastructure, through MeteoGalicia, make a weather forecasting and sea to Galicia, running their models
prediction in CESGA facilities and using boundary conditions provided by Puertos del Estado for implementing them. On the other hand, it also has a network of meteorological and oceanographic observation that complement the results obtained by the models.

For its part, the State Ports hold a series of networks for measuring ocean-meteorological parameters (buoys, current meters, radar and tide gauges) and numerical forecasting systems based on numerical modeling.

Advantages

The data collected offer a wealth of industry, environment and knowledge of the oceans. Examples that are already being carried out are related to the prevention of weather and its consequences. For example, the use of computer and simulators in meteorology allow predictions of weather and atmospheric behavior are becoming more reliable and can provide more timely arrival of cyclones, hurricanes or cold waves and improve preventive action by security forces.

In the fisheries sector also used simulation models, which apply to a better use of coastal resources.
Yes we know the currents in a coastal area, as influenced by tidal interactions with the climate ... - Can determine the movement of plankton, algae or fish banks, and measuring and rationalizing their exploitation. Furthermore, with regard to marine and ecological dynamics, it is a fundamental field of application of simulation because you can know in advance the behavior of a discharge, dispersal of a pollutant, the effects of implementing a industry in a coastal area or those arising from port activities.

CESGA Supercomputing Center of Galicia (CESGA) that depends on the Ministry of Economy and Industry of the Galicia and the Higher Council for Scientific Research (CSIC) is an organization with a history of 17 years.
Its purpose is to promote and disseminate compute-intensive services and communications to the research communities and CSIC Galician, as well as those companies or institutions requesting them.


The supercomputer Finisterrae managed by the CESGA and recently included in the Scientific and Technological Facilities Map Singular (ICTS), Ministry of Science and Innovation, is an integrated system of 144 shared memory nodes and has a total of 20,000 GB of main memory
, 390.000Gb disk and 2,580 Itanium processors. The acquisition was supported Finisterrae the European Regional Development Fund, ERDF.

Leading UML modelling tool vendor supports seminal workshop on interoperability best practice.

 

Sparx Systems (www.sparxsystems.com) will sponsor CEN/TC 287 Workshop on Interoperability Best Practice to be held on 14th September, 2010 in St. Georges Bay, Malta. The workshop will be held in conjunction with the 27th plenary meeting of the CEN/TC 287. Enterprise Architect, Sparx Systems' flagship modeling tool, is extensively deployed by the global geospatial community for the development of industry reference models.

CEN/TC 287 is concerned with standardization in the field of digital geographic information for Europe, a task that is undertaken through close co-operation with ISO TC/211 and the Open Geospatial Consortium Inc (OGC).

Commenting on the workshop, Ken Harkin, Business Development Manager for Sparx Systems said, "The identification of interoperability reference material which can be captured in a central repository and made available to future EU geospatial projects as best practices, is essential to ongoing results improvement. As a technology provider to the global standards community, Sparx Systems supports this initiative and is pleased to sponsor the workshop."

He added, "CEN/TC 287, through maintenance of key project deliverables and components will build on existing geospatial standards and in doing so will play a pivotal role in the concerted adoption of best practices and in shaping the evolution of the industry".

Martin Ford, CEN/TC 287 Secretary noted, "The significance of this workshop which is open to the public, is highlighted by an impressive line up of industry thought leaders. These speakers will share their ideas, experience and insight on strategies to use or build upon standards and specifications, in the domain of geographic information. CEN/TC 287 is very appreciative of Sparx Systems support in making this important discussion possible."

Further information about the CEN/TC 287 Interoperability Workshop is available from: www.gistandards.eu

Page 10 of 12