Premier John Brumby has announced funding for two high performance computers at the Australian Synchrotron at Monash University.

Mr Brumby said the Victorian Government had invested almost $4 billion to strengthening technology and innovation in Victoria, building modern, cutting edge infrastructure which is attracting the best researchers and innovators to Victoria.

The $8.3 million High Performance Computing (HPC) facility will take Victoria’s use of computational imaging to the next level.

“I am pleased to announce further funding for the newest, high tech research infrastructure in Victoria - $800,000 towards the new MASSIVE high Performance Computing facility,” he said.

“The Multi-modal Australian Sciences Imaging and Visualisation Environment - or MASSIVE - is the first facility of its kind in Australia, bringing together two high performance computers at the Australian Synchrotron.

“MASSIVE will be a centre of excellence for computational imaging and visualisation and offer researchers from a range of fields — including biomedicine, astronomy, engineering, geoscience and climate studies — unparalleled capacities to construct and view visualisations of the objects of their investigations.

“The facility will enable scientists to create, view and analyse high-resolution scientific images and 3D-models previously too large to visualise.”

Mr Brumby said the Victorian Government was committed to world-class innovative infrastructure that drives Victoria’s economy.

“We are committed to investing in Victoria’s supercomputers, collaboration tools and advanced gigabit networks to enhance the accessibility of costly scientific instruments — such as synchrotrons, gene sequencers, telescopes and sensor networks,” Mr Brumby said.

“By investing in these facilities we are boosting the State’s capacity to turn new ideas and technologies into valued products, services and solutions.”

MASSIVE is a partnership between some of the country's leading technology providers and research institutions — including the Australian Synchrotron, CSIRO, Victorian Partnership for Advanced Computing and Monash University.

Nautilus, the powerful computer for visualizing and analyzing large datasets at the Remote Data Analysis and Visualization Center (RDAV), goes into full production on Sept. 20. Managed by UT Knoxville and funded by a grant from the National Science Foundation (NSF), RDAV and Nautilus provide scientists with cutting-edge capabilities for analyzing and visualizing massive quantities of data. As Nautilus goes into service, RDAV will serve researchers in a wide range of fields.

“The National Institute for Mathematical and Biological Synthesis (NIMBioS) and RDAV have already initiated new collaborations to enhance the use of visualization for large datasets arising from field observations and from mathematical model results,” said Louis Gross, professor of ecology, and evolutionary biology and mathematics at UT Knoxville and director of NIMBioS. “Our objective is to increase the ability of biologists to interpret and analyze complex, multivariate data to address fundamental and applied questions in the life sciences.”

“For a scientist, visualization is more than just generating pretty pictures,” said astrophysicist Bronson Messer of the Oak Ridge National Lab (ORNL) and UT Knoxville. “As our simulations grow larger and larger, visualization and the associated data analysis are absolutely essential in producing scientific insight from computation. Nautilus, and the way it is being integrated into the computational ecosystem at the National Institute for Computational Sciences, looks like a very promising avenue for us to increase the amount of scientific insight we obtain from simulations on Kraken.”

In addition to addressing scientific problems in the life sciences and astrophysics, Nautilus will be used to process data spanning many other research areas. These include visualizing data results from computer simulations with many complex variables, such as weather or climate modeling; analyzing large amounts of data coming from experimental facilities like ORNL’s Spallation Neutron Source; and aggregating and interpreting input from a large number of sensors distributed over a wide geographic region. The computer will also have the capability to study large bodies of text and aggregations of documents.

With 1,024 cores and four terabytes of memory, Nautilus can process large volumes of data and analyze it in ways unlike those used by previous systems. Manufactured by SGI as part of their UltraViolet product line, Nautilus features a shared-memory architecture with a single system image. This configuration allows researchers great flexibility in taking advantage of computing power to analyze larger amounts of data in ways that are impossible on many other high-performance computing systems. RDAV has installed a 1.1 petabyte filesystem on Nautilus in anticipation of these vast amounts of data.

Founded in the second half of 2009, the RDAV center is now fully staffed and ready to support scientists and their data and visualization challenges. In addition to configuring and fine-tuning the Nautilus hardware, the team has been developing and refining new and existing software to address the latest scientific problems and to engage the scientific community.

“I’m very excited about standing up this machine for the NSF TeraGrid, as it’s going to provide much-needed capability for understanding complex datasets from Kraken and other sources,” said RDAV Director Sean Ahern. “And as datasets continue to grow, the shared memory nature of the SGI is a fertile ground for new analysis and visualization research for very large datasets.”

RDAV is a partnership between UT Knoxville, Lawrence Berkeley National Laboratory, the University of Wisconsin, the National Center for Supercomputing Applications at the University of Illinois and ORNL. RDAV is funded by NSF through the American Recovery and Reinvestment Act of 2009 and is part of the TeraGrid eXtreme Digital Resources for Science and Engineering.

For more information, visit http://rdav.nics.tennessee.edu/.

The following is the text of a message Oak Ridge National Laboratory Director Thom Mason sent to ORNL staff Tuesday regarding the Department of Energy's renewal of UT-Battelle's contract to manage the laboratory.

Director's Message, Tuesday, March 23, 2010

This morning we received news from Energy Secretary Steven Chu that the Department of Energy has awarded a five-year extension of UT-Battelle's contract to manage Oak Ridge National Laboratory.  The Secretary, who was joined by Governor Phil Bredesen and Representatives Zach Wamp and Lincoln Davis, announced the news before several hundred staff gathered in the ORNL Conference Center.

First of all, I am grateful to Secretary Chu and the Department of Energy for their confidence in UT-Battelle's leadership during one of the most exciting periods in the Laboratory's history.  I am also grateful to the University of Tennessee and to Battelle, who together make up one of the most successful partnerships in the national laboratory system.  Both UT and Battelle contributed in unique ways to the accomplishments that marked our first ten years at ORNL.

Most of all, I am grateful to the extraordinary staff at ORNL who in the end were responsible for delivering DOE's science mission. Since more than one-half of the staff currently at ORNL were not here ten years ago, it may be hard for many of you to imagine how far we have come since that first day in April 2000.  The following are just a handful of examples.

-What we know today as the "Quad" was a massive parking lot, with a chain link fence surrounding an infrastructure that was old, unattractive and expensive to maintain.  Working with the support of DOE and the state of Tennessee, UT-Battelle undertook a $350 million plan to transform ORNL from one of the oldest labs into the most modern in the DOE system.

-Ten years ago the SNS had just broken ground, with many wondering if we were capable of delivering a $1.4 billion project.  Our computational program was housed in cramped and outdated facilities, and was not listed among the top 100 computers in the world.  We not only delivered the SNS on time, scope and budget, we also developed a program of high-performance computing that today boasts two of the world's top three machines located in a state-of-the-art facility.

-In April 2000 climate and bioenergy were small parts of the lab agenda.  By leveraging our partnership with UT and the state of Tennessee, ORNL is today among the national leaders in these two emerging fields of research.

-In April 2000, our rate of accidents was high, and our rate of growth was low.  Through the great work of our staff, we now have a safety record among DOE's best, a research portfolio that has tripled our annual budget to more than $1.65 billion, and a staff that has grown by nearly a thousand.

As proud as I am about our success over the past ten years, I am even more excited about the potential that lies ahead for UT-Battelle and Oak Ridge National Laboratory.  Our partnership with the University of Tennessee and the state continues to mature, as evidenced by the new graduate program in energy sciences established by the state legislature in January.  The next decade will witness more collaborative research with UT faculty and greater numbers of UT graduate students taking advantage of the facilities at ORNL. Likewise, we are part of a Battelle family of labs that offers an increasing range of opportunities for collaboration and growth with an emphasis on moving our science and technology into the marketplace.

In some respects, we have set a new bar of performance.  The challenge to UT-Battelle, and to each of us who work at ORNL, is to meet this higher standard in the delivery of our scientific mission, the operation of our Laboratory, and our leadership among the local community.

Again, I want to thank the Department of Energy for providing us the chance to join in solving some of the most important scientific challenges of our time.  Most of all, I want to thank the staff of ORNL, who have made our Laboratory one of the world's great centers for scientific discovery.

Thom

Led by a number of scientific breakthroughs and operational milestones at Oak Ridge National Laboratory, UT-Battelle has again earned high performance ratings from the Department of Energy.

The annual DOE "report card" graded UT-Battelle's management performance with "A-" scores in all eight evaluation categories.  The report covers UT-Battelle's performance from October 2008 through September 2009.  Last year's scores contained seven "A-'s" and one "B+."

The 2009 assessment was based on three key measures related to ORNL's scientific research programs and five criteria that rate efficiency of the lab's operations.  In a letter to ORNL Director and UT-Battelle CEO Thom Mason, the DOE's Oak Ridge Operations Manager Gerald Boyd said, "You and your staff are to be congratulated for achieving a high level of performance in the management and operation of ORNL."

Lab officials said UT-Battelle takes pride in combining the highest levels of science with good management.  "Every day for nearly ten years, our goal has been to deliver science that will make a lasting difference in people's lives," Mason said. "These high marks are evidence of DOE's confidence that our staff are achieving that goal."

UT-Battelle's management contract fee awarded is based on scores from the DOE performance rating.  UT-Battelle's total fee for operating ORNL in fiscal year 2009 is $10,058,000, or 94 percent of the maximum fee.

A number of ORNL achievements in 2009 contributed to the laboratory's high performance scores.  Included among the laboratory's science highlights in 2009:

. ORNL researchers won eight prestigious R&D 100 awards, given to discoveries with high potential for commercial application.
. Scientists developed stainless steels that have an increased upper-temperature corrosion limit up to 400 degrees Fahrenheit higher than conventional stainless steels.
. Researchers in the BioEnergy Sciences Center, created in 2007 to accelerate basic research toward the development of cellulosic ethanol as a cost-effective alternative fuel, produced 80 science publications and 16 invention disclosures.
. Materials scientists completed successful tests of a new generation of High Temperature Superconducting cable that can transmit more power in less space.
. Researchers in ORNL's nuclear energy program fabricated a coated particle fuel that set a world record for advanced high temperature gas-cooled reactor fuel.

Operational high points at the lab over the past year include:

. Delivery on time and budget of the Department of Energy's Leadership Class Facility for high-performance computing, featuring Jaguar, the world's most powerful computer capable of 1,700 trillion calculations per second.
. The Spallation Neutron Source, already the world's most powerful facility for pulsed neutron scattering science, in September became the first pulsed spallation neutron source to break the one-megawatt power barrier.
. Managing the U.S. role in the ITER project and working with the project's international members.
. Breaking ground in May on a $95 million Chemical and Materials Sciences facility, the Department of Energy's first Science Laboratories Infrastructure construction project supported by funds from the American Recovery and Reinvestment Act.
. Energy efficiency improvements expected to reduce energy consumption by 50 percent, water usage by 23 percent, and fossil fuel use by more than 80 percent.

Oak Ridge National Laboratory is managed by UT-Battelle for the Department of Energy Office of Science.

The National Science Foundation (NSF) has awarded the University of Tennessee, Knoxville, $10 million to develop a computer system that will interpret the massive amounts of data created by the current generation of high-performance computers in the agency's national computer grid.
 
Sean Ahern, a computer scientist with UT Knoxville's College of Engineering and Oak Ridge National Laboratory, will create and manage the Center for Remote Data Analysis and Visualization, which will store and examine data generated by computer simulations like those used for weather and climate, large experimental facilities like the Spallation Neutron Source (SNS), and widely distributed arrays of sensors.
 
"Next-generation computing is now this-generation computing," Ahern said. "What's lacking are the tools capable of turning supercomputer data into scientific understanding. This project should provide those critical capabilities."
 
Ahern and colleagues at UT's National Institute for Computational Science will develop Nautilus, a shared-memory computer system that will have the capability to store vast amounts of data, all of which can be accessed by each of its 1,024 core processors. Nautilus will be one of the largest shared-memory computers in the world, Ahern said. It will be located alongside UT's other supercomputer, Kraken, which is the world's most powerful academic supercomputer.
 
Nautilus will be used for three major tasks: visualizing data results from computer simulations with many complex variables, such as weather or climate modeling; analyzing large amounts of data coming from experimental facilities like the SNS; and aggregating and interpreting input from a large number of sensors distributed over a wide geographic region. The computer also will have the capability to study large bodies of text and aggregations of documents.
 
"Large supercomputers like Kraken working on climate simulation will run for a week and dump 100 terabytes of data into thousands of files. You can't immediately tell what's in there," Ahern said. "This computer will help scientists turn that data into knowledge."
 
Nautilus will be part of the TeraGrid XD, the next phase of the NSF's high-performance network that provides American researchers and educators with the ability to work with extremely large amounts of data.
 
Like Kraken, Nautilus will be part of UT's Joint Institute for Computational Sciences on the ORNL campus.
 
The new machine, manufactured by high-performance computing specialist SGI, will employ the company's new shared-memory processing architecture. It will have four terabytes of shared memory and 16 graphics processing units. The system will be complemented with a one-petabyte file system.
 
Through Ahern and co-principal investigator Jian Huang, UT Knoxville is the lead institution on the project. ORNL will provide statistical analysis support, Lawrence Berkeley National Laboratory will provide remote visualization expertise, the National Center for Supercomputing Applications at the University of Illinois will deploy portal and dashboard systems, and the University of Wisconsin will provide automation and workflow services. Huang is on the faculty of UT Knoxville's Department of Electrical Engineering and Computer Science.
 
Nautilus will be joined by another NSF facility at the University of Texas that will use another data-access technique for analysis. The NSF funded both projects under the American Recovery and Reinvestment Act of 2009.
 
"For many types of research, visualization provides the only means of extracting the information to understand complex scientific data," said Barry Schneider, NSF program manager for the project. "The two awards, one to the Texas Advanced Computing Center at the University of Texas at Austin and the other to NICS at the University of Tennessee, will be deploying new and complementary computational platforms to address these challenges."

Page 9 of 26