Computer scientists at Sandia National Laboratories in Livermore, Calif., have for the first time successfully demonstrated the ability to run more than a million Linux kernels as virtual machines.

The achievement will allow cyber security researchers to more effectively observe behavior found in malicious botnets, or networks of infected machines that can operate on the scale of a million nodes. Botnets, said Sandia’s Ron Minnich, are often difficult to analyze since they are geographically spread all over the world.

Sandia scientists used virtual machine (VM) technology and the power of its Thunderbird supercomputing cluster for the demonstration. Sandia National Laboratories computer scientists Ron Minnich (foreground) and Don Rudish (background) have successfully run more than a million Linux kernels as virtual machines, an achievement that will allow cybersecurity researchers to more effectively observe behavior found in malicious botnets. They utilized Sandia's powerful Thunderbird supercomputing cluster for the demonstration. (Photo by Randy Wong)

Running a high volume of VMs on one supercomputer — at a similar scale as a botnet — would allow cyber researchers to watch how botnets work and explore ways to stop them in their tracks. “We can get control at a level we never had before,” said Minnich.

Previously, Minnich said, researchers had only been able to run up to 20,000 kernels concurrently (a “kernel” is the central component of most computer operating systems). The more kernels that can be run at once, he said, the more effective cyber security professionals can be in combating the global botnet problem. “Eventually, we would like to be able to emulate the computer network of a small nation, or even one as large as the United States, in order to ‘virtualize’ and monitor a cyber attack,” he said.

A related use for millions to tens of millions of operating systems, Sandia’s researchers suggest, is to construct high-fidelity models of parts of the Internet.

“The sheer size of the Internet makes it very difficult to understand in even a limited way,” said Minnich. “Many phenomena occurring on the Internet are poorly understood, because we lack the ability to model it adequately. By running actual operating system instances to represent nodes on the Internet, we will be able not just to simulate the functioning of the Internet at the network level, but to emulate Internet functionality.”

A virtual machine, originally defined by researchers Gerald J. Popek and Robert P. Goldberg as “an efficient, isolated duplicate of a real machine,” is essentially a set of software programs running on one computer that, collectively, acts like a separate, complete unit. “You fire it up and it looks like a full computer,” said Sandia’s Don Rudish. Within the virtual machine, one can then start up an operating system kernel, so “at some point you have this little world inside the virtual machine that looks just like a full machine, running a full operating system, browsers and other software, but it’s all contained within the real machine.”

The Sandia research, two years in the making, was funded by the Department of Energy’s Office of Science, the National Nuclear Security Administration’s (NNSA) Advanced Simulation and Computing (ASC) program and by internal Sandia funding.

To complete the project, Sandia utilized its Albuquerque-based 4,480-node Dell high-performance computer cluster, known as Thunderbird. To arrive at the one million Linux kernel figure, Sandia’s researchers ran one kernel in each of 250 VMs and coupled those with the 4,480 physical machines on Thunderbird. Dell and IBM both made key technical contributions to the experiments, as did a team at Sandia’s Albuquerque site that maintains Thunderbird and prepared it for the project.

The capability to run a high number of operating system instances inside of virtual machines on a high performance computing (HPC) cluster can also be used to model even larger HPC machines with millions to tens of millions of nodes that will be developed in the future, said Minnich. The successful Sandia demonstration, he asserts, means that development of operating systems, configuration and management tools, and even software for scientific computation can begin now before the hardware technology to build such machines is mature.

“Development of this software will take years, and the scientific community cannot afford to wait to begin the process until the hardware is ready,” said Minnich. “Urgent problems such as modeling climate change, developing new medicines, and research into more efficient production of energy demand ever-increasing computational resources. Furthermore, virtualization will play an increasingly important role in the deployment of large-scale systems, enabling multiple operating systems on a single platform and application-specific operating systems.”

Sandia’s researchers plan to take their newfound capability to the next level.

“It has been estimated that we will need 100 million CPUs (central processing units) by 2018 in order to build a computer that will run at the speeds we want,” said Minnich. “This approach we’ve demonstrated is a good way to get us started on finding ways to program a machine with that many CPUs.” Continued research, he said, will help computer scientists to come up with ways to manage and control such vast quantities, “so that when we have a computer with 100 million CPUs we can actually use it.”

The conference program for the 2009 International Supercomputing Conference (ISC’09)is now finalized, and according to ISC’09 General Chair Prof. Hans Meuer, this year's conference program is set to be the most illustrious in the 24-year history of the conference. This year’s conference will be held June 23-24 in Hamburg.

The four-day high performance computing conference and trade show at the Congress Center Hamburg provides a unique international platform to gain insights, network and do business, all under one roof. With over 1,500 attendees and more than 120 exhibitors from over 45 countries expected to attend, ISC’09 is shaping up to be the most extensive and dynamic since the conference began as a meeting in Mannheim in 1986.

ISC, which marks its 24th anniversary in 2009, has a well-established reputation for presenting well-founded, precise and up-to-date information in an environment that encourages informal conversations and sharing of ideas. ISC is also the largest high performance computing exhibition in Europe. All conference proceedings are conducted in English.

In addition to world-renowned keynote speakers, ISC’09 will feature four in-depth sessions examining some of the most exciting – and challenging – areas in high performance computing today. Here is a quick look at each of the sessions:

“Every year, as we put together the conference program, I say this will be the best ever, but this year the program is truly illustrious,” said Conference Chair Prof. Hans Werner Meuer of the University of Mannheim. “For 2009, we have four days of presentations, an impressive lineup of speakers and our largest exhibition ever. At the same time, we have also updated our pricing structure to ensure that everyone who wants to attend can afford this extremely valuable investment.”

The ISC’09 program also includes four keynote speakers:

An overview of the ISC’09 conference program can be found at http://www.supercomp.de/isc09/Program/Overview.

Online registration starts on March 2

Advance registration for ISC’09 begins Monday, March 2, and in addition to reduced rates, a number of one-day options for both the program and exhibition are being offered for the first time. As an added incentive for U.S. attendees, the U.S. dollar is about 25 percent stronger against the euro than last year. For complete details on registration categories and rates, go to: http://www.supercomp.de/isc09/Registration-Travel/Attendance-Fees

Temperature differences, slow water could delay ocean entry

Temperature differences and slow-moving water at the confluence of the Clearwater and Snake rivers in Idaho might delay the migration of threatened juvenile salmon and allow them to grow larger before reaching the Pacific Ocean.PNNL researchers place yellow acoustic receivers into the Columbia River. The receivers are part of Juvenile Salmon Acoustic Telemetry System, which is helping track the movement of tagged fall Chinook salmon on the Clearwater River in Idaho.

A team of Northwest researchers are examining the unusual life cycle of the Clearwater’s fall Chinook salmon to find out why some of them spend extra time in the cool Clearwater before braving the warm Snake. The Clearwater averages about 53 degrees Fahrenheit in the summer, while the Snake averages about 71. The confluence is part of the Lower Granite Reservoir – one of several sections of slow water that are backed up behind lower Snake and Columbia river dams – that could reduce fish’s cues to swim downstream.

The delayed migration could also mean Clearwater salmon are more robust and survive better when they finish their ocean-bound trek, said Billy Connor, a fish biologist with the U.S. Fish & Wildlife Service.

“It may seem counterintuitive, but the stalled migration of some salmon could actually help them survive better,” Connor said. “Juvenile salmon may gamble on being able to dodge predators in reservoirs so they can feast on the reservoirs’ rich food, which allows them to grow fast. By the time they swim toward the ocean the next spring, they’re bigger and more likely to survive predator attacks and dam passage.”

Scientists from the U.S. Geological Survey, the U.S. Fish & Wildlife Service, the Department of Energy’s Pacific Northwest National Laboratory and the University of Washington are wrapping up field studies this fall to determine if water temperature or speed encourage salmon to overwinter in the confluence and in other reservoirs downstream. The Bonneville Power Administration is funding the research to help understand how Snake and Columbia River dams may affect fish.

USGS and USFWS are tracking fish movement by implanting juveniles with radio tags, which are more effective in shallow water. PNNL is complementing that effort with acoustic tags, which work better in deeper water. PNNL is also contributing its hydrology expertise to measure the Clearwater and Snake rivers’ physical conditions. UW is providing the statistical analysis of the tagging.

“Fall Chinook salmon on the Clearwater River have a fascinating early life history that may contribute to their successful return as adults,” said PNNL fish biologist Brian Bellgraph. “If we can support the viability of such migration patterns in this salmon subpopulation, we will be one step closer to recovering the larger fall Chinook salmon population in the Snake River Basin.”

Scientists used to think all juvenile fall Chinook salmon in the Clearwater River migrated to the ocean during the summer and fall after hatching in the spring. But researchers from USGS, USFWS and the Nez Perce Tribe began learning in the early 1990s that some stick around until the next spring. Similar delays have also been found in a select number of other rivers, but this is still the exception rather than the rule. The Clearwater is unique because a high number – as much as 80 percent in some years – of its fall Chinook salmon don’t enter the ocean before they’re a year old.

To better understand how fish react to the river’s physical conditions, scientists are implanting juvenile salmon with the two types of small transmitters that emit different signals. The transmitters – commonly called tags – are pencil eraser-sized devices that are surgically implanted into young fish 3.5 to 6 inches in length. Specially designed receivers record the tags’ signals, which researchers use to track fish as they swim. The gathered data helps scientists measure how migration is delayed through the confluence.

Radio tags release radio waves, which are ideal to travel through shallow water and air. And acoustic tags emit higher-frequency sounds, or “pings,” that more easily move through deeper water. The acoustic tags being used are part of the Juvenile Salmon Acoustic Telemetry System, which PNNL and NOAA Fisheries developed for the U.S. Army Corps of Engineers.

Together, fish tagged with both acoustic and radio transmitters help create a more comprehensive picture of how the river affects fish travel. The location data can also indicate how well fish fare. If a tag’s signal stops moving for an extended period, the fish in which it was implanted might have died. Researchers examine the circumstances of each case to determine the fish’s fate.

This study is a unique example of how both tag technologies can jointly determine the survival and migration patterns of the relatively small juvenile fall Chinook salmon. The size of transmitters has decreased considerably in recent years; further size reductions would allow researchers to study even smaller fall Chinook salmon. This could provide further insight into this mysterious migration pattern.

Beyond the fish themselves, researchers will also examine water temperature and flow to determine what correlation the river’s physical conditions may have with the fish movement. Salmon use water velocity and temperature as cues to guide them toward the ocean. But the Lower Granite Dam’s reservoir, which extends about 39 miles upriver from the dam to Lewiston, makes the water in the Clearwater River’s mouth move slowly. Researchers suspect the slow water may encourage some fall juvenile Chinook salmon to delay their journey and spend the winter in the confluence.

To test this hypothesis, PNNL scientists take periodic velocity measurements in the confluence from their research boat. Submerged sensors have recorded water temperatures every few minutes between about June and January since 2007. Both sets of information will be combined to create a computational model of the fish’s river habitat.

This study’s results could be used to modify river water flow to improve fish survival. The Clearwater’s Dworshak Dam already helps manage water temperature by strategically releasing cool water toward the Snake. The waters form thermal layers – with the Snake’s warm water on top and the Clearwater’s cool liquid below – that fish move through to regulate their body temperatures.

The Nez Perce Tribe began studying fall Chinook salmon in the lower Clearwater River in 1987. USGS and USFWS joined the effort in 1991, when the Snake River Basin’s fall Chinook salmon were first listed under the Endangered Species Act. PNNL and UW joined the study in 2007. The Bonneville Power Administration is paying for the study.

More information about the Juvenile Salmon Acoustic Telemetry System can be found at the JSATS webpage.

 

Looking Ahead to ‘A Clear Chance to Shape Our Future’

 

Jane Lubchenco, Ph.D., under secretary of commerce for oceans and atmosphere and NOAA administrator, will lead a U.S. delegation to Geneva, Switzerland, August 31-Septmeber 4 for the World Climate Conference-3 in efforts to establish a Global Framework for Climate Services. This framework is intended to help meet accelerating demands for useful information on the impacts of climate change. 

“Climate change is real. It is happening now, in our backyards and around the globe,” said Lubchenco. “Decision-makers across all sectors require reliable, relevant information about the current and projected impacts of climate change, whether they are farmers, manufacturers, or city officials planning snow removal budgets or options for water resources, transportation or new housing developments. In a rapidly changing world these decisions cannot be made based on weather of the past; decision-makers need to know what to expect in the next twenty to fifty years to plan effectively.”

 Next week’s conference will bring together those who collect and develop climate information with those who need it, setting in motion an unprecedented opportunity to design a system that will inform decision-making in a way that is similar to how weather services work today, but on a longer time-scale. Climate services would include the broad range of what users require to address their needs, including data collection, technical assessment, analysis and prediction, and the ability to interpret and use the information.        

The U.S. delegation will include representatives from the White House Office of Science and Technology Policy, the White House Council on Environmental Quality, the U.S. Department of Commerce, the U.S. Department of State, the U.S. Department of the Interior, the U.S. Department of Health and Human Services, U.S. National Science Foundation, USAID, the Environmental Protection Agency, NOAA and NASA.  U.S. officials will be actively engaged at the conference, learning from the international community and sharing American knowledge and innovations.

The first two World Climate Conferences, in 1979 and 1990, greatly enhanced capabilities to observe and assess climate change, ultimately leading to the establishment of the World Climate Research Program and the Global Climate Observing System, as well as the Nobel Peace Prize-winning Intergovernmental Panel on Climate Change and the United Nations Framework Convention on Climate Change.

“Rarely are we presented with such a clear chance to shape our future, to start taking actions essential to making our planet healthier, safer and more environmentally and economically viable. We look forward to working with the international community to make this a very successful conference,” said Lubchenco.

Page 6 of 20