CAPTION Princeton University researchers "weighed" Antarctica's ice sheet using gravitational satellite data and found that from 2003 to 2014, the ice sheet lost 92 billion tons of ice per year. CREDIT Image by Christopher Harig, Department of Geosciences

During the past decade, Antarctica's massive ice sheet lost twice the amount of ice in its western portion compared with what it accumulated in the east, according to Princeton University researchers who came to one overall conclusion -- the southern continent's ice cap is melting ever faster.

The researchers "weighed" Antarctica's ice sheet using gravitational satellite data and found that from 2003 to 2014, the ice sheet lost 92 billion tons of ice per year, the researchers report in the journal Earth and Planetary Science Letters. If stacked on the island of Manhattan, that amount of ice would be more than a mile high -- more than five times the height of the Empire State Building.

The vast majority of that loss was from West Antarctica, which is the smaller of the continent's two main regions and abuts the Antarctic Peninsula that winds up toward South America. Since 2008, ice loss from West Antarctica's unstable glaciers doubled from an average annual loss of 121 billion tons of ice to twice that by 2014, the researchers found. The ice sheet on East Antarctica, the continent's much larger and overall more stable region, thickened during that same time, but only accumulated half the amount of ice lost from the west, the researchers reported.

"We have a solution that is very solid, very detailed and unambiguous," said co-author Frederik Simons, a Princeton associate professor of geosciences. "A decade of gravity analysis alone cannot force you to take a position on this ice loss being due to anthropogenic global warming. All we have done is take the balance of the ice on Antarctica and found that it is melting -- there is no doubt. But with the rapidly accelerating rates at which the ice is melting, and in the light of all the other, well-publicized lines of evidence, most scientists would be hard pressed to find mechanisms that do not include human-made climate change."

Compared to other types of data, the Princeton study shows that ice is melting from West Antarctica at a far greater rate than was previously known and that the western ice sheet is much more unstable compared to other regions of the continent, said first author Christopher Harig, a Princeton postdoctoral research associate in geosciences. Overall, ice-loss rates from all of Antarctica increased by 6 billion tons per year each year during the 11-year period the researchers examined. The melting rate from West Antarctica, however, grew by 18 billion tons per year every year, Harig and Simons found. Accelerations in ice loss are measured in tons per year, per year, or tons per year squared. 

Of most concern, Harig said, is that this massive and accelerating loss occurred along West Antarctica's Amundsen Sea, particularly Pine Island and the Thwaites Glacier, where heavy losses had already been recorded. An iceberg more than 2,000 square miles in size broke off from the Thwaites Glacier in 2002.

In Antarctica, it's the ocean currents rather than air temperatures that melt the ice, and melted land ice contributes to higher sea levels in a way that melting icebergs don't, Harig said. As the ocean warms, floating ice shelves melt and can no longer hold back the land ice.

"The fact that West Antarctic ice-melt is still accelerating is a big deal because it's increasing its contribution to sea-level rise," Harig said. "It really has potential to be a runaway problem. It has come to the point that if we continue losing mass in those areas, the loss can generate a self-reinforcing feedback whereby we will be losing more and more ice, ultimately raising sea levels by tens of feet."

The Princeton study differs from existing approaches to measuring Antarctic ice loss in that it derives from the only satellite data that measure the mass of ice rather than its volume, which is more typical, Simons explained. He and Harig included monthly data from GRACE, or the Gravity Recovery and Climate Experiment, a dual-satellite joint mission between NASA and the German Aerospace Center. GRACE measures gravity changes to determine the time-variable behavior of various components in the Earth's mass system such as ocean currents, earthquake-induced changes and melting ice. Launched in 2002, the GRACE satellites are expected to be retired by 2016 with the first of two anticipated replacement missions scheduled for 2017.

While the volume of an ice sheet -- or how much space it takes up -- is also crucial information, it can change without affecting the amount of ice that is present, Simons explained. Snow and ice, for instance, compact under their own weight so that to the lasers that are bounced off the ice's surface to determine volume, there appears to be a reduction in the amount of ice, Simons said. Mass or weight, on the other hand, changes when ice is actually redistributed and lost.

Simons equated the difference between measuring ice volume and mass to a person weighing himself by only looking in the mirror instead of standing on a scale. 

"You shouldn't only look at the ice volume -- you should also weigh it to find the mass changes," Simons said. "But there isn't going to be a whole lot of research of this type coming up because the GRACE satellites are on their last legs. This could be the last statement of this kind on these kinds of data for a long time. There may be a significant data gap during which the only monitoring available will not be by 'weighing' but by 'looking' via laser or radar altimetry, photogrammetry or field studies."

Harig and Simons developed a unique data-analysis method that allowed them to separate GRACE data by specific Antarctic regions. Because the ice sheet behaves differently in different areas, a continent-wide view would provide a general sense of how all of the ice mass, taken together, has changed, but exclude finer-scale geographical detail and temporal fluctuations. They recently published a paper about their computational methods in the magazine EOS, Transactions of the American Geophysical Union, and used a similar method for a 2012 paper published in the Proceedings of the National Academy of Sciences that revealed sharper-than-ever details about Greenland's accelerating loss of its massive ice sheet. 

Robert Kopp, a Rutgers University associate professor of earth and planetary sciences and associate director of the Rutgers Energy Institute, said the analysis method Harig and Simons developed allowed them to capture a view of regional Antarctic ice loss "more accurately than previous approaches." Beyond the recent paper, Harig and Simons' method could be important for testing models of Antarctic ice-sheet stability developed by other researchers, he said.

"The notable feature of this research is the power of their method to resolve regions geographically in gravity data," Kopp said. "I expect that [their] technique will be an important part of monitoring future changes in the ice sheet and testing such models."

Louisiana State University students were notified of heavy rains and tornado warnings throughout the day Monday by official advisories, and a University group received a large grant to improve weathering the storm.

The University’s Center for Computation and Technology and The STE||AR Group recently received a $3.2 million grant from the National Science Foundation for STORM, a project to update Advanced Circulation, a coastal modelling system, which is used to model the potential effects of hurricanes and other weather events in U.S. coastal areas.

The STORM project began in October 2014, and work on the program will last for the next four years.

The University teamed up with the University of Texas at Austin, the University of Notre Dame, the University of North Carolina and Louisiana Sea Grant for the project.

“In the event of a hurricane, national organizations issue advisories every six hours, ADCIRC takes that information, these storm paths, and calculates them into the model and calculates the storm surge predictions in different regions of the coast with fairly high resolution,” said computer science adjunct assistant professor and CCT senior scientist Harmut Kaiser.

ADCIRC is used by organizations like the Coast Guard and the Department of Homeland Security to make decisions on managing emergency situations. Updating ADCIRC will give these organizations more precise information on the potential effects of winds, tides, waves and currents on large bodies of water and coastal regions faster, Kaiser said.

Kaiser said roughly $1 million of the NSF grant will go to The STE||AR Group, which contributes to the STORM project by providing their HPX computational library, a C++ runtime system that allows an application’s features to run in larger computer systems.

STE||AR Group scientific program coordinator Adrian Serio said the other three universities are focused on the ADCIRC model itself and whether there might be a more effective way to model environmental phenomena.

The University will focus on determining how to run the program in larger computer systems so results can be retrieved faster.

“At LSU … [we want] to investigate if we can use modern computing resources, which are available here at LSU, and at other institutions, in a better way,” Kaiser said. “Either [we] do more and get more results or [we] do the same thing but run [it] faster.”

Mathematics junior Daniel Bourgeois joined The STE||AR Group at the beginning of the semester and works on writing algorithms for the HPX computational system that will eventually be used for the STORM project.

He said the main reason he decided to work with The STE||AR Group was to learn to collaborate with other people to create something useful through computer programming.

“The [STORM] project is definitely very useful because not only is it going to help emergency managers respond to storms, but it’ll help engineers to run ADCIRC and make better design decisions [such as with building levees],” Bourgeois said.

Temperature differences, slow water could delay ocean entry

Temperature differences and slow-moving water at the confluence of the Clearwater and Snake rivers in Idaho might delay the migration of threatened juvenile salmon and allow them to grow larger before reaching the Pacific Ocean.PNNL researchers place yellow acoustic receivers into the Columbia River. The receivers are part of Juvenile Salmon Acoustic Telemetry System, which is helping track the movement of tagged fall Chinook salmon on the Clearwater River in Idaho.

A team of Northwest researchers are examining the unusual life cycle of the Clearwater’s fall Chinook salmon to find out why some of them spend extra time in the cool Clearwater before braving the warm Snake. The Clearwater averages about 53 degrees Fahrenheit in the summer, while the Snake averages about 71. The confluence is part of the Lower Granite Reservoir – one of several sections of slow water that are backed up behind lower Snake and Columbia river dams – that could reduce fish’s cues to swim downstream.

The delayed migration could also mean Clearwater salmon are more robust and survive better when they finish their ocean-bound trek, said Billy Connor, a fish biologist with the U.S. Fish & Wildlife Service.

“It may seem counterintuitive, but the stalled migration of some salmon could actually help them survive better,” Connor said. “Juvenile salmon may gamble on being able to dodge predators in reservoirs so they can feast on the reservoirs’ rich food, which allows them to grow fast. By the time they swim toward the ocean the next spring, they’re bigger and more likely to survive predator attacks and dam passage.”

Scientists from the U.S. Geological Survey, the U.S. Fish & Wildlife Service, the Department of Energy’s Pacific Northwest National Laboratory and the University of Washington are wrapping up field studies this fall to determine if water temperature or speed encourage salmon to overwinter in the confluence and in other reservoirs downstream. The Bonneville Power Administration is funding the research to help understand how Snake and Columbia River dams may affect fish.

USGS and USFWS are tracking fish movement by implanting juveniles with radio tags, which are more effective in shallow water. PNNL is complementing that effort with acoustic tags, which work better in deeper water. PNNL is also contributing its hydrology expertise to measure the Clearwater and Snake rivers’ physical conditions. UW is providing the statistical analysis of the tagging.

“Fall Chinook salmon on the Clearwater River have a fascinating early life history that may contribute to their successful return as adults,” said PNNL fish biologist Brian Bellgraph. “If we can support the viability of such migration patterns in this salmon subpopulation, we will be one step closer to recovering the larger fall Chinook salmon population in the Snake River Basin.”

Scientists used to think all juvenile fall Chinook salmon in the Clearwater River migrated to the ocean during the summer and fall after hatching in the spring. But researchers from USGS, USFWS and the Nez Perce Tribe began learning in the early 1990s that some stick around until the next spring. Similar delays have also been found in a select number of other rivers, but this is still the exception rather than the rule. The Clearwater is unique because a high number – as much as 80 percent in some years – of its fall Chinook salmon don’t enter the ocean before they’re a year old.

To better understand how fish react to the river’s physical conditions, scientists are implanting juvenile salmon with the two types of small transmitters that emit different signals. The transmitters – commonly called tags – are pencil eraser-sized devices that are surgically implanted into young fish 3.5 to 6 inches in length. Specially designed receivers record the tags’ signals, which researchers use to track fish as they swim. The gathered data helps scientists measure how migration is delayed through the confluence.

Radio tags release radio waves, which are ideal to travel through shallow water and air. And acoustic tags emit higher-frequency sounds, or “pings,” that more easily move through deeper water. The acoustic tags being used are part of the Juvenile Salmon Acoustic Telemetry System, which PNNL and NOAA Fisheries developed for the U.S. Army Corps of Engineers.

Together, fish tagged with both acoustic and radio transmitters help create a more comprehensive picture of how the river affects fish travel. The location data can also indicate how well fish fare. If a tag’s signal stops moving for an extended period, the fish in which it was implanted might have died. Researchers examine the circumstances of each case to determine the fish’s fate.

This study is a unique example of how both tag technologies can jointly determine the survival and migration patterns of the relatively small juvenile fall Chinook salmon. The size of transmitters has decreased considerably in recent years; further size reductions would allow researchers to study even smaller fall Chinook salmon. This could provide further insight into this mysterious migration pattern.

Beyond the fish themselves, researchers will also examine water temperature and flow to determine what correlation the river’s physical conditions may have with the fish movement. Salmon use water velocity and temperature as cues to guide them toward the ocean. But the Lower Granite Dam’s reservoir, which extends about 39 miles upriver from the dam to Lewiston, makes the water in the Clearwater River’s mouth move slowly. Researchers suspect the slow water may encourage some fall juvenile Chinook salmon to delay their journey and spend the winter in the confluence.

To test this hypothesis, PNNL scientists take periodic velocity measurements in the confluence from their research boat. Submerged sensors have recorded water temperatures every few minutes between about June and January since 2007. Both sets of information will be combined to create a computational model of the fish’s river habitat.

This study’s results could be used to modify river water flow to improve fish survival. The Clearwater’s Dworshak Dam already helps manage water temperature by strategically releasing cool water toward the Snake. The waters form thermal layers – with the Snake’s warm water on top and the Clearwater’s cool liquid below – that fish move through to regulate their body temperatures.

The Nez Perce Tribe began studying fall Chinook salmon in the lower Clearwater River in 1987. USGS and USFWS joined the effort in 1991, when the Snake River Basin’s fall Chinook salmon were first listed under the Endangered Species Act. PNNL and UW joined the study in 2007. The Bonneville Power Administration is paying for the study.

More information about the Juvenile Salmon Acoustic Telemetry System can be found at the JSATS webpage.

 

An international research team is bringing a new weapon to bear against invasive earthworms.

The ongoing research project at The Ohio State University, the University of Alberta and Simon Fraser University uses statistical analysis to forecast one worm species' spread, in hopes of finding ways to curtail it.

Most recently, they've focused on the boreal forest of northern Alberta. No native worms live in the forest whatsoever; the region had been worm-free since the last ice age 11,000 years ago, until invasive European species began working their way across the United States and Canada. The worms have only recently invaded Alberta.

In the journal Computational Statistics and Data Analysis, the researchers project that by 2056, one of those invasive species--Dendrobaena octaedra, sometimes called the octagonal-tail worm--will expand its territory from 3 percent of the Alberta boreal forest to 39 percent. Right now, isolated pockets of D. octaedra dot the region, and are spreading deeper into the forest by around 52 feet each year. 

"The idea is to get a more realistic picture of earthworm spread, which involves a lot of uncertainty," said Oksana Chkrebtii, assistant professor of statistics at Ohio State. "The invasion is happening below ground, so we can't easily observe the worms directly. We have to infer population dynamics based on the information we can get, and that's a good problem for statistical analysis." 

The worm is less than an inch long, and its eggs are about the size of sesame seeds. One of the team's findings is that the worms are entering the forest via roadways, which suggests that the tiny eggs are becoming lodged in tire treads and then dropping onto the ground as people drive through the forest.

Before she joined the study, Chkrebtii thought--as many North Americans probably do--that worms are always good for the soil. 

"I thought they were beneficial, especially for agriculture, but maybe the boreal forest is not the best place for them," she said.

D. octaedra eats leaves that fall to the forest floor. The worms burrow beneath the surface where they mix different layers of soil and change the soil pH. Ultimately, these changes alter how organic and inorganic matter decomposes and result in fewer small invertebrates in the soil. Other types of worms have even been found to cause native plants living on the forest floor to die and birds that nest there to lose their habitat.

Since the forest hosts no native worms, D. octaedra and other invasive earthworms are able to spread unchecked. 

Study co-author Erin Cameron and her team at the University of Alberta conducted the first field studies of D. octaedra in Alberta's boreal forest. 

Based on worm counts they made at 78 sites within the forest in 2006, Chkrebtii and co-authors estimated the geographic spread of the worm population. They also modeled the likely number of introductions of worms along roads every year.

By using a technique called approximate Bayesian computation in a new way, the team was able to estimate population spread and new introductions simultaneously, and then use them to get an overall 50-year worm forecast for northern Alberta.

The finding: D. octaedra populations currently expand about 16 meters (around 52 feet) per year. At that rate, a single worm and its descendants--they reproduce asexually, so one worm reproduces on its own--could expand to cover the length of an American football field in six to seven years. 

As to introduction rates, the model determined that one new worm is likely deposited in the forest every year for every 100 kilometers (62 miles) of road that runs through it. The section of the forest used in the study contains around 22,000 kilometers (a little less than 14,000 miles) of road, so the model suggests that more than 200 worms are likely to be dropped into the forest per year. 

In total, the population for this worm species is expected to expand to cover 39 percent of the boreal forest floor in northeastern Alberta by 2056.

The boreal forest covers most of the northern reaches of North America, Europe and Asia, and accounts for one-third of all forested land on Earth. Its plant and animal diversity, as well its role as a host to migrating birds, make it an important target for conservation efforts. Researchers also consider it an important carbon sink, which means it stores away reservoirs of carbon that would otherwise enable climate change.

Americans and Canadians alike travel into the forest for hunting and fishing, so in some cases the worms are able hitch a ride for many miles before being deposited there.

The changes wrought by the worms are slow but profound--and the spread is difficult to model, Chkrebtii said.

"When you think of invasive species, you might think that one worm gets in and suddenly you have a forest full of worms, but it's much more complicated than that. And here the worms are getting re-introduced all the time," she explained.

Cameron added that the information gained in this study could be used to inform both tourists and locals.

"There is no way to stop the spread of earthworms once they are introduced, but having a more realistic estimate of future spread--for example, if we follow business as usual in terms of road construction--may increase awareness of the magnitude of the issue, and ideally encourage people to avoid introducing worms," she said.

"This model will allow us to make more accurate estimates of earthworm effects at large spatial scales," Cameron continued. "For example, we are working on modeling the effects of earthworms on carbon storage in the soil, so we'll be using the maps of earthworm spread from the more accurate statistical model to estimate overall changes in carbon storage due to earthworms for this region in the future."

Chkrebtii wants to expand the statistical model to examine how worms that are already in the forest can be transported within it--for example, between campsites--and how they spread to other areas outside the forest.

Meanwhile, Cameron has created a website that suggests ways for people to avoid unwittingly introducing non-native earthworms, and a smartphone app called "Worm Tracker" that lets elementary and middle school students track worm populations in their local area.

Though the app is geared toward students in Alberta, Cameron said that the information it is gathering, as well as the rest of the study, will be useful to researchers elsewhere.

"People in the US should care about this especially, because earthworm invasions have been ongoing there for a longer period of time, and larger impacts have been observed than in western Canada," she added.

She pointed to the Great Lakes Worm Watch, a project out of the University of Minnesota that partners with students in the U.S. to track invasive worms.

Fall Plugfest Follows the Most Successful-to-Date Plugfest Held Last Spring Where More Than 200 Cables and Devices Were Tested

  • IBTA 16th Compliance and Interoperability Plugfest
  • OpenFabrics Alliance 8th Interoperability Event
  • Supercomputing 2009

The InfiniBand Trade Association (IBTA)  today announced the 16th Compliance and Interoperability Plugfest. The Plugfest will take place October 12-16, 2009 in the University of New Hampshire’s Interoperability Lab. The event provides an opportunity for InfiniBand device and cable vendors to test their products for compliance with the InfiniBand architecture specification, as well as interoperability with other InfiniBand products.

This event will include testing of both double data rate (DDR) 20Gb/s devices and quad data rate (QDR) 40Gb/s devices. There is a new test procedure for the recently released 120Gb/s 12x Small Form-Factor Pluggable (CXP) Interface Specification for cables, along with a new memory map test procedure for the EEPROMs included with QSFP and CXP active cables. The updated Wave Dispersion Penalty (WDP) testing will also be included.

The October Plugfest will include interoperability test procedures using Mellanox, QLogic and Voltaire products. The test procedures ensure that InfiniBand products are both compliant and interoperable, which in turn ensures the trouble-free deployment of InfiniBand clusters. More information on test procedures is available for IBTA members at: http://members.infinibandta.org/apps/org/workgroup/ciwg/documents.php?folder_id=298#folder_298

Plugfest registration is free for IBTA members; non-members need to pay a special fee. More information is available on the IBTA website at: www.infinibandta.org.

The Plugfest program has been a significant contributor to the growth of InfiniBand in both the enterprise data center and the high-performance computing markets. According to the June 2009 TOP500 list, InfiniBand is now the leading server interconnect in the Top100 with 59 clusters.

The Integrators’ List has grown from 115 products in October 2008, to 297 products as of the last Plugfest event in April 2009. End users and OEMs frequently reference this list prior to the deployment of InfiniBand-related systems, including both small clusters and large-scale clusters of 1,000 nodes or more. Many OEMs use this list as a gateway in the procurement process.

Fall Plugfest follows the highly successful Spring ‘09 Plugfest

The Spring ‘09 Plugfest was the most successful in IBTA history with more than 20 cable and device vendors in attendance. During the event, over 200 cables and 14 devices were tested. The number of devices qualifying for inclusion on the Integrators’ List has steadily increased; the list now includes more than 297 products.

Vendors recently adding products to the IBTA Integrators’ List include: Amphenol, Avago Technologies, Cinch Connectors, Emcore, LSI, Luxtera, Mellanox, Molex, Obsidian Research, Panduit, Quellan Inc (Intersil), Tyco Electronics, Volex, Voltaire and W.L. Gore. Several additional vendors will attend the October 2009 Plugfest, including QLogic, FCI and Hitachi.

Following Plugfest: OpenFabrics Alliance’s Eighth Interoperability Event

Following the IBTA Plugfest, the OpenFabrics Alliance will be conducting their 8th Interoperability event from Oct. 15-23, 2009. This session will focus on industry-wide interoperability using the OpenFabrics Alliance Software Stack. This event requires separate eligibility, cost and registration. For more information, please visit: http://www.iol.unh.edu/services/testing/ofa/events/Invitation_2009-10_OFA.php

IBTA to Celebrate 10-Year Anniversary at Supercomputing 2009

The IBTA will celebrate its 10-year anniversary at Supercomputing 2009 in Portland, Ore. on November 14-20. The IBTA will host InfiniBand demonstrations and an InfiniBand presentation theater. The IBTA invites all attendees to stop by booth number 139 at the show.

 

Page 8 of 25