A carousel of collection bottles on a sediment trap will isolate material collected in Lake Peters.

To improve the accuracy of complex supercomputer modeling, climate researchers in the Arctic are turning to natural features a little more in tune with longer time scales: glaciers and the lakes they feed.

The contrast between high-speed supercomputing and slow-moving natural systems reflects the landscape of climate research in a region protected by its isolation, yet sensitive to change.

During the spring and summer of 2015, four successive groups of researchers and students from Northern Arizona University performed what they called “tag team” science at Lake Peters in the Arctic Wildlife Refuge. The multidisciplinary groups were gathering data for a $1 million, three-year National Science Foundation project funded by the NSF’s Arctic System Science Program.

“One thing the climate science community is working hard to understand is the challenging problem of how climate varies on time scales longer than what we can observe,” said Nick McKay, assistant professor at NAU and a member of the fourth research team. “We think climate models under-predict decadal to century-scale changes in climate.”

Sediment samples from Arctic lakes offer insights, and NAU Regents’ Professor Darrell Kaufman has been pursuing them for years. His research for the School of Earth Sciences and Environmental Sustainability has involved the collection and analysis of sediment cores for clues about climate and environment. But local, short-term variables introduced new questions.

“We realized we needed to do more groundwork to understand those complicated systems,” Kaufman said.

McKay said glacier-fed lakes are ideal targets for study because “glaciers are excellent recorders of long-term climate variability. Unlike living organisms, which have strategies to overcome changes in climate, glaciers don’t.”

But the catchment system involving a glacier and lake is also susceptible to changes in the watershed that cannot be explained solely by climate. So the teams set about to generate more data from Lake Peters in northern Alaska, then take the ambitious step of creating a data network with two other well-studied Arctic lakes, one in southern Alaska and the other on an island near Norway.

“We’re trying to combine field observations with models that simulate sediment transfer from the glaciated catchment to the lake,” Kaufman said. “If we can build the model, then we can turn knobs and ask questions about future outcomes.”

According to McKay, “No one has ever tried to do what we’re doing—simulate a whole watershed like this. Hopefully the model is good enough that we can ask some of our questions and get meaningful answers.” Kaufman research 3

While the work being done this fall consists mostly of analysis and modeling, the labor of the summer involved spending quality—and sometimes not so quality—time at a remote outpost. The G. William Holmes Research Station, established in 1958, was in good condition and a day hike from the glacier, Kaufman said.

“Fieldwork in general is a roller coaster,” Kaufman said. “When conditions are good, there is no more beautiful place on earth. But when the mosquitoes are swarming or a storm is coming up, sometimes you want to be someplace else.”

After a winter of data analysis, teams will return to the site next summer for two more rounds of fieldwork. Kaufman said three students have joined the project this fall, including master’s degree students from geology and geography, and a Ph.D. student from the School of Earth Sciences and Environmental Sustainability. The school and the Department of Geography, Planning and Recreation provided the four lead investigators: Kaufman, McKay, Erik Schiefer and David Fortin.

Summer 2015 fieldwork participants included: Team 1–Kaufman, Fortin, Cody Routson, andMindy Bell; Team 2–Fortin (continuing), Schiefer and Doug Steen; Team 3–Anna Liljedahl, Mike Loso, Maryann Ramos and Anne Gaedeke; Team 4–McKay, Fortin, Ethan Yackulic and Rebecca Ellerbroek.

 

With the rise of CO2 in Earth’s atmosphere, understanding the climate of tropical forests—the Amazon in particular—has become a critical research area. A recent NASA study showed that these regions are the biggest terrestrial carbon dioxide sinks on our planet, absorbing 1.4 billion metric tons of CO2 out of a total global terrestrial absorption of 2.5 billion. Climate scientists have typically been relying on general circulation models (GCMs) to simulate the tropical climate to learn more about its processes. But these models exhibit biases over tropical continents, showing peak evaporation and photosynthesis rates in the wrong season, as well as rain too early in the day.

A team led by Pierre Gentine, professor of earth and environmental engineering, and Adam Sobel, professor of applied physics and applied mathematics and of earth and environmental sciences, has developed a new approach, opposite to climate models, to correct climate model inaccuracies using a high-resolution atmospheric model that more precisely resolves clouds and convection (precipitation) and parameterizes the feedback between convection and atmospheric circulation. This study is published in the August 31 online Early Edition ofProceedings of the National Academy of Sciences (PNAS).

“Our new simulation strategy paves the way for better understanding of the water and carbon cycles in the Amazon,” says Gentine, whose research focuses on the feedback between land and atmosphere. “Our approach should help us learn more about the role of deforestation and climate change on the forest.”

Usama Anber, Sobel’s PhD student at Columbia’s Lamont-Doherty Earth Observatory and the paper’s first author, simulated the Amazon climate and demonstrated the key role that the morning fog layer plays on evaporation and surface radiation. This fog layer is induced by the large nighttime precipitation, missed by current climate models, which underestimated the effect of clouds and precipitation. The researchers found that the fog layer is an essential regulator of the Amazon climate: during the wet season, it artificially modifies the duration of daytime because it reflects sunlight during the early morning. During the dry season, with no fog layer to reflect sunlight, the smaller cloud cover allows plants to receive much higher radiation, increasing evaporation and photosynthesis rates, another process missed by the GCMs.

“Our study demonstrates that using coupled land-atmosphere models with resolved convection and parameterized large-scale dynamics produces very accurate results,” Anber observes. “It is critical to our understanding of tropical climates.”

The team, which also included Shuguang Wang, associate research scientist in the Department of Applied Physics and Applied Mathematics, plans next to examine CO2 cycles to see if they can develop better predictions of climate changes.

“If we can improve our estimations of evaporation over land, then we can also improve water resources management, and weather and climatic forecasts,” Gentine adds. “Working on the hydrologic and carbon cycles is exciting because it will help determine the fate of our planet.”

Their study is supported by the Department of Energy, the National Science Foundation (NSF), and the Office of Naval Research.

Antonio Busalacchi will co-chair the Decadal Survey for Earth Science and Applications from Space. (Photo courtesy Antonio Busalacchi.)

University Corporation for Atmospheric Research (UCAR) trustee Antonio Busalacchi has accepted appointment as co-chair of the National Research Council’s Decadal Survey for Earth Science and Applications from Space. Busalacchi is a professor of atmospheric and oceanic sciences at the University of Maryland.

The survey committee will develop priorities for NASA, the National Oceanic and Atmospheric Administration, and the U.S. Geological Survey to bolster observations of Earth’s atmosphere, oceans, and land surfaces with a new generation of U.S. satellites as well as supporting activities during the 2018-27 decade. It will also assess the progress that has been made in addressing the major scientific and application challenges outlined in the Earth Science Decadal Survey completed in 2007, which was co-chaired by then-UCAR President Richard Anthes.

Satellites and other Earth observing instruments are critical for safeguarding society from natural disasters. They are needed for issuing accurate weather forecasts, monitoring the effects of a changing climate, and assessing the impacts of solar storms.

“Tony is a perfect choice to co-chair this landmark survey, thanks to his expertise on Earth observations and his extensive experience on National Research Council committees,” said UCAR Interim President Michael Thompson. “His leadership will ensure that the academic community will have effective input on future satellite missions for observing Earth from space.”

Busalacchi is director of the Earth System Science Interdisciplinary Center at the University of Maryland, as well as chair of the university’s Council on the Environment. An oceanographer and climate expert, his extensive National Research Council service includes terms as chair of the Board on Atmospheric Sciences and Climate and the BASC Climate Research Committee; chair of the Panel on Options to Ensure the Climate Record from the NPOESS and GOES-R Spacecraft; and co-chair of the Committee on National Security Implications of Climate Change on U.S. Naval Forces.

Busalacchi has served on the UCAR Board of Trustees since 2014.

“This study comes at a critical time for Earth System Science across NASA, NOAA, and the USGS, as the efforts of the past few decades have ushered in a golden era of Earth remote sensing, but we have yet to determine how best to sustain this enterprise across basic research, applied research, applications, and operations,” Busalacchi said. “A prime example of this challenge is space-based observations in support of weather monitoring and prediction.”

Toward the end of this century (project here for the years 2068 to 2098) the possibility of storm surges of eight to 11 meters (26 to 36 feet) increases significantly in cities not usually expected to be vulnerable to tropical storms, such as Tampa, Florida, according to recent research in the journal Nature Climate Change.

Researchers at Princeton and MIT have used supercomputer models to show that severe tropical cyclones could hit a number of coastal cities worldwide that are widely seen as unthreatened by such powerful storms.

The researchers call these potentially devastating storms Gray Swans in comparison with the term Black Swan, which has come to mean truly unpredicted events that have a major impact. Gray Swans are highly unlikely, the researchers said, but they can be predicted with a degree of confidence.

"We are considering extreme cases," said Ning Lin, an assistant professor of civil and environmental engineering at Princeton. "These are relevant for policy making and planning, especially for critical infrastructure and nuclear power plants."

In an article published Aug. 31 in Nature Climate Change, Lin and her coauthor Kerry Emanuel, a professor of atmospheric science at the Massachusetts Institute of Technology, examined potential storm hazards for three cities: Tampa, Fla.; Cairns, Australia; and Dubai, United Arab Emirates.

The researchers concluded that powerful storms could generate dangerous storm surge waters in all three cities. They estimated the levels of devastating storm surges occurring in these cities with odds of 1 in 10,000 in an average year, under current climate conditions.

Tampa Bay, for example, has experienced very few extremely damaging hurricanes in its history, the researchers said. The city, which lies on the central-west coast of Florida, was hit by major hurricanes in 1848 and in 1921.

The researchers entered Tampa Bay area climate data recorded between 1980 and 2005 into their model and ran 7,000 simulated hurricanes in the area. They concluded that, although unlikely, a Gray Swan storm could bring surges of up to roughly six meters (18 feet) to the Tampa Bay area. That level of storm surge could dwarf those of the storms of 1848 and 1921, which reached about 4.6 meters and 3.5 meters respectively.

The researchers said their model also indicates that the probability of such storms will increase as the climate changes.

"With climate change, these probabilities can increase significantly over the 21st century," the researchers said. In Tampa, the current storm surge likelihood of 1 in 10,000 is projected to increase to between 1 in 3,000 and 1 in 1,100 by mid-century and between 1 in 2,500 and 1 in 700 by the end of the century.

The work was supported in part by Princeton's Project X Fund, the Andlinger Center for Energy and the Environment's Innovation Fund, and the National Science Foundation.

On Aug. 28, 2005, the National Hurricane Center issued a public notice warning people in New Orleans of "devastating damage expected...power outages will last for weeks...persons...pets...and livestock left exposed to the winds will be killed," from the ensuing Hurricane Katrina.

The storm had formed near the Bahamas and south Florida before becoming a Category 2 hurricane over the Gulf region northwest of Key West. Then, in two days, the hurricane's winds almost doubled to 175 mph, creating Category 5 Hurricane Katrina-- the most intense hurricane in the past 36 years.

By the time Hurricane Katrina hit the U.S. Gulf Coast, the storm had lost strength but was still able to cause immense damage. Even though the destruction was high, the damage could have been worse if it were not for the forecasts. But no matter how accurate the track forecast, there are still mysteries to solve about hurricane behavior to further improve forecasting.

Researchers are particularly interested in improving forecast lead-time, track and intensity forecast, which are essential to plan successful evacuations. With its expertise in space and scientific exploration, NASA provides help for essential services to the American people, such as hurricane weather forecasting. NASA satellites, supercomputer modeling, instruments, aircraft and field missions provide valuable information to help scientists better understand these storms.

Since Katrina, researchers have made strides in understanding the inner-core processes and environmental factors that affect the path and intensity of a hurricane. When a hurricane strikes now, scientists have a better understanding of where it's going and what's going on inside it than they did in 2005.

"NASA's role in observations, data assimilation and increased modeling abilities will continue to contribute greatly to further advance hurricane research," said Reale.

More Scientific Insights

Since Katrina, scientists have learned a tremendous amount about the environmental conditions and inner-core processes that affect a hurricane's path and intensity.

"It used to be that we always looked for the mechanisms that allow hurricanes to rapidly intensify, but as of late, the question has gotten flipped around," said Scott Braun, research meteorologist at Goddard. "Now we ask what are the factors that prevent a hurricane from intensifying."

Around the time of Katrina, scientists thought the presence of "hot towers"-- tall thunderstorm clouds that carry a lot of heat upward-- could increase the intensity of a hurricane. Since Katrina, scientists have been learning that it's not necessarily whether these deep towers of clouds are present or not, but where the drafts of rising air, or updrafts, are positioned in great quantity and in specific locations inside of the cyclone.

To intensify a hurricane, a large amount of rising air needs to be concentrated in a specific region: between the center of the cyclone and the band of its strongest winds. Within this radius, the more rising air, the greater the potential intensification of the storm. The lifting of air can be accomplished by a small number of deep and intense hot towers or by a larger number of weaker updrafts. If a large amount of air circulating within that region rises, then the hurricane will spin up. Think of it like lifting a heavy box--it can be lifted by one or two strong people or by a large number of weaker people.

Another factor that inhibits intensification is high wind shear, or a large change in winds with height, in the storm's environment. If a hurricane's environment has winds that are changing speed or direction significantly with height, the winds will cause a shearing force that tends to tilt or rip apart the storm. Hurricanes with high wind shear tend to fall apart quickly.

During the Hurricane and Severe Storm Sentinel (HS3) field campaign in 2014, using airborne instruments including the TWiLiTE Doppler wind lidar, the Cloud Physics Lidar (CPL), the HIWRAP conically scanning Doppler radar, the HIRAD multi-frequency interferometric radiometer, and the HAMSR microwave sounder, NASA scientists observed how wind shear affected the strength of Hurricane Edouard over the Atlantic. Based on observations and modeling work, scientists observed that the storm rapidly intensified as wind shear decreased and the storm went from being strongly tilted to being upright. For more information on the specific instruments, visit the HS3 website.

"At the time of Katrina, we knew that wind shear was a strong negative influence, but now we're learning much more about how it interacts with storms to affect their structure and intensity," said Braun, HS3 Principal Investigator.

HS3 data also showed that Hurricane Nadine in 2012 had dry Saharan air circulating near it--another potential inhibitor of intensification. In general, dry air can sink to the surface creating pools of cold air. The cold air often weakens the storm because it steals energy that would otherwise be available to the storm to grow stronger.

While it's unclear if the dry air acted as an inhibitor in Nadine, scientists are learning about the relationship between wind shear and dry air. If a storm is being tilted by wind shear, then the shear could also create an opening for outside elements--like dry air and Saharan dust--to get inside.

NASA has become the first to incorporate dust observations from satellites into hurricane models. While scientists are investigating the specific role of dust, they know that dust can affect the speed and formation of cyclones. Including the dust parameters in the models is one way for scientists to better understand factors in a hurricane's environment.

More data input

In the past decade, NASA and agencies worldwide have significantly increased the number of sensors in space, on aircraft and on the ground to collect relevant hurricane data. NASA has numerous Earth observing satellites and many more satellite sources from worldwide partners. Since Katrina, NASA has also launched three field campaigns covering 5 hurricane seasons.

Satellites allow scientists to look within and at the environment surrounding hurricanes with a global perspective. Satellite data include sea-surface temperature, precipitation, surface winds and pressure, dust, atmospheric temperature and water vapor and more. Two satellites used to measure precipitation from space are the Global Precipitation Measurement Mission and former Tropical Rain Measuring Mission.

Field campaigns, on the other hand, gather more focused data on specific hurricanes by flying manned and unmanned aircrafts into the hearts of the storms. During HS3, NASA's unmanned Global Hawk dropped small devices called dropsondes within and around storms while also collecting data on storm cloud tops and Saharan dust. The dropsondes collect information such as temperature, humidity, pressure and wind speed and direction.

But it's a combination of the availability and better integration of these data sets into models that have helped scientists understand and forecast hurricane behavior better.

At NASA's Global Modeling and Assimilation Office (GMAO) in NASA's Goddard Space Flight Center in Greenbelt, Maryland, scientists are developing global models and data assimilation systems that ingest satellite, upper-air and surface observations to simulate hurricanes and improve our understanding of hurricane behavior.

"The idea is that we have all of these millions and millions of observations and these observations are essentially ingested into what is called a data assimilation system," said Reale. Researchers incorporate as many observations as they can.

The data assimilation system also allows for quality control. It compares observations from different sensors, gives each observation proper weight and merges all the information on a homogenous grid, which is a representation of the state of the atmosphere at a specific time.

A Model View of a Hurricane

Hurricane models can be likened to video games--although instead of an ancient charmed forest, these models use Earth as the backdrop. Just as video games have developed better graphics, become more realistic and include a lot more information, hurricane models have developed in a similar way.

In the past ten years, scientists have significantly improved the models' resolution, largely enabled by more powerful supercomputers. GMAO uses supercomputers at NASA's Center for Climate Simulation at Goddard.

"By going to a higher resolution, we have this process by which the resolved scale of the storm becomes smaller and smaller and closer and closer to reality," said Oreste Reale, a meteorologist at NASA Goddard. Reale is part of a team that, among other duties, assesses the ability of the GMAO suite of models to produce realistic hurricanes.

With past versions of NASA's GMAO model, scientists could detect the circulation in a cyclone, but the size of the cyclone was too big in comparison to the real storm and the intensity of the cyclone was typically underestimated. Today's models have up to ten times the resolution than those during Hurricane Katrina and allow for a more accurate look inside the hurricane. Imagine going from video game figures made of large chunky blocks to detailed human characters that visibly show beads of sweat on their forehead.

Increasing the resolution is especially helpful to study and forecast a hurricane's intensity. "For the intensity of a hurricane, so much comes down to the details of the really small processes and specifics in the inner core," said Dan Cecil, atmospheric scientist at NASA's Marshall Space Flight Center in Huntsville, Alabama.

Since Katrina, researchers have used better data assimilation techniques, higher resolution models, more observations and deeper scientific insights to improve hurricane forecasting and understanding of physical processes. NASA does not release hurricane public forecasts, which are exclusively provided by the National Hurricane Center, but collaborates closely with NOAA to improve our understanding of hurricanes.

Page 4 of 25