Berkeley Lab researchers link rising CO2 levels from fossil fuels to an upward trend in radiative forcing at two locations

Scientists have observed an increase in carbon dioxide's greenhouse effect at the Earth's surface for the first time. The researchers, led by scientists from the US Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab), measured atmospheric carbon dioxide's increasing capacity to absorb thermal radiation emitted from the Earth's surface over an eleven-year period at two locations in North America. They attributed this upward trend to rising CO2 levels from fossil fuel emissions.

The influence of atmospheric CO2 on the balance between incoming energy from the Sun and outgoing heat from the Earth (also called the planet's energy balance) is well established. But this effect has not been experimentally confirmed outside the laboratory until now. The research is reported Wednesday, Feb. 25, in the advance online publication of the journal Nature.

The results agree with theoretical predictions of the greenhouse effect due to human activity. The research also provides further confirmation that the calculations used in today's climate models are on track when it comes to representing the impact of CO2.

The scientists measured atmospheric carbon dioxide's contribution to radiative forcing at two sites, one in Oklahoma and one on the North Slope of Alaska, from 2000 to the end of 2010. Radiative forcing is a measure of how much the planet's energy balance is perturbed by atmospheric changes. Positive radiative forcing occurs when the Earth absorbs more energy from solar radiation than it emits as thermal radiation back to space. It can be measured at the Earth's surface or high in the atmosphere. In this research, the scientists focused on the surface.

They found that CO2 was responsible for a significant uptick in radiative forcing at both locations, about two-tenths of a Watt per square meter per decade. They linked this trend to the 22 parts-per-million increase in atmospheric CO2 between 2000 and 2010. Much of this CO2 is from the burning of fossil fuels, according to a modeling system that tracks CO2 sources around the world.

"We see, for the first time in the field, the amplification of the greenhouse effect because there's more CO2 in the atmosphere to absorb what the Earth emits in response to incoming solar radiation," says Daniel Feldman, a scientist in Berkeley Lab's Earth Sciences Division and lead author of the Nature paper.

"Numerous studies show rising atmospheric CO2 concentrations, but our study provides the critical link between those concentrations and the addition of energy to the system, or the greenhouse effect," Feldman adds.

He conducted the research with fellow Berkeley Lab scientists Bill Collins and Margaret Torn, as well as Jonathan Gero of the University of Wisconsin-Madison, Timothy Shippert of Pacific Northwest National Laboratory, and Eli Mlawer of Atmospheric and Environmental Research.

The scientists used incredibly precise spectroscopic instruments operated by the Atmospheric Radiation Measurement (ARM) Climate Research Facility, a DOE Office of Science User Facility. These instruments, located at ARM research sites in Oklahoma and Alaska, measure thermal infrared energy that travels down through the atmosphere to the surface. They can detect the unique spectral signature of infrared energy from CO2.

Other instruments at the two locations detect the unique signatures of phenomena that can also emit infrared energy, such as clouds and water vapor. The combination of these measurements enabled the scientists to isolate the signals attributed solely to CO2.

"We measured radiation in the form of infrared energy. Then we controlled for other factors that would impact our measurements, such as a weather system moving through the area," says Feldman.

The result is two time-series from two very different locations. Each series spans from 2000 to the end of 2010, and includes 3300 measurements from Alaska and 8300 measurements from Oklahoma obtained on a near-daily basis.

Both series showed the same trend: atmospheric CO2 emitted an increasing amount of infrared energy, to the tune of 0.2 Watts per square meter per decade. This increase is about ten percent of the trend from all sources of infrared energy such as clouds and water vapor.

Based on an analysis of data from the National Oceanic and Atmospheric Administration's CarbonTracker system, the scientists linked this upswing in CO2-attributed radiative forcing to fossil fuel emissions and fires.

The measurements also enabled the scientists to detect, for the first time, the influence of photosynthesis on the balance of energy at the surface. They found that CO2-attributed radiative forcing dipped in the spring as flourishing photosynthetic activity pulled more of the greenhouse gasfrom the air.

The scientists used the National Energy Research Scientific Computing Center (NERSC), a DOE Office of Science User Facility located at Berkeley Lab, to conduct some of the research.

The research was supported by the Department of Energy's Office of Science.

The New York City Panel on Climate Change (NPCC) 2015, co-chaired by a NASA researcher, published its latest report which details significant future increases in temperature, precipitation and sea level in the New York metropolitan area.

The report aims to increase current and future resiliency of the communities, citywide systems and infrastructure in the New York metropolitan region to a range of climate risks. Cynthia Rosenzweig of NASA's Goddard Institute for Space Studies (GISS), New York, co-chairs the New York City panel.

The NPCC was founded in 2008 to study the effects of climate change on New York City's five boroughs and surrounding region. As some of the leading Earth scientists in the metropolitan New York area, GISS researchers have been involved in the panel's work since its beginning. The GISS climate model was used in climate projections, and scientists at GISS led the technical team, which analyzed the scientific data and developed the projections.

"The NPCC is a prototype for how federal government scientists and municipal policymakers can work together," said Rosenzweig, who also is affiliated with the Center for Climate Systems Research at Columbia University's Earth Institute, New York. "This collaboration will help ensure that climate science developed for the New York metropolitan region informs and draws from the best available information, positioning residents and planners to confront expected future changes in the most effective way possible."

Increasing temperature and heavier precipitation events, along with sea level rise, are projected by the report to accelerate in the coming decades, increasing risks for the people, economy and infrastructure of New York City.

Specific report findings about local New York observations and projections include:


   --  Mean annual temperature has increased a total of 3.4 degrees Fahrenheit
       (F) from 1900 to 2013. Future mean annual temperatures are projected to
       increase 4.1 to 5.7 degrees F by the 2050s and 5.3 to 8.8 degrees F by
       the 2080s, relative to the 1980s base period. The frequency of heat
       waves is projected to increase from 2 per year in the 1980s to roughly 6
       per year by the 2080s.
   --  Mean annual precipitation has increased by a total of 8 inches from 1900
       to 2013. Future mean annual precipitation is projected to increase 4 to
       11 percent by the 2050s and 5 to 13 percent by the 2080s, relative to
       the 1980s base period.
   --  Sea levels have risen in New York City 1.1 feet since 1900. That is
       almost twice the observed global rate of 0.5 to 0.7 inches per decade
       over a similar time period. Projections for sea level rise in New York
       City increase from 11 inches to 21 inches by the 2050s, 18 inches to 39
       inches by the 2080s, and, 22 inches to 50 inches, with the worst case of
       up to six feet, by 2100. Sea level rise projections are relative to the
       2000 to 2004 base period.

"Climate change research isn't just something for the future," said Rosenzweig. "It's affecting how key policy decisions are being made now. NASA is proud to work with New York City and other intergovernmental entities to provide world-class science."

The report also uses NASA Landsat 7 data to map the surface temperature of mid-town Manhattan and show the cooling effect of Central Park. The Climate Impacts Group at GISS, led by Rosenzweig, provided technical support for the report.

NASA has a Climate Adaptation Science Investigator (CASI) program that is geared toward evaluating the risks facing NASA facilities due to climate change.  The GISS Climate Impacts Group is using processes and lessons learned during its work with the NPCC to support the CASI program. And CASI research focusing on the advance of key NASA products related to climate adaptation could also have future applications benefiting New York City. In addition, the proposed "Climate Change Resilience Indicators and Monitoring System" will utilize NASA data observations and measurements to help the city manage climate risk.

GISS is a laboratory in the Earth Sciences Division of NASA's Goddard Space Flight Center, Greenbelt, Maryland and is affiliated with the Earth Institute and School of Engineering and Applied Science at Columbia University.

NASA monitors Earth's vital signs from land, air and space with a fleet of satellites and ambitious airborne and ground-based observation campaigns. NASA develops new ways to observe and study Earth's interconnected natural systems with long-term data records and supercomputer analysis tools to better see how our planet is changing. The agency shares this unique knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.

For a copy of the NPCC's 2015 report, visit:
http://onlinelibrary.wiley.com/doi/10.1111/nyas.2015.1336.issue-1/issuetoc

For more information about NASA GISS, visit:
http://www.giss.nasa.gov/

For more information about NASA's Earth science activities, visit:
http://www.nasa.gov/earthrightnow

Inconsistencies may undermine model's reliability for projecting decade-to-decade warming and lead to misinterpretation of data

A new Duke University-led study finds that most climate models likely underestimate the degree of decade-to-decade variability occurring in mean surface temperatures as Earth's atmosphere warms. The models also provide inconsistent explanations of why this variability occurs in the first place.

These discrepancies may undermine the models' reliability for projecting the short-term pace as well as the extent of future warming, the study's authors warn. As such, we shouldn't over-interpret recent temperature trends.

"The inconsistencies we found among the models are a reality check showing we may not know as much as we thought we did," said lead author Patrick T. Brown, a Ph.D. student in climatology at Duke's Nicholas School of the Environment.

"This doesn't mean greenhouse gases aren't causing Earth's atmosphere to warm up in the long run," Brown emphasized. "It just means the road to a warmer world may be bumpier and less predictable, with more decade-to-decade temperature wiggles than expected. If you're worried about climate change in 2100, don't over-interpret short-term trends. Don't assume that the reduced rate of global warming over the last 10 years foreshadows what the climate will be like in 50 or 100 years."

Brown and his colleagues published their findings this month in the peer-reviewed Journal of Geophysical Research, at http://onlinelibrary.wiley.com/enhanced/doi/10.1002/2014JD022576/.

To conduct their study, they analyzed 34 climate models used by the Intergovernmental Panel on Climate Change (IPCC) in its fifth and most recent assessment report, finalized last November.

The analysis found good consistency among the 34 models explaining the causes of year-to-year temperature wiggles, Brown noted. The inconsistencies existed only in terms of the model's ability to explain decade-to-decade variability, such as why global mean surface temperatures warmed quickly during the 1980s and 1990s, but have remained relatively stable since then.

"When you look at the 34 models used in the IPCC report, many give different answers about what is causing this decade-to-decade variability," he said. "Some models point to the Pacific Decadal Oscillation as the cause. Other models point to other causes. It's hard to know which is right and which is wrong."

Hopefully, as the models become more sophisticated, they will coalesce around one answer, Brown said.

New data may offer resolution five times better than present data

Satellites planned for launch during the next several years may have an expanded role: Forecasting air-quality worldwide.

That's the view of University of Iowa researchers, who found that data gathered from geo-stationary satellites--satellites orbiting Earth at about 22,000 miles above the equator and commonly used for telecommunications and weather imaging--can greatly improve air-quality forecasting.

The UI study, published in the journal Geophysical Research Letters, is timely because space agencies in North America, Europe, and East Asia plan to launch geostationary satellites in the near future to continuously monitor pollutants found in Earth's atmosphere.

Co-lead authors Gregory Carmichael, professor of chemical and biochemical engineering, director of the Iowa Informatics Initiative, and co-director of the UI Center for Global and Regional Environmental Research (CGRER), and Pablo Saide, postdoctoral researcher at CGRER, say the geo-stationary satellites will provide aerosol optical depth--a measure of the amount of smoke, dust and other pollutants at different altitudes--every hour and in high resolution to give a better picture of changing air quality on the ground.

The data will be combined with what's collected currently by low Earth-orbiting satellites to produce more detailed maps that experts can use to make improved air-quality forecasts.

The researchers combined observed data and model predictions to estimate the effect of adding geo-stationary satellite data to aerosol simulations. More specifically, they combined data from the existing Geo-stationary Ocean Color Imager instrument on board the Korean Communications, Ocean and Meteorology Satellite, which observes northeast Asia, with data from low Earth-orbiting satellites. The two-week study period looked at air pollutants including desert dust, biomass burning, and various human-caused pollutants over northeast Asia.

While current models capture major pollution features, they tend to underestimate aerosol loads and surface concentrations. Adding geo-stationary satellite data improved the simulations. This was especially evident in the Korean peninsula, where predictions of surface particulate matter--when including geo-stationary data--was about five times better than using only low Earth-orbiting satellite data.

Carmichael says, "This study shows that these new data streams have the potential to really improve our prediction skill, but more work is needed to improve the modeling and retrievals in order to be better prepared to utilize the data that will come from the new geo-stationary satellites as they come online."

The study supports the planned geo-stationary missions, adds Carmichael.

Increased supercomputing capacity will improve accuarcy of weather forecasts

Today, NOAA has announced the next phase in the agency’s efforts to increase supercomputing capacity to provide more timely, accurate, reliable, and detailed forecasts.  By October 2015, the capacity of each of NOAA’s two operational supercomputers will jump to 2.5 petaflops, for a total of 5 petaflops – a nearly tenfold increase from the current capacity. 

“NOAA is America’s environmental intelligence agency; we provide the information, data, and services communities need to become resilient to significant and severe weather, water, and climate events,” said Kathryn Sullivan, Ph.D., NOAA’s Administrator. “These supercomputing upgrades will significantly improve our ability to translate data into actionable information, which in turn will lead to more timely, accurate, and reliable forecasts.” 

Ahead of this upgrade, each of the two operational supercomputers will first more than triple their current capacity later this month (to at least 0.776 petaflops for a total capacity of 1.552 petaflops). With this larger capacity, NOAA’s National Weather Service in January will begin running an upgraded version of the Global Forecast System (GFS) with greater resolution that extends further out in time – the new GFS will increase resolution from 27km to 13km out to 10 days and 55km to 33km for 11 to 16 days. In addition, the Global Ensemble Forecast System (GEFS) will be upgraded by increasing the number of vertical levels from 42 to 64 and increasing the horizontal resolution from 55km to 27km out to eight days and 70km to 33km from days nine to 16. 

Computing capacity upgrades scheduled for this month and later this year are part of ongoing computing and modeling upgrades that began in July 2013. NOAA’s National Weather Service has upgraded existing models – such as the Hurricane Weather Research and Forecasting model, which did exceptionally well this hurricane season, including for Hurricane Arthur which struck North Carolina. And NOAA’s National Weather Service has operationalized the widely acclaimed High-Resolution Rapid Refresh model, which delivers 15-hour numerical forecasts every hour of the day.

“We continue to make significant, critical investments in our supercomputers and observational platforms,” said Louis Uccellini, Ph.D., director, NOAA’s National Weather Service. “By increasing our overall capacity, we’ll be able to process quadrillions of calculations per second that all feed into our forecasts and predictions. This boost in processing power is essential as we work to improve our numerical prediction models for more accurate and consistent forecasts required to build a Weather Ready Nation.”

The increase in supercomputing capacity comes via a $44.5 million investment using NOAA's operational high performance computing contract with IBM, $25 million of which was provided through the Disaster Relief Appropriations Act of 2013 related to the consequences of Hurricane Sandy. Cray Inc., headquartered in Seattle, plans to serve as a subcontractor for IBM to provide the new systems to NOAA.

“We are excited to provide NOAA’s National Weather Service with advanced supercomputing capabilities for running operational weather forecasts with greater detail and precision,” said Peter Ungaro, president and CEO of Cray. “This investment to increase their supercomputing capacity will allow the National Weather Service to both augment current capabilities and run more advanced models. We are honored these forecasts will be prepared using Cray supercomputers.”

"As a valued provider to NOAA since 2000, IBM is proud to continue helping NOAA achieve its vital mission," said Anne Altman, General Manager, IBM Federal. "These capabilities enable NOAA experts and researchers to make forecasts that help inform and protect citizens. We are pleased to partner in NOAA's ongoing transformation."

 

Page 10 of 25