A gas and particle rich plume emanates from molten lava beneath Halemaʻumaʻu Crater on the Island of Hawaiʻi. The plume reacts and converts in the atmosphere, forming the acidic volcanic pollution locally known as “vog.” (photo credit: Michael Poland, USGS)

A paper published this month by University of Hawaiʻi and Hawaiian Volcano Observatory researchers in the Bulletin of the American Meteorological Society details the development and utility of a supercomputer model for the dispersion of volcanic smog or “vog,” which forms when volcanic sulfur dioxide gas interacts with water and coverts it to acid sulfate aerosol particles in the atmosphere.

Vog poses a serious threat to the health of Hawaiʻi’s people as well as being harmful to the state’s ecosystems and agriculture. Even at the low concentrations, which can be found far from the volcano, vog can provoke asthma attacks in those with prior respiratory conditions. It also damages vegetation and crops downwind from the volcano.

News tools for predicting vog

Scientists from the UH Mānoa School of Ocean and Earth Science and Technology (SOEST), under the leadership of Professor of Meteorology Steve Businger, and in collaboration with researchers at the Hawaiian Volcano Observatory, developed a supercomputer model for predicting the dispersion of vog. The vog model uses measurements of the amount of sulfur dioxide (SO2) emitted by Kīlauea, along with predictions of the prevailing winds, to forecast the movement of vog around the state.

The team of scientists developed an ultraviolet spectrometer array to provide near-real-time volcanic gas emission rate measurements; developed and deployed SO2 and meteorological sensors to record the extent of Kīlauea’s gas plume (for model verification); and developed web-based tools to share observations and model forecasts, providing useful information for safety officials and the public and raising awareness of the potential hazards of volcanic emissions to respiratory health, agriculture and general aviation.

“Comparisons between the model output and vog observations show what users of the vog model forecasts have already guessed—that online model data and maps depicting the future location and dispersion of the vog plume over time are sufficiently accurate to provide very useful guidance, especially to those who suffer allergies or respiratory conditions that make them sensitive to vog,” said Businger.

A statewide concern

Kīlauea volcano, the most active volcano on earth, is situated in the populous State of Hawaiʻi. The current eruption has been ongoing since 1983, while a new summit eruption began in 2008.

The most significant effect of this new eruption has been a dramatic increase in the amount of volcanic gas that is emitted into Hawaiʻi’s atmosphere. While the effects of lava eruption are limited to the southeastern sector of the Big Island, the volcanic gas emitted by Kīlauea is in no way constrained; it is free to spread across the entire state. Protea plants damaged by vog have cost farmers in Ocean View, on average, 40 percent of their household income. (photo credit: Chris Stewart, the Chronicle)

“Higher gas fluxes from Kīlauea appear to be the new norm. For the State of Hawaiʻi to understand the effects of vog and then come up with strategies to efficiently mitigate its effects, accurate forecasts of how vog moves around the state are vital,” said Businger.

The American Recovery Act award that originally funded the development of the vog model program has long since expired. Funding for a PhD candidate, Andre Pattantyus, to help keep the online vog products available has been provided by SOEST and the Joint Institute for Marine and Atmospheric Research.

Because Pattantyus, the lead vog modeler, is set to graduate this winter, the vog program is at a crossroads. Businger is working with stakeholders that include federal, state, commercial and private interests to jointly fund an ongoing vog and dispersion modeling capability for the residents of Hawaiʻi.

Public support of the vog modeling program is critical for the program to continue providing vog plume predictions in future.

A new, low-cost experimental device for sea level measurement has been developed by European Commission Joint Research Centre (JRC) scientists. The first four devices have been installed in Spain and Portugal, in collaboration with the European Commission's Directorate-General for Humanitarian Aid and Civil Protection and the UNESCO Intergovernmental Oceanographic Commission (IOC). Additional 16 devices will be installed in the Mediterranean and will concretely contribute to the improvement of tsunami monitoring activities.

Tsunami early warning systems are based on observation networks of seismometers and sea level measuring stations, which send real time data to national and regional warning centres (TWCs). Based on these observations, TWCs are able to confirm or cancel a tsunami watch or warning.

The presence of devices for sea level measurement in many different locations along the coast is crucial for effective tsunami monitoring, but so far the high installation and maintenance cost of these devices have prevented their large-scale use.

The JRC has developed the Inexpensive Device for Sea Level Measurement (IDSL), a novel mareograph system to measure the sea level in real time. Compared to existing similar devices, the IDSL has a very low cost while still providing high quality measurements within few seconds. Sea level data are measured every five seconds and transmitted to JRC servers through the GPRS phone lines and thus immediately available for analysis. A web site is available to visualise, analyse and download the data as soon as they are stored.

The first four devices were installed in October on the Spanish coast in Cartagena and Cadiz and in Albufeira and Sagres in Portugal. Another 16 devices will be installed in the near future in Greece, Italy, Lebanon, Morocco, Romania, Tunisia and Turkey in order to further test the IDSL reliability, duration and quality. 

This project contributes to the activities of the North Atlantic and Mediterranean Tsunami Warning System (NEAMTWS) set up by the UNESCO Intergovernmental Oceanographic Commission (IOC). By the end of the installation period, the Tsunami Monitoring Centres will have at their disposal devices allowing real-time analysis of tsunami events, contributing to a major improvement in their monitoring capacity.

Current flood models do not account for cities' impact on local rainfall patterns, an oversight that could lead to significantly underestimating the severity and frequency of floods in urban areas, a Purdue study finds.

Dev Niyogi, Indiana's state climatologist, and collaborators at China's Tsinghua University showed that hard, impenetrable city surfaces such as concrete can dramatically influence the way rainfall spreads across a watershed.

Flood models that do not incorporate the ways cities modify rainfall patterns could underestimate the magnitude of future floods by as much as 50 percent, said Niyogi, who is also a professor of agronomy and earth, atmospheric and planetary sciences.

"Over the last two decades, we've seen an increase in major flooding in urban areas," he said. "Many factors are contributing to that change, including extreme weather, climate change and climate variability. But evidence is also emerging that cities themselves are significantly and detectably changing the rainfall patterns around them."

In rural areas, rainfall seeps gradually into the soil, nourishing plants, flowing into streams and lakes and replenishing underground aquifers. But when a region is urbanized - topped with impervious surfaces such as asphalt - the way water moves over land changes significantly. 

Paved roads, sidewalks and roofs can hinder rainwater's ability to filter into the ground, often creating powerful streams by linking water flow paths and sending runoff surging into drains and waterways along with pollutants it picks up along the way. This dual combination of city-altered rainfall patterns and the urban network can modify the strength, timing and duration of floods in cities and damage local water quality.

Previous research by Long Yang, a postdoctoral research associate at Tsinghua University and first author of the study, suggested that covering 5 percent of a watershed with impervious surfaces is the threshold at which local hydrology - the science of how water moves - begins to change. These effects are magnified with increases in impervious surfaces.

Hydrologic engineers make projections about future floods with a liner model that assumes flood magnitude increases at the same rate as impervious coverage and rain. But Niyogi and Yang think this model does not tell the whole story: It treats cities as isolated systems that respond to changes in rainfall but do not affect the outside system, such as the atmosphere.

According to Niyogi, growing evidence shows that cities measurably shift rainfall patterns by sending signals to the atmosphere that can alter how rain accumulates, its intensity, the frequency of downpours and the dynamics of how storms evolve.

By omitting this complex back-and-forth between cities and local rain patterns, the linear model could greatly underestimate flood changes and undermine the effectiveness of flood-control designs, he said.

"Models need to move beyond the traditional approach to consider the feedbacks cities have on local rainfall," he said.

The researchers used a grid-based watershed model to test how urbanized regions surrounded by rural areas, such as grassland, crops and forests, interact with rain and affect how it is spread across a watershed. Hydrology in these "mixed" watersheds can be much more varied than in completely urbanized or undeveloped areas, Yang said. To mimic real rainfall variability, they used a dataset of 30-year daily rainfall typical of an urbanized watershed.

Simulations showed that a region's hydrological response depends on two factors - the amount of impervious coverage and the spatial pattern of rainfall, for example, whether rain falls across the watershed in a random way or is more centralized over impervious areas.

The most notable changes in flood magnitude occurred when the watershed was moderately urbanized, that is, 20-30 percent covered with impervious surfaces. The results indicate that the risk of underestimating flood changes is larger for moderately urbanized watersheds than completely urbanized or undeveloped areas, Yang said.

"Each city is unique," Niyogi said. "There is no single, simple recipe for what urban areas can do to modify rainfall potential and flood risks. Engineering will have to evolve to include multiple factors and feedbacks. Here we've shown that cities create their own rainfall patterns that need to be considered to mitigate flood hazards."

Fuqiang Tian of Tsinghua University also co-authored the paper.

The paper was published in Urban Climate and is available to journal subscribers and campus readers at http://www.sciencedirect.com/science/article/pii/S2212095515000073

The National Science Foundation of China and the Ministry of Science and Technology in China and the U.S. National Science Foundation provided funding for the study. Researchers received support from the Foundation of State Key Laboratory of Hydro-Science and Engineering of Tsinghua University, the NSF's Career and Computational and Data-Enabled Science Strong City grants and the U.S. Department of Agriculture's National Institute of Food and Agriculture Drought Trigger project. 

A new study by a University of York scientist confirms that historical temperature data records constitute reliable evidence of climate change.

Dr Kevin Cowtan, a computational scientist, carried out a new analysis of the data following suggestions that it was unreliable and that the rate of global warming was exaggerated as a result.

Previous research by Dr Cowtan helped to address a claim that global warming had "paused" and clarified the level of agreement between climate models and observations.

In the new study, he sets out to reproduce the science behind the calibration of old temperature records, known as "homogenization".  Dr Cowtan presents not only the results but also a step-by-step description of the methods he has employed to test the data. He concludes that the science is sound, and suggests that it is straightforward enough for many amateur computational scientists to reproduce.

Dr Cowtan, of the York Structural Biology Laboratory in the University’s Department of Chemistry, says: "It's a fun problem.  And it's simple enough that a citizen scientist with only modest computational skills can take it on.”

The report includes a range of tests for bias in the calibration process which supported the original results. Dr Cowtan goes further, however, and demonstrates how it is possible to write a rudimentary temperature homogenization program from scratch.  He shows that the increase in warming in the calibrated data is attributable to the data itself rather than the software.

"Reproducibility is key to science.  I've shown what can be done, but I could only take it so far.  I'm hoping that others will not only reproduce what I have done, but build on it," he adds

On the left, La Nina cools off the ocean surface (greens and blues) in the winter of 1988. On the right, El Nino warms up it up (oranges and reds) in the winter of 1997.

El Nino and global warming work together to bring more extreme weather

In the future, the Pacific Ocean's temperature cycles could disrupt more than just December fishing. A study published in Nature Communications suggests that the weather patterns known as El Nino and La Nina could lead to at least a doubling of extreme droughts and floods in California later this century.

The study shows more frequent extreme events are likely to occur. Other research shows the Golden State's average precipitation increasing gradually, but not enough to account for the occurrence of extreme events. A better understanding of what gives rise to El Nino and La Nina cycles — together known as El Nino-Southern Oscillation — might help California predict and prepare for more frequent droughts and floods in the coming century.

"Wet and dry years in California are linked to El Nino and La Nina. That relationship is getting stronger," said atmospheric scientist Jin-Ho Yoon of the Department of Energy's http://www.pnnl.gov Pacific Northwest National Laboratory. "Our study shows that ENSO will be exhibiting increasing control over California weather."

Rain's range

California is experiencing one of the most severe droughts in its history, but it's not clear if a warmer world will make droughts worse, more frequent or perhaps even improve the situation. After all, warmer air can hold more water, and some research suggests global warming could increase California's average rain and snowfall.

However, research also suggests future rain will come down more as light drizzles and heavy deluges and less as moderate rainfall. Yoon and colleagues from PNNL and Utah State University in Logan, Utah, wondered if droughts might follow a similar pattern.

To find out, the researchers looked at what happens to California in global climate models. They simulated two periods of time: 1920 to 2005 using historical measurements; and 2006 to 2080 using conditions in which very few efforts are made to reduce greenhouse gas emissions. They chose this future scenario to examine the most extreme case.

To understand how well the supercomputing simulations worked, they used two tactics to show reproducibility: In one tactic, they used a compilation of 38 different models. In the other, they re-ran a single model 30 times. The more similar the results, the more sure the researchers were of the finding.

Weather pendulum

The models showed that in the future, assuming emissions continue to increase, California seasons will exhibit more excessively wet and excessively dry events. These results suggest that the frequency of droughts could double and floods could triple between the early 20th century and late 21st century.

"By 2100, we see more — and more extreme — events. Flooding and droughts will be more severe than they are currently," said Yoon.

But why? Yoon suspected the El Nino-Southern Oscillation. Every two to seven years, El Nino comes in and warms up the tropical Pacific Ocean a few degrees, increasing winter rain and snowpack in California. On a similar schedule, La Nina cools things off. Both disrupt regular weather in many regions around the globe.

To explore El Nino's connection to California precipitation, Yoon and colleagues ran a climate model with and without El Nino. In both simulations, they ramped up the concentration of carbon dioxide by 1 percent every year for 150 years. In just one of the runs, they removed El Nino's cyclical contribution by programming the sea surface temperatures to reflect only steady warming.

Without El Nino and La Nina, the frequency of extreme precipitation in California stayed constant for the simulation's century and a half. With ENSO, simulated California experienced wide swings in rainfall by the end of the period.

The results suggest that even though researchers expect rain and snowfall to increase as the climate warms, the manner in which the water hits California could be highly variable.

The El Nino-Southern Oscillation is still a bit of a mystery, said Yoon. Scientists only know El Nino and La Nina years, named for the Spanish terms for boy and girl, are coming by sea surface temperatures and other weather hints. Studies that investigate what controls the unruly children could help scientists predict unruly weather in the future.

Page 3 of 25