Existing NWS forecast graphics (top) emphasize the probability of rain and other weather events. The redesigned graphics (bottom) contain more information about the timing and potential severity of forecast weather systems. These examples are from the June 26 forecast for Bridgeport, Ohio. To view local forecasts for other areas, visit weather.gov.

New online graphics being rolled out this summer by the National Weather Service (NWS) are based on research by a team of risk communication experts at the National Center for Atmospheric Research (NCAR) who focused on how to better convey forecast information visually.

Beginning July 7 the NWS will use the redesigned icons for all weather.gov local forecasts. These point-and-click forecasts influence weather-related decisions by people across the country, drawing 2 to 4 million unique web views daily.

The new icons will feature split images and color-coded boxes to better communicate the existence, timing, and potential severity of upcoming weather threats. For example, instead of portraying a night as entirely rainy, a split image could show a 30 percent chance of rain in the early part of the evening, followed by partial clearing after midnight. Colored rectangles drawn around the images will also be used to call attention to weather threats, with yellow denoting a watch, orange denoting an advisory, and red denoting a warning.

The changes are based on several years of work by a multidisciplinary team at NCAR that worked closely with the NWS, an agency within the National Oceanic and Atmospheric Administration. The team developed prototype graphics and surveyed tens of thousands of weather.gov local forecast users, finding that specific improvements to the graphics could greatly improve public understanding of the forecasts.

“We want to help people better understand when there’s a major weather threat and when it’s likely to occur,” said NCAR’s Julie Demuth, a researcher who specializes in communicating weather risks to the public. “The main goal is to better convey information that's critical for protecting lives and protecting property.”

Eli Jacks, acting chief of the NWS’s Forecast Services Division, said the changes are an important step toward helping people make better use of NWS forecasts.

“Just putting out the forecast is no longer enough,” he said. “This helps users interpret the forecast more easily and use it to make informed decisions.”

NWS director Louis Uccellini said the new project demonstrates the importance of research in communicating forecasts to the public.

“Research provides the essential backbone to any effective operational product used by the National Weather Service to protect lives and property,” Uccellini said. “Not only must we have advancements in atmospheric science to improve forecasts and services, we must also infuse social science to ensure we are communicating forecast information in a way that is clear and understandable so people can take appropriate action.”

The work was primarily funded by the NWS, with additional support from NOAA and the National Science Foundation, which sponsors NCAR.


Demuth and her colleagues began looking into the issue several years ago when the NWS wanted to determine if the icons could communicate weather information more clearly. An icon depicting rain, for example, might prompt people to cancel outdoor plans even if there was just a 10 percent chance of showers for a few hours.

The research team, working with the NWS, focused on better transmitting two aspects of a forecast that are crucial for helping people understand their risk: the existence of a hazardous weather threat and the timing of that threat. They developed experimental graphics and text with the goal of conveying sometimes complex information in a way that is easy to interpret.

“This is supposed to be a forecast at a glance, so we adapted the approach using NCAR’s work as a basis.” Jacks said. “Ideally a user will look at it and get a pretty good idea of what is expected.”

Demuth and her colleagues conducted two rounds of surveys with more than 13,000 users of NWS forecasts, asking them to evaluate designs to see which one was best at communicating a severe thunderstorm warning and a flood watch.

The surveys showed that users were significantly better at identifying the timing and nature of the weather threat when the information was presented with new graphics and reinforced with wording that stated the start and stop times of the threat. For example, more than 97 percent of survey respondents correctly identified the start and end time of the flood watch with the revamped graphics and text, compared to less than 4 percent in a control group using the existing NWS presentation.

The research team summarized its findings in a 2013 paper in Weather and Forecasting, an American Meteorological Society journal.

Building on this work, the NWS further refined the icons and began to test them last year, asking for input from thousands of users. The agency is now previewing the new graphical approach on its website as it prepares to launch the new version across the nation next month.

“This work illustrates the importance of evaluating how people interpret information about risks, particularly when it’s a matter of public safety,” Demuth said. “Making improvements in how we communicate hazardous weather information—even seemingly small improvements—can translate into very large benefits for society.”

“It’s gratifying to be part of a collaboration where NCAR research is being used to help the National Weather Service alert people across the U.S. about potentially dangerous weather," said Thomas Bogdan, president of the University Corporation for Atmospheric Research, which manages NCAR. "This is a great example of how investments in science lead to substantial benefits for society.”

Artist’s impression of a Young Earth (credit W. Henning)

To sort out the biological intricacies of Earth-like planets, astronomers have developed supercomputer models that examine how ultraviolet radiation from other planets’ nearby suns may affect those worlds, according to new research published June 10 in Astrophysical Journal.

“Depending on the intensity, ultraviolet radiation can be both useful and harmful to the origin of life,” says Lisa Kaltenegger, Cornell associate professor of astronomy and the director of Cornell’s new Carl Sagan Institute: Pale Blue Dot and Beyond. “We are trying to ascertain how much radiation other young Earths would get and what that could mean for the possibility for life.”

The study, “UV Surface Environment of Earth-like Planets Orbiting FGKM Stars Through Geological Evolution,” was prepared by lead author Sarah Rugheimer, Cornell research associate at the Carl Sagan Institute; Antigona Segura of Universidad Nacional Autonoma de Mexico; Dimitar Sasselov of the Harvard Smithsonian Center for Astrophysics; and Kaltenegger.

“We’re going to see all kinds of planets in all kinds of stages in their own evolution, but we wanted to take four kinds of epochs from Earth history, as samples of what we might see,” said Rugheimer. “With the next generation of missions, we expect to observe a wide diversity of extrasolar planets.”

Borrowing from deep into Earth’s history, Rugheimer and co-authors modeled the first epoch, a pre-biotic world with a carbon dioxide-dominated atmosphere, similar to early Earth 3.9 billion years ago. The second epoch – about 2 billion years ago – spawned the first whiff of oxygen, an active biosphere and the process of biosynthesis. Oxygen started to rise from the first cyanobacteria to the 1 percent concentration of current levels.

“It’s not just the amount of ultraviolet radiation, but also the specific types of ultraviolet radiation which will impact biology,” Rugheimer said. “We consider which wavelengths are most damaging to DNA and other biomolecules in addition to just looking at the total amount of radiation.”

Multicellular life started about 800 million years ago, and the group modeled a third epoch, where oxygen rose to about 10 percent of today’s levels. The fourth epoch corresponds to modern Earth, where the atmosphere features a carbon dioxide ratio of about 355 parts per million with current oxygen levels.

The researchers noted that for all epochs after the rise of oxygen the hottest and coolest stars have less biologically effective radiation. For the hottest stars, this is due to increased ozone shielding from higher UV environments, and for the coolest stars this is due to less absolute UV flux.

Rugheimer, who conducted the research while she was a doctoral student at Harvard University, explained that astrobiology draws researchers across disciplines and noted: “This work provides a link from the astrophysical conditions we expect to find on other planets to the origin-of-life experiments happening on here on Earth.”

The Simons Foundation provided funding for this research.

Spanish scientists have developed a new piece of software to predict the source of faecal pollution in seas, reservoirs and rivers. The system, called Ichnaea, uses the automatic learning and analysis of various biological indicators to make highly reliable predictions of this type of pollution, which poses a serious health risk. The team is now looking for funding to move the whole application to the cloud. 

Faecal pollution is increasingly more common in rivers and water reserves. The concentration of towns increases the demand for water and also generates a high volume of waste water from both humans and animals. For this reason, a multidisciplinary team of Spanish biologists and IT experts have decided to develop a new system which helps to identify the source of the faecal pollution present in the water.

"Identifying the species to which the traces belong would help in resolving conflicts about who is responsible for the faecal pollution of a river: a farm, an abattoir, a sewage treatment plant or a human population nucleus, for example," as Anicet R. Blanch tells SINC. Blanch is microbiologist at the University of Barcelona, and co-director of the project together with Lluis Belanche, from the Polytechnic University of Catalonia.

Microbial, chemical or eukaryotic indicators

According to Blanch, the new piece of software, which they have called Ichnaea (ancient Greek for 'tracker'), "is based on the development of prediction models from the analysis of a series of microbial, chemical or eukaryotic indicators. This information allows the source of the sample to be determined, even in complex cases in which the faecal pollution is very diluted or deteriorated," Blanch highlights.

To be able to elaborate on these predictions, the analysis results of several parameters of other water samples with a single known source of faecal pollution have to be previously entered into the system. "Using this data, the software determines the relevant indicators, which when analysed in water samples with an unknown source of faecal pollution, would allow its source to be determined," he adds.

"Up until now, each research group proposed the indicators which they believed to be the most important, but Ichnaea eliminates the subjectivity by selecting the most essential for a reliable prediction from among the different variable parameters," the researcher explains. 

Reliability of the prediction

Some of these relevant indicators are microbial parameters such as bacteriophages (viruses which infect bacteria) linked to a single species. Others related to bifidobacteria (a group of bacteria that live in the intestine) often appear or the mitochondrial DNA of the specimens, indicates the expert. 

The prediction's degree of reliability depends on the number and the quality of the samples used to train the supercomputer learning models, the parameters that have been analysed and their relevance, given that the presence of certain bacteria varies depending on the geographical location. The freshness and degree of dilution of the samples to be analysed also play a part.

The scientists compared the effectiveness of this system on three sites with different degrees of faecal pollution of human and animal (cattle, pig or poultry) origin. The predictions made ascertained the source in areas with high and medium dilution. Where the dilution was low, in irrigation channels in the Delta del Ebro, a high range of probability was established.

Analysis from the cloud

This software is currently still in its prototype stage, as its components have been developed separately. The researchers are now looking for funding which will allow them to refine and integrate the different modules into a single platform for calculation. They plan to move the whole application to the cloud.

"The idea is that anyone can access the system from any computer, even a tablet, because the calculation is done from a remote machine," explains Blanch.

According to the team's plans, the user will be able to adapt the prediction to their geographical location with a customised setting. This will allow them to provide the software with their own samples of known origin, to complete the learning stage which will subsequently allow the predictions to be made.

In the event that a user does not have their own samples for this training, they could use the system's database, which will include results left with open access by other scientists. In this way, they will be able to choose the analysis of the closest or most similar geographical areas to that which is to be studied.

Blanch considers that knowing the source of the pollution is also important from a health-risk perspective, "given that human pathogens present in water are significantly more contagious than those of animal origin," concludes the scientist.

How light of different colours is absorbed by carbon dioxide (CO2) can now be accurately predicted using new calculations developed by a UCL-led team of scientists. This will help climate scientists studying Earth's greenhouse gas emissions to better interpret data collected from satellites and ground stations measuring CO2.

By improving the understanding of how much radiation CO2 absorbs, uncertainties in modeling climate change will be reduced and more accurate predictions can be made about how much Earth is likely to warm over the next few decades.

Previous methods were only accurate to about 5% at best across all wavelengths, whereas the new calculations give an accuracy of 0.3%. This improvement will enable missions to achieve their goals, which demand an accuracy of 0.3-0.5% say the team of scientists.

The study, published today in Physical Review Letters by researchers from UCL, the Russian Academy of Sciences (Russia), National Institute of Standards and Technology (USA) and Nicolaus Copernicus University (Poland), shows how the fundamental laws of quantum mechanics can be used to predict precisely how light of different colours is absorbed by CO2. This will help climate scientists work out how CO2 evolves in the atmosphere and pinpoint where it is being produced.

Supervising author, Professor Jonathan Tennyson, UCL Physics & Astronomy, said: "Billions of dollars are currently being spent on satellites that monitor what seems to be the inexorable growth of CO2 in our atmospheres. To interpret their results, however, it is necessary to have a very precise answer the question "How much radiation does one molecule of CO2 absorb?" Up until now laboratory measurements have struggled to answer this question accurately enough to allow climate scientists to interpret their results with the detail their observations require."

The team used calculations based on quantum mechanical equations to predict the chances of a CO2 molecule absorbing different colours of light, which have defined energies. These predictions, made using powerful supercomputers, were verified using highly precise measurements taken using an extremely sensitive technique called 'cavity-ring down spectroscopy'. This method simulates the distances in space across which absorption measurements are taken, but in a sample length of 75 cm.

Lead author, Dr Oleg Polyansky, UCL Physics & Astronomy, said: "We have long known the exact quantum mechanical equations obeyed by a molecule like CO2; however these equations are much too complicated to solve explicitly. But the combination of modern [super]computers and novel treatments of the problem mean that we can now use quantum theory to calculate how strongly CO2 absorbs light at each wavelength".

Dr Joseph Hodges, from the National Institute of Science and Technology in Gaithersberg, USA who led the team measuring the spectrum of CO2 in the laboratory, said: "These measurements are very challenging so we could only make precise lab measurements at a few wavelengths. Where we were able to make measurements, the agreement with the calculations is excellent which enables us to have full confidence in Dr Polyansky's calculations."

The results will allow atmospheric scientists to monitor how CO2 evolves in Earth's atmosphere, where is produced and moves to, all of which are key to understanding the atmosphere, monitoring human behaviour and the future of our planet.

Similar to how hurricane forecasters combine all projected paths of the storm to predict landfall, a new group aims to take the most useful science and perspectives to gauge how the world's oceans should be best managed.

The Ocean Modeling Forum, a collaboration between the School of Aquatic and Fishery Sciences at University of Washington and NOAA Fisheries, is trying something very rare - bringing together multiple science models and people who care about a particular ocean resource or fishery to decide what's most important for its vitality and the communities it serves. 

"We've gotten to this point now where there's an amazing amount of science, but it's fragmented," said Phil Levin, the Ocean Modeling Forum's co-director and senior scientist with NOAA Fisheries. "We want to bring it together and exploit the strengths of all these different models and data streams and try to overcome the weaknesses."

The Ocean Modeling Forum will address all ocean management issues, facilitating conversations among as many stakeholders as possible. Its first project, focused on the recently closed Pacific sardine fishery, will have its fourth and final meeting later this month in Seattle. The group has started its second project this week in Richmond, British Columbia, with a summit focusing on the Pacific herring fishery. 

The goals of the herring summit are to hear from tribes and First Nations peoples, social and natural scientists, the fishing industry, nonprofits, and federal and state wildlife managers about the role herring plays socially and ecologically, and to begin to develop a framework for how traditional ecological knowledge - in addition to scientific data - can be used in fisheries management practices.

"I think there is a lot to be gained in bringing together people who are working on common problems so we can use models in new ways," said André Punt, director of the UW's School of Aquatic and Fishery Sciences and co-director of the Ocean Modeling Forum.

The three-day summit in British Columbia comes at a time when many are questioning plans for the Pacific herring fishery. Some First Nations peoples protested and prevented commercial fishing by taking to the water last year and this spring, though Fisheries and Oceans Canada opened a commercial fishery. The First Nations have argued that the herring population, which holds deep cultural significance, hasn't yet recovered to a sustainable level, and some scientists evaluating the fishery agree.

The first day of the summit is devoted to hearing stories about the significance of herring, and many tribes and First Nations peoples are expected to share. An artist will serve as a pictorial recorder, overlaying words, ideas and pictures from the stories into works of art that will remind participants of the discussions and themes throughout the conference.

The Ocean Modeling Forum's approach for the summit - where the people who set fishing quotas, conduct the science, catch the fish and plan for the future are all brought to the discussion table - is the first of its kind worldwide for fisheries. The goal is to come up with a plan to sustainably manage a fishery in a way that's more nimble to change, and sensitive to both ecological and social factors.

"The idea is to increase the breadth of the approach to address the complex questions that we're facing right now," said Tessa Francis, managing director of the Ocean Modeling Forum and lead ecosystem ecologist with the Puget Sound Institute at UW Tacoma. "Given the particularly knotty ocean management issues faced worldwide, our hope is to bring together all the existing models, with their modeling teams, to provide more reliable and clear advice."

Some models combine a hypothesis and data to try to predict how healthy a fishery is ecologically, while others look at how well it's performing from an economic standpoint. Nearly all involve complicated math, and scientists agree that all models are flawed in some way. So, by looking at every option on the table, organizers hope the best parts of each will rise to the surface.

The result is a way for managing a fishery that offers more than any single model could on its own.

Over the next year and a half, a smaller working group will take information from the herring summit and construct a framework that agencies can adopt when they are ready to incorporate human dimensions, such as the cultural significance of fishing, into fisheries management. 

The sardine project launched with a similar meeting in March 2014, and its working group will wrap up this summer with a paper summarizing its findings and recommendations.

In the future, organizers say, the Ocean Modeling Forum could be used to address other fish and animal species, or issues such as how to manage resources affected by ocean acidification. 

"I think the sky is the limit in terms of the sorts of issues we can address and the scope with which we can address them," Levin said.

Page 6 of 25