CIAT builds high-resolution dataset, bias-corrected CMIP5 projections for climate change impact assessments

The 7-terabyte dataset, the largest of its kind, helps envision climate-change scenarios at scales as small as 1 kilometer; a new review validates and describes the dataset

What the global climate emergency has in store may vary from one back yard to the next, particularly in the tropics where microclimates, geography, and land-use practices shift dramatically over small areas. This has major implications for adaptation strategies at local levels and requires trustworthy, high-resolution data on plausible future climate scenarios.

A dataset created by the International Center for Tropical Agriculture (CIAT) and colleagues is filling this niche. Primarily intended to help policymakers devise adaptation strategies for smallholder farmers around the world, the open-access dataset has been used in 350 research papers. Users in at least 186 countries have downloaded almost 400,000 files from the dataset since it went online in 2013.

A description, review, and validation of the dataset, including how it was built, was published today in an educational journal. {module INSIDE STORY}CAPTION A small bean farm in Colombia's Darién region. Future climate scenarios can be modeled at the community scale thanks to a dataset created by the CGIAR research program on Climate Change, Agriculture and Food Security (CCAFS) and the International Center for Tropical Agriculture (CIAT).  CREDIT Neil Palmer / International Center for Tropical Agriculture

"Climate models are complex representations of the earth system, but they aren't perfect," said Julian Ramirez-Villegas, the principal investigator of the project and a scientist with CIAT and the CGIAR Research Platform on Climate Change, Agriculture and Food Security (CCAFS). "These errors can have an impact on our agricultural models. Because these models help us make decisions, this can have dire consequences."

While the data has primarily served agricultural research, it has also been used to map the potential global spread of Zika (a mosquito-borne disease), to plan investment strategies for international development, and to predict the ongoing decline of outdoor skating days in Canada due to warmer winters.

"The use and applicability of this data have been really extensive and topically quite broad," said Ramirez-Villegas. "Of course, a large portion of the studies has been done on crops that are key to global food security and incomes such as rice, coffee, cocoa, maize, and others."

Pinpointing climate impacts

Climate-change projections are typically available at coarse scales, ranging 70-400km. But models for the impact of climate change for many agricultural plant varieties require data at finer scales. The researchers used techniques to increase the spatial resolution (a process known as downscaling) and to correct errors (a process known as bias correction) to create high-resolution future climate data for 436 scenarios.

"This is a critical resource for modeling more realistically the future of crops and ecosystems," said Carlos Navarro, the lead author of the study who is affiliated with CIAT and CCAFS.

For a given emissions pathway and future period, each scenario includes monthly information for average and extreme temperatures, rainfall, and 19 other related variables. The data are publicly available in the World Data Center for Climate and the CCAFS-Climate data portal.

"Through these scenarios, we can understand, for instance, how agricultural productivity might evolve if the world continues on the current greenhouse emissions trajectory," said Navarro. "They also provide the data to model what types of adaptations would best counter any negative climate change effects."

Global and regional models analyze climate conditions at rougher scales and simplify natural processes, producing results that may deviate from realistic scenarios.

The dataset is CGIAR's biggest Findable Accessible Interoperable Reusable (FAIR) database. It also underscores CGIAR's role in big data for development, through its Platform for Big Data in Agriculture. The dataset is currently included in its Global Agriculture Research Data Innovation and Acceleration Network (GARDIAN).

The high-resolution scale of this data is useful for scientists, policymakers, NGOs and investors, as it can help them understand local climate change impacts and therefore make better bets on adaptation measures, which plans can specifically target watersheds, regions, municipalities or countries.

In addition to the studies noted by Ramirez-Villegas above, other studies that have used the datasets include:

-Mapping global environmental suitability for Zika virus. The results showed that more than 2.17 billion people in the tropics and sub-tropics live in Zika-prone areas.

-A multi-year CCAFS study following more than 15,000 farmers across India who are testing new seed varieties to enhance smallholder resilience to climate change.

-The above study also noted how Concern Worldwide, an NGO that does long-term development work, has used the data to identify adaptation options and investment strategies in Chad and South Sudan.

-The datasets were used in numerous climate change-impact studies on crops in Africa, including cocoa in Ghana and Cote d'Ivoire, chickpea in East Africa, irrigated sugarcane in South Africa, and groundnuts in West Africa.

-In a show of the dataset's broad research potential, a study in Canada showed how days of outdoor ice-skating are in decline there due to warming.

The core of massive dying galaxies already formed 1.5 billion years after the Big Bang

Astrophysics, Galaxies: The most distant dying galaxy discovered so far, more massive than our Milky Way -- with more than a trillion stars -- has revealed that the 'cores' of these systems had formed already 1.5 billion years after the Big Bang, about 1 billion years earlier than previous measurements revealed. The discovery will add to our knowledge on the formation of the Universe more generally, and may cause the supercomputer models astronomers use, one of the most fundamental tools, to be revised. The result was obtained in close collaboration with Masayuki Tanaka and his colleagues at the National Observatory of Japan is now published in two works in the Astrophysical Journal Letters and the Astrophysical Journal.

What is a "dead" galaxy?

Galaxies are broadly categorized as dead or alive: dead galaxies are no longer forming stars, while alive galaxies are still bright with star formation activity. A 'quenching' galaxy is a galaxy in the process of dying -- meaning its star formation is significantly suppressed. Quenching galaxies are not as bright as fully alive galaxies, but they are not as dark as dead galaxies. Researchers use this spectrum of brightness as the first line of identification when observing galaxies in the Universe. The James Webb Space Telescope, whose primary mirror has a diameter of 6.5 meters, will be launched in 2021. It is the designated successor of the Hubble Space Telescope.{module INSIDE STORY}

The farthest dying galaxy discovered so far reveals remarkable maturity

A team of researchers of the Cosmic Dawn Center at the Niels Bohr Institute and the National Observatory of Japan recently discovered a massive galaxy dying already 1.5 billion years after the Big Bang, the most distant of its kind. "Moreover, we found that its core seems already fully formed at that time", says Masayuki Tanaka, the author of the letter. "This result pairs up with the fact that, when these dying gigantic systems were still alive and forming stars, they might have not been that extreme compared with the average population of galaxies", adds Francesco Valentino, assistant professor at the Cosmic Dawn Center at the Niels Bohr Institute and author of an article on the past history of dead galaxies appeared in the Astrophysical Journal.

Why do galaxies die? - One of the biggest and still unanswered questions in astrophysics

"The suppressed star formation tells us that a galaxy is dying, sadly, but that is exactly the kind of galaxy we want to study in detail to understand why it dies", continues Valentino. One of the biggest questions that astrophysics still has not answered is how a galaxy goes from being star-forming to being dead. For instance, the Milky Way is still alive and slowly forming new stars, but not too far away (in astronomical terms), the central galaxy of the Virgo cluster - M87 - is dead and completely different. Why is that? "It might have to do with the presence of gigantic and active black hole at the center of galaxies like M87" Valentino says.

Earth based telescopes find extremes - but astronomers look for normality

One of the problems in observing galaxies in this much detail is that the telescopes available now on Earth are generally able to find only the most extreme systems. However, the key to describe the history of the Universe is held by the vastly more numerous population of normal objects. "Since we are trying hard to discover this normality, the current observational limitations are an obstacle that has to be overcome."

The James Webb Telescope (JWST) represents hope for better data material in the near future

The new James Webb Space Telescope, scheduled for launch in 2021, will be able to provide the astronomers with data in a level of detail that should be able to map exactly this "normality". The methods developed in close collaboration between the Japanese team and the team at the Niels Bohr Institute have already proven to be successful, given the recent result. "This is significant, because it will enable us to look for the most promising galaxies from the start, when JWST gives us access to much higher quality data" Francesco Valentino explains.

Combining observations with the tool, the supercomputer models of the Universe

What has been found observationally is not too far away from what the most recent models predict. "Until very recently, we did not have many observations to compare with the models. However, the situation is in rapid evolution, and with JWST we will have valuable larger samples of "normal'' galaxies in a few years. The more galaxies we can study, the better we are able to understand the properties or situations leading to a certain state - if the galaxy is alive, quenching or dead. It is basically a question of writing the history of the Universe correctly, and in greater and greater detail. At the same time, we are tuning the computer models to take our observations into account, which will be a huge improvement, not just for our branch of work, but for astronomy in general" Francesco Valentino explains.

Concordia researcher hopes to use big data to make pipelines safer in Canada

Historical records of oil and gas leaks can lead to big improvements for future projects, Fuzhan Nasiri says

Oil and gas pipelines have become polarizing issues in Canada, but supporters and detractors alike can agree that the safer they are, the better.

Unfortunately, integrity and health are ongoing and serious problems for North America's pipeline infrastructure. According to the US Department of Transportation (DOT), there have been more than 10,000 pipeline failures in that country alone since 2002. Complicating safety measures are the cost and intensity of labor required to monitor the health of the thousands of kilometers of pipelines that criss-cross Canada and the United States.

In a recent paper in the Journal of Pipeline Systems Engineering and Practice, researchers at Concordia and the Hong Kong Polytechnic University look at the methodologies currently used by industry and academics to predict pipeline failure -- and their limitations. Fuzhan Nasiri: {module INSIDE STORY}

"In many of the existing codes and practices, the focus is on the consequences of what happens when something goes wrong," says Fuzhan Nasiri, associate professor in the Department of Building, Civil and Environmental Engineering at the Gina Cody School of Engineering and Computer Science.

"Whenever there is a failure, investigators look at the pipeline's design criteria. But they often ignore the operational aspects and how pipelines can be maintained to minimize risks."

Nasiri, who runs the Sustainable Energy and Infrastructure Systems Engineering Lab, co-authored the paper with his Ph.D. student Kimiya Zakikhani and Hong Kong Polytechnic professor Tarek Zayed.

Safeguarding against corrosion

The researchers identified five failure types: mechanical, the result of design, material or construction defects; operational, due to errors and malfunctions; natural hazard, such as earthquakes, erosion, frost or lightning; third-party, meaning damage inflicted either accidentally or intentionally by a person or group; and corrosion, the deterioration of the pipeline metal due to environmental effects on pipe materials and acidity of oil and gas impurities. This last one is the most common and the most straightforward to mitigate.

Nasiri and his colleagues found that the existing academic literature and industry practices around pipeline failures need to further evolve around available maintenance data. They believe the massive amounts of pipeline failure data available via the DOT's Pipeline and Hazardous Materials Safety Administration can be used in the assessment process as a complement to manual in-line inspections.

These predictive models, based on decades' worth of data covering everything from pipeline diameter to metal thickness, pressure, average temperature change, location and timing of failure, could provide failure patterns. These could be used to streamline the overall safety assessment process and reduce costs significantly.

"We can identify trends and patterns based on what has happened in the past," Nasiri says. "And you could assume that these patterns could be followed in the future, but need certain adjustments for climate and operational conditions. It would be a chance-based model: given variables such as location and operational parameters as well as expected climatic characteristics, we could predict the overall chance of corrosion over a set time."

He adds that these models would ideally be consistent and industry-wide, and so transferrable in the event of pipeline ownership change -- and that research like this could influence industry practices.

"Failure prediction models developed based on reliability theory should be realistic. Using historical data (with adjustments) gets you closer to what happens in reality," he says.

"They can close the gap of expectations, so both planners and operators can have a better idea of what they could see over the lifespan of their structure."