Phillip R. Owens, Purdue University associate professor of agronomy, demonstrates how a farmer's field can be mapped to show predicted crop yield with his technology. His functional soil mapping technology is available for licensing and developing through the Purdue Office of Technology Commercialization.

A Purdue University agronomist has developed soil-mapping technology that provides visual information about soil functionality and productivity, which could increase profitability for farmers and growers as they cultivate their crops.

Phillip R. Owens, associate professor of agronomy in Purdue's College of Agriculture, said the U.S. Department of Agriculture provides soil survey maps of the contiguous 48 states. These maps only classify and name soil types as broad units based on their appearance, while Owens' functional maps provide a broader spectrum of highly detailed information.

"These functional maps show properties like organic carbon content, clay content, the location of water tables, the native nutrient potential, catatonic exchange and more," he said. "They also show categorized information like the highest- and lowest-yielding areas, how much water the soil would store after a rainfall event, and how fast a farmer could expect runoff. This information could impact how farmers choose to manage their land and their crops in order to decrease costs and increase profits."

Jenette Ashtekar, a doctoral graduate from Purdue's College of Agriculture, worked with Owens to create the functional soil maps by using algorithms that capture the important relationships between the landscape, water and soil development.

Owens mapping2

Phillip R. Owens, from left, Purdue University associate professor of agronomy, and doctoral graduate Jenette Ashtekar compare a soil sample in an auger to a functional soil map on a digital tablet. The functional soil mapping technology is available for licensing and developing through the Purdue Office of Technology Commercialization. (Purdue Research Foundation photo) 

"The computer algorithms that we developed exploit the relationship between soil properties and the landscape. We then use the same algorithms to determine precise locations in the fields to sample so we can create the best possible map with a limited number of samples," she said. "After sampling, we use Geographical Information Systems, or GIS, technology to develop beautiful, color-image maps that farmers can use in the field to inform their management decisions."

The Purdue Office of Technology Commercialization has filed a patent application on the technology, which was developed with funding from the USDA and Purdue. Owens is looking for venture capitalists or agriculture firms to license and scale up the technology for the marketplace.

"The maps currently are in GIS format, but they can be delivered multiple ways," he said. "Ultimately farmers will want the information on their tractors so they can make decisions about where to plant, what seeding rate to use, where and how much to fertilize or more as they drive across the field."

This visualization shows soil moisture measurements taken by NASA's Aquarius instrument. Here, soil moisture in the top 2 inches of the land is visible.  Credit: NASA Goddard's Science Visualization Studio/T. Schindler

Scientists working with data from NASA's Aquarius instrument have released worldwide maps of soil moisture, showing how the wetness of the land fluctuates with the seasons and weather phenomena.

Soil moisture, the water contained within soil particles, is an important player in Earth's water cycle. It is essential for plant life and influences weather and climate. Satellite readings of soil moisture will help scientists better understand the climate system and have potential for a wide range of applications, from advancing climate models, weather forecasts, drought monitoring and flood prediction to informing water management decisions and aiding in predictions of agricultural productivity.

Launched June 10, 2011, aboard the Argentinian spacecraft Aquarius/Satélite de Aplicaciones Científicas (SAC)-D, Aquarius was built to study the salt content of ocean surface waters. The new soil wetness measurements were not in the mission's primary science objectives, but a NASA-funded team led by U.S. Department of Agriculture (USDA) researchers has developed a method to retrieve soil moisture data from the instrument's microwave radiometer.

The Aquarius measurements are considerably coarser in spatial resolution than the measurements from the upcoming NASA Soil Moisture Active Passive (SMAP) mission, which was specifically designed to provide the highest quality soil moisture measurements available, including a spatial resolution 10 times that offered by Aquarius.

Soils naturally radiate microwaves and the Aquarius sensor can detect the microwave signal from the top 2 inches (5 centimeters) of the land, a signal that subtly varies with changes in the wetness of the soil. Aquarius takes eight days to complete each worldwide survey of soil moisture, although with gaps in mountainous or vegetated terrain where the microwave signal becomes difficult to interpret.

Tom Jackson, principal investigator for the Aquarius soil moisture measurements and a hydrologist with USDA, said that his agency uses soil wetness information to improve crop forecasts. These forecasts not only help farmers and markets adjust their prices according to worldwide production, but also allow relief agencies to plan for food emergency responses.

"There's a lot of precipitation data in our country, but outside the U.S. and Europe it gets pretty sparse," said Jackson. "By using soil moisture readings, we can better monitor the condition of soils."

An animated version of Aquarius' soil moisture measurements reveals a dynamic pattern of worldwide shifts between dry and moist soils. A look at the Eastern coast of Australia during late January and early February 2012 shows the impact of Cyclone Jasmine, a tropical storm that brought heavy rainfall to northern Queensland. In Africa, a wide band of wet soils that matches the rain belt travels north of the equator in the Southern Hemisphere's winter and then moves south as summer progresses. The land-soaking effects of the monsoon are evident in the Indian subcontinent from June to October. The U.S. Midwest dramatically shifts from being bone-dry during the 2012 drought to waterlogged during the floods of April and May 2013.

But Aquarius has some limitations in its soil moisture estimates, said USDA's Rajat Bindlish, a co-investigator on the soil moisture maps. The instrument's 62-mile-wide (100-km) footprint prevents it from covering small islands, coastlines and any piece of land narrower than 62 miles, such as Baja California or Italy. Other factors interfere with soil moisture retrievals, too: The dense tree canopy of the Amazon rainforest distorts the microwave signal while snow and ice block it. These areas where the microwave signals are difficult to interpret are shown in dark gray in the Aquarius soil moisture animation.

Aquarius soil moisture data are coarser than those collected by the European Space Agency's Soil Moisture and Ocean Salinity (SMOS) mission. Still, having multiple missions in orbit that simultaneously measure the wetness of the land in the same band of the microwave spectrum ensures a more continuous record, said Peggy O'Neill, of NASA's Goddard Space Flight Center in Greenbelt, Maryland, a co-investigator of the soil moisture project and deputy project scientist of NASA's upcoming soil moisture mission, SMAP.

"One of the things that having Aquarius and SMOS in orbit before SMAP has allowed us to do is to do inter-comparison studies between the sensors," O'Neill said. "By having the three instruments up there at the same time, we will be able to create a long-time series of soil moisture that starts with SMOS and continues with Aquarius and then SMAP. We won't have to worry that the earlier data were taken by SMOS and the later data by SMAP – we'll know they're telling us the same thing. If we hadn't been able to do inter-comparison studies and there had been no overlap between the three instruments, we wouldn't know how to merge the measurements together."

SMAP, set to launch in November 2014, will advance soil moisture studies with its greater spatial and temporal resolution. The new mission will combine microwave radiometer readings, which are accurate but coarse, with the measurements taken by its onboard radar, which are less precise but have higher spatial resolution than the radiometer data. This approach will provide a footprint of 5.6 miles (9 kilometers), and it will produce worldwide soil moisture maps every three days. Taken together these features of SMAP will provide about 500 times the number of soil moisture measurements per day compared to SMOS or Aquarius.

"Weather and climate models now need information in the 10-kilometer scale," O'Neill said. "SMAP will also allow professionals who now do drought monitoring and flood forecasting at the county level to be able to do it at sub-county levels."

The Aquarius mission and radar were developed at NASA's Jet Propulsion Laboratory in Pasadena, California, and the radiometer was developed at NASA's Goddard Space Flight Center. The Aquarius soil moisture product and sea surface salinity measurements are produced by NASA's Goddard Space Flight Center, which manages the mission. Soil moisture data from NASA's Aquarius microwave radiometer are now available at the National Snow and Ice Data Center. 

 

Shows capability to forecast water clarity during naval missions

Dr. Jason Jolliff is an oceanographer with the U.S. Naval Research Laboratory (NRL). "The emphasis here," he says, "is on developing models of the ocean environment to help the naval warfighter." His most recent paper, published in Ocean Modeling (March 2014), shows NRL can also forecast where oil will go following a major spill.

"If you're going to do forecasting," he says, "you have to get the ocean circulation correct. It's fundamental to all else." Jolliff plugged the distribution of surface oil following the 2010 Deepwater Horizon oil spill—when it was still well offshore—into a powerful NRL forecasting tool. He accurately predicted what would happen to the oil; in particular, the processes that made inevitable its landfall on Louisiana shorelines fully four days later.

Jolliff's second key point is, "When we look at oceanographic problems, we have to understand the scales of time and space we're dealing with." In 2010, there was concern about oil washing up on Florida beaches. But oil does evaporate and degrade; so knowing how far the oil will go over what period of time helps predict which beaches are at most risk.

Jolliff is part of a team developing a tool called Bio-Optical Forecasting (BioCast). "We're developing this framework where you can combine satellite images, that give you an estimate of what is in the ocean, with ocean circulation models." BioCast calculates "how those materials will ultimately be transported and dispersed."

Jolliff is interested in how "tracers"—like plankton or nitrate distribution—change water clarity. "That will help the Navy predict ocean optical properties." Divers need good visibility, as do airborne platforms that use electro-optics to "look" for mines in shallow waters. "But the general knowledge that we gain can be applied to a very wide range of forecasting problems, including contaminant distribution and oil spill response."

Jolliff validated NRL's capability for future operations by applying integrated forecasting to the Deepwater Horizon oil spill. He describes two sets of experiments: one at a local scale, focusing on currents near Louisiana; the second at a mesoscale, looking at the entire Gulf of Mexico.

Local scale: predicting where shoreline oil would be worst

Using satellite images from May 11, 2010—which show oil slicks still well away from shore—Jolliff forecast how oil would move over the next 96 hours. He predicted oil would make a substantial landfall on the Louisiana coast, west of the Mississippi River Delta, on May 14. "The forecast was qualitatively accurate," he says. "That's precisely what happened."

Jolliff uses the NRL tool, Coupled Ocean-Atmosphere Mesoscale Prediction System (COAMPS). "[COAMPS] was designed to provide a direct forecast of oceanographic and atmospheric variables, things like surface temperature, surface humidity, sea surface temperature, current speeds, and atmospheric visibility."

With forecasts of how water is moving from COAMPS, Jolliff predicts how a tracer will be transported through that system using BioCast. With a tracer like surface chlorophyll, which indicates phytoplankton abundance, he might forecast water clarity for a naval operations environment. By using oil from a well-documented spill as a tracer, Jolliff validated how the models work together.

"One of the things we can do at NRL is we can start to explore areas that weren't necessarily thought of when [COAMPS] was designed," says Jolliff.

Following Deepwater Horizon's blowout, "the main surface aggregation of oil was offshore." For the oil to reach shorelines, surface water from the deep ocean would have to carry it in and exchange with coastal waters.

"What we find, generally," says Jolliff, "is that the water near shore tends to stay near shore, and the water off shore tends to stay off shore. So in order to forecast if this oil is going to impact some coastal area, you have to know where this water mass exchange is going to take place." With COAMPS, Jolliff was able to explain the water mass exchange that led to the severe oiling in Louisiana.

Where the Mississippi River Delta landmass protrudes out into the Gulf of Mexico, coastal currents can mix with offshore surface waters. "Oil that is initially offshore, it'll come near shore and bump up against the Mississippi River Delta. And then it gets whipped around and swept into this coastal current."

Additionally, there's an unusually deep feature on the ocean floor to the southwest of the Delta, called the Mississippi Canyon. The irregular ocean depths, or bathymetry, between the Delta and the Canyon; prevailing winds; and fresh water that drains from the Delta all contribute to water mass exchange.

To show oil would be concentrated where currents converge, and dispersed where currents diverge, "We had specific buoyancy restoring term in the model that would bring oil back to the surface." He was able to account for the tendency of oil, pulled under by currents, to rise back to the surface-critical to forecasting material transport. A three-dimensional framework made these zones of divergence and convergence easier to identify than in 2010 (Eulerian versus Lagrangian). (A future model that sought to include subsurface oil in the initial inputs would need a more complex accounting of how hydrocarbons interact with seawater and chemical dispersants.)

Jolliff doesn't compare his predictions to what federal agencies made at the time, because they were continuously updating their simulations. "The take home message is that COAMPS was able to forecast the atmospheric and oceanographic conditions that made that funnel [of water mass exchange] operate. And so what was happening in the model then happened in reality."

Mesoscale: why the oil spill didn't hurt Florida beaches

Next, Jolliff sought to explain what happened to oil over a longer time period and wider region. Unlike the first set of experiments, "We're doing what's called a 'hindcast'; and that's every 12 hours, the ocean model is assimilating information from satellites to correct deviations from those observations."

Except for two gaps, land hems in the Gulf of Mexico. As a result, ocean current tends to flow in through a gap between the Yucatan and Cuba, and then into a forceful clockwise Loop Current that turns toward the southeast.

Most of the time, the Gulf's Loop Current is able to squeeze out through the Florida Straits. "But it's unstable," says Jolliff, "so with some frequency, this big loop detaches and forms this sort of closed, clockwise circulation of water—what oceanographers call a mesoscale eddy." And with the hindcasts, Jolliff observed exactly that.

"You see that the Loop Current itself pinched off to form an eddy." This fortuitous event prevented the current from carrying aggregated surface oil to Key West and eastern Florida beaches, like Miami. "This emphasizes the point that the only way you'd know that is to forecast the ocean circulation."

In addition to understanding regional currents, time scale is also very important. For the 96-hour forecast, Jolliff could assume oil was an inert tracer. But over time, oil "weathers"—evaporating and biodegrading. Jolliff accounted for weathering in the mesoscale by assuming a rate of decay. He acknowledges this oversimplifies the true chemical properties of oil, and could be improved in a future model; but in 2010, most forecasters didn't account for weathering at all.

Even if the current had carried oil to Florida, the damage likely wouldn't have been severe. "What we see is that it takes more than 10 days for that surface oil to finally get entrained into the Loop Current. Enough time has elapsed that these materials are now significantly degraded."

NRL on the front line: improving ocean models supports readiness

While Jolliff tested NRL's operational capability against the Deepwater Horizon oil spill, he could make similar forecasts for a spill anywhere in the world. "What's very powerful about COAMPS," he says, "is we can take models that are much larger in scale, and we can zoom in to these local areas."

Going forward, he'll focus on integrating more cross-interacting variables directly into COAMPS to give naval operators a more accurate and complete forecast. "In any complicated system," he says, "you're going to have various feedbacks. The ocean and the atmosphere are constantly exchanging energy." By bringing the tracer directly into COAMPS, instead of using BioCast to perform transport calculations separately from COAMPS, he could see if materials in the ocean impact air-sea interactions.

He and NRL colleague Dr. Travis Smith are researching the biological, chemical, and physical properties of key tracers. "As you go out to longer and longer time scales, you have to bring more and more of inherent reactivity of these materials into your calculations: things like phytoplankton growth, settling of particles." Incorporating these properties, similar to what he did with the decay constant for oil, improves forecasts of ocean optics.

"The tools we've developed here at NRL are state of the art, without question," he says.

Climate feedbacks from decomposition by soil microbes are one of the biggest uncertainties facing climate modelers. A new study from the International Institute for Applied Systems Analysis (IIASA) and the University of Vienna shows that these feedbacks may be less dire than previously thought.

The dynamics among soil microbes allow them to work more efficiently and flexibly as they break down organic matter – spewing less carbon dioxide into the atmosphere than previously thought, according to a new study published in the journal Ecology Letters.

"Previous climate models had simply looked at soil microbes as a black box," says Christina Kaiser, lead author of the study who conducted the work as a post-doctoral researcher at IIASA. Kaiser, now an assistant professor at the University of Vienna, developed an innovative model that helps bring these microbial processes to light.

Microbes and the climate

"Soil microbes are responsible for one of the largest carbon dioxide emissions on the planet, about six times higher than from fossil fuel burning," says IIASA researcher Oskar Franklin, one of the study co-authors. These microbes release greenhouse gases such as carbon dioxide and methane into the atmosphere as they decompose organic matter. At the same time, the Earth's trees and other plants remove about the same amount of carbon dioxide from the atmosphere through photosynthesis.

As long as these two fluxes remain balanced, everything is fine.

But as the temperature warms, soil conditions change and decomposition may change. And previous models of soil decomposition suggest that nutrient imbalances such as nitrogen deficiency would lead to increased carbon emissions. "This is such a big flux that even small changes could have a large effect," says Kaiser. "The potential feedback effects are considerably high and difficult to predict."

Diversity does the trick

How exactly microorganisms in the soil and litter react to changing conditions, however, remains unclear. One reason is that soil microbes live in diverse, complex communities, where they interact with each other and rely on one another for breaking down organic matter.

"One microbe species by itself might not be able to break down a complex substrate like a dead leaf," says Kaiser. "How this system reacts to changes in the environment doesn't depend just on the individual microbes, but rather on the changes to the numbers and interactions of microbe species within the soil community."

To understand these community processes, Kaiser and colleagues developed a supercomputer model that can simulate complex soil dynamics. The model simulates the interactions between 10,000 individual microbes within a 1mm by 1mm square. It shows how nutrients, which influence microbial metabolism, affect these interactions, and change the soil community and thereby the decomposition process.

Previous models had viewed soil decomposition as a single process, and assumed that nutrient imbalances would lead to less efficient decomposition and hence greater greenhouse gas emissions. But the new study shows that, in fact, microbial communities reorganize themselves and continue operating efficiently – emitting far less carbon dioxide than previously predicted.

"Our analyses highlight how the systems thinking for which IIASA is renowned advances insights into key ecosystem services," says study co-author and IIASA ecologist Ulf Dieckmann.

"This model is a huge step forward in our understanding of microbial decomposition, and provides us with a much clearer picture of the soil system," says University of Vienna ecologist Andreas Richter, another study co-author. 

The recent EU-sponsored FiPS project (Developing Hardware and Design Methodologies for Heterogeneous Low Power Field Programmable Servers) is designing a new class of heterogeneous supercomputers based on energy-efficient alternative processing resources like the ones found, for example, in today’s low-power mobile devices.

FiPS aims to significantly increase energy-efficiency through a smart combination of alternative processors such as ARM smartphone processors, 3D graphics chips, re-configurable FPGA hardware, or even many-core architectures.

The project was kicked off in the second half of 2013 by a consortium that spans all of Europe, from Greece to Ireland and from France to Poland. Coordinated by the OFFIS – Institute for Information Technology (Germany), the project brings together hardware and software specialists from both high-performance and embedded computing communities. Christmann Informationstechnik + Medien GmbH, CoSynth GmbH & Co. KG and Bielefeld University (Germany) are in charge of the development of a new kind of modular server architecture able to seamlessly host various computing resources, which will be deployed at the Poznań Supercomputing and Networking Center (Poland). OFFIS and CEA LIST (France) are developing the related methodology helping the system designer to determine the ideal processor types for a given application in order to build the right heterogeneous computing machine. The resulting system will be put into practice on very high-demanding real-world applications by SOFiSTiK Hellas S.A. (Greece), Cenaero ASBL (Belgium) and the National University of Ireland (Ireland) with the goal to achieve the highest performance at the lowest power consumption.

The 3-year project is being sponsored with a total of about 3 million Euros through the 7th Framework Programme of the European Union.

FiPS aims at addressing the rapidly rising energy demand of today’s supercomputing centers, as many of the most advanced research programs are depending on data centers hosting thousands of computers, the only way to quickly solve sophisticated problems (like daily weather forecast, advanced engineering, medical tomography analysis or even a simple search engine query like those on Google or Yahoo!). This amassment of computers is consuming plenty of electrical energy, and the demand for supercomputing is rising fast. Already 15% of our total electrical energy is required to power all the computers at home, at work and in data centers, and the number is growing rapidly.

A possible solution is to use heterogeneous computing architectures to optimize the performance/watt ratio by including smartphone processors, 3D graphic chips, reconfigurable chips (FPGAs) or many-core architectures. This is done by applying the well-known algorithm-architecture matching concept of embedded systems to the high-performance computing domain, as these alternative architectures may be better suited to particular parts of applications with respect to both performance and energy consumption compared to the traditional complex processors used in current supercomputers. The bulk of the computing time of advanced high-performance simulations is spent in a small number of very CPU-intensive sub-routines, many of which are simple enough to run on these alternative computation devices, reducing the required number of traditional high-energy, high-performance processors.

The FiPS project aims to build a new supercomputer combining traditional high-performance processors (for complex control-oriented and sequential computations) with more efficient alternative processors used for simpler or regular computations (where relevant). As the total number of processors increases, these new supercomputers will be faster while at the same time significantly reduce the overall energy consumption.

The challenge of building supercomputers from a heterogeneous network of processors (rather than a regular grid of identical processors) is to synchronize the various types of components: different computing devices have to be programmed in different programming models, and there is no easy way to know which processor type will finally run a computation job the fastest or at the lowest energy consumption. Moreover, all the processors working on different parts of the same problem will have to communicate their intermediate results and synchronize.

FiPS sets up a programming methodology that requires only a single programming language for developing the supercomputing program, which then determines the best-suited processor type and automatically translates the program into the programming language of this processor architecture. The user then receives a prediction of the program’s speed and energy consumption. With this data, the implementation can be tweaked to increase performance and/or energy efficiency.

FiPS will not only make an ecological impact by reducing energy demand (and thus carbon dioxide emission), but also an economic impact by reducing the major energy costs. Supercomputing will become cheaper and thus affordable for many other applications and users.

More Information about the FiPS project

Page 4 of 7