CAPTION SuperComputer simulations have allowed scientists, led by Dr Imran Rahman of the University of Bristol, UK to work out how this 555-million-year-old organism with no known modern relatives fed. Their research reveals that some of the first large, complex organisms on Earth formed ecosystems that were much more complex than previously thought. CREDIT M. Laflamme

SuperComputer simulations have allowed scientists to work out how a puzzling 555-million-year-old organism with no known modern relatives fed, revealing that some of the first large, complex organisms on Earth formed ecosystems that were much more complex than previously thought.

The international team of researchers from Canada, the UK and the USA, including Dr Imran Rahman from the University of Bristol, UK studied fossils of an extinct organism called Tribrachidium, which lived in the oceans some 555 million years ago. Using a supercomputer modeling approach called computational fluid dynamics, they were able to show that Tribrachidium fed by collecting particles suspended in water. This is called suspension feeding and it had not previously been documented in organisms from this period of time.

Tribrachidium lived during a period of time called the Ediacaran, which ranged from 635 million to 541 million years ago. This period was characterised by a variety of large, complex organisms, most of which are difficult to link to any modern species. It was previously thought that these organisms formed simple ecosystems characterised by only a few feeding modes, but the new study suggests they were capable of more types of feeding than previously appreciated. CAPTION Such simulations have allowed scientists, led by Dr Imran Rahman of the University of Bristol, UK to work out how this 555-million-year-old organism with no known modern relatives fed. Their research reveals that some of the first large, complex organisms on Earth formed ecosystems that were much more complex than previously thought. CREDIT I.A. Rahman

Dr Simon Darroch, an Assistant Professor at Vanderbilt University, said: "For many years, scientists have assumed that Earth's oldest complex organisms, which lived over half a billion years ago, fed in only one or two different ways. Our study has shown this to be untrue, Tribrachidium and perhaps other species were capable of suspension feeding. This demonstrates that, contrary to our expectations, some of the first ecosystems were actually quite complex."

Co-author Dr Marc Laflamme, an Assistant Professor at the University of Toronto Mississauga, added: "Tribrachidium doesn't look like any modern species, and so it has been really hard to work out what it was like when it was alive. The application of cutting-edge techniques, such as CT scanning and computational fluid dynamics, allowed us to determine, for the first time, how this long-extinct organism fed."

Computational fluid dynamics is a method for simulating fluid flows that is commonly used in engineering, for example in aircraft design, but this is one of the first applications of the technique in palaeontology (following up previous research carried out at Bristol).

Dr Rahman, a Research Fellow in Bristol's School of Earth Sciences said: "The computer simulations we ran allowed us to test competing theories for feeding in Tribrachidium. This approach has great potential for improving our understanding of many extinct organisms."

Co-author Dr Rachel Racicot, a postdoctoral researcher at the Natural History Museum of Los Angeles County added: "Methods for digitally analysing fossils in 3D have become increasingly widespread and accessible over the last 20 years. We can now use these data to address any number of questions about the biology and ecology of ancient and modern organisms."

Paris emissions reduction pledges reduce risks of severe warming, study shows

More than 190 countries are meeting in Paris next week to create a durable framework for addressing climate change and to implement a process to reduce greenhouse gases over time. A key part of this agreement would be the pledges made by individual countries to reduce their emissions.

A study published in Science today shows that if implemented and followed by measures of equal or greater ambition, the Paris pledges have the potential to reduce the probability of the highest levels of warming, and increase the probability of limiting global warming to 2 degrees Celsius.

In the lead up to the Paris meetings, countries have announced the contributions that they are willing to make to combat global climate change, based on their own national circumstances. These Intended Nationally Determined Contributions, or INDCs, take many different forms and extend through 2025 or 2030.

Examples of these commitments include the United States' vow to reduce emissions in 2025 by 26-28 percent of 2005 levels and China's pledge to peak emissions by 2030 and increase its share of non-fossil fuels in primary energy consumption to around 20 percent. In the study, the scientists tallied up these INDCs and simulated the range of temperature outcomes the resulting emissions would bring in 2100 under different assumptions about possible emissions reductions beyond 2030.

"We wanted to know how the commitments would play out from a risk management perspective," said economist Allen Fawcett of the U.S. Environmental Protection Agency, the lead author of the study. "We analyzed not only what the commitments would achieve over the next ten to fifteen years, but also how they might lay a foundation for the future."

Although many researchers have focused on the importance of the 2 degree limit, Fawcett and colleagues assessed uncertainty in the climate change system from an overall risk management perspective. They analyzed the full range of temperatures the INDCs might attain and determine the odds for achieving each of those temperatures. To determine odds, they modeled the future climate hundreds of times to find the range of temperatures these various conditions produce.

"It's not just about 2 degrees," said Gokul Iyer, the study's lead scientist at the Joint Global Change Research Institute, a collaboration between the Department of Energy's Pacific Northwest National Laboratory and the University of Maryland. "It is also important to understand what the INDCs imply for the worst levels of climate change."

In the study, the scientists compare the Paris commitments to a world in which countries don't act at all or start reducing greenhouse gas emissions only in 2030.

The team found that if countries do nothing to reduce emissions, the earth has almost no chance of staying under the 2-degree limit, and it is likely that the temperature increase would exceed 4 degrees. They went on to show that the INDCs and the future abatement enabled by Paris introduce a chance of meeting the 2-degree target, and greatly reduce the chance that warming exceeds 4 degrees. The extent to which the odds are improved depends on how much emissions limits are tightened in future pledges after 2030.

"Long-term temperature outcomes critically hinge on emissions reduction efforts beyond 2030," said Iyer. "If countries implement their INDCs through 2030 and ramp up efforts beyond 2030, we'll have a much better chance of avoiding extreme warming and keeping temperature change below 2 degrees Celsius. It's important to know that the INDCs are a stepping stone to what we can do in the future."

To perform the analysis, the team incorporated the INDCs along with assumptions about future emissions reductions into a global, technologically detailed model of the world called the Global Change Assessment Model or GCAM that includes energy, economy, agriculture and other systems. The GCAM model produced numbers for global greenhouse gas emissions, which the team then fed into a climate model called Model for the Assessment of Greenhouse-gas Induced Climate Change or MAGICC. Running the simulations for each scenario 600 times resulted in a range of temperatures for the year 2100, which the team converted into probabilities.

Iyer said the next thing to look at is the question of the kinds of policies and institutional frameworks that could pave the way for a robust process that enables emissions reduction efforts to progressively increase over time.

This work was supported by the Global Technology Strategy Program, the William and Flora Hewlett Foundation, the Department of State, and the Environmental Protection Agency.

Waves to Weather research center focuses on deficiencies and challenges of weather prediction with the aim to improve forecast models

Mid-latitude weather can be forecast with a lead time of approximately seven to ten days. This saves the lives of numerous people every year and has an economic value of several billion euros. Nevertheless, weather forecasts are sometimes rather poor, like for instance at Christmas 1999, when storm "Lothar" was not forecast well by any of the leading weather centers. Usually the reasons for such forecast busts are deficient initial data or forecast methods. However, since the atmosphere is a chaotic system, there is also a fundamental limit to predictability beyond which a forecast cannot be extended by any practical means. It is important to recognize this limit and, at the same time, produce the best possible forecast within this limit. This challenge is taken up by the new Transregional Collaborative Research Center "Waves to Weather" (TRR 165), which has recently received funding approval by the German Research Foundation (DFG).

"This collaborative project is unique in Germany. It will help us to distinguish situations in which the fundamental limits of predictability have been reached from those situations in which improved methods still result in improved forecasts," explained Professor Volkmar Wirth from the Institute of Atmospheric Physics at the Johannes Gutenberg University Mainz (JGU). In addition to the JGU, the consortium, which is being funded by the DFG for an initial four years, includes the Ludwig Maximilian University Munich as the coordinating university, the Karlsruhe Institute of Technology, Heidelberg University, Technical University Munich, and the German National Aeronoautics and Space Research Center (DLR).

Among other things, the scientists will analyze so-called ensemble forecasts in order to investigate the physical processes and interactions that occur, for instance, during the formation of storms or heat waves. Ensembles are generated through several dozens of forecasts starting from slightly different initial conditions; a measure of uncertainty is then obtained by the extent to which the individual forecasts deviate from each other. "Ensemble prediction allows us to determine the quality of weather forecasts and provides us with a tool to study predictability," added Wirth.

The scientists involved in the project are experts in atmospheric dynamics, cloud physics, statistics, numerical modeling, and visualization. The partners based in Mainz will primarily contribute their expertise in dynamical meteorology and cloud physics, but they also entrain colleagues from applied mathematics and computer sciences. "We expect to obtain new insight that allows us to better understand the deficiencies of today's forecast tools. This is how we aim to develop new models and methods in order to extend the forecast range, eventually out to the fundamental limit of predictability," said Professor Volkmar Wirth.

Figure 1: In spin ice crystals, the magnetic spins of electrons (black and blue arrows) make it difficult for the system to reach its lowest energy state. Quantum interactions can cause spins to flip and lead to ring-like, aligned arrangements of the spins (blue arrows).  © 2015 Shigeki Onoda, RIKEN Center for Emergent Matter Science

SuperComputations reveal how quantum interactions can break a deadlock in magnetic spin ice oxides

Physicists have long been seeking to uncover a grand unified theory in which the three non-gravitational forces—the electromagnetic force and the strong and weak nuclear forces—merge into a single force at high temperatures. Such a theory is seen as a critical stepping stone to realizing a ‘theory of everything’ that combines all four forces. Some grand unified theories that have been proposed predict the existence of magnetic monopoles—the magnetic equivalent of electric charges—in certain crystals.

In the crystals of some oxygen-containing compounds, electron spins are arranged at the edges of the tetrahedrons that make up the crystal structure (Fig. 1). In the absence of an external magnetic field, these materials should be non-magnetic, with the same number of spins pointing in each direction. In a crystal, there are endless configurations for achieving this. At low temperatures, spins do not have enough energy to rearrange themselves, and thus the system is ‘frozen’ into one of these arrangements. These materials are referred to as a spin ice, in analogy to water ice, which is a random arrangement of water molecules.

Now, Shigeki Onoda and Yasuyuki Kato from the RIKEN Center for Emergent Matter Science have performed calculations to investigate how, at low temperatures, quantum effects in these materials can cause them to ‘melt’1. Their findings highlight the importance of quantum physics in helping systems to reach favorable energy states.

While, according to classical physics, the spins in a spin ice should be frozen at low temperatures, quantum effects allow the spins to interact with each other. “The quantum nature of spins allows for exchanging nearest-neighbor spins,” explains Onoda. Such a system is thus termed a quantum spin ice.

By performing supercomputer calculations, the researchers studied how at low temperatures these quantum interactions cause the quantum spin ice to melt through flipping of spins. Through these quantum effects, the spins can align (blue spins in Fig. 1) and can collectively change their orientation. These collective changes of spin patterns can travel through the crystal in a similar way to particles of light, and are thus dubbed ‘photons’.

The next task is to experimentally verify these theoretical predictions, explains Onoda. “We are working on directly relevant models to experimental systems in order to propose experimental observations of quantum spin ice photons and provide theoretical predictions.” This would bring scientists a step closer to uncovering how photons and magnetic monopoles emerge spontaneously in materials.

New research uses numerical modeling to determine the potential of fracking hazard before drilling into the ground

Using supercomputer analysis prior to drilling could limit seismic events as a result of hydraulic fracturing, according to new research published in theCanadian Geotechnical Journal.

Hydraulic fracturing, also known as "fracking", is used to break the subsurface rock mass into pieces and is done by injecting high-pressure fluid. While this gives the fluid or gas more paths to reach production wells, it also leads to several environmental problems, one of which is the unwanted shaking of the ground structures caused by the movement of large faults. Evaluating the seismic effects of fracking before drilling is particularly important as many Enhanced Geothermal Systems (EGS) or hydrocarbon extraction operations occur in tight rock masses and in close vicinity to fault zones. New research published in the Canadian Geotechnical Journal addresses the issue of unwanted seismic events by proposing a numerical modeling process to evaluate the effects of hydraulic fracturing prior to drilling.

"By using this process, the hydraulic fracturing industry will be able to infer how potential fluid injection operations can change the movement of the fault systems," says the lead author of the study Jeoung Seok Yoon, Helmholtz-Centre Potsdam - GFZ German Research Centre for Geosciences. "These findings are particularly relevant to the shale gas industry as extraction of shale gas has the undesired potential effect of induced seismic events activating a nearby fault".

According to the research, to avoid unwanted seismic events and to improve the safety of hydraulic fracturing operations, a series of numerical models should be conducted to design a carefully implemented hydraulic stimulation scheme. The modeling technique presented in this paper is a hydromechanical-coupled dynamic model, which provides unique solutions to the issue of defining risk when drilling near fault zones.

"In our opinion, regulators should require industry to undergo this kind of numerical modeling testing prior to drilling to better estimate the potential hazard beforehand," continues Jeoung Seok Yoon. "It is essential that mechanisms of fluid injection-induced seismicity and fault reactivations are better understood in order to avoid the larger events induced by fluid injection".

Page 2 of 25