LATEST

Some of the most important crops risk substantial damage from rising temperatures. To better assess how climate change caused by human greenhouse gas emissions will likely impact wheat, maize and soybean, an international team of scientists now ran an unprecedentedly comprehensive set of supercomputer simulations of US crop yields. The simulations were shown to reproduce the observed strong reduction in past crop yields induced by high temperatures, thereby confirming that they capture one main mechanism for future projections. Importantly, the scientists find that increased irrigation can help to reduce the negative effects of global warming on crops – but this is possible only in regions where sufficient water is available. Eventually limiting global warming is needed to keep crop losses in check.

“We know from observations that high temperatures can harm crops, but now we have a much better understanding of the processes,” says Bernhard Schauberger from the Potsdam Institute for Climate Impact Research, lead author of the study. “The [super]computer simulations that we do are based on robust knowledge from physics, chemistry, biology; on a lot of data and elaborate algorithms. But they of course cannot represent the entire complexity of the crop system, hence we call them models. In our study they have passed a critical test.” The scientists compare the model results to data from actual observations. This way, they can find out if they include the critical factors into their calculations, from temperature to CO2, from irrigation to fertilization.

Without efficient emission reductions, yield losses of 20 percent for wheat are possible by 2100

For every single day above 30°C, maize and soybean plants can lose about 5 percent of their harvest. The simulations have shown that the models capture how rather small heat increases beyond this threshold can result in abrupt and substantial yield losses. Such temperatures will be more frequent under unabated climate change and can severely harm agricultural productivity. Harvest losses from elevated temperatures of 20 percent for wheat, 40 percent for soybean and almost 50 percent for maize, relative to non-elevated temperatures, can be expected at the end of our century without efficient emission reductions. These losses do not even consider extremely high temperatures above 36°C, which are expected to lower yields further.

The effects go far beyond the US, one of the largest crop exporters: world market crop prices might increase, which is an issue for food security in poor countries.

Irrigation could be a means for adaptation - yet only in regions where there's sufficient water

When water supply from the soil to the plant decreases, the small openings in the leaves gradually close to prevent water loss. They thereby preclude the diffusion of CO2 into the cells, which is an essential building material for the plants. Additionally, crops respond to water stress by increasing root growth at the expense of above-ground biomass and, eventually, yields.

Burning fossil fuels elevates the amount of CO2 in the air. This usually increases the water use efficiency of plants since they lose less water for each unit of CO2 taken up from the air. However, this cannot be confirmed as a safeguard of yields under high temperatures, the scientists argue. The additional CO2 fertilization in the simulations does not alleviate the drop in yields associated with high temperatures above about 30°C.

The comparison of different supercomputer simulations of climate change impacts is at the heart of the ISIMIP project (Inter-Sectoral Impacts Modelling Intercomparison Project) comprising about 100 modeling groups worldwide. The simulations are generated in cooperation with AgMIP, the international Agricultural Model Intercomparison and Improvement Project.

Soybean harvest in Iowa. Climate change could impact agricultural productivity. Photo. Thinkstock
Soybean harvest in Iowa. Climate change could impact agricultural productivity. Photo. Thinkstock

Professor Daniel Loss from the University of Basel's Department of Physics and the Swiss Nanoscience Institute has been awarded the King Faisal International Prize for Science 2017.

Professor Daniel Loss from the University of Basel's Department of Physics and the Swiss Nanoscience Institute has been awarded the King Faisal International Prize for Science 2017. The King Faisal Foundation awarded Loss the renowned science prize for his discovery of a concept for development of a quantum computer based on the intrinsic angular momentum of electrons. Loss has further refined his theory over recent years and established a completely new field of research.

"Daniel Loss has made a considerable contribution to our theoretical understanding of spin dynamics and spin coherence in quantum dots, creating new areas of opportunity with practical applications for spin-based quantum computing," the King Faisal Foundation said. Loss developed the idea to use the intrinsic angular movement (or spin) of electrons in quantum dots as the smallest memory unit, or quantum bit (qubit).

"His work has helped initiate a number of important experimental programs, and opened the door to high-performing quantum computers with extraordinary speed and storage capacity," the Foundation explained in its announcement. Loss will share the USD 200,000 prize with the physicist Professor Laurens Molenkamp from the Department of Physics at the University of Würzburg (Germany).

A theoretical basis for experimental work across the globe

Loss published the theory for the Loss-DiVincenzo quantum computer in 1998 together with David DiVincenzo. In this proposal, artificial atoms without nuclei are created in a semiconductor. The electrons are confined by means of an electrically charged gate. Within these quantum dots, the spin of the electron is partially protected from interaction with its environment, allowing the creation of interference-resistant qubits. These quantum dots and their spin qubits can also be coupled with one another by electrical means.

Numerous experimental research teams worldwide rely on Loss's theoretical contributions. These, as well as many further developments over the years, are an integral part of the effort to create a quantum computer based on spin qubits in semiconductors. Such quantum computers will not only enable superfast calculations and simulations but also addressing fundamental questions of quantum physics.

Highly cited publications and renowned prizes

Daniel Loss has more than 400 specialist publications to his name and is one of the world's most frequently cited physicists. He is Professor of Theoretical Physics at the University of Basel's Department of Physics, Co-Director of the University's Swiss Nanoscience Institute and Director of the Center for Quantum Computing and Quantum Coherence (QC2) in Basel.

Loss was awarded the European Academy of Science's Blaise Pascal Medal for physics in 2014. In 2010, he received Switzerland's most prestigious science award, the Marcel Benoist Prize, and in 2005 he earned Germany's Humboldt Research Award. Since 2014 he is a member of the German National Academy of Sciences Leopoldina.

This graphic shows approximately 1200 different future scenarios of what the climate will look like, with each scenario a different mix of technologies and government policies. The Intergovernmental Panel on Climate Change, or IPCC, relies information like this to see how different combinations of technologies and government regulations will affect global climate until 2100. Graphic: Global Carbon Project

Scientists and policymakers rely on complex supercomputer simulations called Integrated Assessment Models to figure out how to address climate change. But these models need tinkering to make them more accurate.

We know that greenhouse gas emissions from burning fossil fuels and other economic activities are warming the Earth and changing the climate.

The North Pole provides just the latest disturbing evidence of this fact. Right before New Year’s, the International Arctic Buoy Programme recorded temperatures there that were right around freezing, or more than 30 degrees C. warmer than usual.

Deciding which large-scale economic and social changes to make so we don’t overheat the planet is not easy, however. Researchers around the globe have built complex supercomputer simulations, called Integrated Assessment Models, or IAMs, that give them a view into possible futures. The models predict what would happen to the climate over the next 80 years based on different mixes of government policies and technologies.

“These models are really important in shaping policy and our current understanding” of how different strategies will affect the climate, said Anders Arvesen, a researcher at the Norwegian University of Science and Technology in the Industrial Ecology Programme.

But Arvesen and colleagues say there are ways to make the models even better, an approach they describe in an article in Nature Climate Change, published online on 4 January.

“Since we only have one Earth to experiment with, we need some kind of test field to study and test the impact of different strategies before we implement them,” said Stefan Pauliuk, a researcher at the University of Freiburg and the study’s first author. “IAMs are the most important test field we have…but how robust are these scenarios, how much can we trust that the technological mix calculated will actually lead to the outcome envisioned by the model?”

The opening ceremony of the Fortieth Session of the IPCC, in Copenhagen, Denmark, 27 October 2014. Approximately 450 participants attended IPCC-40, including government representatives, authors, representatives of UN organizations, members of civil society, and academics. Photo: IPCC

The opening ceremony of the Fortieth Session of the IPCC, in Copenhagen, Denmark, 27 October 2014. Approximately 450 participants attended IPCC-40, including government representatives, authors, representatives of UN organizations, members of civil society, and academics. Photo: IPCC

Scenarios can help decide policies

The Intergovernmental Panel on Climate Change, commonly called the IPCC, was established by the United Nations Environment Programme (UNEP) and the World Meteorological Organization (WMO) in 1988. Its mandate is to provide the world with a clear scientific view on the current state of knowledge on climate change and its potential environmental and socio-economic impacts.

Over the decades, the panel has published a series of reports and assessments for policymakers and the public. The most recent report, the 5th Assessment, was finalized in November 2014 and included many different scenarios based on IAMs.

The fact that the IAMs are so influential is why Pauliuk, Arvesen and his colleagues decided to give them a hard look, Arvesen said. “They (IAMs) have been successful and provide useful insights, which is exactly why it is important to improve them.”

The Nature Climate Change perspective describes how the models could be improved using approaches drawn from industrial ecology, where researchers study the flows of materials and energy through society to understand the costs and impacts of societal choices.

Steel, other metals, cement and chemicals

One area where the researchers say industrial ecology might help improve the IAMs is with materials. For example, the production of steel, other metals, cement and chemicals (including plastic) account for roughly 50 per cent of all industrial greenhouse gas emissions.

Recycling of aluminium and other metals can make a difference in greenhouse gas emissions, but today's climate model scenarios don't always take this into account when they make their predictions. Photo: Thinkstock

Recycling of aluminium and other metals can make a difference in greenhouse gas emissions, but today’s climate model scenarios don’t always take this into account when they make their predictions. Photo: Thinkstock

But recycling of scrap metals, once done to reduce costs, also reduces greenhouse gas emissions, which should be factored into the IAMs, says Konstantin Stadler, one of the paper’s co-authors and a researcher at NTNU’s Industrial Ecology programme.

“This is one of the paths that is not considered in the IAMs yet, but maybe we need this to have a full understanding of how we can reduce CO2 emissions,” Stadler said. “What policy options do we have, and what kind of consequences would it have if we implement them?”

If the IAMs could be fine-tuned to include this perspective, you could improve the scenarios that the models predict, the researchers said. Alternatively, you might miss out on some significant actions that could make it easier to meet emissions targets, Arvesen said.

“If you do not account for the full spectrum of mitigation options or possible constraints in that part of the system, then you miss out on something that is potentially important,” he said.

No answers — yet

thinkstockphotos 490175416 352x235 8f36b

It will be difficult to limit global warming to under 2 degrees C, experts agree. But higher global temperatures will lead to increased droughts, more severe storms and higher sea levels. Photo: Thinkstock

The researchers don’t actually know yet how including more detailed representations of the industrial ecology into IAMs will affect scenario outcomes — but they hope their article will encourage researchers to consider the idea. It is already clear that doing so will enable IAMs to consider a wider spectrum of climate change mitigation options, including recycling and material efficiency, than is the case today.

“What we propose is basically a new line of research where we suggest the two groups work together,” Stadler said. “We identified a problem that needs fixing and suggest how it might be done.”

Reference: Industrial ecology in integrated assessment models. Stefan Pauliuk, Anders Arvesen, Konstantin Stadler and Edgar G. Hertwich. Nature Climate Change. Published online 4 Jan. 2017. http://dx.doi.org/10.1038/nclimate3148

It comes down to privacy -- biomedical research can't proceed without human genomic data sharing, and genomic data sharing can't proceed without some reasonable level of assurance that de-identified data from patients and other research participants will stay de-identified after they're released for research.

Data use agreements that carry penalties for attempted re-identification of participants may be a deterrent, but they're hardly a guarantee of privacy. Genomic data can be partially suppressed as they're released, addressing vulnerabilities and rendering individual records unrecognizable, but suppression quickly spoils a data set's scientific usefulness.

A new study from Vanderbilt University presents an unorthodox approach to re-identification risk, showing how optimal trade-offs between risk and scientific utility can be struck as genomic data are released for research.

The study appears in the American Journal of Human Genetics.

Doctoral candidate Zhiyu Wan, Bradley Malin, Ph.D., and colleagues draw on game theory to simulate the behavior of would-be data privacy adversaries, and show how marrying data use agreements with a more sensitive, scalpel-like data suppression policy can provide greater discretion and control as data are released. Their framework can be used to suppress just enough genomic data to persuade would-be snoops that their best privacy attacks will be unprofitable.

"Experts in the privacy field are prone to assume the worse case scenario, an attacker with unlimited capability and no aversion to financial losses. But that may not happen in the real world, so you would tend to overestimate the risk and not share anything," Wan said. "We developed an approach that gives a better estimate of the risk."

Malin agrees that failure to come to grips with real-world risk scenarios could stifle genomic data sharing.

"Historically, people have argued that it's too difficult to represent privacy adversaries. But the game theoretic perspective says you really just have to represent all the ways people can interact with each other around the release of data, and if you can do that, then you're going to see the solution. You're doing a simulation of what happens in the real world, and the question just becomes whether you've represented the rules of the game correctly," said Malin, associate professor of Biomedical Informatics, Biostatistics and Computer Science.

To date, no one has faced prosecution for attacking the privacy of de-identified genomic data. Privacy experts nevertheless assume a contest of computerized algorithms as de-identified data are released, with privacy algorithms patrolling the ramparts while nefarious re-identification algorithms try to scale them.

Re-identification attacks have occurred, but according to earlier research by Malin and colleagues, the perpetrators appear to be motivated by curiosity and academic advancement rather than by criminal self-interest. They're sitting at computers just down the hall, so to speak, overpowering your data set's de-identification measures, then publishing an academic paper saying just how they did it. It's all very bloodless and polite.

The new study is something different, more tough-minded, situating data sharing and privacy algorithms in the real world, where people go to jail or are fined for violations. Here the envisaged privacy adversary doesn't wear elbow patches, lacks government backing and is simply out to make a buck through the illicit sale of private information.

De-identified genotype records are linked to de-identified medical, biometric and demographic information. In what the study refers to as "the game," the attacker is assumed already to have some named genotype data in hand, and will attempt to match this identified data to de-identified genotype records as study data are released.

To bring these prospective attackers out of the shadows, the authors present a detailed case study involving release of genotype data from some 8,000 patients. They painstakingly assign illicit economic rewards for the criminal re-identification of research data. Based on costs for generating data, they also assign economic value to the scientific utility of study data.

On the way to estimating risk and the attacker's costs, the authors estimate the likelihood that any named individual genotype record already held by the attacker is included in the de-identified data set slated for release; according to the authors, this key estimate is often neglected in re-identification risk assessments.

The authors measure the utility of a study's genomic data in terms of the frequencies of genetic variants: for a given variant, the greater the difference between its frequency in the study group and its frequency in the general population (based on available reference data), the greater its scientific utility. This approach to utility triumphed recently when Wan and Malin won the 2016 iDASH Healthcare Privacy Protection Challenge. Their winning algorithm proved best at preserving the scientific utility of a genomic data set while thwarting a privacy attack.

For any genomic data set, before any data are released in a game's opening move, the sharer can use the game to compare various data sharing policies in terms of risk and utility. In the case study, the game theoretic policy provides the best payoff to the sharer, vastly outperforming a conventional data suppression policy and edging out a data use agreement policy.

No matter where parameters are set regarding illicit financial rewards or information that's likely to be wielded by an attacker, the authors show that the game theoretic approach generally provides the best payoff to the sharer. They sketch how their approach could serve the release of data from other sources, including the federal government's upcoming Precision Medicine Initiative.

This is a flow visualization of dominant vortex structure of the full-tuck posture of the downhill skier at a flow speed of 40 m/s. Red: counterclockwise vortexes, Blue: clockwise vortexes, viewed from the back.

SuperComputer simulation models flow of air and air resistance around skiers in high-resolution improvement on wind tunnels

Minimizing air resistance and friction with snow is key to elite performance in downhill skiing. Experiments in wind tunnels have revealed the total drag experienced by skiers, but have not provided precise data on which parts of the body cause the most air resistance when adopting the full-tuck position.

A new study published in the European Journal of Physics reports the findings of a research team at the University of Tsukuba that established a new computational modeling approach that provides precise 3D data on air flow, vortex formation, and lift around a skier's body. This has expected utility for designing better ski equipment and determining the ideal posture to adopt during skiing.

As downhill skiers can exceed speeds of 120 km/h, they are subjected to high levels of air resistance, and must adopt a tuck position to reduce this. However, at elite levels, where podium places can be separated by hundredths of a second, miniscule differences in air resistance can be extremely important, so much effort has been expended on modeling and reducing this.

The University of Tsukuba team has advanced this field of study by establishing a novel approach to computational modeling of air flow alongside wind tunnel experiments. Wind tunnel experiments using a mannequin provided total drag data at different air flow speeds, which were used to validate the computer simulations. In the novel supercomputer simulations, a type of computational fluid dynamics analysis called the lattice Boltzmann method was applied, in which a 3D grid was created to model air flow at and around the surface of the skier's body.

"The lattice Boltzmann method allowed us to identify regions of low air flow and places where vortices of air flow formed," study coauthor Sungchan Hong says. "Because of the precision of this simulation, in contrast to wind tunnel experiments, we could show that the head, upper arms, upper legs, and thighs are particular sources of drag."

The validity of the results was supported by the high correlation between the empirical results of total drag on the skier mannequin in a wind tunnel and corresponding data in the supercomputer simulations.

"Now we know which parts of the body have the greatest effects of slowing a skier down, we can design equipment to reduce the air resistance associated with this, and also suggest small changes to a skier's posture that could increase speed," lead author Takeshi Asai says.

The team intends to extend this work by applying the new approach to various skiing postures adopted during different parts of a race, and by using different models of turbulence to increase the reliability of their results.

Tyrannosaurus Rex "Tristan", on display at the Museum für Naturkunde - Leibniz Institute for Evolution and Biodiversity Science in Berlin with which PIK is cooperating. Photo: Carola Radke/Museum für Naturkunde

66 million years ago, the sudden extinction of the dinosaurs started the ascent of the mammals, ultimately resulting in humankind’s reign on Earth. Climate scientists now reconstructed how tiny droplets of sulfuric acid formed high up in the air after the well-known impact of a large asteroid and blocking the sunlight for several years, had a profound influence on life on Earth. Plants died, and death spread through the food web. Previous theories focused on the shorter-lived dust ejected by the impact. The new supercomputer simulations show that the droplets resulted in long-lasting cooling, a likely contributor to the death of land-living dinosaurs. An additional kill mechanism might have been a vigorous mixing of the oceans, caused by the surface cooling, severely disturbing marine ecosystems.

“The big chill following the impact of the asteroid that formed the Chicxulub crater in Mexico is a turning point in Earth history,” says Julia Brugger from the Potsdam Institute for Climate Impact Research (PIK), lead author of the study to be published today in the Geophysical Research Letters. “We can now contribute new insights for understanding the much debated ultimate cause for the demise of the dinosaurs at the end of the Cretaceous era.” To investigate the phenomenon, the scientists for the first time used a specific kind of computer simulation normally applied in different contexts, a climate model coupling atmosphere, ocean and sea ice. They build on research showing that sulfur- bearing gases that evaporated from the violent asteroid impact on our planet’s surface were the main factor for blocking the sunlight and cooling down Earth.

In the tropics, annual mean temperature fell from 27 to 5 degrees Celsius

“It became cold, I mean, really cold,” says Brugger. Global annual mean surface air temperature dropped by at least 26 degrees Celsius. The dinosaurs were used to living in a lush climate. After the asteroid’s impact, the annual average temperature was below freezing point for about 3 years. Evidently, the ice caps expanded. Even in the tropics, annual mean temperatures went from 27 degrees to mere 5 degrees. “The long-term cooling caused by the sulfate aerosols was much more important for the mass extinction than the dust that stays in the atmosphere for only a relatively short time. It was also more important than local events like the extreme heat close to the impact, wildfires or tsunamis,” says co-author Georg Feulner who leads the research team at PIK. It took the climate about 30 years to recover, the scientists found.

In addition to this, ocean circulation became disturbed. Surface waters cooled down, thereby becoming denser and hence heavier. While these cooler water masses sank into the depths, warmer water from deeper ocean layers rose to the surface, carrying nutrients that likely led to massive blooms of algae, the scientists argue. It is conceivable that these algal blooms produced toxic substances, further affecting life at the coasts. Yet in any case, marine ecosystems were severely shaken up, and this likely contributed to the extinction of species in the oceans, like the ammonites.

“It illustrates how important the climate is for all lifeforms on our planet”

The dinosaurs, until then the masters of the Earth, made space for the rise of the mammals, and eventually humankind. The study of Earth’s past also shows that efforts to study future threats by asteroids have more than just academic interest. “It is fascinating to see how evolution is partly driven by an accident like an asteroid’s impact – mass extinctions show that life on Earth is vulnerable,” says Feulner. “It also illustrates how important the climate is for all lifeforms on our planet. Ironically today, the most immediate threat is not from natural cooling but from human-made global warming.”

Using a combination of infrared spectroscopy and supercomputer simulation, researchers at Ruhr-Universität Bochum (RUB) have gained new insights into the workings of protein switches. With high temporal and spatial resolution, they verified that a magnesium atom contributes significantly to switching the so-called G-proteins on and off.

G-proteins affect, for example, seeing, smelling, tasting and the regulation of blood pressure. They constitute the point of application for many drugs. “Consequently, understanding their workings in detail is not just a matter of academic interest,” says Prof Dr Klaus Gerwert, Head of the Department of Biophysics. Together with his colleagues, namely Bochum-based private lecturer Dr Carsten Kötting and Daniel Mann, he published his findings in the Biophysical Journal. The journal features the subject matter as its cover story in the edition published on January 10, 2017 (http://www.cell.com/biophysj/issue?pii=S0006-3495%2816%29X0003-3).

G-proteins as source of disease

GTP can bind to all G-proteins. If an enzyme cleaves a phosphate group from the bound GTP, the G-protein is switched off. This so-called GTP hydrolysis takes place in the active centre of enzymes within seconds. If the process fails, severe diseases may be triggered, such as cancer, cholera or the rare McCune-Albright syndrome, which is characterised by, for example, abnormal bone metabolism.

Magnesium important for switch mechanism

In order for GTP hydrolysis to take place, a magnesium atom has to be present in the enzyme’s active centre. The research team observed for the first time directly in what way the magnesium affects the geometry and charge distribution in its environment. After being switched off, the atom remains in the enzyme’s binding pocket. To date, researchers had assumed that the magnesium leaves the pocket after the switching off process is completed.

The new findings have been gathered thanks to a method developed at the RUB Department of Biophysics. It allows to monitor enzymatic processes at a high temporal and spatial resolution in their natural state. The method in question is a special type of spectroscopy, namely the time-resolved Fourier Transform Infrared Spectroscopy. However, the data measured with its aid do not provide any information about the precise location in the enzyme where a process is taking place. The researchers gather this information through quantum-mechanical computer simulations of structural models. “Computer simulations are crucial for decoding the information hidden in the infrared spectra,” explains Carsten Kötting. The supercomputer, so to speak, becomes a microscope.

How proteins accelerate the switching off process

In the current study, the RUB biophysicists also demonstrated in what way the specialised protein environment affects the acceleration of GTP hydrolysis. They analysed the role of a lysine amino acid, which is located in the same spot in many G-proteins. It binds precisely that phosphate group of the GTP molecule from which a phosphate is separated when the G-protein is switched off.

“The function of lysine is to accelerate GTP hydrolysis by transferring negative charges from the third phosphate group to the second phosphate group,” elaborates Daniel Mann. “This is a crucial starting point for the development of drugs for the treatment of cancer and other severe diseases in the long term.”

The research team Carsten Kötting, Daniel Mann and Klaus Gerwert (left to right) sets up the measuring equipment. The detector of the spectrometer has to be cooled with liquid nitrogen. © RUB, Marquard
The research team Carsten Kötting, Daniel Mann and Klaus Gerwert (left to right) sets up the measuring equipment. The detector of the spectrometer has to be cooled with liquid nitrogen. © RUB, Marquard

Copper is an essential element of our society with main uses in the field of electricity and electronics. About 70% of the copper comes from deposits formed several million years ago during events of magma degassing within the Earth's crust just above subduction zones. Despite similar ore forming processes, the size of these deposits can vary orders of magnitude from one place to another, the main reason of which has remained unclear. A new study led by researchers from the Universities of Geneva (UNIGE, Switzerland) and the Saint-Etienne (France), to be published in Scientific Reports, suggests that the answer may come from the volume of magma emplaced in the crust and proposes an innovative method to better explore these deposits.

Magmas formed above subduction zones contain important amount of water that is essentially degassed during volcanic eruptions or upon magma cooling and solidification at depth. The water escaping from the crystallizing magma at several kilometers below surface carries most of the copper initially dissolved in the magma. On its way toward the surface the magmatic fluids cool and deposit copper in the fractured rocks forming giant metal deposits such as those exploited along the Andean Cordillera.

By modeling the process of magma degassing, the researchers could reproduce the chemistry of the fluids that form metal deposits. "Comparing the model results with available data from known copper deposits, we could link the timescales of magma emplacement and degassing in the crust, the volume of magma, and the size of the deposit", explains Luca Caricchi, researcher at the UNIGE. The scientists also propose a new method to estimate the size of the deposits, based on high-precision geochronology, one of the specialties of the Department of Earth Sciences in UNIGE's Science Faculty.

This technique is a new add-in in the prospector toolbox with the possibility to identify deposits with the best potential, early in the long and costly process of mineral exploration. It is anticipated that the computational approach developed in this study can also provide important insights on the role of magma degassing as a potential trigger for volcanic eruptions.

CAPTION This is an activ magmatic system.
CAPTION This is an activ magmatic system.

Physicists at the National Institute of Standards and Technology (NIST) have cooled a mechanical object to a temperature lower than previously thought possible, below the so-called "quantum limit."

The new NIST theory and experiments, described in the Jan. 12, 2017, issue of Nature, showed that a microscopic mechanical drum--a vibrating aluminum membrane--could be cooled to less than one-fifth of a single quantum, or packet of energy, lower than ordinarily predicted by quantum physics. The new technique theoretically could be used to cool objects to absolute zero, the temperature at which matter is devoid of nearly all energy and motion, NIST scientists said.

"The colder you can get the drum, the better it is for any application," said NIST physicist John Teufel, who led the experiment. "Sensors would become more sensitive. You can store information longer. If you were using it in a quantum computer, then you would compute without distortion, and you would actually get the answer you want."

"The results were a complete surprise to experts in the field," Teufel's group leader and co-author José Aumentado said. "It's a very elegant experiment that will certainly have a lot of impact."

The drum, 20 micrometers in diameter and 100 nanometers thick, is embedded in a superconducting circuit designed so that the drum motion influences the microwaves bouncing inside a hollow enclosure known as an electromagnetic cavity. Microwaves are a form of electromagnetic radiation, so they are in effect a form of invisible light, with a longer wavelength and lower frequency than visible light.

The microwave light inside the cavity changes its frequency as needed to match the frequency at which the cavity naturally resonates, or vibrates. This is the cavity's natural "tone," analogous to the musical pitch that a water-filled glass will sound when its rim is rubbed with a finger or its side is struck with a spoon.

NIST scientists previously cooled the quantum drum to its lowest-energy "ground state," or one-third of one quantum. They used a technique called sideband cooling, which involves applying a microwave tone to the circuit at a frequency below the cavity's resonance. This tone drives electrical charge in the circuit to make the drum beat. The drumbeats generate light particles, or photons, which naturally match the higher resonance frequency of the cavity. These photons leak out of the cavity as it fills up. Each departing photon takes with it one mechanical unit of energy--one phonon--from the drum's motion. This is the same idea as laser cooling of individual atoms, first demonstrated at NIST in 1978 and now widely used in applications such atomic clocks.

The latest NIST experiment adds a novel twist--the use of "squeezed light" to drive the drum circuit. Squeezing is a quantum mechanical concept in which noise, or unwanted fluctuations, is moved from a useful property of the light to another aspect that doesn't affect the experiment. These quantum fluctuations limit the lowest temperatures that can be reached with conventional cooling techniques. The NIST team used a special circuit to generate microwave photons that were purified or stripped of intensity fluctuations, which reduced inadvertent heating of the drum.

"Noise gives random kicks or heating to the thing you're trying to cool," Teufel said. "We are squeezing the light at a 'magic' level--in a very specific direction and amount--to make perfectly correlated photons with more stable intensity. These photons are both fragile and powerful."

The NIST theory and experiments indicate that squeezed light removes the generally accepted cooling limit, Teufel said. This includes objects that are large or operate at low frequencies, which are the most difficult to cool.

The drum might be used in applications such as hybrid quantum supercomputers combining both quantum and mechanical elements, Teufel said. A hot topic in physics research around the world, quantum supercomputers could theoretically solve certain problems considered intractable today.

CAPTION NIST researchers applied a special form of microwave light to cool a microscopic aluminum drum to an energy level below the generally accepted limit, to just one fifth of a single quantum of energy. The drum, which is 20 micrometers in diameter and 100 nanometers thick, beat10 million times per second while its range of motion fell to nearly zero. CREDIT Teufel/NIST
CAPTION NIST researchers applied a special form of microwave light to cool a microscopic aluminum drum to an energy level below the generally accepted limit, to just one fifth of a single quantum of energy. The drum, which is 20 micrometers in diameter and 100 nanometers thick, beat10 million times per second while its range of motion fell to nearly zero. CREDIT Teufel/NIST

Page 41 of 42