Temperature Record 101: How We Know What We Know about Climate Change

2021 was tied for the sixth warmest year on NASA’s record, stretching more than a century.But, what is a temperature record?GISTEMP, NASA’s global temperature analysis, takes in millions of observations from instruments on weather stations, ships and ocean buoys, and Antarctic research stations, to determine how much warmer or cooler Earth is on average from year to year.Stretching back to 1880...
...

Read more

NASA supercomputing shows 2021 tied for 6th warmest year in continued trend

Earth’s global average surface temperature in 2021 tied with 2018 as the sixth warmest on record, according to independent analyses done by NASA and the National Oceanic and Atmospheric Administration (NOAA). 2021 was tied for the sixth warmest year on NASA’s record, stretching more than a century. Because the record is global, not every place on Earth experienced the sixth warmest year on record. Some places had record-high temperatures, and we saw record droughts, floods and fires around the globe. Credits: NASA’s Scientific Visualization Studio/Kathryn Mersmann

Continuing the planet’s long-term warming trend, global temperatures in 2021 were 1.5 degrees Fahrenheit (0.85 degrees Celsius) above the average for NASA’s baseline period, according to scientists at NASA’s Goddard Institute for Space Studies (GISS) in New York. NASA uses the period from 1951-1980 as a baseline to see how global temperature changes over time.

Collectively, the past eight years are the warmest since modern recordkeeping began in 1880. This annual temperature data makes up the global temperature record – which tells scientists the planet is warming. 

According to NASA’s temperature record, Earth in 2021 was about 1.9 degrees Fahrenheit (or about 1.1 degrees Celsius) warmer than the late 19th-century average, the start of the industrial revolution.

“Science leaves no room for doubt: Climate change is the existential threat of our time,” said NASA Administrator Bill Nelson. “Eight of the top 10 warmest years on our planet occurred in the last decade, an indisputable fact that underscores the need for bold action to safeguard the future of our country – and all of humanity. NASA’s scientific research about how Earth is changing and getting warmer will guide communities throughout the world, helping humanity confront climate and mitigate its devastating effects.”

This warming trend around the globe is due to human activities that have increased emissions of carbon dioxide and other greenhouse gases into the atmosphere. The planet is already seeing the effects of global warming: Arctic sea ice is declining, sea levels are rising, wildfires are becoming more severe and animal migration patterns are shifting. Understanding how the planet is changing – and how rapidly that change occurs – is crucial for humanity to prepare for and adapt to a warmer world. 

{media id=275,layout=solo}

Weather stations, ships, and ocean buoys around the globe record the temperature at Earth’s surface throughout the year. These ground-based measurements of surface temperature are validated with satellite data from the Atmospheric Infrared Sounder (AIRS) on NASA’s Aqua satellite. Scientists analyze these measurements using supercomputer algorithms to deal with uncertainties in the data and quality control to calculate the global average surface temperature difference for every year. NASA compares that global mean temperature to its baseline period of 1951-1980. That baseline includes climate patterns and unusually hot or cold years due to other factors, ensuring that it encompasses natural variations in Earth’s temperature.

Many factors affect the average temperature any given year, such as La Nina and El Nino climate patterns in the tropical Pacific. For example, 2021 was a La Nina year and NASA scientists estimate that it may have cooled global temperatures by about 0.06 degrees Fahrenheit (0.03 degrees Celsius) from what the average would have been.

A separate, independent analysis by NOAA also concluded that the global surface temperature for 2021 was the sixth-highest since record-keeping began in 1880. NOAA scientists use much of the same raw temperature data in their analysis and have a different baseline period (1901-2000) and methodology.

“The complexity of the various analyses doesn’t matter because the signals are so strong,” said Gavin Schmidt, director of GISS, NASA’s leading center for climate modeling and climate change research. “The trends are all the same because the trends are so large.”

NASA’s full dataset of global surface temperatures for 2021, as well as details of how NASA scientists conducted the analysis, are publicly available from GISS.

RIKEN shows how the free-energy principle explains the brain

The RIKEN Center for Brain Science (CBS) in Japan, along with colleagues, has shown that the free-energy principle can explain how neural networks are optimized for efficiency. The study first shows how the free-energy principle is the basis for any neural network that minimizes energy cost. Then, as a proof-of-concept, it shows how an energy minimizing neural network can solve mazes. This finding will be useful for analyzing impaired brain function in thought disorders as well as for generating optimized neural networks for artificial bits of intelligence. General view of a solved maze. The maze comprises a discrete state space, wherein white and black cells indicate pathways and walls, respectively. The blue path is the trajectory. Starting from the left, the agent needs to reach the right edge of the maze within a certain amount of steps (time). The maze was solved following the free energy principle.

Biological optimization is a natural process that makes our bodies and behavior as efficient as possible. A behavioral example can be seen in the transition that cats make from running to galloping. Far from being random, the switch occurs precisely at the speed when the amount of energy it takes to gallop becomes less than it takes to run. In the brain, neural networks are optimized to allow efficient control of behavior and transmission of information, while still maintaining the ability to adapt and reconfigure to changing environments.

As with the simple cost/benefit calculation that can predict the speed that a cat will begin to gallop, researchers at RIKEN CBS are trying to discover the basic mathematical principles that underly how neural networks self-optimize. The free-energy principle follows a concept called Bayesian inference, which is the key. In this system, an agent is continually updated by new incoming sensory data, as well as its past outputs, or decisions. The researchers compared the free-energy principle with well-established rules that control how the strength of neural connections within a network can be altered by changes in sensory input.

“We were able to demonstrate that standard neural networks, which feature delayed modulation of Hebbian plasticity, perform planning and adaptive behavioral control by taking their previous ‘decisions’ into account,” says first author and Unit Leader Takuya Isomura. “Importantly, they do so the same way that they would when following the free-energy principle.”

Once they established that neural networks theoretically follow the free-energy principle, they tested the theory using simulations. The neural networks are self-organized by changing the strength of their neural connections and associating past decisions with future outcomes. In this case, the neural networks can be viewed as being governed by the free-energy principle, which allowed them to learn the correct route through a maze through trial and error in a statistically optimal manner.

These findings point toward a set of universal mathematical rules that describe how neural networks self-optimize. As Isomura explains, “Our findings guarantee that an arbitrary neural network can be cast as an agent that obeys the free-energy principle, providing a universal characterization for the brain.” These rules, along with the researchers’ new reverse engineering technique, can be used to study neural networks for decision-making in people with thought disorders such as schizophrenia and predict the aspects of their neural networks that have been altered.

Another practical use for these universal mathematical rules could be in the field of artificial intelligence, especially those that designers hope will be able to efficiently learn, predict, plan, and make decisions. “Our theory can dramatically reduce the complexity of designing self-learning neuromorphic hardware to perform various types of tasks, which will be important for next-generation artificial intelligence,” says Isomura.

JAIST supercomputer shows novel crystal structure for hydrogen under high pressure

Japanese researchers identify a potential crystal phase for hydrogen solidified at extreme pressures using data science and supercomputer simulations 

Elements in the periodic table can take up multiple forms. Carbon, for example, exists as diamond or graphite depending on the environmental conditions at the time of formation. Crystal structures that have been formed in ultra-high-pressure environments are particularly important as they provide clues to the formation of planets. However, recreating such environments in a laboratory is difficult, and materials scientists often rely on simulation predictions to identify the existence of such structures. A new crystal structure (atomic arrangement pattern) called the P21/c-8 type, which is predicted to be achieved under very high pressure, such as deep inside the Earth.  CREDIT Ryo Maezono from JAIST.

In this regard, hydrogen is especially important for analyzing the distribution of matter in the universe and the behavior of giant gas planets. However, the crystal structures of solid hydrogen formed under high pressure are still under contention owing to the difficulty in conducting experiments involving high-pressure hydrogen. Moreover, the structural pattern is governed by a delicate balance of factors including electric forces on the electrons and fluctuations imposed by quantum mechanics, and for hydrogen, the fluctuations are particularly large, making the predictions of its crystal phases even more difficult.

Recently, in a collaborative study published in Physical Review B, a global team of researchers involving Professor Ryo Maezono and Associate Professor Kenta Hongo from Japan Advanced Institute of Science and Technology tackled this problem using an ingenious combination of supercomputer simulations and data science, revealing various crystal structures for hydrogen at low temperatures near 0 K and high pressures.

“For crystal structures under high pressure, we have been able to generate several candidate patterns using a recent data science method known such as genetic algorithms etc. But whether these candidates are truly the phases that survive under high pressure can only be determined by high-resolution simulations,” explains Prof. Maezono.

Accordingly, the team looked for various possible structures that can be formed with 2 to 70 hydrogen atoms at high pressures of 400 to 600 gigapascals (GPa) using a technique called “particle swarm optimization” and density functional theory (DFT) calculations and estimated their relative stability using first-principles quantum Monte Carlo method and DFT zero-point energy corrections.

The search produced 10 possible crystal structures that were previously not found by experiments, including nine molecular crystals and one mixed structure, Pbam-8 comprising atomic and molecular crystal layers appearing alternatively. However, they found that all the 10 structures showed structural dynamic instabilities. To obtain a stable structure, the team relaxed Pbam-8 in the direction of instability to form a new dynamically stable structure­ called P21/c-8. “The new structure is a promising candidate for the solid hydrogen phase realized under high-pressure conditions such as that found deep within the Earth,” says Dr. Hongo.

The new structure was found to be more stable than Cmca-12, a structure that was previously found to be a valid candidate in the H2-PRE phase, one of the six structural phases identified for solid hydrogen at high pressure (360 to 495 GPa) that is stable at near 0 K. The team further validated their results by comparing the infrared spectrum of the two structures, which revealed a similar pattern typically observed for the H2-PRE phase.

While this is an interesting finding, Prof. Maezono explains the significance of their results: “The hydrogen crystal problem is one of the most challenging and intractable problems in materials science. Depending on the type of approximation used, the predictions can vary greatly and avoiding approximations is a typical challenge. With our result now verified, we can continue our research on other structure prediction problems, such as that for silicon and magnesium compounds, which have a significant impact on earth and planetary science."

Brown University physicist discovers a strange metal that could lead to quantum supercomputers, deep insights

Discovery could help scientists to understand “strange metals,” a class of materials that are related to high-temperature superconductors and share fundamental quantum attributes with black holes.

Scientists understand quite well how temperature affects electrical conductance in most everyday metals like copper or silver. But in recent years, researchers have turned their attention to a class of materials that do not seem to follow the traditional electrical rules. Understanding these so-called “strange metals” could provide fundamental insights into the quantum world, and potentially help scientists understand strange phenomena like high-temperature superconductivity. Using a material called yttrium barium copper oxide arrayed with tiny holes, researchers have discovered "strange metal" behavior in a type of system where charge carriers are bosons, something that's never been seen before.

Now, a research team co-led by a Brown University physicist has added a discovery to the strange metal mix. The team found strange metal behavior in a material in which electrical charge is carried not by electrons, but by more “wave-like” entities called Cooper pairs.

While electrons belong to a class of particles called fermions, Cooper pairs act as bosons, which follow very different rules from fermions. This is the first time strange metal behavior has been seen in a bosonic system, and researchers are hopeful that the discovery might help find an explanation for how strange metals work — something that has eluded scientists for decades.

“We have these two fundamentally different types of particles whose behaviors converge around a mystery,” said Jim Valles, a professor of physics at Brown and the study’s corresponding researcher. “What this says is that any theory to explain strange metal behavior can’t be specific to either type of particle. It needs to be more fundamental than that.”

Strange metals

Strange metal behavior was first discovered around 30 years ago in a class of materials called cuprates. These copper-oxide materials are most famous for being high-temperature superconductors, meaning they conduct electricity with zero resistance at temperatures far above that of normal superconductors. But even at temperatures above the critical temperature for superconductivity, cuprates act strangely compared to other metals.

As their temperature increases, cuprates’ resistance increases in a strictly linear fashion. In normal metals, the resistance increases only so far, becoming constant at high temperatures in accord with what's known as Fermi liquid theory. Resistance arises when electrons flowing in a metal bang into the metal’s vibrating atomic structure, causing them to scatter. Fermi-liquid theory sets a maximum rate at which electron scattering can occur. But strange metals don’t follow the Fermi-liquid rules, and no one is sure how they work. What scientists do know is that the temperature-resistance relationship in strange metals appears to be related to two fundamental constants of nature: Boltzmann’s constant, which represents the energy produced by random thermal motion, and Planck’s constant, which relates to the energy of a photon (a particle of light).

“To try to understand what’s happening in these strange metals, people have applied mathematical approaches similar to those used to understand black holes,” Valles said. “So there are some very fundamental physics happening in these materials.”

Of bosons and fermions

In recent years, Valles and his colleagues have been studying electrical activity in which the charge carriers are not electrons. In 1952, Nobel Laureate Leon Cooper, now a Brown professor emeritus of physics, discovered that in normal superconductors (not the high-temperature kind discovered later), electrons team up to form Cooper pairs, which can glide through an atomic lattice with no resistance. Despite being formed by two electrons, which are fermions, Cooper pairs can act like bosons.

“Fermion and boson systems usually behave very differently,” Valles said. “Unlike individual fermions, bosons are allowed to share the same quantum state, which means they can move collectively like water molecules in the ripples of a wave.”

In 2019, Valles and his colleagues showed that Cooper pair bosons can produce metallic behavior, meaning they can conduct electricity with some amount of resistance. That in itself was a surprising finding, the researchers say, because elements of quantum theory suggested that the phenomenon shouldn’t be possible. For this latest research, the team wanted to see if bosonic Cooper-pair metals were also strange metals.

The team used a cuprate material called yttrium barium copper oxide patterned with tiny holes that induce the Cooper-pair metallic state. The team cooled the material down to just above its superconducting temperature to observe changes in its conductance. They found, like fermionic strange metals, a Cooper-pair metal conductance that is linear with temperature.

The researchers say this discovery will give theorists something new to chew on as they try to understand strange metal behavior.

“It’s been a challenge for theoreticians to come up with an explanation for what we see in strange metals,” Valles said. “Our work shows that if you’re going to model charge transport in strange metals, that model must apply to both fermions and bosons — even though these types of particles follow fundamentally different rules.”

Ultimately, a theory of strange metals could have massive implications. Strange metal behavior could hold the key to understanding high-temperature superconductivity, which has vast potential for things like lossless power grids and quantum supercomputers. And because strange metal behavior seems to be related to fundamental constants of the universe, understanding their behavior could shed light on basic truths of how the physical world works.