Japan's climate modeling shows global warming at least doubled the probability of extreme ocean warming

Sea surface temperatures observed in July and August 2022 near Japan and ten monitoring areas. This figure has been updated with the latest observational data obtained after the acceptance of the paper. Area numbers where extreme ocean warming events are identified are shown with area-averaged values and anomalies relative to 1991-2020. Single- or double-star marks indicate the event frequency increases more than twice or 10 times due to global warming significantly at the 95% confidence levels. COBE-SST2 was retrieved from the NEAR-GOOS RRTDB website on 15 September 2022 and thus the results in August 2022 are yet preliminary and not included in the published article.  CREDIT NIESLimit global warming below 1.5°C to prevent the new normal climate beyond record-warm sea surface temperatures we experienced

In the past decade, the marginal seas of Japan frequently experienced extremely high sea surface temperatures (SSTs). A new study led by National Institute for Environmental Studies (NIES) researchers revealed that the increased occurrence frequency of extreme ocean warming events since the 2000s is attributable to global warming due to industrialization.

In August 2020, the southern area of Japan and the northwestern Pacific Ocean experienced unprecedentedly high SSTs, according to the Japan Meteorological Agency (JMA). A recent study published in January 2021 revealed that the record-high northwestern Pacific SST observed in August 2020 could not be expected to occur without human-induced climate changes. Since then, the JMA again announced that record-high SSTs were observed near Japan in July and October 2021 and from June to August 2022, but it remains unclear to what extent climate change has altered the occurrence likelihood of these regional extreme warming events.

“Impacts of global warming is not uniform, rather show regional and seasonal differences,” said a co-author Hideo Shiogama, the head of the Earth System Risk Assessment Section at Earth System Division, NIES. “A comprehensive analysis of regional SSTs for a long period may provide a quantitative understanding of how much ocean condition near Japan has been and will be affected by global warming. This better informs policymakers to plan climate change mitigation and adaptation strategies.”

The paper published in Geophysical Research Letters today figures out the contribution of global warming to discrete monthly extreme ocean warming events in Japan’s marginal seas, which could occur less than once per 20 years in the preindustrial era. A climate research group at NIES focused on ten monitoring areas operationally used by the JMA, including the Japan Sea, East China Sea, Okinawa Islands, east of Taiwan, and the Pacific coasts of Japan. The scientists confirmed that observed SST changes from 1982 to 2021 were well reproduced by 24 climate models participating in the sixth phase of the Coupled Model Intercomparison Project (CMIP6), except for the region east of Hokkaido. Then, the extreme ocean warming events were identified in nine monitoring areas to reveal the contribution of climate change therein.

Extreme ocean warming events observed from January 1982 to August 2022 and the contribution of climate change therein. This figure has been updated with the latest observational data obtained after the acceptance of the paper. (left) South of the Japan Sea (Area 3) and (right) south of the East China Sea (Area 8). The upper panels show the observed sea surface temperature anomalies of COBE-SST2 (relative to 1991-2020) with shading and identified extreme ocean warming events with stippled (less than once per 20 years) and hatched (less than once per 100 years). The bottom panels show "Fraction of Attributable Risks" (FAR=1-P0/P1; P0 and P1 are the occurrence probabilities in the preindustrial and present climate conditions) with shading and the FAR values exceeding 0.5 and 0.9 significantly at the 95% confidence levels with hachured and stippled. Note that FAR of 0.5 or 0.9 corresponds to that the probability increases doubled or tenfold from the preindustrial era. Bar plots in the middle panels indicate the annual number of months when the events are identified. The bars are filled with orange or dark red when FAR significantly exceeds 0.5 or 0.9. COBE-SST2 was retrieved from the NEAR-GOOS RRTDB website on 15 September 2022 and thus the results in August 2022 are yet preliminary and not included in the published paper.  CREDIT NIESExtreme ocean warming and climate change

“In the present climate, every extreme ocean warming event is linked to global warming,” said corresponding lead author Michiya Hayashi, a research associate at NIES. The scientists estimated the occurrence frequencies of each event in the present and preindustrial climate conditions from January 1982 through July 2022 based on the CMIP6 climate models. “We found that the occurrence probability of almost all the extreme ocean warming events has already at least doubled since the 2000s than the preindustrial era. It is increased more than tenfold in sizeable cases since the mid-2010s, especially in southern Japan.”

For instance, in July 2022, anomalously high SSTs observed in five monitoring areas, including the Japan Sea (Areas 1, 3), East China Sea (Areas 5, 8), and south of Okinawa near Taiwan (Area 10), are identified as the extreme ocean warming events. The updated results based on the preliminary data retrieved from the NEAR-GOOS RRTDB website on 15 September 2022 (not included in the published paper) show that, in August 2022, the events are also identified in six monitoring areas at the south of 35°N: the East China Sea (Areas 5, 8), south and east of Okinawa (Areas 10, 9), southeastern Kanto (Area 7), and seas off Shikoku and Tokai (Area 6). “We estimate that, in all of these identified events in July and August 2022, the occurrence frequencies are increased at least doubled due to climate change, and more than tenfold for those in the south of 35°N except for the north of East China Sea,” stated Hayashi.

“Climate change impacts on extreme ocean warming events in northern Japan began to emerge relatively late compared to southern Japan,” noted Shiogama. The increased global aerosol emissions until the 1980s tend to cool the Earth’s surface, which is more substantial in the North Pacific especially near northern Japan via atmospheric large-scale circulation changes. In addition, the year-to-year natural variability of SST is large in northern Japan so the global warming signal was less detectable than in southern Japan. Since in the last decades global aerosol emissions have been reduced, the cooling effect becomes less dominant to human-induced greenhouse gas warming. “Our study indicates,“ continued Shiogama, “that the contribution of climate change to SST extremes has been already discernible beyond natural variability even in northern Japan under the present climate condition.”

What about the ocean conditions expected in the future? The researchers further compared the probabilities of exceeding the monthly record high SSTs around Japan at different global warming levels from 0°C to 2°C using the 24 CMIP6 climate model outputs from 1901 to 2100. “Once global warming reaches 2°C, all of nine monitoring areas are expected to experience SSTs warmer than the past highest levels at least every two years,” said Tomoo Ogura, a co-author and the head of the Climate Modeling and Analysis Section at Earth System Division, NIES. He added, “Limiting global warming below 1.5°C is necessary not to have the record warm conditions in Japan's marginal seas as the new normal climate.”

The quantitative analysis of SSTs around Japan implies that climate change has already become the major factor for most of the record-high SSTs in recent years. “In the future, dynamics of each extreme warming event need to be examined by taking the long-term climate change and year-to-year natural variability into account,” noted Hayashi. “Nevertheless, we expect that our statistical results based on the latest climate models will help to implement adaptation and mitigation measures for climate change.” 

MIT astronomers discover a cataclysmic pair of stars with the shortest orbit yet

An artist’s illustration shows a white dwarf (right) circling a larger, sun-like star (left) in an ultra-short orbit, forming a “cataclysmic” binary system.The stars circle each other every 51 minutes, confirming a decades-old prediction.

Nearly half the stars in our galaxy are solitary like the sun. The other half comprises stars that circle other stars, in pairs and multiples, with orbits so tight that some stellar systems could fit between Earth and the moon.

Astronomers at MIT and elsewhere have now discovered a stellar binary, or pair of stars, with an extremely short orbit, appearing to circle each other every 51 minutes. The system seems to be one of a rare class of binaries known as a “cataclysmic variable,” in which a star similar to our sun orbits tightly around a white dwarf — a hot, dense core of a burned-out star. 

A cataclysmic variable occurs when the two stars draw close, over billions of years, causing the white dwarf to start accreting or eating material away from its partner star. This process can give off enormous, variable flashes of light that, centuries ago, astronomers assumed to be a result of some unknown cataclysm.

The newly discovered system, which the team has tagged ZTF J1813+4251, is a cataclysmic variable with the shortest orbit detected to date. Unlike other such systems observed in the past, the astronomers caught this cataclysmic variable as the stars eclipsed each other multiple times, allowing the team to precisely measure the properties of each star.

With these measurements, the researchers ran simulations of what the system is likely doing today and how it should evolve over the next hundreds of millions of years. They conclude that the stars are currently in transition and that the sun-like star has been circling and “donating” much of its hydrogen atmosphere to the voracious white dwarf. The sun-like star will eventually be stripped down to a mostly dense, helium-rich core. In another 70 million years, the stars will migrate even closer together, with an ultrashort orbit reaching just 18 minutes, before they begin to expand and drift apart.

Decades ago, researchers at MIT and elsewhere predicted that such cataclysmic variables should transition to ultrashort orbits. This is the first time such a transitioning system has been observed directly.

“This is a rare case where we caught one of these systems in the act of switching from hydrogen to helium accretion,” says Kevin Burdge, a Pappalardo Fellow in MIT’s Department of Physics. “People predicted these objects should transition to ultrashort orbits, and it was debated for a long time whether they could get short enough to emit detectable gravitational waves. This discovery puts that to rest.”​

Sky search

The astronomers discovered the new system within a vast catalog of stars, observed by the Zwicky Transient Facility (ZTF), a survey that uses a camera attached to a telescope at the Palomar Observatory in California to take high-resolution pictures of vast swaths of the sky.

The survey has taken more than 1,000 images of each of the more than 1 billion stars in the sky, recording each star’s changing brightness over days, months, and years.

Burdge combed through the catalog, looking for signals of systems with ultrashort orbits, the dynamics of which can be so extreme that they should give off dramatic bursts of light and emit gravitational waves.

 “Gravitational waves are allowing us to study the universe in a totally new way,” says Burdge, who is searching the sky for new gravitational-wave sources.

For this new study, Burdge looked through the ZTF data for stars that appeared to flash repeatedly, with a period of less than an hour — a frequency that typically signals a system of at least two closely orbiting objects, with one crossing the other and briefly blocking its light.

He used an algorithm to weed through over 1 billion stars, each of which was recorded in more than 1,000 images. The algorithm sifted out about 1 million stars that appeared to flash every hour or so. Among these, Burdge then looked by eye for signals of particular interest. His search zeroed in on ZTF J1813+4251 — a system that resides about 3,000 light years from Earth, in the Hercules constellation.

“This thing popped up, where I saw an eclipse happening every 51 minutes, and I said, OK, this is definitely a binary,” Burdge recalls.

A dense core

He and his colleagues further focused on the system using the W.M. Keck Observatory in Hawaii and the Gran Telescopio Canarias in Spain. They found that the system was exceptionally “clean,” meaning they could clearly see its light change with each eclipse. With such clarity, they were able to precisely measure each object’s mass and radius, as well as their orbital period.

They found that the first object was likely a white dwarf, at 1/100th the size of the sun and about half its mass. The second object was a sun-like star near the end of its life, at a tenth the size and mass of the sun (about the size of Jupiter). The stars also appeared to orbit each other every 51 minutes.

Yet, something didn’t quite add up.

“This one star looked like the sun, but the sun can’t fit into an orbit shorter than eight hours — what’s up here?” Burdge says.

He soon hit upon an explanation: Nearly 30 years ago, researchers including MIT Professor Emeritus Saul Rappaport had predicted that ultrashort-orbit systems should exist as cataclysmic variables. As the white dwarf eats orbits the sun-like star and eats away its light hydrogen, the sun-like star should burn out, leaving a core of helium — an element that is denser than hydrogen, and heavy enough to keep the dead star in a tight, ultrashort orbit.

Burdge realized that ZTF J1813+4251 was likely a cataclysmic variable, transitioning from a hydrogen- to a helium-rich body. The discovery both confirms the predictions made by Rappaport and others, and also stands as the shortest orbit cataclysmic variable detected to date.

“This is a special system,” Burdge says. “We got doubly lucky to find a system that answers a big open question, and is one of the most beautifully behaved cataclysmic variables known.”

This research was supported, in part, by the European Research Council.

Refining current opacity models is key to unearthing details of exoplanet properties

NASA’s James Webb Space Telescope (JWST) is revealing the universe with spectacular, unprecedented clarity. The observatory’s ultrasharp infrared vision has cut through the cosmic dust to illuminate some of the earliest structures in the universe, along with previously obscured stellar nurseries and spinning galaxies lying hundreds of millions of light-years away. Astronomers risk misinterpreting planetary signals in James Webb Space Telescope data if models to interpret the data don’t improve, an MIT study finds. In this conceptual image, the James Webb telescope captures light from around a newly-discovered planet (on left). However, when scientists analyze this data, limitations in opacity models could produce planetary predictions that are off by an order of magnitude (represented by 3 possible planets on the right). Credits:Image: Jose-Luis Olivares, MIT. James Webb icon courtesy of NASA.

In addition to seeing farther into the universe than ever before, JWST will capture the most comprehensive view of objects in our galaxy — namely, some of the 5,000 planets that have been discovered in the Milky Way. Astronomers are harnessing the telescope’s light-parsing precision to decode the atmospheres surrounding some of these nearby worlds. The properties of their atmospheres could give clues to how a planet formed and whether it harbors signs of life.

But a new MIT study suggests that the tools astronomers typically use to decode light-based signals may not be good enough to accurately interpret the new telescope’s data. Specifically, opacity models —  the tools that model how light interacts with matter as a function of the matter’s properties — may need significant retuning to match the precision of JWST data, the researchers say.

If these models are not refined? The researchers predict that properties of planetary atmospheres, such as their temperature, pressure, and elemental composition, could be off by an order of magnitude.

“There is a scientifically significant difference between a compound like water being present at 5 percent versus 25 percent, which current models cannot differentiate,” says study co-leader Julien de Wit, assistant professor in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS).

“Currently, the model we use to decrypt spectral information is not up to par with the precision and quality of data we have from the James Webb telescope,” adds EAPS graduate student Prajwal Niraula. “We need to up our game and tackle together the opacity problem.”

De Wit, Niraula, and their colleagues have published their study today in Nature Astronomy. Co-authors include spectroscopy experts Iouli Gordon, Robert Hargreaves, Clara Sousa-Silva, and Roman Kochanov of the Harvard-Smithsonian Center for Astrophysics.

Leveling up

Opacity is a measure of how easily photons pass through a material. Photons of certain wavelengths can pass straight through a material, be absorbed, or be reflected out depending on whether and how they interact with certain molecules within a material. This interaction also depends on a material’s temperature and pressure.

An opacity model works based on various assumptions of how light interacts with matter. Astronomers use opacity models to derive certain properties of a material, given the spectrum of light that the material emits. In the context of exoplanets, an opacity model can decode the type and abundance of chemicals in a planet’s atmosphere, based on the light from the planet that a telescope captures.

De Wit says that the current state-of-the-art opacity model, which he likens to a classical language translation tool, has done a decent job of decoding spectral data taken by instruments such as those on the Hubble Space Telescope.

Light, perturbed

He and his colleagues make this point in their study, in which they put the most commonly used opacity model to the test. The team looked to see what atmospheric properties the model would derive if it were tweaked to assume certain limitations in our understanding of how light and matter interact. The researchers created eight such “perturbed” models. They then fed each model, including the real version, “synthetic spectra” — patterns of light that were simulated by the group and similar to the precision that the JWST would see.

They found that, based on the same light spectra, each perturbed model produced wide-ranging predictions for the properties of a  planet’s atmosphere. Based on their analysis, the team concludes that, if existing opacity models are applied to light spectra taken by the Webb telescope, they will hit an “accuracy wall.” That is, they won’t be sensitive enough to tell whether a planet has an atmospheric temperature of 300 Kelvin or 600 Kelvin, or whether a certain gas takes up 5 percent or 25 percent of an atmospheric layer.

“That difference matters for us to constrain planetary formation mechanisms and reliably identify biosignatures,” Niraula says.

The team also found that every model also produced a “good fit” with the data, meaning, that even though a perturbed model produced a chemical composition that the researchers knew to be incorrect, it also generated a light spectrum from that chemical composition that was close enough to, or “fit” with the original spectrum.

“We found that there are enough parameters to tweak, even with a wrong model, to still get a good fit, meaning you wouldn’t know that your model is wrong and what it’s telling you is wrong,” de Wit explains.

He and his colleagues raise some ideas for how to improve existing opacity models, including the need for more laboratory measurements and theoretical calculations to refine the models’ assumptions of how light and various molecules interact, as well as collaborations across disciplines, and in particular, between astronomy and spectroscopy.

“To reliably interpret spectra from the diverse exoplanetary atmospheres, we need an extensive campaign for new accurate measurements and calculations of relevant molecular spectroscopic parameters,” says study co-author Iouli Gordon, a physicist at the Harvard-Smithsonian Center for Astrophysics. “These parameters will need to be timely implemented into reference spectroscopic databases and consequently models used by astronomers."

“There is so much that could be done if we knew perfectly how light and matter interact,” Niraula adds. “We know that well enough around the Earth’s conditions, but as soon as we move to different types of atmospheres, things change, and that’s a lot of data, with increasing quality, that we risk misinterpreting.”