Blue Blob near Iceland could slow glacial melting

A region of cooling water in the North Atlantic Ocean near Iceland, nicknamed the "Blue Blob,” has likely slowed the melting of the island's glaciers since 2011 and may continue to stymie ice loss until about 2050, according to new research.

The origin and cause of the Blue Blob, which is located south of Iceland and Greenland, is still being investigated. The cold patch was most prominent during the winter of 2014-2015 when the sea surface temperature was about 1.4 degrees Celsius (2.52 degrees Fahrenheit) colder than normal.

The new study uses climate models and field observations to show that the cold water patch chilled the air over Iceland sufficiently to slow ice loss starting in 2011. The model predicts cooler water will persist in the North Atlantic, sparing Iceland's glaciers until about 2050. Ocean and air temperatures are predicted to increase between 2050 and 2100, leading to accelerated melting.

While cooler water in the North Atlantic offers a temporary respite for Iceland's glaciers, the authors estimate that without steps to mitigate climate change, the glaciers could lose a third of their current ice volume by 2100 and be gone by 2300. If the country's 3,400 cubic kilometers (about 816 cubic miles) of ice melt, sea level will rise by 9 millimeters (0.35 inches).

"In the end, the message is still clear," said lead author Brice Noël, a climate modeler who specializes in polar ice sheets and glaciers at Utrecht University. "The Arctic is warming fast. If we wish to see glaciers in Iceland, then we have to curb the warming."

The paper is published in the AGU journal Geophysical Research Letters, which publishes high-impact, short-format reports with immediate implications spanning all Earth and space sciences. Its findings may help scientists to better understand the indirect effects of the ocean on glaciers.

"It's crucial to have an idea of the possible feedbacks in the Arctic because it's a region that is changing so fast," Noël said. "It's important to know what we can expect in a future warmer climate."

The warming Arctic

Nowhere on Earth has warmed as quickly as the Arctic. Recent studies report the area is warming four times faster than the global average. Iceland's glaciers steadily shrank from 1995 to 2010, losing an average of 11 gigatons of ice per year. Starting in 2011, however, the speed of Iceland’s melting slowed, resulting in about half as much ice loss, or about 5 gigatons annually. This trend was not seen in nearby, larger glaciers across Greenland and Svalbard.

Noël and his colleagues investigated the cause of this slowdown by estimating the glaciers' mass balance — how much they grew or melted annually from 1958 to 2019. They used a high-resolution regional climate model that works at the small scale of Iceland's glaciers to estimate how much snow the glaciers received in winter and how much ice was lost from meltwater runoff in summer. The researchers found that cooler waters near the Blue Blob are linked to observations of lower air temperatures over Iceland's glaciers and coincide with the slowdown of glacial melting since 2011.

Several researchers have proposed that the Blue Blob is part of the normal sea surface temperature variability in the Arctic. Notably, especially cold winters in 2014 and 2015 led to record cooling, which caused upwelling of cold, deep water, even as ocean temperatures around the region warmed due to climate change.

Before the Blue Blob, a long-term cooling trend in the same region, called the Atlantic Warming Hole, reduced sea surface temperatures by about 0.4 to 0.8 degrees Celsius (0.72 to 1.44 degrees Fahrenheit) during the last century and may continue to cool the region in the future. A possible explanation for the Warming Hole is that climate change has slowed the Atlantic Meridional Overturning Circulation, an ocean current that brings warm water up from the tropics to the Arctic, thus reducing the amount of heat delivered to the region.

The end of Iceland's glaciers?

Noël projected the future climate of Iceland by combining the same regional climate model with a global climate model to predict how North Atlantic ocean temperatures would affect the glaciers' fate until 2100, under a scenario of rapid warming. The models predicted that the North Atlantic near Iceland will stay cool, slowing — and perhaps even temporarily stopping — ice loss from the glaciers by the mid-2050s.

The authors verified that the models accurately reconstructed the mass of the glaciers using almost 1,200 measurements of snow depth collected between 1991 and 2019 by colleagues at the University of Iceland and satellite measurements of the elevation and extent of glaciers taken from 2002 to 2019 by co-authors at the Delft University of Technology.

"I think their analysis is very thorough," said Fiamma Straneo, a physical oceanographer at the Scripps Institution of Oceanography who was not involved in the study. "They have a really state-of-the-art regional atmospheric model for looking at the variability of glaciers." Straneo thinks this approach could be used to understand changes in other glaciers that occur over land, such as in the Himalayas and Patagonia. "There is very active research in land terminating glaciers because they are one of the largest contributors to sea level rise right now."

UCI scientists discover how galaxies can exist without dark matter

In supercomputer simulations, collisions cause smaller star groupings to lose material

In a new study, an international team led by astrophysicists from the University of California, Irvine and Pomona College report how, when tiny galaxies collide with bigger ones, the bigger galaxies can strip the smaller galaxies of their dark matter — a matter that we can’t see directly, but which astrophysicists think must exist because, without its gravitational effects, they couldn’t explain things like the motions of a galaxy’s stars. Dark matter distribution in a simulated galaxy group, with brighter areas showing higher concentrations of dark matter. Circles show close-up images of the stellar light associated with two galaxies lacking dark matter. If these galaxies had dark matter, they would appear as bright regions in the main image. Moreno et al.

It’s a mechanism that has the potential to explain how galaxies might be able to exist without dark matter – something once thought impossible.

It started in 2018 when astrophysicists Shany Danieli and Pieter van Dokkum of Princeton University and Yale University observed two galaxies that seemed to exist without most of their dark matter.

“We were expecting large fractions of dark matter,” said Danieli, who’s a co-author on the latest study. “It was quite surprising, and a lot of luck, honestly.”

The lucky find, which van Dokkum and Danieli reported on in a paper in 2018 and 2020, threw the galaxies-need-dark-matter paradigm into turmoil, potentially upending what astrophysicists had come to see as a standard model for how galaxies work.

“It’s been established for the last 40 years that galaxies have dark matter,” said Jorge Moreno, an astronomy professor at Pomona College, who’s the lead author of the new paper. “In particular, low-mass galaxies tend to have significantly higher dark matter fractions, making Danieli’s finding quite surprising. For many of us, this meant that our current understanding of how dark matter helps galaxies grow needed an urgent revision.”    

The team ran computer models that simulated the evolution of a chunk of the universe – one about 60 million light-years across – starting soon after the Big Bang and running to the present.

The team found seven galaxies devoid of dark matter. After several collisions with neighboring galaxies 1,000-times more massive, they were stripped of most of their material, leaving behind nothing but stars and some residual dark matter.

“It was pure serendipity,” said Moreno. “The moment I made the first images, I shared them immediately with Danieli, and invited her to collaborate.”

Robert Feldmann, a professor at the University of Zurich who designed the new simulation, said that “this theoretical work shows that dark matter-deficient galaxies should be very common, especially in the vicinity of massive galaxies.”

UCI’s James Bullock, an astrophysicist who’s a world-renowned expert on low-mass galaxies, described how he and the team didn’t build their model just so they could create galaxies without dark matter – something he said makes the model stronger because it wasn’t designed in any way to create the collisions that they eventually found. “We don’t presuppose the interactions,” said Bullock.

Confirming that galaxies lacking dark matter can be explained in a universe where there’s lots of dark matter is a sigh of relief for researchers like Bullock, whose career and everything he’s discovered therein hinges on the dark matter being the thing that makes galaxies behave the way they do.

“The observation that there are dark matter-free galaxies has been a little bit worrying to me,” said Bullock. “We have a successful model, developed over decades of hard work, where most of the matter in the cosmos is dark. There is always the possibility that nature has been fooling us.”

But, Moreno said, “you don’t have to get rid of the standard dark matter paradigm.”

Now that astrophysicists know how a galaxy might lose its dark matter, Moreno and his collaborators hope the findings inspire researchers who look at the night sky to look for real-world massive galaxies they might be in the process of stripping dark matter away from smaller ones.

“It still doesn’t mean this model is right,” Bullock said. “A real test will be to see if these things exist with the frequency and general characteristics that match our predictions.”

As part of this new work, Moreno, who has indigenous roots, received permission from Cherokee leaders to name the seven dark matter-free galaxies found in their simulations in honor of the seven Cherokee clans: Bird, Blue, Deer, Long Hair, Paint, Wild Potato and Wolf.

“I feel a personal connection to these galaxies,” said Moreno, who added that, just as the more massive galaxies robbed the smaller galaxies of their dark matter, “many people of indigenous ancestry were stripped of our culture. But our core remains, and we are still thriving.”

University of Hong Kong physicists make a great stride in the quest for quantum materials through better measurement of quantum entanglement

A research team from the Department of Physics, the University of Hong Kong (HKU) has developed a new algorithm to measure entanglement entropy, advancing the exploration of more comprehensive laws in quantum mechanics, a move closer towards the actualization of the application of quantum materials. Mr Jiarui ZHAO, a PhD student from Department of Physics from HKU, came up with this new algorithm of computing the quantum entanglement on a trip in the subway.

Quantum materials play a vital role in propelling human advancement. The search for more novel quantum materials with exceptional properties has been pressing among the scientific and technology community.

2D Moire materials such as twisted bilayer graphene are having a far-reaching role in the research of novel quantum states such as superconductivity which suffers no electronic resistance. They also play a role in the development of quantum supercomputers that vastly outperform the best supercomputers in existence.

But materials can only arrive at “quantum state”, i.e. when thermal effects can no longer hinder quantum fluctuations which trigger the quantum phase transitions between different quantum states or quantum phases, at extremely low temperatures (near Absolute Zero, -273.15°C) or under exceptionally high pressure. Experiments testing when and how atoms and subatomic particles of different substances “communicate and interact with each other freely through entanglement” in a quantum state are therefore prohibitively costly and difficult to execute.

The study is further complicated by the failure of the classical LGW (Landau, Ginzburg, Wilson) framework to describe certain quantum phase transitions, dubbed Deconfined Quantum Critical Points (DQCP). The question then arises whether DQCP realistic lattice models can be found to resolve the inconsistencies between DQCP and QCP. Dedicated exploration of the topic produces copious numerical and theoretical works with conflicting results, and a solution remains elusive.

Mr. Jiarui ZHAO, Dr. Zheng YAN, and Dr. Zi Yang MENG from the Department of Physics, HKU successfully made a momentous step towards resolving the issue through the study of quantum entanglement, which marks the fundamental difference between quantum and classical physics.

The research team developed a new and more efficient quantum algorithm of the Monte Carlo techniques adopted by scientists to measure the Renyi entanglement entropy of objects. With this new tool, they measured the Rényi entanglement entropy at the DQCP and found the scaling behavior of the entropy, i.e. how the entropy changes with the system sizes, is in sharp contrast with the description of conventional LGW types of phase transitions.

“Our findings helped confirm a revolutionized understanding of phase transition theory by denying the possibility of a singular theory describing DQCP. The questions raised by our work will contribute to further breakthroughs in the search for a comprehensive understanding of unchartered territory,” said Dr. Zheng Yan.

 “The finding has changed our understanding of the traditional phase transition theory and raises many intriguing questions about deconfined quantum criticality. This new tool developed by us will hopefully help the process of unlocking the enigma of quantum phase transitions that have perplexed the scientific community for two decades,” said Mr. Zhao Jiarui, the first author of the journal paper and a Ph.D. student who came up with the final fixes of the algorithm.

 “This discovery will lead to a more general characterization of the critical behavior of novel quantum materials, and is a move closer towards the actualization of application of quantum materials which play a vital role in propelling human advancement.” Dr. Meng Zi Yang remarked.

The models
To test the efficiency and superior power of the algorithm and demonstrate the distinct difference between the entanglement entropy of normal QCP between DQCP, the research team chose two representative models —the J1-J2 model hosting normal O(3) QCP and the J-Q3 model hosting DQCP.

Nonequilibrium increment algorithm
Based on previous methods, the research team created a highly paralleled increment algorithm. The main idea of the algorithm is to divide the whole simulation task into many smaller tasks and use massive CPUs to parallelly execute the smaller tasks thus greatly decreasing the simulation time. This improved method helped the team to simulate the two models previously mentions with high efficiency and better data quality.

Findings
With the nonequilibrium increment method, the research team successfully obtain the second Rényi entanglement entropy SA(2)  at QCP and DQCP of the two models for different system sizes. The data is shown and one can find from the insets that when deducting the leading term(area law contribution from the entanglement boundary) the signs of the sub-leading term clearly distinguish the QCP (negative in J1-J2 model,) and DQCP (positive in J-Q3 model). This finding rules out the possibility of the description of DQCP based on a unitary assumption and raises several intriguing questions about the theory of DQC. This discovery is likely to lead to a more general characterization of the critical behavior of novel quantum materials.

UCL Prof Fisher develops cancer supercomputer models to identify new drug combos to treat Covid-19

By adapting supercomputer models originally developed to understand the biology of cancer cells, UCL researchers have identified new drug combinations with the potential to treat severe cases of Covid-19 infection at different stages of the disease.

The findings could help lower the number of Covid-19 related deaths and reduce the strain on healthcare systems.

The study tested the potential impact of interfering with different aspects of SARS-CoV-2 infection and the body’s responses to the virus. Results have identified existing therapeutics that might be suitable for treating Covid-19 patients.

Although vaccines and treatments for Covid-19 now exist, additional effective and affordable treatments are still urgently required. Cases of SARS-CoV-2 infection are still highly likely to occur, particularly when new variants arise.

Tackling virus replication and immune response

Therapeutic development for Covid-19 is complicated by the need to consider different stages of the disease. Early symptoms are typically triggered by viral replication, while later and more severe disease is caused by the over-reaction of the body’s immune defenses.

Different stages of the disease are therefore likely to needed different treatments – and getting the timing wrong could have grave consequences: boosting immune responses to prevent viral replication could be highly damaging if they are already being ramped up.

The interplay between the virus, the cell it is infecting, and host immune responses involve a highly complex web of interactions. Interfering with these interactions using therapeutics could therefore have an effect throughout this web, which might help to clear the virus but could also disrupt important cellular processes and cause harmful side effects.

Model solutions

Similar issues have been faced by cancer researchers. To address this challenge, UCL researcher Professor Jasmin Fisher has developed supercomputer models of cancer cell biology, which simulate the biochemical and metabolic pathways of cells and how they are subverted by cancer-causing mutations that drive uncontrolled growth of cells.

Using these models, the Fisher Lab can explore what might happen if particular pathways or cellular processes are inhibited, individually or in combination, so that the best targets for intervention can be identified and possibly harmful side effects anticipated.

“We realized we could model SARS-CoV-2 infection using a computational framework originally developed in my lab to predict personalized treatment combinations for cancer patients and use it to predict effective repurposed drug combinations for treating Covid-19,” says Professor Fisher (UCL Cancer Institute), who was the lead writer of the study.

“We collated information available at the time on SARS-CoV-2 infection of airway cells and the immune response to infection, to create a dynamic model of viral infection and Covid-19 disease processes. Our study focused on two key stages – viral replication following initial infection (before severe symptoms emerge) and late-stage immune-driven disease, which is typically more severe,” says Professor Fisher.

The research team identified a range of therapeutic drugs, already licensed or in late development, that target processes thought to be important at these two stages. They then used their computer model to explore what might happen in cells when these processes were inhibited, mimicking the action of therapeutics. The model provided insights into impacts on virus replication and host responses, and the likely net effect to both treatments of disease and safety.

Importantly, Professor Fisher and colleagues were able to validate their model by showing that the predicted effects of therapeutics already being used to treat Covid-19 at different stages of disease – such as antivirals and anti-inflammatory drugs – matched those seen in clinical studies.

In silico screening

Using this approach, the team examined 9,870 pairs of compounds acting on 140 potential cellular targets. They were able to identify new combinations of therapeutics that would be predicted to be beneficial at either early or late stages of the disease, as well as the ‘windows’ when they might be safely deployed. For example, the combination of two drugs, Camostat and Apilimod, was predicted to have a particularly big impact on virus replication. This strong antiviral effect was confirmed using live SARS-CoV-2 cell culture assays by Dr. Ann-Kathrin Reuschl and Professor Clare Jolly in the Division of Infection and Immunity at UCL.

The study also identified cellular responses, such as the levels of certain cytokines, that correlated with mild, early-severe, and late-severe disease. These could have an important role as biomarkers to guide the use of therapeutics at the most appropriate time. “We not only need to know what drugs might work against Covid-19 but also when they might work. Our model is both a highly efficient way of prioritizing drugs for evaluation as treatments for Covid-19 and could also help ensure that Covid-19 patients get the right drug at the right time,” says Professor Fisher.

Vanderbilt astronomers discover exceedingly rare star

A team of astronomers has made the discovery of a lifetime that will help answer burning questions on the evolution of stars. The group is led by Evolutionary Studies Initiative member and Stevenson Professor of Physics and Astronomy, Keivan Stassun. The Stassun lab – 2018.

Stassun’s team generated a new model that greatly improved the way stars are measured in 2017. 

“Being able to combine all of the different types of measurements into one coherent analysis was certainly key to being able to decipher the various unusual characteristics of this star system,” Stassun said.

The model helps predict the types of planets orbiting distant stars – called exoplanets. It has been used to identify the characteristics of more than 100 stars found by the TESS space telescope and 1,000s of others. But nothing prepared the team for what this new binary star system – which is two stars orbiting each other – could tell them about our universe.

According to Stassun, “This type of star is so extremely unusual that, frankly, we would not have thought to go looking for it – nobody has seen one before!”

Stassun explained how several key ingredients make this binary star system incredibly rare. Binary star systems are not uncommon among the cosmos, but one uncommon trait of this one is its orientation. When viewed from Earth, the stars eclipse each other. This allows researchers to calculate important qualities of the two stars more easily, like their mass and luminosity.

Also, stars can change size and luminosity in a process known as pulsating, and studies of these pulsations allow astronomers to probe the inner workings of stars, akin to Earth scientists using earthquake vibrations to study the Earth’s internal structure. Two rare types of stellar pulsating exist, each of which provides a different, complementary view of stellar interiors. One of the stars in this binary star system that Stassun’s team found exhibits a hybrid of both.

“Stars exhibiting either of those pulsating behaviors are quite rare; a star exhibiting hybrid pulsating behavior is even more so,” Stassun said.

Next, this unique star has a strong magnetic field, which is decidedly uncommon for a hybrid pulsating star, and which could be a key missing ingredient in current theories for understanding the earliest stages of stellar evolution.

Finally, according to Stassun, “this is the first time that one of these rare magnetic hybrid pulsating stars has been found that is part of a star cluster and that is moreover a part of an eclipsing binary system. It seems quite unlikely that TESS will discover another star that has all of these attributes together.”

Graduate student Dax Feliz also played a major role in this project. He joined the lab as a fellow through the Fisk-Vanderbilt Masters-to-PhD Bridge Program.

According to Feliz, “the discovery of this rare eclipsing binary star system provides a fantastic testbed for understanding how stellar binaries evolve over time. As the TESS mission continues observing large patches of sky, star systems like HD 149834 which are located in star clusters can help us further our understanding of stellar evolution.”

The team received plenty of help from the Frist Center for Autism and Innovation. The center, founded by Stassun in 2018, works to understand and promote neurodiverse talents.

When asked about the center’s contribution, Stassun said, “we have students and interns who have expertise with data visualization, and that process is becoming increasingly important for detecting rare patterns in data, such as extreme – and extremely interesting – ‘outliers’ such as the system we discovered in this study.”