UK builds a new sea ice fragmentation module to help improve climate model predictions

ARCTIC sea ice is an important indicator of climate change and its rapid decline in past decades has been a wake-up call to scientists, policy-makers, and the general public.  Arctic sea-ice in summer

Now, an innovative new project featuring Dr. Byongjun (Phil) Hwang from the University of Huddersfield’s School of Applied Sciences will determine the role of sea ice fragmentation in the accelerated retreat of the Arctic ice-cap by combining new and emerging observations, new theory and process modeling. 

The research is being funded by the National Environment Research Council (NERC) as part of a responsive project award titled ‘Fragmentation and Melt of Arctic Sea-Ice’.

Climate model accuracy 

The latest assessment from the Intergovernmental Panel on Climate Change (IPCC) concluded that it was likely that the Arctic would become reliably ice-free by 2050 assuming greenhouse gas emissions continue to increase. However, the climate simulations used by the IPCC often fail to realistically capture large-scale properties of the Arctic sea ice, such as the extent, variability, and recent trends which can lead to the impairment of climate model accuracy. 

“This is why it is imperative we improve simulations of Arctic sea ice so we can provide a better understanding of the recent observed changes and deliver credible projections of the future,” said Dr. Hwang, who is Director of the University's Centre for Climate Resilient Societies.

“By building a fundamental understanding of sea ice fragmentation we will improve climate model predictions. This will help assess risks and opportunities as well as inform important policy decisions about adaptation and mitigation.”

The three-year project, which is being led by Professor Danny Feltham at the University of Reading, will result in a new sea ice fragmentation module delivered to climate and weather modeling groups including the Met Office, the National Oceanography Centre, the British Antarctic Survey, and the European Centre for Medium-Range Weather Forecasts.

As a geophysicist and remote sensing expert, Korean-born Dr. Hwang has developed a specialism in the dynamics and thermodynamics of snow and sea ice in polar regions.  He has undertaken a large number of expeditions to the Arctic, including some tough mid-winter assignments.

A seasoned Arctic researcher, Dr. Hwang has already made 15 voyages to the region, observing, recording, and analyzing seasonal changes in the ice. The data he has gathered on Arctic sea ice retreat has been an important contribution to the scientific debate about climate change.

Japanese built network models may help us understand the spread of new variants in a pandemic

New simulation shows how infectivity of new variants affects spread

Researchers from Tokyo Metropolitan University have performed numerical simulations based on network theory which show how numbers of infections in a pandemic change when a new variant emerges. They found a non-linear dependence between how infectious the new variant is compared to the existing one, an effect not seen in previous work. Their model may be applied to understand real pandemics such as COVID-19 and inform control measures. Simulation on a network of numbers of susceptible (S), infected (I) and recovered (R) from a pandemic and its variant (I’, R’) over time. At t=21, a variant was added.

Ever since it began to spread in late 2019, COVID-19 has had a devastating impact on people’s lives. With wave after wave of new variants continuing to wreak havoc around the world, scientists have been looking for ways to understand how the disease spreads. In particular, there is the issue of how new variants appear, spread, and end up displacing the existing strain. Understanding the dynamics of variants in a population is vital to controlling their spread.

A classic framework for modeling pandemic dynamics is the “compartmental” SIR model, looking at the numbers of susceptible (S), infected (I), and recovered (R) members of a population. The numbers are related by equations and solved, giving many of the salient features of how a disease spreads; the pandemic spreads rapidly before slowing down as the number of susceptible cases decreases and more patients recover. However, the model cannot account for the varied nature of the population i.e. a given infected individual does not have an equal probability of infecting all others, and the number of contacts that people have can vary greatly from one person to another. Any model that tries to capture pandemic dynamics and get to grips with where and how it spreads needs to use a more sophisticated model.

That’s why Emeritus Professor Yutaka Okabe and Professor Akira Shudo from Tokyo Metropolitan University have turned to network theory, a mathematical framework that can capture how different members of a population connect to others. Using different types of networks, they were able to create a more realistic model for how an infectious disease might spread. Key features included dynamic absorbing states, states in which the network can get stuck in overtime e.g. a state with no infected people. With a few infections and low infectivity, the network would collapse back to the infection-free state. Contrary to conventional models, the number of individuals who experienced infection does not scale linearly with how much more infectious a variant is compared to the existing strain.

The team performed a numerical simulation of the microscopic model on the network; in the middle of a simulation of infectious disease, they added a variant that is more transmissible than the original strain.  Looking at the numbers, the team found that a variant with the same infectivity as the existing strain fails to take off at all. This is a direct result of the non-linear nature of the simulation, as the network collapses back to an absorbing state with no infections. As the infectivity of the new variant is ramped up, the population becomes more likely to become infected with the variant as opposed to the existing strain, increasing the rate for the new strain at the expense of the old one. The non-linear nature of how the infection numbers increase with the variant infectivity is a product of the microscopic nature of the network model, giving a more detailed, nuanced picture than before.

The team hopes that their model may be utilized to form effective strategies to contain infectious diseases, looking at points of significant connectivity in the network and understanding how their isolation affects overall infections. As the COVID-19 pandemic continues to rage, fundamental studies of how diseases spread are a vital piece in informed decision-making aimed at bringing normal life back to society.

A milestone for MilkyWay@home - Measuring dark matter in a tidal stream of stars

Prof.Heidi Jo Newberg explains how the MilkyWay@home volunteer supercomputer was used to determine the shape and dark matter content of the ultrafaint dwarf galaxy that fell into the Milky Way three billion years ago and was ripped apart to form the Orphan-Chenab Stream (OCS).The measured dark matter mass is ten times less than that of dwarf galaxies observed today.This could mean that ultrafai...
...

Read more

Ancient dwarf galaxy reconstructed with MilkyWay@home supercomputing

Astrophysicists for the first time have calculated the original mass and size of a dwarf galaxy that was shredded in a collision with the Milky Way billions of years ago. Reconstructing the original dwarf galaxy, whose stars today thread through the Milky Way in a stellar “tidal stream,” will help scientists understand how galaxies like the Milky Way formed, and could aid in the search for dark matter in our galaxy. H.NewbergAPJ2 22 1dc39

“We’ve been running simulations that take this big stream of stars, back it up for a couple of billion years, and see what it looked like before it fell into the Milky Way,” said Heidi Newberg, a professor of physics, astrophysics, and astronomy at Rensselaer Polytechnic Institute. “Now we have a measurement from data, and it’s the first big step toward using the information to find dark matter in the Milky Way.”

Billions of years ago, the dwarf galaxy and others like it near the Milky Way were pulled into the larger galaxy. As each dwarf galaxy coalesced with the Milky Way, its stars were pulled by “tidal forces,” the same kind of differential forces that make tides on Earth. The tidal forces distorted and eventually ripped the dwarf galaxy apart, stretching its stars into a tidal stream flung across the Milky Way. Such tidal mergers are fairly common, and Newberg estimates that “immigrant” stars absorbed into the Milky Way make up most of the stars in the galactic halo, a roughly spherical cloud of stars that surrounds the spiral arms of the central disk.

{media id=276,layout=solo}

Critically, the position and velocities of the tidal stream stars carry information about the Milky Way’s gravitational field.

Reconstructing the dwarf galaxy is a research task that combines data from star surveys, physics, and Newberg’s MilkyWay@Home distributed supercomputer, which harnesses 1.5 petaflops –a measure of computer processing speed– of home computer power donated by volunteers. This large amount of processing power makes it possible to simulate the destruction of a large number of dwarf galaxies with different shapes and sizes and identify a model that best matches the tidal stream of stars that we see today.

“It’s an enormous problem, and we solve it by running tens of thousands of different simulations until we get one that actually matches. And that takes a lot of computer power, which we get with the help of volunteers all over the world who are part of MilkyWay@Home,” Newberg said, “We’re brute-forcing it, but given how complicated the problem is, I think this method has a lot of merits.”

As published today in The Astrophysical Journal, Newberg’s team estimates the total mass of the original galaxy whose stars today form the Orphan-Chenab Stream as 2x107 times the mass of our sun.

However, only a little more than 1% of that mass is estimated to be made up of ordinary matter like stars. The remainder is assumed to be a hypothetical substance called dark matter that exerts a gravitational force, but that we cannot see because it does not absorb or give off light. The existence of dark matter would explain a discrepancy between the gravitational pull of the mass of the matter we can see, and the far larger pull needed to account for the formation and movement of galaxies. The gravitational pull from dark matter is estimated to make up as much as 85% of the matter in the universe, and tidal streams of stars that fell in with dwarf galaxies could be used to determine where dark matter is located in our galaxy.

“Tidal stream stars are the only stars in our galaxy for which it is possible to know their positions in the past,” Dr. Newberg said. “By looking at the current speeds of stars along a tidal stream, and knowing they all used to be in about the same place and moving at the same speed, we can figure out how much the gravity changes along that stream. And that will tell us where the dark matter is in the Milky Way.”

The research also finds that the progenitor of the Orphan-Chenab stream has less mass than the galaxies measured in the outskirts of our galaxy today, and if this small mass is confirmed it could change our understanding of how small stellar systems form and then merge to make larger galaxies like our Milky Way.

Dr. Newberg, an expert in the galactic halo, is a pioneer in identifying stellar tidal streams in the Milky Way. One day, she hopes that MilkyWay@home will help her measure more than the properties of one disintegrated dwarf galaxy. Ideally, she would like to simultaneously fit many dwarf galaxies, their orbits, and the properties of the Milky Way galaxy itself. This goal is complicated by the fact that the properties of our galaxy change over the billions of years that it takes for a small galaxy to fall in and be ripped apart to make these tidal streams.

“By painstakingly tracking the path of stars pulled into the Milky Way, Dr. Newberg and her team are building an image that shows us not just a dwarf galaxy long-since destroyed, but also sheds light on the formation of our galaxy and the very nature of matter,” said Curt Breneman, dean of the Rensselaer School of Science.

At Rensselaer, Newberg was joined in the research by Eric J. Mendelsohn, Siddhartha Shelton, Jeffery M. Thompson. Carl J. Grillmair at the California Institute of Technology, and Lawrence M. Widrow at Queen’s University, also contributed to the finding. “Estimate of the Mass and Radial Profile of the Orphan-Chenab Stream’s Dwarf Galaxy Progenitor Using MilkyWay@home” was published with support from the National Science Foundation, and with data from the Sloan Digital Sky Survey, the Dark Energy Camera at the Cerro Tololo Inter-American Observatory, and the National Aeronautics and Space Administration/Infrared Processing & Analysis Center Infrared Science Archive.

New computational tool can guide sustainable dam siting to protect ecosystem services

Rapid hydroelectric dam expansion in the Amazon poses a serious threat to Earth’s largest and most biodiverse river basin. There are 158 dams in the Amazon River basin, with another 351 proposed; these projects are typically assessed individually, with little coordinated planning. A new study, published in Science, provides the first computational approach for evaluating basin-level tradeoffs between hydropower and ecosystem services, to guide sustainable dam siting. The location, size, and design of a hydropower dam determine its effects on the environment and ecosystem services that people rely on. Here, the recently constructed Belo Monte megadam in the Amazon lowlands of Brazil.  CREDIT runo Batista/ VPR

Coauthor Stephen Hamilton, an ecosystem ecologist at Cary Institute of Ecosystem Studies explains, “Continued hydropower development in the Amazon is inevitable. So how can that proceed in a way that optimizes energy output at the lowest environmental cost? The answer comes in selecting projects strategically, taking into account multiple environmental criteria that have thus far been too difficult to account for simultaneously in planning large numbers of potential projects.”

Hamilton was part of an interdisciplinary team of environmental and computational experts who developed ‘Amazon EcoVistas’, a novel framework to analyze proposed dam projects collectively – both for their energy generation, as well as their impacts on the environment. They analyzed five environmental criteria: river flow, river connectivity, sediment transport, fish biodiversity, and greenhouse gas emissions. Their tool uses artificial intelligence and high-performance computing to identify hydroelectric dam portfolios that meet energy production goals with the least environmental harm. 

“Our tool allows us to evaluate hydroelectric projects for their collective impacts to nature and people on the scale of the entire watershed – a rare, yet critical approach since the Amazon River and its tributaries flow through multiple countries with diverse topography,” explains coauthor Rafael Almeida, a former visiting graduate student at Cary who is currently an Assistant Professor at the University of Texas, Rio Grande Valley. The tool can also screen out particularly harmful projects, with Almeida adding, “Fragmentation of river systems, blockage of fish migrations, trapping of sediment, and emission of methane are all worsened by the absence of basin-wide planning.”

Almeida notes that the environmental criteria evaluated have social values too. Dams block sediments needed to fertilize crops grown in the floodplain. Fishery degradation threatens an important source of food and income, and river fragmentation disrupts the transportation of people and goods.

Running the ‘Amazon EcoVistas’ algorithm on the 158 existing and 351 proposed dams created scenarios based on all possible combinations of these projects. This allows it to determine the ‘Pareto-optimal frontier’ – or combination of hydropower projects that minimizes negative environmental effects for any given level of aggregate hydropower output. This process is extremely computationally intensive; between the 509 total projects, there are 2509 (or ~10153) possible combinations – with six dimensions (energy output + the five environmental criteria) evaluated for each.

Lead author Alexander Flecker, Professor in the Department of Ecology and Evolutionary Biology at Cornell University, says, “All decisions around dam siting involve complex tradeoffs. The Pareto-optimal frontier provides a clear way to evaluate those tradeoffs as we seek to balance energy production and diverse environmental consequences.”

For example, dams in steep Andean valleys of upper Amazon rivers create smaller reservoirs, and thus inundate less land and emit less methane. Dams built higher in the river system are also less disruptive for fish that need to migrate long distances, while dams built lower in the system block fish headed to upstream reaches of the river. However, Andean dams trap mountain sediments needed to nourish downstream ecosystems and maintain floodplains important to people and wildlife. And dams in steep valleys are more likely to store water at higher flows, thereby creating more disruptive alterations to flows downstream. Locations and energy generation capacities of the 158 existing (red) and 351proposed (yellow) hydropower dams in the Amazon basin.  CREDIT Flecker et al. 2022

Flecker continues, “There’s no one-size-fits-all solution to minimize negative environmental impacts of dam construction. But the most damaging impacts can be averted by weighing the various ecological and social costs of different combinations of projects. Our novel computational framework is the first to make this kind of evaluation possible on such a vast basin-wide scale.” 

“Applying our method to existing dams in the Amazon shows how a lack of coordinated planning to date has resulted in projects that are collectively more harmful than would have been the case had alternative, strategically selected portfolios of dams been built,” Almeida explains. “This is true for all five criteria that we evaluated. Planning across borders would benefit all countries in the region – both in terms of meeting energy needs and facilitating better environmental outcomes.”

By identifying opportunities for more sustainable hydropower development, ‘Amazon EcoVistas’ could prove useful to energy planners, decision-makers, and researchers working to implement strategic, whole-basin dam planning. It could also help evaluate priorities for dam removal in regions with aging dams such as North America and Europe.

Hamilton concludes, “Hydroelectric energy planning typically happens on a national basis, even though electricity is exported across borders. Our evaluations demonstrate that coordinated whole-basin planning can reduce environmental impacts while optimizing energy production and maintaining crucial ecosystem services.”