Minnesota's research demos a new machine learning method that improve environmental predictions

The algorithm is 'taught' rules of the physical world to help researchers make better predictions

Machine learning algorithms do a lot for us every day--send unwanted emails to our spam folder, warn us if our car is about to back into something, and give us recommendations on what TV show to watch next. Now, we are increasingly using these same algorithms to make environmental predictions for us.

A team of researchers from the University of Minnesota, University of Pittsburgh, and U.S. Geological Survey recently published a new study on predicting flow and temperature in river networks in the 2021 Society for Industrial and Applied Mathematics (SIAM) International Conference on Data Mining (SDM21) proceedings. The study was funded by the National Science Foundation (NSF). A new machine-learning method developed by researchers at the University of Minnesota, University of Pittsburgh, and U.S. Geological Survey will provide more accurate stream and river temperature predictions, even when little data is available. These temperature predictions are used to determine suitability of aquatic habitats, evaporation rates, greenhouse gas exchange, and efficiency of thermoelectric energy production.

The research demonstrates a new machine learning method where the algorithm is "taught" the rules of the physical world to make better predictions and steer the algorithm toward physically meaningful relationships between inputs and outputs.

The study presents a model that can make a more accurate river and stream temperature predictions, even when little data is available, which is the case in most rivers and streams. The model can also better generalize to different time periods.

"Water temperature in streams is a 'master variable' for many important aquatic systems, including the suitability of aquatic habitats, evaporation rates, greenhouse gas exchange, and efficiency of thermoelectric energy production," said Xiaowei Jia, a lead author of the study and assistant professor in the University of Pittsburgh's Department of Computer Science at University in the School of Computing and Information. "Accurate prediction of water temperature and streamflow also aids in decision making for resource managers, for example helping them to determine when and how much water to release from reservoirs to downstream rivers.

A common criticism of machine learning is that the predictions aren't rooted in the physical meaning. That is, the algorithms are just finding correlations between inputs and outputs, and sometimes those correlations can be "spurious" or give false results. The model often won't be able to handle a situation where the relationship between inputs and outputs changes.

The new method published by Jia, who is also a 2020 Ph.D. graduate of the University of Minnesota Department of Computer Science and Engineering in the College of Science and Engineering, and his colleagues use "process-guided or knowledge-guided machine learning." This method is applied to a use case of water temperature prediction in the Delaware River Basin (DRB) and is designed to overcome some of the common pitfalls of prediction using machine learning. The method informs the machine learning model with a relatively simple process--correlation through time, the spatial connections between streams, and energy budget equations.

Data sparsity and variability in stream temperature dynamics are not unique to the Delaware River Basin. Relative to most of the continental United States, the Delaware River Basin is well-monitored for water temperature. The Delaware River Basin is therefore an ideal place to develop new methods for stream temperature prediction.

An interactive visual explainer released by the U.S. Geological Survey highlights these model developments and the importance of water temperature predictions in the DRB. The visualization demonstrates the societal need for water temperature predictions, where reservoirs provide drinking water to more than 15 million people, but also have competing water demands to maintain downstream flows and cold-water habitat for important game fish species. Reservoir managers can release cold water when they anticipate water temperature will exceed critical thresholds and having accurate water temperature predictions is key to using limited water resources only when necessary.

The recent study builds on a collaboration between water scientists at the U.S. Geological Survey and University of Minnesota Twin Cities computer scientists in Professor Vipin Kumar's lab in the College of Science and Engineering's Department of Computer Science and Engineering, where researchers have been developing knowledge-guided machine learning techniques.

"These knowledge-guided machine learning techniques are fundamentally more powerful than standard machine learning approaches and traditional mechanistic models used by the scientific community to address environmental problems," Kumar said.

These new generations of machine learning methods, funded by NSF's Harnessing the Data Revolution Program, are being used to address a variety of environmental problems such as improving lake and stream temperature predictions.

In another new NSF-funded study on predicting water temperature dynamics of unmonitored lakes in the American Geophysical Union's Water Resources Research led by University of Minnesota Department of Computer Science and Engineering Ph.D. candidate Jared Willard, researchers show how knowledge-guided machine learning models were used to solve one of the most challenging environmental prediction problems--prediction in unmonitored ecosystems.

Models were transferred from well-observed lakes to lakes with few to no observations, leading to accurate predictions even in lakes where temperature observations don't exist. Researchers say their approach readily scales to thousands of lakes, demonstrating that the method (with meaningful predictor variables and high-quality source models) is a promising approach for many kinds of unmonitored systems and environmental variables in the future.

Dark matter: real stuff or gravity misunderstood?

For many years now, astronomers and physicists have conflicted. Is the mysterious dark matter that we observe deep in the Universe real, or is what we see the result of subtle deviations from the laws of gravity as we know them? In 2016, Dutch physicist Erik Verlinde proposed a theory of the second kind: emergent gravity. New research, published in Astronomy & Astrophysics this week, pushes the limits of dark matter observations to the unknown outer regions of galaxies, and in doing so re-evaluates several dark matter models and alternative theories of gravity. Measurements of the gravity of 259,000 isolated galaxies show a very close relationship between the contributions of dark matter and those of ordinary matter, as predicted in Verlinde’s theory of emergent gravity and an alternative model called Modified Newtonian Dynamics. However, the results also appear to agree with a supercomputer simulation of the Universe that assumes that dark matter is real stuff.

The new research was carried out by an international team of astronomers, led by Margot Brouwer (RUG and UvA).  Further important roles were played by Kyle Oman (RUG and Durham University) and Edwin Valentijn (RUG). In 2016, Brouwer also performed the first test of Verlinde’s ideas; this time, Verlinde himself also joined the research team. In the centre of the image the elliptical galaxy NGC5982, and to the right the spiral galaxy NGC5985. These two types of galaxies turn out to behave very differently when it comes to the extra gravity – and therefore possibly the dark matter – in their outer regions. Images: Bart Delsaert (www.delsaert.com).

Matter or gravity?

So far, dark matter has never been observed directly – hence the name. What astronomers observe in the night sky are the consequences of matter that is potentially present: bending of starlight, stars that move faster than expected, and even effects on the motion of entire galaxies. Without a doubt, all of these effects are caused by gravity, but the question is: are we truly observing additional gravity, caused by invisible matter, or are the laws of gravity themselves the thing that we haven’t fully understood yet?

To answer this question, the new research uses a similar method to the one used in the original test in 2016. Brouwer and her colleagues make use of an ongoing series of photographic measurements that started ten years ago: the KiloDegree Survey (KiDS), performed using ESO’s VLT Survey Telescope in Chili. In these observations, one measures how starlight from far away galaxies is bent by gravity on its way to our telescopes. Whereas in 2016 the measurements of such ‘lens effects’ only covered an area of about 180 square degrees in the night sky, in the meantime this has been extended to about 1000 square degrees – allowing the researchers to measure the distribution of gravity in around a million different galaxies.

Comparative testing

Brouwer and her colleagues selected over 259,000 isolated galaxies, for which they were able to measure the so-called ‘Radial Acceleration Relation (RAR). This RAR compares the amount of gravity expected based on the visible matter in the galaxy, to the amount of gravity that is actually present – in other words: the result shows how much ‘extra’ gravity there is, in addition to that due to normal matter. Until now, the amount of extra gravity had only been determined in the outer regions of galaxies by observing the motions of stars, and in a region about five times larger by measuring the rotational velocity of cold gas. Using the lensing effects of gravity, the researchers were now able to determine the RAR at gravitational strengths which were one hundred times smaller, allowing them to penetrate much deeper into the regions far outside the individual galaxies.

This made it possible to measure the extra gravity extremely precisely – but is this gravity the result of invisible dark matter, or do we need to improve our understanding of gravity itself? Author Kyle Oman indicates that the assumption of ‘real stuff’ at least partially appears to work: “In our research, we compare the measurements to four different theoretical models: two that assume the existence of dark matter and form the base of computer simulations of our universe, and two that modify the laws of gravity – Erik Verlinde’s model of emergent gravity and the so-called ‘Modified Newtonian Dynamics or MOND. One of the two dark matter simulations, MICE, makes predictions that match our measurements very nicely. It came as a surprise to us that the other simulation, BAHAMAS, led to very different predictions. That the predictions of the two models differed at all was already surprising, since the models are so similar. But moreover, we would have expected that if a difference would show up, BAHAMAS was going to perform best. BAHAMAS is a much more detailed model than MICE, approaching our current understanding of how galaxies form in a universe with the dark matter much closer. Still, MICE performs better if we compare its predictions to our measurements. In the future, based on our findings, we want to further investigate what causes the differences between the simulations.”

Young and old galaxies

Thus it seems that at least one dark matter model does appear to work. However, the alternative models of gravity also predict the measured RAR. A standoff, it seems – so how do we find out which model is correct? Margot Brouwer, who led the research team, continues: “Based on our tests, our original conclusion was that the two alternative gravity models and MICE matched the observations reasonably well. However, the most exciting part was yet to come: because we had access to over 259,000 galaxies, we could divide them into several types – relatively young, blue spiral galaxies versus relatively old, red elliptical galaxies.” Those two types of galaxies come about in very different ways: red elliptical galaxies form when different galaxies interact, for example when two blue spiral galaxies pass by each other closely or even collide. As a result, the expectation within the particle theory of dark matter is that the ratio between regular and dark matter in the different types of galaxies can vary. Models such as Verlinde’s theory and MOND on the other hand do not make use of dark matter particles, and therefore predict a fixed ratio between the expected and measured gravity in the two types of galaxies – that is, independent of their type. Brouwer: “We discovered that the RARs for the two types of galaxies differed significantly. That would be a strong hint towards the existence of dark matter as a particle.”

However, there is a caveat: gas. Many galaxies are probably surrounded by a diffuse cloud of hot gas, which is very difficult to observe. If it were the case that there is hardly any gas around young blue spiral galaxies, but that old red elliptical galaxies live in a large cloud of gas – of roughly the same mass as the stars themselves – then that could explain the difference in the RAR between the two types. To reach a final judgment on the measured difference, one would therefore also need to measure the amounts of diffuse gas – and this is exactly what is not possible using the KiDS telescopes. Other measurements have been done for a small group of around one hundred galaxies, and these measurements indeed found more gas around elliptical galaxies, but it is still unclear how representative those measurements are for the 259,000 galaxies that were studied in the current research.

Dark matter for the win? A plot showing the Radial Acceleration Relation (RAR). The background is an image of the elliptical galaxy M87, showing the distance to the centre of the galaxy. The plot shows how the measurements range from high gravitational acceleration in the centre of the galaxy, to low gravitational acceleration in the far outer regions. Image: Chris Mihos (Case Western Reserve University) / ESO.

If it turns out that extra gas cannot explain the difference between the two types of galaxies, then the results of the measurements are easier to understand in terms of dark matter particles than in terms of alternative models of gravity. But even then, the matter is not settled yet. While the measured differences are hard to explain using MOND, Erik Verlinde still sees a way out for his own model. Verlinde: “My current model only applies to static, isolated, spherical galaxies, so it cannot be expected to distinguish the different types of galaxies. I view these results as a challenge and inspiration to develop an asymmetric, dynamical version of my theory, in which galaxies with a different shape and history can have a different amount of ‘apparent dark matter’.”

Therefore, even after the new measurements, the dispute between dark matter and alternative gravity theories is not settled yet. Still, the new results are a major step forward: if the measured difference in gravity between the two types of galaxies is correct, then the ultimate model, whichever one that is, will have to be precise enough to explain this difference. This means in particular that many existing models can be discarded, which considerably thins out the landscape of possible explanations. On top of that, the new research shows that systematic measurements of the hot gas around galaxies are necessary. Edwin Valentijn formulates it as follows: “As observational astronomers, we have reached the point where we can measure the extra gravity around galaxies more precisely than we can measure the amount of visible matter. The counterintuitive conclusion is that we must first measure the presence of ordinary matter in the form of hot gas around galaxies before future telescopes such as Euclid can finally solve the mystery of dark matter.”

Japanese researchers develop the most successful simulation of bubble cascading flow in Guinness beer

Researchers from Osaka University, Japan and Kirin HD explain the physics that underpins cascading flow of nitrogenated stout beer, with applications to water purification and pharmaceutical production

As far back as 1959, brewers at Guinness developed a system that fundamentally altered the texture of their draught beer. Now, researchers from Japan have solved the physics of Guinness's cascading flow, which will have widespread applications to technology in life and environmental sciences.

In a study, researchers from Osaka University have revealed why the nitrogen bubbles of Guinness draught beer flow similarly to a fluid. Bubble texture of Guinness beer in a pint glass, featuring the creamy taste of tiny bubbles and fascinating texture motion

The bubbles of many just-opened carbonated beverages simply move upwards, following Archimedes’ principle. Much of the appeal of the draught from Guinness beer is that the bubbles sink and flow collectively, known as “bubble cascade.” Brewers and researchers believe this collective flow behavior must have something to do with how draught Guinness beer is dispensed. At present, the physics of the collective flow remains unresolved, something the researchers at Osaka University aimed to address.

"A wide range of lab work and computational simulations has been useful for estimating individual and collective bubble motion, but only after the flow has occurred," says lead and senior author of the study Tomoaki Watamura, Osaka University. "We're interested in predicting cascading flow via mathematical modeling, rather than results from experiments or simulations after the fact."

To do this, the researchers used numerical simulations to approximate the fluid and bubble particles of cascading draught beer. Bench work experiments consisted of testing a transparent “pseudo-Guinness fluid,” which is a mixture of ultra-small hollow particles in tap water, and actual Guinness beer.

"The simulation results matched experimental data, over a wide range of glass sizes and other conditions," explains Watamura. "We have developed the most successful simulation of cascading flow in Guinness beer to date."

Intriguingly, cascading bubbles may not require a nitrogenated stout beer after all.

"The bubble diameter and bubble volume fraction in carbonated water, poured into the approximate dimensions of a common 200-liter drum with inclination angle, facilitate cascading bubbles," says Hideyuki Wakabayashi, Kirin HD. “Furthermore, the associated fluid motion near an inclined container wall pertains to maintenance of product quality during brewing, suggesting an immediate application of our findings."

In addition to proving insight into optimizing brewing conditions, this research has clear applications to any work that involves fermenters or cell incubation. As such, the Osaka University and Kirin HD researchers' findings may be used to meet diverse needs, such as pharmaceutical production from industrial-scale cell cultures, and city water purification. (a) Configuration and snapshot of particle concentration distribution. (b) Comparison of the distribution of simulated particles (left) and Guinness beer bubbles (right) at the midpoint. (c) Snapshots of a bubble-concentration wave forming in: a pint glass (top left), a cocktail glass (top center), and a 1-oz (30-ml) shot glass (top right). Phase diagrams of scaled velocity fluctuation (bottom). The shaded areas correspond to the typical dimensions of the glasses.