Brown bypasses the need for massive data sets with the combo of an ML, active learning techniques

When it comes to predicting disasters brought on by extreme events (think earthquakes, pandemics, or “rogue waves” that could destroy coastal structures), computational modeling faces an almost insurmountable challenge: Statistically speaking, these events are so rare that there is just not enough data on them to use predictive models to accurately forecast when they’ll happen next.

But a team of researchers from Brown University and Massachusetts Institute of Technology says it doesn’t have to be that way.

In a new study, the scientists describe how they combined statistical algorithms — which need fewer data to make accurate, efficient predictions — with a powerful machine learning technique developed at Brown and trained it to predict scenarios, probabilities, and sometimes even the timeline of rare events despite the lack of a historical record on them.

Doing so, the research team found that this new framework can provide a way to circumvent the need for massive amounts of data that are traditionally needed for these kinds of computations, instead essentially boiling down the grand challenge of predicting rare events to a matter of quality over quantity.

“You have to realize that these are stochastic events,” said George Karniadakis, a professor of applied mathematics and engineering at Brown and a study author. “An outburst of a pandemic like COVID-19, environmental disaster in the Gulf of Mexico, an earthquake, huge wildfires in California, a 30-meter wave that capsizes a ship — these are rare events and because they are rare, we don't have a lot of historical data. We don't have enough samples from the past to predict them further into the future. The question that we tackle in the paper is: What is the best possible data that we can use to minimize the number of data points we need?”

The researchers found the answer in a sequential sampling technique called active learning. These types of statistical algorithms are not only able to analyze data input into them, but more importantly, they can learn from the information to label new relevant data points that are equally or even more important to the outcome that’s being calculated. At the most basic level, they allow more to be done with less.

That’s critical to the machine learning model the researchers used in the study. Called DeepOnet, the model is a type of artificial neural network, which uses interconnected nodes in successive layers that roughly mimic the connections made by neurons in the human brain. DeepOnet is known as a deep neural operator. It’s more advanced and powerful than typical artificial neural networks because it’s actually two neural networks in one, processing data in two parallel networks. This allows it to analyze giant sets of data and scenarios at breakneck speed to spit out equally massive sets of probabilities once it learns what it’s looking for.

The bottleneck with this powerful tool, especially as it relates to rare events, is that deep neural operators need tons of data to be trained to make calculations that are effective and accurate.

In the paper, the research team shows that combined with active learning techniques, the DeepOnet model can get trained on what parameters or precursors to look for that lead up to the disastrous event someone is analyzing, even when there are not many data points.

“The thrust is not to take every possible data and put it into the system, but to proactively look for events that will signify the rare events,” Karniadakis said. “We may not have many examples of the real event, but we may have those precursors. Through mathematics, we identify them, which together with real events will help us to train this data-hungry operator.”

In the paper, the researchers apply the approach to pinpointing parameters and different ranges of probabilities for dangerous spikes during a pandemic, finding and predicting rogue waves, and estimating when a ship will crack in half due to stress. For example, with rogue waves — ones that are greater than twice the size of surrounding waves — the researchers found they could discover and quantify when rogue waves will form by looking at probable wave conditions that nonlinearly interact over time, leading to waves sometimes three times their original size.

The researchers found their new method outperformed more traditional modeling efforts, and they believe it presents a framework that can efficiently discover and predict all kinds of rare events.

In the paper, the research team outlines how scientists should design future experiments so that they can minimize costs and increase the forecasting accuracy. Karniadakis, for example, is already working with environmental scientists to use the novel method to forecast climate events, such as hurricanes.

The study was led by Ethan Pickering and Themistoklis Sapsis from MIT. DeepOnet was introduced in 2019 by Karniadakis and other Brown researchers. They are currently seeking a patent for the technology. The study was supported with funding from the Defense Advanced Research Projects Agency, the Air Force Research Laboratory, and the Office of Naval Research.

 A depiction of the Novatron nuclear fusion reactor. (Image: Novatron Fusion Group AB)
A depiction of the Novatron nuclear fusion reactor. (Image: Novatron Fusion Group AB)

Sweden increases investment in fusion energy

KTH Royal Institute of Technology a public university in Stockholm, Sweden has announced that it will make a joint investment in fusion power with Novatron Fusion Group AB and EIT InnoEnergy. The purpose of the work is to evaluate new technology intended to stabilize fusion plasma—matter that exceeds 100 million degrees Celsius.  Lisa Ericsson, Head of KTH Innovation

Handling fusion plasma poses a key challenge before fusion power can be realized as a stable and sustainable energy solution. Neither steel nor any other material can withstand the material’s extreme heat.

In France, the ITER fusion reactor stabilizes plasma through a process involving powerful magnets and advanced controls. The KTH-Novatron-EIT InnoEnergy collaboration however will evaluate a solution in which the plasma is naturally kept stable. More fusion scientists and a large number of trained fusion energy engineers will be required.

“We believe that this international collaboration is an important prerequisite for the university’s success,” says Lisa Ericsson, head of KTH Innovation. Ericsson is one of the KTH teams involved in the collaboration around the new technology, which KTH alumnus Jan Jäderberg is developing in the newly-formed company, Novatron Fusion Group.

Peter Roos, CEO of Novatron Fusion Group says: “We are proud to work in partnership with KTH Royal Institute of Technology and EIT InnoEnergy on fusion power as the sustainable energy source for the future. It highlights the scientific importance of the Novatron concept, and it will provide us with further access to state-of-the-art technology, leading research in plasma physics, and supercomputer simulations capabilities.”

The collaboration’s aim is to enable fusion on a large commercial scale so that it can be developed into a sustainable energy solution for the future when the need for electricity is expected to continue growing.

“It is important to point out that the technology faces experimental verification,” says Stefan Östlund, vice president for Global Relations at KTH and professor of Electric Power Engineering.

Östlund says that if the technology meets expectations, it could produce fusion energy more simply and cheaply than other solutions currently being tested. KTH would then be one of the players involved in making this possible via expanded fusion research and a significant increase in education in fusion technology,

Ericsson says that investment is important from the perspective of both KTH and Sweden. She says that Commonwealth Fusion Systems, a spin-off company at the Massachusetts Institute of Technology (MIT), has contributed to interest in MIT around fusion energy, which has led to the formation of a number of new fusion-related companies and research projects.

“This collaboration strengthens KTH's role both in fusion research and innovation development,” Ericsson says. “If this works, it will lead to the emergence of a completely new global industry starting at KTH.”

Östlund says he is satisfied with the collaboration between KTH and EIT InnoEnergy. For example, he mentions the innovation work via KTH Innovation and education in the form of successful master's programs.

“It has proven to be very rewarding for both parties. The example of Novatron and fusion technology is another collaboration that can develop into something highly valuable,” Östlund says.

Diego Pavia, CEO of EIT InnoEnergy, says: “Transforming a technical breakthrough into a viable and commercial solution requires indeed a range of skills, resources, and expertise that can only be achieved by bringing different partners together. We are pleased to announce this collaboration.”

Earliest galaxies in the Universe (photo: NASA - James Webb Space Telescope)
Earliest galaxies in the Universe (photo: NASA - James Webb Space Telescope)

Tel Aviv University's prof Barkana models the SARAS result of the earliest galaxies in the Universe

First-of-its-kind study sheds light on the epoch of the first stars, 200M years after the Big Bang

An international team of astrophysicists, including Prof. Rennan Barkana from Tel Aviv University's Sackler School of Physics and Astronomy at Raymond & Beverly Sackler Faculty of Exact Sciences, has managed for the first time to statistically characterize the first galaxies in the Universe, which formed only 200 million years after the Big Bang. Prof. Rennan Barkana from TAU's Sackler School of Physics and Astronomy

According to the groundbreaking results, the earliest galaxies were relatively small and dim. They were fainter than present-day galaxies, and likely processed only 5% or less of their gas into stars. Moreover, the intensity of the radio waves emitted by the earliest galaxies wasn't much higher than that of modern galaxies.

“We are trying to understand the epoch of the first stars in the Universe, known as the 'cosmic dawn', about 200 million years after the Big Bang." Prof. Rennan Barkana

Researching the "Cosmic Dawn"

This new study, carried out together with the SARAS observation team, was led by the research group of Dr. Anastasia Fialkov from the University of Cambridge, England, a former Ph.D. student of TAU's Prof. Barkana. The results of this innovative study were published in the prestigious journal Nature Astronomy.

“This is a very new field and a first-of-its-kind study”, explains Prof. Barkana. “We are trying to understand the epoch of the first stars in the Universe, known as the 'cosmic dawn', about 200 million years after the Big Bang."

"The James Webb Space Telescope, for example, can’t see these stars. It might only detect a few particularly bright galaxies from a somewhat later period. Our goal is to probe the entire population of the first stars.” 

"Since stellar radiation affects the light emitted by hydrogen atoms, we use hydrogen as a detector in our search for the first stars: if we can detect the effect of stars on hydrogen, we will know when they were born, and in what types of galaxies." Prof. Rennan Barkana.

Searching for the First Stars

According to the standard picture, before stars began to fuse heavier elements inside their cores, our Universe was nothing but a cloud of hydrogen atoms from the Big Bang (other than some helium and a lot of dark matter).

Today, the Universe is also filled with hydrogen, but in the modern Universe, it is mostly ionized due to radiation from stars.

“Hydrogen atoms naturally emit light at a wavelength of 21cm, which falls within the spectrum of radio waves”, explains Prof. Barkana. “Since stellar radiation affects the light emitted by hydrogen atoms, we use hydrogen as a detector in our search for the first stars: if we can detect the effect of stars on hydrogen, we will know when they were born, and in what types of galaxies. I was among the first theorists to develop this concept 20 years ago, and now observers can implement it in actual experiments. Teams of experimentalists all over the world are currently attempting to discover the 21cm signal from hydrogen in the early Universe.”

One of these teams is EDGES, which uses a small radio antenna that measures the average intensity of the entire sky of radio waves arriving from different periods of the cosmic dawn. In 2018, the EDGES team announced that it had found the 21cm signal from ancient hydrogen.

“There was a problem with their findings, however," says Prof. Barkana. "We could not be sure that the measured signal did indeed come from hydrogen in the early Universe. It could have been a fake signal produced by the electrical conductivity of the ground below the antenna. Therefore, we all waited for an independent measurement that would either confirm or refute these results."

"Every year the experiments become more reliable and precise, and consequently we expect to find stronger upper limits, giving us even better constraints on the cosmic dawn." Prof. Rennan Barkana

Setting Limits

"Last year, astronomers in India carried out an experiment called SARAS, in which the antenna was made to float on a lake, a uniform surface of water that could not mimic the desired signal. According to the results of the new experiment, there was a 95% probability that EDGES did not detect a real signal from the early Universe."

"SARAS found an upper limit for the genuine signal, implying that the signal from early hydrogen is likely significantly weaker than the one measured by EDGES. We modeled the SARAS result and worked out the implications for the first galaxies, i.e., what their properties were, given the upper limit determined by SARAS.  Now we can say for the first time that galaxies of certain types could not have existed at that early time.”

Prof. Barkana concludes: “Modern galaxies, such as our own Milky Way, emit large amounts of radio waves. In our study, we placed an upper limit on the star formation rate in ancient galaxies and on their overall radio emission. And this is only the beginning. Every year the experiments become more reliable and precise, and consequently, we expect to find stronger upper limits, giving us even better constraints on the cosmic dawn. We hope that shortly we will have not only limits but a precise, reliable measurement of the signal itself.”