University of Geneva team develops new model to solve part of the solar problem

Nothing was going any more in the Sun! In the early 2000s, the abundances of the chemical elements on its surface were revised downwards, preventing astrophysicists from reconciling the values ​​predicted by their standard model with these new data. Called into question, these abundances are nevertheless holding up despite several new analyses. We will therefore have to deal with it and it will be up to the solar models to evolve, especially since they serve as a reference for the study of stars in general. A team from the University of Geneva (UNIGE) in Switzerland, in collaboration with the University of Liège in Liège, Wallonia, Belgium, has developed a new theoretical model which solves part of the problem: by taking into account the rotation of the Sun which has varied during the time, and the resulting magnetic fields, scientists have demonstrated that it is possible to explain its chemical structure.  The model developed by the scientists includes the history of the sun's rotation but also the magnetic instabilities it generates. (c) Sylvia Ekström / UNIGE

“The Sun is the star that can be best characterized. It thus constitutes a fundamental test for our understanding of stellar physics. We have abundance measurements of its chemical elements, but also measurements of its internal structure as for the Earth thanks to seismology”, explains Patrick Eggenberger, lecturer, and researcher at the UNIGE Astronomy Department. and the first author of the study.

These observations must coincide with the predictions of the theoretical models that attempt to explain its evolution. How will the Sunburn the hydrogen in its core, how the energy produced there will be transported to the outer layers, and how will the chemical elements move under the effect of rotation and magnetic fields?

The standard model of the Sun

"The standard solar model used until now considers our star in a simplified form, on the one hand about the transport of chemical elements in its deepest layers, on the other hand for its rotation and its internal magnetic fields. completely neglected up to now”, emphasizes Gaël Buldgen, a researcher in the UNIGE Department of Astronomy and co-author of the study.


However, all of this worked satisfactorily until the early 2000s, when an international scientific team drastically revised the solar abundances by providing a finer analysis. These new abundances have thrown a big stone in the pond of solar models. From then on, no model was able to reproduce the data obtained by helioseismology, ie the study of the vibrations of the Sun, in particular the abundance of helium in the envelope of the Sun.

New model

The new model developed by the UNIGE team includes not only the history of the rotation itself, undoubtedly faster in the past, but also the magnetic instabilities it generates. “We must take into account simultaneously the effects of rotation and magnetic fields on the transport of chemical elements in our stellar models. It is important for the Sun as well as for the general physics of stars, with a direct impact on the chemical evolution of the Universe since the chemical elements so important for life on Earth are manufactured in the heart of stars. says Patrick Eggenberger.

The new model succeeds in correctly predicting the concentration of helium in the outer layers of the Sun and that of lithium, which has also resisted modeling until now. “The helium abundance is correctly reproduced by the new model because the internal rotation of the Sun imposed by the magnetic fields generates a turbulent mixing which prevents this element from falling too quickly towards the center of the star; simultaneously, the abundance of lithium observed on the solar surface is also reproduced because this same mixture transports it to the hot regions where it is destroyed”, explains Patrick Eggenberger.

The problem is not fully resolved

Not all the challenges posed by helioseismology are solved by the new model, however: “Thanks to helioseismology, we know with formidable precision, to within 500 km, the region where convective motions of matter begin, at 199,500 km below the surface of the Sun. However, the theoretical models of the Sun predict a depth offset of 10,000 km!” explains Sébastien Salmon, a researcher at UNIGE and co-author of the article. If the problem still exists with the new model, it opens a new door of understanding: “With the new model of this work, we shed light on the physical processes that can help us resolve this critical disagreement.”

Similar Stars Update

“We are going to have to revise the masses, radii, and ages obtained for the stars of the solar-type that we have studied so far”, underlines Gaël Buldgen, detailing the next steps. Indeed, in the vast majority of cases, solar physics is transposed to cases of studies close to the Sun. Therefore, if the models for analyzing the Sun are modified, this update must also be performed for other stars similar to ours.

Patrick Eggenberger specifies: “This is particularly important if we want to better characterize the host stars of planets, for example within the framework of the PLATO mission.” This observatory of 24 telescopes should fly to Lagrange 2 point (1.5 million kilometers from Earth, opposite the Sun) in 2026 to discover and characterize small planets, and refine the characteristics of their host star.

Algorithms help to distinguish diseases at the molecular level

Machine learning is playing an ever-increasing role in biomedical research. Scientists at the Technical University of Munich (TUM) have now developed a new method of using molecular data to extract subtypes of illnesses. In the future, this method can help to support the study of larger patient groups. Head of the LipiTUM research group Dr. Josch Konstantin Pauling (left) and PhD student Nikolai Köhler (right) interpret the disease-related changes in lipid metabolism using a newly developed network. Image: LipiTUM

Nowadays doctors define and diagnose most diseases on the basis of symptoms. However, that does not necessarily mean that the illnesses of patients with similar symptoms will have identical causes or demonstrate the same molecular changes. In biomedicine, one often speaks of the molecular mechanisms of disease. This refers to changes in the regulation of genes, proteins or metabolic pathways at the onset of illness. The goal of stratified medicine is to classify patients into various subtypes at the molecular level in order to provide more targeted treatments.

To extract disease subtypes from large pools of patient data, new machine learning algorithms can help. They are designed to independently recognize patterns and correlations in extensive clinical measurements. The LipiTUM junior research group, headed by Dr. Josch Konstantin Pauling of the Chair for Experimental Bioinformatics has developed an algorithm for this purpose.

Complex analysis via an automated web tool

Their method combines the results of existing algorithms to obtain more precise and robust predictions of clinical subtypes. This unifies the characteristics and advantages of each algorithm and eliminates their time-consuming adjustment. “This makes it much easier to apply the analysis in clinical research,” reports Dr. Pauling. “For that reason, we have developed a web-based tool that permits online analysis of molecular clinical data by practitioners without prior knowledge of bioinformatics.”

On the website (https://exbio.wzw.tum.de/mosbi/), researchers can submit their data for automated analysis and use the results to interpret their studies. “Another important aspect for us was the visualization of the results. Previous approaches were not capable of generating intuitive visualizations of relationships between patient groups, clinical factors, and molecular signatures. This will change with the web-based visualization produced by our MoSBi tool,” says Tim Rose, a scientist at the TUM School of Life Sciences. MoSBi stands for “Molecular Signatures using Biclustering”. “Biclustering” is the name of the technology used by the algorithm.

Application for clinically relevant questions

With the tool, researchers can now, for example, represent data from cancer studies and simulations for various scenarios. They have already demonstrated the potential of their method in a large-scale clinical study. In a cooperative study conducted with researchers from the Max Planck Institute in Dresden, the Technical University of Dresden, and the Kiel University Clinic, they studied the change in lipid metabolism in the liver of patients with non-alcoholic fatty liver disease (NAFLD).

This widespread disease is associated with obesity and diabetes. It develops from the non-alcoholic fatty liver (NAFL), in which lipids are deposited in liver cells, to non-alcoholic steatohepatitis (NASH), in which the liver becomes further inflamed, to liver cirrhosis and the formation of tumors. Apart from dietary adjustments, no treatments have been found to date. Because the disease is characterized and diagnosed by the accumulation of various lipids in the liver, it is important to understand their molecular composition.

Biomarkers for liver disease

Using the MoSBi methods, the researchers were able to demonstrate the heterogeneity of the livers of patients in the NAFL stage at the molecular level. “From a molecular standpoint, the liver cells of many NAFL patients were almost identical to those of NASH patients, while others were still largely similar to healthy patients. We could also confirm our predictions using clinical data,” says Dr. Pauling. “We were then able to identify two potential lipid biomarkers for disease progression.” This is important for early recognition of the disease and its progression and the development of targeted treatments.

The research group is already working on further applications of their method to gain a better understanding of other diseases. “In the future algorithms will play an even greater role in biomedical research than they already do today. They can make it significantly easier to detect complex mechanisms and find more targeted treatment approaches,” says Dr. Pauling.

Japanese astronomers discover supermassive black holes inside of dying galaxies in early Universe

An international team of astronomers used a database combining observations from the best telescopes in the world, including the Subaru Telescope, to detect the signal from the active supermassive black holes of dying galaxies in the early Universe. The appearance of these active supermassive black holes correlates with changes in the host galaxy, suggesting that a black hole could have far-reaching effects on the evolution of its host galaxy. The COSMOS survey region surrounded by images of galaxies used in this study. In these galaxies star formation ceased around 10 billion years ago. (3-color false-color composite images combining data from the Subaru Telescope and VISTA) (Credit: NAOJ)

The Milky Way Galaxy where we live includes stars of various ages, including stars still forming. But in some other galaxies, known as elliptical galaxies, all of the stars are old and about the same age. This indicates that early in their histories elliptical galaxies had a period of prolific star formation that suddenly ended. Why this star formation ceased in some galaxies but not others are not well understood. One possibility is that a supermassive black hole disrupts the gas in some galaxies, creating an environment unsuitable for star formation.

To test this theory, astronomers look at distant galaxies. Due to the finite speed of light, it takes time for light to travel across the void of space. The light we see from an object 10 billion light-years away had to travel for 10 billion years to reach Earth. Thus the light we see today shows us what the galaxy looked like when the light left that galaxy 10 billion years ago. So looking at distant galaxies is like looking back in time. But the intervening distance also means that distant galaxies look fainter, making study difficult.

To overcome these difficulties an international team led by Kei Ito at SOKENDAI in Japan used the Cosmic Evolution Survey (COSMOS) to sample galaxies 9.5-12.5 billion light-years away. COSMOS combines data taken by world-leading telescopes, including the Atacama Large Millimeter/submillimeter Array (ALMA) and the Subaru Telescope. COSMOS includes radio waves, infrared light, visible light, and x-ray data.

The team first used optical and infrared data to identify two groups of galaxies: those with ongoing star formation and those where star formation has stopped. The x-ray and radio wave data signal-to-noise ratio was too weak to identify individual galaxies. So the team combined the data for different galaxies to produce higher signal-to-noise ratio images of “average” galaxies. In the averaged images, the team confirmed both x-ray and radio emissions for the galaxies without star formation. This is the first time such emissions have been detected for distant galaxies more than 10 billion light-years away. Furthermore, the results show that the x-ray and radio emissions are too strong to be explained by the stars in the galaxy alone, indicating the presence of an active supermassive black hole. This black hole activity signal is weaker for galaxies where star formation is ongoing.

These results show that an abrupt end in star formation in the early Universe correlates with increased supermassive black hole activity. More research is needed to determine the details of the relationship.