Moffitt researchers model how self-renewal processes impact skin cell evolution

Subclone size is due to persistence with larger subclones arising early in life

All normal human tissues acquire mutations over time. Some of these mutations may be driver mutations that promote the development of cancer through increased proliferation and survival, while other mutations may be neutral passenger mutations that have no impact on cancer development. Currently, it is unclear how the normal self-renewal process of the skin called homeostasis impacts the development and evolution of gene mutations in cells. In a new study. Moffitt Cancer Center has used mathematical and supercomputer modeling to demonstrate the impact of skin homeostasis on driver and passenger mutations. This figure shows snapshots of simulations at three timepoints with cross-sections revealing the mosaic nature of mutations in normal epidermis. Each patch of color is a different population of cells carrying the same mutations.

Skin cells undergo a normal life and death cycle of homeostasis. Cells in the lower basal layer proliferate, grow and move into the upper layers of the skin while undergoing cell differentiation and maturation. Eventually, the skin cells migrate into the uppermost layer of the skin where they form a protective barrier, die, and are sloughed off.

Homeostasis is typically maintained in the skin. Its thickness and growth do not significantly change over time, despite the accumulation of mutations. This is different from other tissue types that undergo increased growth and proliferation due to mutations. However, scientists are not sure how cell mutations in the skin evolve and form subclones, or groups of cells derived from a single parent cell, without impacting normal skin homeostasis.

Moffitt researchers developed a supercomputer simulation model to address these uncertainties and improve their understanding of the impact of skin homeostasis on gene mutations and subclone evolution. Supercomputer modeling can address complex biological relationships among cells that cannot be studied in typical laboratory settings. The researchers built their model based on the normal structure of the skin, including a constant cell number based on self-renewal, a constant tissue height, and a constant number of immature stem cells. They incorporated patient mutation data using GATTACA, a tool allowing you to introduce and track mutations, into their model to assess how mutations and UV exposure impact skin homeostasis and clonal populations. They also investigated the impact of two genes that are commonly mutated in nonmelanoma skin cancer, NOTCH1, and TP53.

“This study prompted the creation of several new tools, such as GATTACA, which allows you to induce and track base pair resolution mutations in any agent-based modeling framework with temporal, spatial, and genomic positional information,” said the study’s lead author Ryan Schenck, Ph.D., a mathematical oncology programmer in Moffitt’s Department of Integrated Mathematical Oncology. “Along with my lab colleague Dr. Chandler Gatenbee, we also developed EvoFreq to help visualize evolutionary dynamics, now being used in many of our publications.”

The researchers demonstrated that both passenger and driver mutations exist in subclones within the skin with a similar size and frequency. Most mutations that occur in immature stem cells are lost or are present in smaller subclones due to random stem cell death and replacement, while larger subclones are likely due to persistence and older age. Large NOTCH1 and TP53 subclones are rarely observed because they would destroy the homeostasis of the skin; however, those large subclones that do exist likely arose during an early age.

The researchers used their model to determine when subclones with NOTCH1 and TP53 mutations have a selective fitness advantage over neighboring cells without mutations. They showed using their model that subclones with NOTCH1 mutations may prevent neighboring cells from dividing into their positions, while subclones with TP53 mutations may be resistant to cell death from UV exposure. The researchers hope that their model can be used to study other processes impacted by homeostasis that cannot be studied with typical laboratory approaches.

“This work broadens our current understanding of selection and fitness acting in a homeostatic, normal tissue, where subclone size more reflects persistence rather than selective sweeps, with larger subclones being predominately older subclones,” said Alexander Anderson, Ph.D., chair of Moffitt’s Department of Integrated Mathematical Oncology. “This model strives to provide a means to explore mechanisms of increased fitness in normal, homeostatic tissue and provides a simple framework for future researchers to model their hypothesized mechanisms within squamous tissue.”

This study was supported by grants received from the National Cancer Institute (U54CA193489, U54CA217376, P01 CA196569, U01CA23238), the Wellcome Trust (108861/7/15/7, 206314/Z/17/Z), the Wellcome Centre for Human Genetics (203141/7/16/7) and Moffitt’s Center of Excellence for Evolutionary Therapy.

ASU builds ML to supercompute melting points for any compound

If you apply enough heat to a material, at some point, most things melt, just like ice cream on a hot summer day.

Engineers rely on this knowledge daily. Knowing the exact melting temperatures is a critical parameter for building any high-performance materials. From the building and safety of bridges to gas turbines and jet engines to heat shields on aircraft, all are dependent on knowing the performance limits of materials. Materials are often synthesized or processed employing the molten or liquid state, so knowing melting is critical to making new materials. CREDIT Qijun Hong, Arizona State University

Shift to the field of Earth and planetary science, and the melting points are used to reveal clues into Earth’s past and the characteristics of planets in our solar system and far-out orbiting exoplanets.

But measuring the melting temperature of a compound or material is an arduous task. That’s why, of the estimated 200,000-plus inorganic compounds, less than 10% of their melting temperatures are known.

Melting temperatures are often measured after carefully calibrating crystal structures or plotting the thermodynamic free energy curves when a material melts, creating a phase change from a solid to a liquid. This is analogous to the melting of solid ice to form liquid water. But when high-temperature materials exceed 2,000 or 3,000 degrees, finding an experimental chamber to do the measurements can be a challenge. And sometimes, rocks have complex mixtures of minerals not much larger than a grain of sand----so getting enough samples of a single mineral can also present a challenge. Materials synthesized under extreme conditions of high pressure and temperature are also often available in only very small amounts.

Now, Arizona State University researchers Qi-Jun Hong, Alexandra Navrotsky, and Sergey Ushakov, together with Axel van de Walle at Brown University have harnessed the power of artificial intelligence (AI), or machine learning (ML), to demonstrate an easier way to predict melting temperatures for potentially any compound or chemical formula.

“We employ machine learning methods to fill this gap by building a rapid and accurate mapping from chemical formula to melting temperature,” said Hong, assistant professor in the School for Engineering of Matter, Transport, and Energy, within the Ira A. Fulton Schools of Engineering.

“The model we have developed will facilitate large-scale data analysis involving melting temperature in a wide range of areas. These include the discovery of new high-temperature materials, the design of novel extractive metallurgy processes, the modeling of mineral formation, the evolution of Earth over geological time, and the prediction of exoplanet structure.”

Hong’s approach allows melting temperatures to be computed in milliseconds for any compound or chemical formula input. To do so, the research team built a model from an architecture of neural networks and trained their machine learning program on a custom-curated database encompassing 9375 materials, out of which 982 compounds have melting temperatures higher than a scorching 3100 degrees Fahrenheit (or 2000 degrees Kelvin). Materials at this temperature glow white-hot.

Hong used this methodology to explore two lines of research: 1) predicting the melting temperatures of nearly 5,000 minerals and 2) finding new materials that have extremely high melting temperatures above 3000 Kelvin (or 5000 degrees Fahrenheit).

For the minerals project, Hong’s team was able to predict melting temperatures and correlate these with the known major geological epochs of Earth’s history. These AI-garnered melting temperatures were applied to minerals made since the formation of Earth about 4.5 billion years ago. The oldest minerals originate directly from stars or interstellar and solar nebula condensates predating Earth’s formation 4.5 billion years ago. These are the most refractory, with melting temperatures around 2600 F.

For the most part, there was a gradual decrease in the calculated melting temperatures of minerals identified on Earth in more recent time, with 2 major exceptions. 

“The gradual overall decrease in the melting temperature of minerals formed during Earth history is interrupted with two anomalies, which are distinctly pronounced in average and medium melting temperatures using 250 or 500 million years ago binning,” said Navrotsky, an ASU Professor with joint faculty appointments in the School of Molecular Sciences and School for Engineering of Matter, Transport and Energy and Director of MOTU, the Navrotsky Eyring Center for Materials of the Universe.  

The first anomaly in Earth’s early history came from a dramatic temperature spike caused by a scary and dynamic time of major meteor strikes, including the possible formation of the Moon.

“The spike at 3.750 billion years ago correlates to the proposed timing of late-heavy bombardment, hypothesized exclusively from dating of lunar samples and currently debated,” said Navrotsky.

The team also noticed a large temperature dip in the melting temperatures of minerals around 1.75 billion years ago.

“The dip at 1.750 billion years ago is related to the first known occurrences of a large number of hydrous (water-containing) minerals and correlates with the Huronian glaciation, the longest ice age thought to be the first time Earth was completely covered in ice.”

With their machine learning program trained to successfully replicate mineral melting in  Earth’s early history, next, the team turned their attention to finding new materials that have extremely high melting temperatures. Dozens of new materials are identified and computationally predicted to have extremely high melting temperatures above 5,000 degrees Fahrenheit (3000 Kelvin), more than half the temperature of the Sun’s surface.

The team made their model simple and reliable enough that any user can obtain the melting temperature within seconds for any compound based only on its chemical formula.

“To use the model, a user needs to visit the webpage and input the chemical compositions of the material of interest,” said Hong. “The model will respond with a predicted melting temperature in seconds, as well as the actual melting temperatures of the nearest neighbors (i.e., the most similar materials) in the database. Thus, this model serves as not only a predictive model but a handbook of melting temperature as well.”

The model is now publicly available on the ASU webpage: https://faculty.engineering.asu.edu/hong/melting-temperature-predictor/.

The research is supported by U.S. National Science Foundation under Collaborative Research Awards DMR-2015852, 2209026 (ASU) and DMR-1835939, 2209027 (Brown University). The research team was led by Alexandra Navrotsky (Principal Investigator), Professor in SMS and SEMTE (ASU), Qi-Jun Hong, Assistant Professor in SEMTE School of Engineering (ASU), Sergey Ushakov, Research Professor in SMS (ASU), and Axel van de Walle, Professor at Brown University.

University of Zurich uses AI to improve treatment in women with heart attacks

Heart attacks are one of the leading causes of death worldwide, and women who suffer a heart attack have a higher mortality rate than men. This has been a matter of concern to cardiologists for decades and has led to controversy in the medical field about the causes and effects of possible gaps in treatment. The problem starts with the symptoms: unlike men, who usually experience chest pain with radiation to the left arm, a heart attack in women often manifests as abdominal pain radiating to the back or as nausea and vomiting. These symptoms are unfortunately often misinterpreted by the patients and healthcare personnel – with disastrous consequences.

Risk profile and the clinical picture are different in women

An international research team led by Thomas F. Lüscher, professor at the Center for Molecular Cardiology at the University of Zurich (UZH), has now investigated the role of biological sex in heart attacks in more detail. “Indeed, there are notable differences in the disease phenotype observed in females and males. Our study shows that women and men differ significantly in their risk factor profile at hospital admission,” says Lüscher. When age differences at admission and existing risk factors such as hypertension and diabetes are disregarded, female heart-attack patients have higher mortality than male patients. “However, when these differences are taken into account statistically, women and men have similar mortality,” the cardiologist adds.

Current risk models favor the under-treatment of female patients

In their study, published in the prestigious journal The Lancet, researchers from Switzerland and the United Kingdom analyzed data from 420,781 patients across Europe who had suffered the most common type of heart attack. “The study shows that established risk models which guide current patient management are less accurate in females and favor the undertreatment of female patients,” says first author Florian A. Wenzl of the Center for Molecular Medicine at UZH. “Using a machine learning algorithm and the largest datasets in Europe we were able to develop a novel artificial- intelligence-based risk score which accounts for sex-related differences in the baseline risk profile and improves the prediction of mortality in both sexes,” Wenzl says.

AI-based risk profiling improves individualized care

Many researchers and biotech companies agree that artificial intelligence and Big Data analytics are the next steps on the road to personalized patient care. “Our study heralds the era of artificial intelligence in the treatment of heart attacks,” says Wenzl. Modern computer algorithms can learn from large data sets to make accurate predictions about the prognosis of individual patients – the key to individualized treatments.

Thomas F. Lüscher and his team see huge potential in the application of artificial intelligence for the management of heart disease both in male and female patients. “I hope the implementation of this novel score in treatment algorithms will refine current treatment strategies, reduce sex inequalities, and eventually improve the survival of patients with heart attacks – both male and female,” says Lüscher.

Japan takes a step closer to probabilistic supercomputing with spin-transfer torque in superparamagnetic tunnel junctions

Tohoku University scientists in Japan have developed a mathematical description of what happens within tiny magnets as they fluctuate between states when an electric current and magnetic field are applied. Their findings could act as the foundation for engineering more advanced supercomputers that can quantify uncertainty while interpreting complex data.

Classical computers have gotten us this far, but there are some problems that they cannot address efficiently. Scientists have been working on addressing this by engineering computers that can utilize the laws of quantum physics to recognize patterns in complex problems. But these so-called quantum supercomputers are still in their early stages of development and are extremely sensitive to their surroundings, requiring extremely low temperatures to function.

Now, scientists are looking at something different: a concept called probabilistic supercomputing. This type of supercomputer, which could function at room temperature, would be able to infer potential answers from complex input. A simplistic example of this type of problem would be to infer information about a person by looking at their purchasing behavior. Instead of the computer providing a single, discrete result, it picks out patterns and delivers a good guess of what the result might be. (left) Bird-view of a superparamagnetic tunnel junction device. (right) Top view of scanning electron microscope image of the actual device.  CREDIT Shun Kanai et al.

There could be several ways to build such a supercomputer, but some scientists are investigating the use of magnetic tunnel junctions. These are made from two layers of magnetic metal separated by an ultrathin insulator (Fig. 1). When these nanomagnetic devices are thermally activated under an electric current and magnetic field, electrons tunnel through the insulating layer. Depending on their spin, they can cause changes, or fluctuations, within the magnets. These fluctuations, called p-bits, which are the alternative to the on/off or 0/1 bits we have all heard about in classical computers, could form the basis of probabilistic supercomputing. But to engineer probabilistic computers, scientists need to be able to describe the physics that happens within magnetic tunnel junctions.

This is precisely what Shun Kanai, Tohoku University's Research Institute of Electrical Communication professor, and his colleagues have achieved.

"We have experimentally clarified the 'switching exponent' that governs fluctuation under the perturbations caused by the magnetic field and spin-transfer torque in magnetic tunnel junctions," says Kanai. "This gives us the mathematical foundation to implement magnetic tunnel junctions into the p-bit to sophisticatedly design probabilistic computers. Our work has also shown that these devices can be used to investigate unexplored physics related to thermally activated phenomena."

Arkansas researchers show how climate change is increasing frequency of fish mass die-offs

A study of die-offs finds that air and water temperatures are reliable predictors of fish mass mortality events and projects significant increases in the frequency of such events. Simon Tye

As the planet’s climate has gotten warmer, so has the prevalence of fish die-offs or mass mortality events. These die-offs can have severe impacts on the function of ecosystems, imperil existing fish populations and reduce the global food supply. And the frequency of these events appears to be accelerating, with potentially dire consequences for the world if global carbon emissions are not substantially reduced over the 21st century.

Those are the findings of a recent paper co-authored by two members of the University of Arkansas Department of Biological Sciences: doctoral student Simon Tye and associate professor Adam Siepielski, along with several of their colleagues.

The paper, “Climate warming amplifies the frequency of fish mass mortality events across the north temperate lakes,” compiled 526 documented cases of fish die-offs that occurred across Minnesota and Wisconsin lakes between 2003 and 2013. The researchers determined there were three main drivers of these events: infectious diseases, summerkills, and winterkills.

The researchers then narrowed their focus to summerkills — fish mortalities associated with warm temperatures. They found a strong relationship between local air and water temperatures and the occurrence of these events, meaning they increased in frequency as temperature increased. Moreover, their models that used either air or water temperature provided similar results, which is important because air temperature data is more widely available than water temperature data across the world.

Finally, with a historical baseline established, the team used air and water temperature-based models to predict frequencies of future summerkills.

The results were sobering. Based on local water temperature projections, the models predicted an approximately six-fold increase in the frequency of fish mortality events by 2100, while local air temperature projections predicted a 34-fold increase. Importantly, these predictions were based on temperature projections from the most severe climate change scenario, which was the only scenario with the necessary data for these analyses.

As Tye explained, “If there are eight summerkills per year now, the models suggest we could have about 41 per year based on water temperature estimates or about 182 per year based on air temperature estimates.”

“We think predictions from the water temperature model are more realistic, whereas predictions from the air temperature model indicate we need to better understand how and why regional air and water temperature estimates differ over time to predict how many mortality events may occur.”

Nevertheless, their models reveal strong associations between rising temperatures and frequencies of ecological catastrophes.

Though the study used data related to temperate northern lakes, Tye said the study is pertinent to Arkansas. “One of the findings of the paper is that similar deviations in temperature affect all types of fish, such that a regional heatwave could lead to mortalities of both cold- and warm-water fish,” he said.

“Specifically, climate change is more than gradually increasing temperatures because it also increases temperature variation, such as we experienced much of this summer,” he explained “In turn, our findings suggest these rapid changes in temperature affect a wide range of fish regardless of their thermal tolerance.”

Siepielski added, “This work is important because it demonstrates the feasibility of using readily obtainable data to anticipate fish die-offs.

“As with many examples of how climate warming is negatively affecting wild animal populations, this work reveals that temperature extremes can be particularly detrimental.”

“The large scale of the project, using thousands of lakes and over a million air and temperature data points, is particularly impressive,” Siepielski added. “Lakes outside the study area, including those in Arkansas and surrounding areas, are not likely to be immune to these events increasing in frequency.”

Siepielski encouraged citizens of Arkansas to help document these events when they find