JAIST prof Maezono discovers a new crystal structure as a candidate for superconductivity

Superconductivity refers to the loss of electrical resistance of a material and requires extremely low temperatures (< -200°C) to persist. This low temperature, otherwise known as the transition temperature, has been a limiting factor in the application of superconductors, and finding materials that can display superconductivity at higher temperatures has become an important research objective worldwide. Newly discovered crystal structure of the superconductor realizing higher transition temperature.  CREDIT Ryo Maezono from JAIST.

A breakthrough has been made with hydrogen-rich compounds (called hydrides) containing rare earth or alkaline metals, which display room temperature superconductivity at high pressures (100-200 GPa). These metal hydrides have cage-like structures of hydrogen atoms stacked on top of each other enabling them to withstand the high pressures required for the superconductivity phenomenon.  Many of the crystal structures and compositions for these hydrides have been predicted by combining various binary metal hydrides.

Now, in a study published in The Journal of Physical Chemistry on January 26, 2022, a group of researchers led by Professor Ryo Maezono from the Japan Advanced Institute of Science and Technology (JAIST) has used a supercomputer to make similar predictions for viable high-temperature ternary metal hydride superconductors containing Magnesium (Mg), an alkaline metal, and Scandium (Sc), a rare earth element. “MgH2 and ScHare known to be stable phases at ambient pressure. Therefore, MgH2 and ScH2 can be used to chemically synthesize the ternary Mg−Sc−H compounds”, explains Prof. Maezono.

For their search, the researchers initially started with certain Mg-Sc-H compounds (MgSc3Hx, MgSc2Hx, MgScHx, Mg2ScHx, and Mg3ScHx, where x = 2−12, 14, 16, and 18). Starting with random initial structures, they used the supercomputer to determine possible combinations and crystal structures that would result in a valid superconductor in a pressure range of 100-200 GPa.

To achieve superconductivity, the predicted compound must meet certain conditions: it must be thermodynamically stable, i.e., it cannot degrade into its elementary components, have a high transition temperature, have a valid synthesis route, and possess a structure capable of withstanding high pressures where the phenomenon takes place. In the simulations, four hydrogen-rich structures were found to meet the criteria: R3̅m-MgScH6, C2/m-Mg2ScH10, Immm-MgSc2H9, and Pm3̅m-Mg-(ScH4)3.

Out of the crystal structures, R3̅m-MgScH6 was found to have the highest transition temperature of (23.3 K) at 200 GPa and 41 K at 100 GPa. The compound was found to possess a hexagonal crystal structure, in which each Mg and Sc atom is surrounded by 14 H atoms (Figure 1). The transition temperature was, however, much lower than that of the binary halide counterparts (LH10 and YH10) and this low temperature was attributed to the low density of states at the Fermi level due to the lower hydrogen content.

Among the metal hydrides, ternary metal hydrides that contain hydrogen bonded to two other metals are promising candidates for low-pressure, room-temperature superconductivity. It was, however, a challenging and time-consuming process to predict the appropriate elements and crystal structure that resulted in a superconducting ternary hydride due to a large number of possible combinations with metals. With the help of supercomputers, researchers are now able to quickly determine potential superconducting candidates. The discovery of the Mg-Sc-H compounds as valid superconductors is the third such prediction for ternary hydrides made by the research group using computer simulations. “This is the third news with 'Mg/Sc' compounds following the preceding findings with 'La/Y' in December 2021 and 'Y/Mg' in January 2022. New findings are being launched one after another,” says Prof. Maezono.

Despite having low transition temperatures, the predicted Mg-Sc-H compounds remain stable at pressures that are lower than those normally observed for high-temperature superconductors. Simulations like these are enabling researchers to understand the contributions of each element towards the superconductivity phenomenon, accelerating the development of high-temperature superconductors.

UK built models show a low risk of COVID spread on Underground rail

People traveling on the London Underground and similar rail systems were at low risk of being exposed to the virus that causes COVID-19, according to supercomputer simulations.   

The modeling carried out before the emergence of the Omicron variant found that risks were reduced when ventilation was good and passengers complied with COVID-19 mitigation measures. 

These included wearing a face covering or mask; maintaining a social distance from other travelers; regularly washing or sanitizing hands; and encouraging people who had COVID-19 symptoms to stay at home.  

Scientists and engineers from the University of Leeds, the UK's Defence Science and Technology Laboratory (DSTL), and the University of Manchester developed the model to identify the relative risks of the virus spreading on a mass transit system used for short commuter journeys.        

Professor Cath Noakes, from the University of Leeds and Principal Investigator on the study, said: “All environments where people interact together to have a risk of virus transmission and public transport is no exception. Where journeys are short and not overcrowded, and the carriage is well ventilated, then the risks are likely to be quite low.    

“Wearing a face covering can significantly reduce risk of the virus spreading, particularly as it can be harder to socially distance in a Tube or subway carriage at certain times of the day.  

“Even though there may be a small chance of transmission by touching a contaminated surface, this can be managed through regular hand hygiene and avoiding touching your eyes, nose, and mouth.     

“The results show that compliance with good mitigation measures is likely to be effective in reducing infection.”   

The computer model simulated the risks of people being exposed to the virus through the main routes of transmission: being within two meters of an infectious person; touching a contaminated surface - and then touching their nose, mouth or eyes; or by breathing-in viral particles that hang in the air, known as aerosol inhalation.     

Presenting their findings in the journal Indoor Air, the scientists said: “...the risk of exposure to the virus was predicted to be low through all routes of transmission. The highest modeled doses (of the virus) were to a small proportion of people in close proximity to an infected person, which is through a combination of aerosol inhalation and direct droplet deposition.”      

Practical steps to reduce viral spread     

The scientists underline the effectiveness of continuing with well-established mitigation measures. They set out the practical steps passengers and transport operators could take to reduce the risk of staff and passengers being exposed to the virus:    

  • COVID-19 spreads more easily when people are close together, within 1-2 meters of one another. At times when COVID-19 infection rates in the community are high, measures to reduce crowding on public transport could reduce infection.
  • The greater the number of infectious passengers, the greater the risks to other travelers. The researchers said public health policies should encourage people who are infectious to stay at home. 
  • High levels of mask-wearing reduce the level of virus people are exposed to.
  • The modeling also predicted that a small minority of travelers could be exposed to a large dose of the virus by touching contaminated surfaces. Hand sanitation facilities located near high-touch points, as people get on and off trains or near escalators, could reduce this risk.  

Writing in the journal, the researchers added: “To date, there is no evidence that public transport is a major driver for the pandemic but as a shared enclosed setting where people may be at close proximity, transmission is possible and understanding the factors that influence the likelihood of transmission is important for introducing and managing effective mitigation strategies.     

“This is particularly important as public transport is a necessity for many people, and it can be an environment where social distancing is difficult to maintain, particularly in dense urban transport systems.”  

On 27 January the Government in England lifted the mandatory requirement to wear a face-covering in a range of settings, including on public transport, but guidance suggests that they should be worn in crowded and enclosed spaces. However, the Mayor for London said masks and face coverings will remain mandatory on all Transport for London services and in stations, including the Underground, unless travelers are exempt. 

In Scotland, Wales, and Northern Ireland there is still a legal requirement to wear a face-covering on public transport. 

Dr. Martin Lopez-Garcia, a mathematical modeler from the University of Leeds and co-author of the paper, said: “Measuring transmission on public transport systems is challenging. Our model provides an insight into the different factors that are likely to influence risk and should be used to effectively plan strategies that reduce the transmission of the virus.” 

The computer model was developed independently by researchers as part of the £1.7 million TRACK project, funded by the Department for Transport and the Engineering and Physical Sciences Research Council.  TRACK is a multi-strand investigation into the COVID-19 risks on public transport - buses, trams, and trains - and the ways those risks could be reduced.  

Dr. Simon Parker, a modeler from the Defence Science and Technology Laboratory (DSTL) and co-author, said: “Building a model of virus transmission for this environment requires a detailed understanding of the unique features of public transport spaces and how people use them. The results are complex but fascinating and, we hope, valuable. They reflect the interactions between disease prevalence, passenger behavior and the environment itself.”     

Are there lessons for reducing Omicron risks? 

The research team point to the limitations of their model. It does not take account of vaccination rates among the traveling public, and the model was created before the arrival of the Delta and Omicron variants. The modeling did not analyze viral spread in very overcrowded carriages, which could be seen in large numbers of people returning to their offices and workplaces. 

However, Professor Noakes believes mitigation measures will continue to reduce risks. 

She said: “The Omicron variant is more transmissible and the risks of exposure in different settings including on public transport are not yet clear. However, the mitigations identified in the study are still likely to be effective at reducing the risk of exposure to the virus.”   

Professor Charlotte Deane, Deputy Executive Chair of the EPSRC, said: “These findings demonstrate the value of developing models to assess how COVID-19 spreads, and are a valuable contribution to the evidence base. They also reinforce the point that measures such as wearing face coverings, good ventilation, and hand sanitation are likely to play an important role in limiting the spread of the virus.”

Brown researchers use tiny magnetic swirls to generate true random numbers

Skyrmions, tiny magnetic anomalies that arise in two-dimensional materials, can be used to generate true random numbers useful in cryptography and probabilistic supercomputing. Magnetic swirls called skyrmions fluctuate randomly in size, a behavior that can be harnessed to generate true random numbers. Credit: Xiao lab / Brown University

Whether for use in cybersecurity, gaming, or scientific simulation, the world needs true random numbers, but generating them is harder than one might think. But a group of Brown University physicists has developed a technique that can potentially generate millions of random digits per second by harnessing the behavior of skyrmions — tiny magnetic anomalies that arise in certain two-dimensional materials.

Their research reveals previously unexplored dynamics of single skyrmions, the researchers say. Discovered around a half-decade ago, skyrmions have sparked interest in physics as a path toward next-generation computing devices that take advantage of the magnetic properties of particles — a field known as spintronics.  

“There has been a lot of research into the global dynamics of skyrmions, using their movements as a basis for performing computations,” said Gang Xiao, chair of the Department of Physics at Brown and senior author of the research. “But in this work, we show that purely random fluctuations in the size of skyrmions can be useful as well. In this case, we show that we can use those fluctuations to generate random numbers, potentially as many as 10 million digits per second.”

Most random numbers produced by computers aren’t random in the strictest sense. Computers use an algorithm to generate random numbers based on an initial starting place, a seed number. But because the algorithm used to generate the number is deterministic, the numbers aren’t truly random. With enough information about the algorithm or its output, it could be possible for someone to find patterns in the numbers that the algorithm produces. While pseudorandom numbers are sufficient in many settings, applications like data security — which uses numbers that can’t be guessed by an outside party — require truly random numbers.

Methods of producing truly random numbers often draw on the natural world. Random fluctuations in the electrical current flowing through a resistor, for example, can be used to generate random numbers. Other techniques harness the inherent randomness in quantum mechanics — the behavior of particles at the tiniest scale.

This new study adds skyrmions to the list of true random number generators.

Skyrmions arise from the “spin” of electrons in ultrathin materials. Spin can be thought of as the tiny magnetic moment of each electron, which points up, down, or somewhere in between. Some two-dimensional materials, in their lowest energy states, have a property called perpendicular magnetic anisotropy — meaning the spins of electrons all point in a direction perpendicular to the film. When these materials are excited with electricity or a magnetic field, some of the electron spins flip as the energy of the system rises. When that happens, the spins of surrounding electrons are perturbed to some extent, forming a magnetic whirlpool surrounding the flipped electron — a skyrmion.

Skyrmions, which are generally about 1 micrometer (a millionth of a meter) or smaller in diameter, behave a bit like a kind of particle, zipping across the material from side to side. And once they’re formed, they’re very difficult to get rid of. Because they’re so robust, researchers are interested in using their movement to perform computations and to store data.

This new study shows that in addition to the global movement of skyrmions across a material, the local behavior of individual skyrmions can also be useful. For the study, which was led by Brown postdoctoral fellow Kang Wang, the researchers fabricated magnetic thin films using a technique that produced subtle defects in the material’s atomic lattice. When skyrmions form in the material, these defects, which the researchers call pinning centers, hold the skyrmions firmly in place rather than allowing them to move as they normally would.

The researchers found that when a skyrmion is held in place, they fluctuate randomly in size. With one section of the skyrmion held tightly to one pinning center, the rest of the skyrmion jumps back and forth, wrapping around two nearby pinning centers, one closer and one farther away.

“Each skyrmion jumps back and forth between a large diameter and a small diameter,” Wang said. “We can measure that fluctuation, which occurs randomly, and use it to generate random numbers.”

The change in skyrmion size is measured through what’s known as the anomalous Hall effect, which is a voltage that propagates across the material. This voltage is sensitive to the perpendicular component of electron spins. When the skyrmion size changes, the voltage changes to an extent that is easily measured. Those random voltage changes can be used to produce a string of random digits.

The researchers estimate that by optimizing the defect-spacing in their device, they can produce as many as 10 million random digits per second, providing a new and highly efficient method of producing truly random numbers.

“This gives us a new way of generating true random numbers, which could be useful for many applications,” Xiao said. “This work also gives us a new way of harnessing the power of skyrmions, by looking at their local dynamics as well as their global movements.”

Nebraska scientist in race to create immune system 'digital twin'

Tomas Heliker inspired by his son's double lung transplant at nine weeks

The renewal of a National Institutes of Health grant will enable a University of Nebraska-Lincoln researcher to continue developing a tool that illuminates the complex, multi-scale interplay of the immune system’s many components.  Tomas Helikar, Susan J. Rosowski Associate Professor of Biochemistry at the University of Nebraska-Lincoln, is among a group of scientists across the U.S. pursuing a virtual immune system.

Tomas Helikar, Susan J. Rosowski Associate Professor of biochemistry, will use the five-year, $1.8 million grant from the NIH’s Maximizing Investigators’ Research Award program to advance his work on a virtual immune system aimed at increasing our understanding of immune-related diseases and ramping up the speed and efficiency of drug development. 

Helikar thinks the model could greatly reduce the duration and cost of a drug’s journey from the lab to the marketplace, which often lasts more than 10 years and costs roughly $1.3 billion.

“By being able to map out, model, and simulate the human immune system, the goal is that we can identify more effective drug targets and eliminate the bad hypotheses that may take you down a rabbit hole that doesn’t go anywhere,” Helikar said.

The model would fill a gap in human health science. Though the immune system is arguably one of the most complex “machines” in the human body, there’s no computer representation of it that allows scientists to test hypotheses in a low-stakes environment.

“One way to think about it is that we have models of engines and rockets that we simulate before we put them to use, and we’re able to predict the different parameters that will make them work,” he said. “On the human health side, we don’t really have that equivalent. Our long-term goal is to develop a tool like that.”

It’s an ambitious task: The immune system comprises organs, tissues, antibodies, cells, genes, and more that are constantly influencing the behavior of one another. These interactions play out across different scales: in different parts of the body, at different points in time, and different levels of its organization.

To demonstrate the viability of including each of the scales in a single model, Helikar and a team of Husker collaborators — whose expertise includes software and technology development, immunology, biology, biochemistry, and beyond — used the first installment of the MIRA grant to build computational methods and a tool focused on just one type of immune cell. They selected CD4+ T cells, which are “helpers” in the immune system that stimulate other cells to fight pathogens. 

As detailed in a study recently published in PLOS Computational Biology, the team successfully built a CD4+ T cell-based model incorporating four different mathematical approaches, three spatial scales, and different immune tissues.

The team’s success in launching that model was key, Helikar said because it established a method for mathematically and computationally connecting the immune system’s different scales. But expanding the model to include more types of cells, molecules, genes, and organs will require linking together an even greater number of mathematical approaches in a computationally cost-effective way. Clearing that hurdle by improving the speed and efficiency of the model’s algorithms is a major goal of the next five years. 

Helikar also plans to enable the model to account for the physiology of a person or a particular demographic group. This step may open the door to personalized medicine, where doctors can tailor drug regimens according to a particular patient’s immune function.  

Though the project’s scale is daunting, Helikar is motivated by his experiences as a father. In 2014, his son was born with a rare genetic mutation that required a double lung transplant at nine weeks old – making him the second-youngest human ever to undergo that procedure.

The transplant has thus far gifted Helikar’s son with seven years of life — two years beyond the average life expectancy for lung transplant recipients. But one cost of the procedure is a compromised immune system: When a patient undergoes a transplant, their immune system can view the new organ as an invader and attack it. 

To blunt that response, transplant recipients take immunosuppressive medications. But those drugs weaken the immune system universally, making patients more susceptible to infectious diseases and cancer.

The trick, Helikar said, is to fine-tune the immune system so it doesn’t destroy the transplanted organ, but retains its ability to protect recipients from everything else. Seeing first-hand the need to strike this delicate balance fueled his research ambitions.   

“With my son, that made me laser-focused on the immune system,” Helikar said. “I see the importance of understanding fully how the immune system works and how we can actually rewire it and reprogram it to do what we need it to do.”

With support from the University of Nebraska Collaboration Initiative, Helikar is partnering with doctors from the University of Nebraska Medical Center, including a liver transplant expert, to explore how his model can help transplant recipients.

JAIST increases the accuracy of atomic force calculations with the space-warp coordinate transformation

Japanese researchers developed a method to accurately calculate atomic forces for elements with high atomic numbers with quantum Monte Carlo simulations TITLE: Figure1. Plot of force and energy variance ratio as a function of atomic number, Z.  CAPTION: This figure shows the ratios between the force and energy variances evaluated for several molecules. We found that the ratio scales as Z2.5 without the SWCT in the QMC calculations, while the ratio is independent of Z with the SWCT both in the VMC and LRDMC calculations, indicating that the computational cost of QMC forces with respect to Z is no worse than that of energy. Indeed, the accessible system size is not affected by QMC force calculations when the SWCT variance-reduction technique is applied.  Photo courtesy: Kousuke Nakano from JAIST.

Atomic forces are primarily responsible for the motion of atoms and their versatile arrangement patterns, which is unique for different types of materials. Atomic simulation methods are a popular choice for the calculation of these forces, the understanding of which can vastly enhance existing knowledge on how to improve a material’s function at an atomic level.

Quantum Monte Carlo (QMC) methods are high-precision, state-of-the-art simulation methods, which are used to obtain many-body wave functions, essential for the calculation of atomic forces. These methods have recently gained importance, owing to their ability to simulate the microscopic behavior of matter with extremely high accuracy, and for overcoming the disadvantages of conventional simulation methods.

The two major QMC methods are the variational Monte Carlo (VMC) method and the fixed node diffusion Monte Carlo (FNDMC) method. FNDMC methods provide more accurate results than the VMC methods since the latter is highly dependent on the quality of the trial wave function to calculate necessary parameters, such as the ground state wave function of atoms.
While QMC methods are reliable for the calculation of ground-state energies of atoms, they result in very high calculation costs for atomic forces of elements with higher atomic numbers. Moreover, there is no consensus among the research community on an efficient algorithm for QMC force evaluation, which should ideally scale well with the number of electrons and atomic number. This leads to a few inaccuracies in atomic force calculations for elements with high atomic numbers.

To address this shortcoming, a group of researchers led by Assistant Professor Kousuke Nakano from the Japan Advanced Institute of Science and Technology, recently proposed a space-warp coordinate transformation (SWCT) method, to reduce the calculation costs incurred while calculating atomic forces.

For this study, which was published in The Journal of Chemical Physics, the team utilized a “TurboRVB” software developed by Dr. Nakano et al., which implemented a ‘lattice regularized’ version of the FNDMC method, also known as the LRDMC method. Using this software, they calculated the all-electron VMC and LRDMC forces, with and without the space-warp coordinate transformation, for mono and heteronuclear atom dimers, using the following atoms: H2, Li2, N2, F2, P2, S2, Cl2, and Br2. Furthermore, they calculated the LRDMC forces using Reynolds (RE) and Variational-Drift (VD) approximation methods.

An analysis of these calculations revealed that the energy surface values derived from the LRDMC method gave equilibrium bond lengths and harmonic frequencies values, very close to previously derived experimental values for atomic dimers, in turn improving the corresponding VMC results. Moreover, the LRDMC forces calculations using both RE and VD approximations led to an improvement in VMC forces, although VD approximation-based calculations resulted in high computational costs.

Our findings indicate that the application of the so-called space-warp coordinate transformation (SWCT) is essential to reduce the computational cost of forces in QMC. Specifically, the ratio of computational costs between QMC energy and forces scales as Z2.5 without the SWCT, where Z is the atomic number. In contrast, the application of the SWCT makes the ratio independent of Z, says Dr. Nakano while discussing the findings of this study.

This is one of the most important findings of the study, suggesting that the application of SWCT methods yields a constant ratio of the computational cost of force to energy, regardless of the atomic number of the material.

“The accurate forces obtained by QMC could contribute to in-silico material designs, to design important materials such as medicines and molecular catalysts”, states Dr. Nakano, while highlighting the real-life applications of this study.

Thus, the SWCT method can enhance the application of the quantum Monte Carlo method to calculate atomic forces for substances that cannot be dealt with by conventional approaches – which will provide much-needed relief to researchers who rely on these conventional methods. These findings highlight the accuracy and reliability of the SWCT method and its utility for the development of important technologies in the field of material science.