University of Michigan uncovers electronic, geometric descriptors of chemical activity for metal alloys, oxides using unsupervised machine learning

In a finding that could help pave the way toward cleaner fuels and more sustainable chemical industry, researchers at the University of Michigan have used machine learning to predict how the compositions of metal alloys and metal oxides affect their electronic structures. 

The electronic structure is key to understanding how the material will perform as a mediator, or catalyst, of chemical reactions.

"We're learning to identify the fingerprints of materials and connect them with the material's performance," said Bryan Goldsmith, the Dow Corning Assistant Professor of Chemical Engineering.

A better ability to predict which metal and metal oxide compositions are best for guiding which reactions could improve large-scale chemical processes such as hydrogen production, production of other fuels and fertilizers, and manufacturing of household chemicals such as dish soap.

"The objective of our research is to develop predictive models that will connect the geometry of a catalyst to its performance. Such models are central for the design of new catalysts for critical chemical transformations," said Suljo Linic, the Martin Lewis Perl Collegiate Professor of Chemical Engineering. 

One of the main approaches to predicting how a material will behave as a potential mediator of a chemical reaction is to analyze its electronic structure, specifically the density of states. This describes how many quantum states are available to the electrons in the reacting molecules and the energies of those states.

Usually, the electronic density of states is described with summary statistics—average energy or skew that reveals whether more electronic states are above or below the average, and so on.

"That's OK, but those are just simple statistics. You might miss something. With principal component analysis, you just take in everything and find what's important. You're not just throwing away information," Goldsmith said.

The principal component analysis is a classic machine learning method, taught in introductory data science courses. They used the electronic density of states as input for the model, as the density of states is a good predictor for how a catalyst's surface will adsorb, or bond with, atoms and molecules that serve as reactants. The model links the density of states with the composition of the material.

Unlike conventional machine learning, which is essentially a black box that inputs data and offers predictions in return, the team made an algorithm that they could understand.

"We can see systematically what is changing in the density of states and correlate that with geometric properties of the material," said Jacques Esterhuizen, a doctoral student in chemical engineering and first author on the paper in Chem Catalysis.

This information helps chemical engineers design metal alloys to get the density of states that they want for mediating a chemical reaction. The model accurately reflected correlations already observed between a material's composition and its density of states, as well as turning up new potential trends to be explored.

The model simplifies the density of states into two pieces or principal components. One-piece essentially covers how the atoms of the metal fit together. In a layered metal alloy, this includes whether the subsurface metal is pulling the surface atoms apart or squeezing them together, and the number of electrons that the subsurface metal contributes to bonding. The other piece is just the number of electrons that the surface metal atoms can contribute to bonding. From these two principal components, they can reconstruct the density of states in the material.

This concept also works for the reactivity of metal oxides. In this case, the concern is the ability of oxygen to interact with atoms and molecules, which is related to how stable the surface oxygen is. Stable surface oxygens are less likely to react, whereas unstable surface oxygens are more reactive. The model accurately captured the oxygen stability in metal oxides and perovskites, a class of metal oxides.

Stanford Earth study finds wildfire smoke exposure during pregnancy increases preterm birth risk

A new Stanford University study suggests that exposure to wildfire smoke during pregnancy increases the risk that a baby will be born too early.

The study, published Aug. 14 in Environmental Research, finds there may have been as many as 7,000 extra preterm births in California attributable to wildfire smoke exposure between 2007 and 2012. These births occurred before 37 weeks of pregnancy when incomplete development heightens the risk of various neurodevelopmental, gastrointestinal, and respiratory complications, and even death.

Wildfire smoke contains high levels of the smallest and deadliest type of particle pollution, known as PM 2.5. These specks of toxic soot, or particulate matter, are so fine they can embed deep in the lungs and pass into the bloodstream, just like the oxygen molecules we need to survive. Average number of smoke days per year.  CREDIT Heft-Neal et al. 2021, Environmental Research

The research comes as massive wildfires are again blazing through parched landscapes in the western U.S. – just a year after a historic wildfire season torched more than 4 million acres of California and produced some of the worst daily air pollution ever recorded in the state. During the 2020 fire season, more than half of the state’s population experienced a month of wildfire smoke levels in the range of unhealthy to hazardous.

This year could be worse, said Stanford environmental economist Marshall Burke, a co-author of the new study. And yet much remains unknown about the health impacts of these noxious plumes, which contribute a growing portion of fine particle pollution nationwide and have a different chemical makeup from other ambient sources of PM 2.5, such as agriculture, tailpipe emissions, and industry.

The authors say one possible explanation for the link between wildfire smoke exposure and preterm birth is that the pollution may trigger an inflammatory response, which then sets delivery in motion. The increase in risk is relatively small in the context of all the factors that contribute to the birth of a healthy, full-term baby. “However, against a backdrop where we know so little about why some women deliver too soon, prematurely, and why others do not, finding clues like the one here helps us start piecing the bigger puzzle together,” said co-author Gary Shaw, DrPH, a professor of pediatrics and co-primary investigator of Stanford’s March of Dimes Prematurity Research Center.

Extreme wildfires

The new results show wildfire smoke may have contributed to more than 6 percent of preterm births in California in the worst smoke year of the study period, 2008, when a severe lightning storm, powerful winds, high temperatures, and a parched landscape combined for a deadly and destructive fire season – one that has now been dwarfed by the record-setting infernos of 2020 and ongoing blazes like the Dixie fire in Northern California.

“In the future, we expect to see more frequent and intense exposure to wildfire smoke throughout the West due to a confluence of factors, including climate change, a century of fire suppression, and construction of more homes along the fire-prone fringes of forests, scrublands and grasslands. As a result, the health burden from smoke exposure – including preterm births – is likely to increase,” said lead author Sam Heft-Neal, a research scholar at Stanford’s Center on Food Security and the Environment.

The research provides new evidence for the value of investing in prescribed burns, mechanical thinning, or other efforts to reduce the risk of extreme wildfires. Given that premature births cost the U.S. healthcare system an estimated $25 billion per year, even modest reductions in preterm birth risk could yield “enormous societal benefits,” said Burke, an associate professor of Earth system science at Stanford’s School of Earth, Energy & Environmental Sciences (Stanford Earth). “Our research highlights that reducing wildfire risk and the air pollution that accompanies it is one way of achieving these societal benefits.”

‘No safe level of exposure’

The researchers analyzed satellite data of smoke plumes from the National Oceanic and Atmospheric Administration (NOAA) to identify smoke days for each of 2,610 zip codes. They paired these data with estimates of ground-level PM 2.5 pollution, which were developed using a machine learning algorithm that incorporates data from air quality sensors, satellite observations, and supercomputer models of how chemicals move through Earth’s atmosphere. They pulled additional data from California birth records, excluding twins, triplets, and higher multiples, which commonly arrive early.

After accounting for other factors known to influence preterm birth risks, such as temperature, baseline pollution exposure, and the mother’s age, income, race, or ethnic background, they looked at how patterns of preterm birth within each zip code changed when the number and intensity of smoke days rose above normal for that location.

They found every additional day of smoke exposure during pregnancy raised the risk of preterm birth, regardless of race, ethnicity, or income. And a full week of exposure translated to a 3.4 percent greater risk relative to a mother exposed to no wildfire smoke. During the second trimester, exposure to intense smoke – between 14 and 26 weeks of pregnancy – had the strongest impact, especially when smoke contributed more than 5 additional micrograms per cubic meter to daily PM 2.5 concentrations. “If one can avoid smoke exposure by staying indoors or wearing an appropriate mask while outdoors, that would be good health practice for all,” Shaw said.

The findings build on an established link between particle pollution and adverse birth outcomes, including preterm birth, low birth weight, and infant deaths. But the study is among the first to isolate the effect of wildfire smoke on early births and to tease out the importance of exposure timing.

“Our work, together with a number of other recent papers, clearly shows that there’s no safe level of exposure to particulate matter. Any exposure above zero can worsen health impacts,” said Burke, who is also deputy director of the Center on Food Security and the Environment and a senior fellow at Stanford’s Freeman Spogli Institute for International Studies. “While as a society it will be extremely difficult to fully eliminate all pollutants from the air, our research suggests that further reductions in key pollutants below current ‘acceptable’ levels could be massively beneficial for public health.”

UVA builds a quantum platform that accelerates the transition to integrated photonics on a chip smaller than a penny

The quantum supercomputing market is projected to reach $65 billion by 2030, a hot topic for investors and scientists alike because of its potential to solve incomprehensibly complex problems.

Drug discovery is one example. To understand drug interactions, a pharmaceutical company might want to simulate the interaction of two molecules. The challenge is that each molecule is composed of a few hundred atoms, and scientists must model how these atoms might array themselves when their respective molecules are introduced. The number of possible configurations is infinite—more than the number of atoms in the entire universe. Only a quantum supercomputer can represent, much less solve, such an expansive, dynamic data problem.

Mainstream use of quantum supercomputing remains decades away, while research teams in universities and private industry across the globe work on different dimensions of the technology.

A research team led by Xu Yi, assistant professor of electrical and computer engineering at the University of Virginia School of Engineering and Applied Science, has carved a niche in the physics and applications of photonic devices, which detect and shape light for a wide range of uses including communications and computing. His research group has created a scalable quantum computing platform, which drastically reduces the number of devices needed to achieve quantum speed, on a photonic chip the size of a penny.

Olivier Pfister, professor of quantum optics and quantum information at UVA, and Hansuek Lee, assistant professor at the Korean Advanced Institute of Science and Technology, contributed to this success.

Nature Communications recently published the team’s experimental results, A Squeezed Quantum Microcomb on a Chip. Two of Yi’s group members, Zijiao Yang, a Ph.D. student in physics, and Mandana Jahanbozorgi, a Ph.D. student of electrical and computer engineering, are the paper’s co-first authors. A grant from the National Science Foundation’s Engineering Quantum Integrated Platforms for Quantum Communication program supports this research.

Quantum computing promises an entirely new way of processing information. Your desktop or laptop computer processes information in long strings of bits. A bit can hold only one of two values: zero or one. Quantum computers process information in parallel, which means they don’t have to wait for one sequence of information to be processed before they can compute more. Their unit of information is called a qubit, a hybrid that can be one and zero at the same time. A quantum mode, or qumode, spans the full spectrum of variables between one and zero—the values to the right of the decimal point.

Researchers are working on different approaches to efficiently produce the enormous number of qumodes needed to achieve quantum speeds.

Yi’s photonics-based approach is attractive because a field of light is also full spectrum; each light wave in the spectrum has the potential to become a quantum unit. Yi hypothesized that by entangling fields of light, the light would achieve a quantum state.  

You are likely familiar with the optical fibers that deliver information through the internet. Within each optical fiber, lasers of many different colors are used in parallel, a phenomenon called multiplexing. Yi carried the multiplexing concept into the quantum realm.

Micro is key to his team’s success. UVA is a pioneer and a leader in the use of optical multiplexing to create a scalable quantum computing platform. In 2014, Pfister’s group succeeded in generating more than 3,000 quantum modes in a bulk optical system. However, using this many quantum modes requires a large footprint to contain the thousands of mirrors, lenses, and other components that would be needed to run an algorithm and perform other operations.

“The future of the field is integrated quantum optics,” Pfister said. “Only by transferring quantum optics experiments from protected optics labs to field-compatible photonic chips will bona fide quantum technology be able to see the light of day. We are extremely fortunate to have been able to attract to UVA a world expert in quantum photonics such as Xu Yi, and I’m very excited by the perspectives these new results open to us.”

Yi’s group created a quantum source in an optical microresonator a ring-shaped, millimeter-sized structure that envelopes the photons and generates a microcobe, a device that efficiently converts photons from single to multiple wavelengths. Light circulates the ring to build up optical power. This power buildup enhances chances for photons to interact, which produces quantum entanglement between fields of light in the microcomb.

Through multiplexing, Yi’s team verified the generation of 40 qumodes from a single microresonator on a chip, proving that multiplexing of quantum modes can work in integrated photonic platforms. This is just the number they can measure.

“We estimate that when we optimize the system, we can generate thousands of qumodes from a single device,” Yi said.

Yi’s multiplexing technique opens a path toward quantum computing for real-world conditions, where errors are inevitable. This is true even in classical computers. But quantum states are much more fragile than classical states.

The number of qubits needed to compensate for errors could exceed one million, with a proportionate increase in the number of devices. Multiplexing reduces the number of devices needed by two or three orders of magnitude.

Yi’s photonics-based system offers two additional advantages in the quantum computing quest. Quantum computing platforms that use superconducting electronic circuits require cooling to cryogenic temperatures. Because the photon has no mass, quantum computers with photonic integrated chips can run or sleep at room temperature. Additionally, Lee fabricated the microresonator on a silicon chip using standard lithography techniques. This is important because it implies the resonator or quantum source can be mass-produced.

“We are proud to push the frontiers of engineering in quantum computing and accelerate the transition from bulk optics to integrated photonics,” Yi said. “We will continue to explore ways to integrate devices and circuits in a photonics-based quantum computing platform and optimize its performance.”