JAIST prof Hongo estimates the bending energy of disiloxane molecule with ultra-fine molecular simulations run on a supercomputer

Understanding the structural properties of molecules found in nature or synthesized in the laboratory has always been the bread and butter of materials scientists. But, with advancements in science and technology, the endeavor has become even more ambitious: discovering new materials with highly desirable properties. To accomplish such a feat systematically, materials scientists rely upon sophisticated simulation techniques that incorporate the rules of quantum mechanics, the same rules which govern the molecules themselves. A global team of scientists considered the bending energy of a silicate molecule, disiloxane. Although the system appears to be simple and tractable, the bending energy calculation is actually a difficult problem to solve by the conventional simulation methods.  CREDIT Kenta Hongo from JAIST.

The simulation-based approach has been remarkably successful, so much so that an entire field of study called materials informatics has been dedicated to it. But there have also been instances of failure. A notable example comes from disiloxane, silicon (Si)-containing compound consisting of a Si-O-Si bridge with three hydrogen atoms at each end. The structure is simple enough, and yet it has been notoriously difficult to estimate just how much energy is needed to bend the Si-O-Si bridge. Experimental results have been inconsistent and theoretical calculations have yielded widely different values due to the sensitivity of the calculated properties to parameter choices and level of theory.

Fortunately, an international research team led by Dr. Kenta Hongo, Associate Professor at Japan Advanced Institute of Science and Technology, has now managed to solve this problem. In their study published in Physical Chemistry Chemical Physics, the team achieved this feat by using a state-of-the-art simulation technique called the “first-principles quantum Monte Carlo method” that finally overcame the difficulties other standard techniques could not surmount.

But does it all just boil down to better simulations? Not quite. “Getting an answer that does not agree with the experimentally known value is, in itself, not surprising. The agreement can improve with more care, and more expensive, simulations. But with disiloxane, the agreement becomes worse with more careful simulations,” explains Dr. Hongo. “What our method has achieved, rather, is good results without much dependence on the adjustment parameters, so that we don’t need to worry about whether the adjusted values are sufficient.”

The team compared the first-principles quantum Monte Carlo approach with other standard techniques, such as “density functional theory” (DFT) calculations and “coupled-cluster method with single and double substitutions and noniterative triples” (CCSD(T)), along with empirical measurements from previous studies. The three methods differed mainly in their sensitivity towards the “completeness” of basis sets (a set of functions used to define the quantum wavefunctions).

It turned out that for DFT and CCSD(T), the choice of basis set affected the amplitude as well as positions of zero amplitude for the wavefunctions, while for quantum Monte Carlo, it only affected the zero amplitude positions. This allowed one to tweak the amplitude such that the wavefunction shape approached that of an exact solution. “This self-healing property of the amplitude works well to reduce the basis-set dependence and lower the bias arising from an incomplete basis set in calculating the bending energy barrier,” elaborates Dr. Hongo.

While this is an exciting development in itself, Prof. Hongo points out the bigger picture. “Molecular simulations are widely used to design new medicines and catalysts. Getting rid of the fundamental difficulties in using them greatly contributes to the design of such materials. With our powerful supercomputers, the method used in our study could be a standard strategy for overcoming such difficulties,” he says.

This is indeed deserving of being called a quantum leap.

Even the smallest pollution particles change the rainfall regime in the Amazon

Researchers have found that nanoparticles resulting from human activities such as the burning of fossil fuels rapidly grow in the atmosphere and influence cloud formation

Even the finest particles of pollution influence the process of cloud formation and the rainfall regime. A study conducted in Manaus, the capital of Amazonas state in Brazil’s northern region, shows that oxidation leads small aerosols expelled by factories and car exhausts, for example, to grow very rapidly, reaching up to 400 times their original size and that this affects raindrop formation.

“Understanding cloud and rain formation mechanisms in the Amazon is a major challenge because of the complexity of the non-linear physical and chemical processes that occur in the atmosphere,”  said Paulo Artaxo, a professor at the University of São Paulo’s Physics Institute (IF-USP) and penultimate author of an article on the study published in Science Advances.

The co-authors are all researchers affiliated with institutions in the United States, except Artaxo and Luiz Augusto Machado, also a professor at IF-USP. 

The discovery enhances the accuracy of climate change studies based on mathematical models and simulations. “These nanoparticles of pollution [smaller than 10 nanometers] used to be overlooked in atmospheric calculations and models. The focus was on particles larger than 100 nm because these act as cloud condensation nuclei [on which water vapor condenses to form droplets] and change the rainfall regime. This study shows that smaller particles oxidize as they travel through the atmosphere, expanding rapidly until they reach the size necessary to become condensation nuclei,” Machado said.

The data was collected by instruments onboard a special aircraft that flew over the Manaus pollution plume for about 100 kilometers (km) in 2014 and 2015 during the Green Ocean Amazon (GOAmazon) scientific campaign. FAPESP funded the study via its support for the campaign and a Thematic Project, in both cases under the aegis of the FAPESP Research Program on Global Climate Change (RPGCC).

“Little was known about the role played by these nanoparticles in the rainfall regime,” Machado said. “It so happens that the Manaus area is unique in the world in the sense that it’s an open-air laboratory, a mega-city surrounded by forest at a great distance from other cities where we can investigate how a metropolitan area changes an environment similar to that of the pre-industrial era.”

Aerosols are microscopic solid or liquid particles suspended in the atmosphere. They are produced naturally by forests, as primary aerosols, and in the atmosphere from gases emitted naturally by forests that are known as volatile organic compounds (VOCs), as secondary aerosols. They can also be produced by human activities such as the burning of fossil fuels. The latter are the type investigated in this study. 

According to Machado, aerosols of less than 10 nm emitted by vehicle exhausts, factories, and power plants in the Manaus area form a pollution plume that is blown in a southwesterly direction by the prevailing winds. The researchers concluded that the particles grew rapidly during this journey.

“It’s very hard to estimate the effect of particulate matter on rainfall because of the large number of atmospheric variables that influence this interaction,” Machado said. “We, therefore, compared the pollution line with nearby areas that lie outside the plume. We found that the particles rapidly grow in size. By the time they’re 10 km out of Manaus, they’re larger, and at 30 km they can reach a large enough size to become condensation nuclei, affecting the formation of raindrops.”

Variable impact

Cloud formation mechanisms are complex and involve many atmospheric parameters. Small aerosols interfere in raindrop condensation, but they may intensify or reduce rainfall depending on atmospheric conditions and above all on cloud formation at each moment. According to Machado, a large amount of particulate matter in the plume creates a sort of competition for the water vapor present in clouds, and the size of the droplets decreases as a result.

“For the rain to fall, the droplets have to be a certain size,” he said. “What we call terminal droplet velocity has to be above the velocity of the upwelling air, or the cloud will be full of tiny droplets and no rain will fall.”

If a very strong vertical wind is blowing, however, it can drive this large mass of droplets to a higher altitude where they form ice particles and potentially fuel a fierce storm. “We found that as the particles grow and become condensation nuclei, scant rainfall results if they meet a small, warm cloud. The aerosols reduce the precipitation. However, if the cloud builds up to become a mass of cumulonimbus [dense, towering, vertical cloud], for example, the aerosols increase the precipitation,” Machado said. “In other words, even these small particles of pollution influence the rainfall regime.”

According to the researchers, the project will proceed on a broader basis and fresh data will be collected. This year the team will conduct an experiment called Chemistry of the Atmosphere: Field Experiment in Brazil (CAFE-Brazil), with the aid of a German aircraft that can fly as high as 15,000 m. Artaxo explained that similar studies using remote sensing are also being conducted at the 325 m ATTO tower in the heart of the Amazon Rainforest (more at agencia.fapesp.br/29665).

“In the study just published [on January 12], we collected data on a low flight path [at 4,000 m],” he said. “The German aircraft we’ll use for our next collections is one of the most sophisticated flying laboratories in existence, so we’ll be able to conduct an experiment designed to produce an understanding of key physical and chemical issues in the production of aerosols, clouds, and rain that is still a mystery to us.”

UNH researchers find lower emissions vital to slow warming

Winters are warming faster than summers in North America, impacting everything from ecosystems to the economy. Global climate models indicate that this trend will continue in future winters but there is a level of uncertainty around the magnitude of warming. Researchers at the University of New Hampshire focused on the role of carbon dioxide emissions in this equation—looking at the effects of both high and low levels of carbon dioxide emissions on future climate warming scenarios—and found that a reduction in emissions could preserve almost three weeks of snow cover and below-freezing temperatures. Webp.net resizeimage 2022 02 03T204040.394 1be66

“The local ski hills of New England raised me to love winter and snow,” said Elizabeth Burakowski, research assistant professor in UNH’s Earth Systems Research Center. “But winters are vital to all of us and taking serious action now to limit, or slow, the warming of winter could mean preserving many-core purposes of cold weather including providing more winter protection for woodland animals, preventing the spread of invasive forest pests, and increasing the ability of ski resorts to make snow—protecting the economy by maintaining the area’s multimillion-dollar recreation industry.”

In their study, recently published in the journal Northeastern Naturalist, the researchers analyzed 29 different climate models to determine the effect of reducing carbon dioxide emissions, and other heat-trapping gasses, into the atmosphere. At the current pace, by mid-century (2040–2069) ski areas in North America will face up to a 50% decline in days where conditions would be favorable to make snow. Limiting emissions could slow that to only a 10 to 30% decline in the number of snowmaking days. Colder days (below freezing) and preserving snow cover is also critical for providing winter habitats and protection for animals like porcupines and martens, a carnivorous member of the weasel family. At the current rate of warming, the researchers found that deep snowpacks could become increasingly short-lived, decreasing from the historical two months of subnivium, or beneath the snow, habitat to less than one month. The researchers say that maintaining a cold winter environment is also associated with greater soil carbon storage and helps prevent the spread of invasive and very destructive forests pests such as the Southern Pine Beetle, which was recently detected as far north as New Hampshire and Maine by UNH researchers.

“Emissions scenarios play a critical role in the loss of winter conditions, indicating a potential doubling of the loss of cold days and snow cover under higher emissions,” said Alexandra Contosta, research assistant professor at UNH’s Earth Systems Research Center. “These changes could disrupt and forever change some very significant social and ecological systems that have historically relied on cold, snowy winters for habitat, water resources, forest health, local economies, cultural practices, and human wellbeing.”

Historically, between 1980–2005, the number of snow-covered days in the Northeast was 95 days. Under the low emissions scenario, that would be reduced to 72 days—under the high emissions scenario, there would only be 56 days. Historically, New Jersey, Rhode Island, and Connecticut could expect to see 20–80 days of snow cover per season but by the end of the century, under the higher emissions scenario, they are more likely to have a snow-free winter.

Co-authors include Danielle Grogan, also at UNH; Sarah Nelson, Appalachian Mountain Club; Sarah Garlick, Hubbard Brook Research Foundation; and Nora Casson, University of Winnipeg.