NIST researchers resurrect, improve a technique for detecting transistor defects

A traditional method gets a new lease on life and may provide a new standard for measuring electric current. 

Researchers at the National Institute of Standards and Technology (NIST) have revived and improved a once-reliable technique to identify and count defects in transistors, the building blocks of modern electronic devices such as smartphones and computers. Over the past decade, transistor components have become so small in high-performance computer chips that the popular method, known as charge pumping, could no longer count defects accurately. NIST’s new and improved method is sensitive enough for the most modern, minuscule technology, and can provide an accurate assessment of defects that could otherwise impair the performance of transistors and limit the reliability of the chips in which they reside. Webp.net resizeimage 2022 02 04T200807.552 fc757

The new, modified charge pumping technique can detect single defects as small as the diameter of a hydrogen atom (one-tenth of a billionth of a meter) and can indicate where they’re located in the transistor. Researchers could also use the new capability to detect and manipulate a property in each electron known as quantum spin. The ability to manipulate individual spins has applications in both basic research and quantum engineering and computing.

Transistors act as electrical switches. In the on position, which represents the “1” of binary digital information, a designated amount of current flows from one side of a semiconductor to the other. In the off position, representing the “0” of binary logic, the current ceases to flow.

Defects in a transistor can interfere with the reliable flow of current and significantly degrade the performance of transistors. These defects could be broken chemical bonds in the transistor material. Or they could be atomic impurities that trap electrons in the material. Scientists have devised several ways to categorize defects and minimize their impact, tailored to the structure of the transistor under study.

In the traditional design known as the metal oxide semiconductor field-effect transistor (MOSFET), a metal electrode called the gate sits atop a thin insulating layer of silicon dioxide. Below the insulating layer lies the interface region that separates the insulating layer and the main body of the semiconductor. In a typical transistor, current travels through a narrow channel, only one billionth of a meter thick, that extends from the source, which lies on one side of the gate, to a “drain” on the other side. The gate controls the amount of current in the channel.

Charge pumping is a two-step process in which the examiner alternately pulses the gate with a positive test voltage, then a negative one. (The transistor does not act as an on/off switch during this testing mode.) In traditional charge pumping, the alternating voltage pulses are applied at a single, set frequency.

In the first step of the test, the positive voltage attracts or pumps electrons, which are negatively charged, to the boundary or interface between the gate’s insulating layer and the body of the transistor. Some of the pumped electrons become trapped in defects at the interface, but there are many electrons left over. In the second step, a negative voltage is applied, to rid the interface of the excess electrons, leaving only the trapped ones behind. The negative voltage also attracts positive charge carriers, known as “holes,” to the region, where they combine with electrons trapped in the defects. This activity generates a current proportional to the number of defects. The greater the output current, the larger the number of defects.

In the recent past, the current was indeed a reliable measure of defects. However, the insulating oxide layer in modern transistors is now so thin — just 10 to 20 hydrogen atoms wide — that an effect from the realm of quantum mechanics comes into play, confounding measurements using the traditional charge-pumping method.

According to quantum theory, electrons and other subatomic particles can never be truly trapped; there’s always some probability they will escape or “tunnel” out of an enclosure or boundary layer. The thinner the material, the higher the probability that electrons will escape, creating a tunneling current. As transistor dimensions shrank, the tunneling current leaking through the insulating oxide layer made it nearly impossible to detect defects with ordinary charge pumping. Scientists all but abandoned the technique.

NIST researchers James Ashton, Mark Anders, and Jason Ryan have now found a way to salvage the technique so that it not only works for ultrathin transistor components but is also more sensitive, enabling scientists to record signals from a single defect. The solution arose when the scientists came to a key realization: The current that results from quantum tunneling remains virtually the same, regardless of the frequency at which charge pumping pulses the positive and negative voltages.

Armed with that knowledge, the team revised the charge pumping technique by alternately applying the method’s positive and negative voltages at two different frequencies rather than the single frequency used in the traditional method. Applying the voltages at two different frequencies gave the researchers two different output currents. By subtracting one output current from the other, the constant signal from the quantum tunneling current dropped out. With the confounding tunneling current eliminated, the researchers were able to detect defects in transistors with ultrasmall features. The researchers reported their development of the frequency-modulated charge pumping technique online in the Feb. 2 Applied Physics Letters.

“We’ve given charge pumping a new lease on life,” said Ashton.

“The modulated-frequency technique is now useful for looking at single interface defects, which gives engineers control of single electron charges in a very sensitive measurement scheme,” he added.

Since only one electron is involved, the output current is equal to multiples of the electron’s charge, a fundamental physics constant calculated by NIST and other institutions.

Because the method can detect single electrons, it may serve as a sensitive probe of an electron’s quantum spin. Modulated-frequency charge pumping may provide a valuable guide to scientists who are now exploring how electron spin might store and transfer information in a computer of the future. It may also prove useful in quantum metrology, as a potential new way of determining a quantum standard of electrical current.

Barton’s lab develops new method that applies physics to models in epidemiology

During the SARS-CoV-2 pandemic, multiple new and more transmissible variants of the virus have emerged. Understanding how specific mutations affect SARS-CoV-2 transmission could help us to better understand the biology of the virus and to control outbreaks. 

This, however, is a challenging task, said John Barton, an assistant professor of physics and astronomy at the University of California, Riverside, who is presenting results from his research titled ‘Inferring the Effects of Mutations on SARS-CoV-2 Transmission From Genomic Surveillance Data’ at the American Physical Society’s March Meeting

“Existing computational methods to study this problem tend to either be difficult to apply to large amounts of data or rely on very restrictive assumptions,” Barton said. “Experiments can also provide excellent information about how different mutations affect the virus, but they can’t be used to directly study SARS-CoV-2 transmission in humans.”

Barton and his colleagues developed a new computational method to solve this problem by applying techniques from statistical physics mathematical models in epidemiology. Their method allows them to look at genomic surveillance data — SARS-CoV-2 sequences collected from infected individuals — over time and across many regions throughout the world, and to find the effects of different mutations on SARS-CoV-2 transmission that best explain the observed evolutionary history of the virus throughout the pandemic. 

“A few novel features of our method are that it can account for the travel of infected individuals between regions, which most other models are unable to do, and that the physics-based methods that we use allow us to write down an exact mathematical expression for the transmission effects of different mutations, rather than relying on numerical simulations to estimate these parameters,” Barton said.

After validating their method on simulations, Barton and his colleagues applied it to more than 1.6 million SARS-CoV-2 sequences from the GISAID database, which were collected from 87 geographical regions. 

“Much research has focused on mutations in the Spike protein of SARS-CoV-2, and our analysis supports this emphasis on Spike as a main driver of SARS-CoV-2 transmission,” Barton said. “About half of the most impactful mutations that we find are in Spike, including three of the top four mutations. However, we also find multiple mutations outside of Spike that appears to strongly increase the transmission of the virus. Some of these may make good targets for future experiments to understand how different mutations affect SARS-CoV-2 function.”

Barton explained that their method is also sensitive enough to reveal benefits to SARS-CoV-2 transmission for mutations that were previously assumed to be neutral. His team is also able to detect some increased transmission for major new variants such as Alpha and Delta very rapidly, within a week of their appearance in regional data. The data set the team considered when writing the paper did not include sequences from the Omicron variant because the data was only collected up until August of 2021. 

“However, even without observing any Omicron sequences in the data, we would already estimate that Omicron would transmit more readily than Alpha just based on the mutations that it shares with other SARS-CoV-2 variants,” Barton said. “While we have focused specifically on SARS-CoV-2 in our analysis, our method is very general and could be applied to study the transmission of other pathogens, such as influenza.”

This research was led by graduate students Brian Lee and Elizabeth Finney in Barton’s lab, joined by collaborators Muhammad Sohail, Syed Ahmed, and Ahmed Quadeer at the Hong Kong University of Science and Technology; and Matthew McKay at the University of Melbourne, Australia. 

UCF lands new project to study effect of rain on hypersonic travel

The work will help researchers learn what conditions make for safe hypersonic travel

The University of Central Florida researchers are part of a new $1 million project funded by the Air Force Office of Scientific Research to better understand and predict how and why raindrops are affected when they cross a hypersonic shock wave. hypersonic raindrop Mechanical and Aerospace Engineering Professor Subith Vasu’s research team will perform experiments using a shock tube to study the effects of hypersonic shockwaves on droplets. Pictured here are team members, from left to right, Nicolas Berube, Subith Vasu, Artem Arakelyan, Jacklyn Higgs, Daniel Dyson, Sydney Briggs and Farhan Arafin.group for web 9db0a

Hypersonic speeds are those at Mach 5 and higher, or five times greater than the speed of sound. The U.S. is currently working on developing hypersonic systems for defense and travel. 

The new project is important because colliding with something as light as a single raindrop could cause a lot of damage at hypersonic speeds. The work will inform researchers as to whether or not the raindrop maintains its single droplet form or breaks up into tens of much smaller droplets.

“If you have a rain droplet with a tenth of an inch diameter and you hit it at Mach 8, it can create a load as heavy as the weight of an elephant,” says Michael Kinzel, project co-investigator and an assistant professor in UCF’s Department of Mechanical and Aerospace Engineering. “So, you can’t put an elephant on the wing of an aircraft, and it’ll support it, right? It’s a huge load. And these would be hitting all over parts of the vehicle.”

Knowing the impact of different size raindrops on hypersonic aircraft and rockets will help predict when to fly, as light rainstorms may not affect travel as much as heavy storms.

The researchers want to narrow down what conditions make for safe hypersonic travel through the rain. The knowledge could prevent damage and improve the accuracy of hypersonic rockets launched through rain and clouds, Kinzel says.

“This work will help lead to structural integrity when designing hypersonic vehicles,” Kinzel says. “And it develops a framework to understand how to design in that context as well as understand limitations of hypersonic flight with respect to some weather conditions.”

Kinzel will work to model the effects of raindrops on hypersonic travel by using supercomputer simulations.

Subith Vasu, a professor in UCF’s Department of Mechanical and Aerospace Engineering and co-investigator, will perform experiments using a shock tube to study the effects of hypersonic shock waves on droplets, such as behavior and timescale of breakup.

Unique test facilities combined with state-of-the-art optical and laser diagnostic systems will be used to understand the interaction of droplet and shock wave, Vasu says.

“The work is important for deterrence and the national security of the United States, and we are proud to be involved in this prestigious effort,” Vasu says.  “The knowledge gained from hypersonics research could have other applications as well, including space exploration.”

Boston University is leading the project and will be working closely with Kinzel and Vasu to understand droplet behavior when impacted at hypersonic speeds. UCF will be receiving about $560,000 for the three-year project. UCF will be collaborating with engineers and scientists from the Air Force Research Laboratory and Lockheed Martin, both closely involved in the development of a variety of hypersonic vehicles.

The project further highlights UCF’s expertise in the area of hypersonic propulsion.

Kinzel received his doctorate in aerospace engineering from Pennsylvania State University and joined UCF in 2018. In addition to being a member of UCF’s Department of Mechanical and Aerospace Engineering, a part of UCF’s College of Engineering and Computer Science, he also works with UCF’s Center for Advanced Turbomachinery and Energy Research.

Vasu received his doctorate in mechanical engineering from Stanford University and joined UCF’s Department of Mechanical and Aerospace Engineering in 2012. He is a member of UCF’s Center for Advanced Turbomachinery and Energy Research, is an associate fellow of the American Institute of Aeronautics and Astronautics, and is a member of the International Energy Agency’s Task Team on Energy. Vasu is a recipient of DARPA’s Director’s Fellowship, DARPA Young Faculty Award, the Young Investigator grant from the Defense Threat Reduction Agency, American Chemical Society’s Doctoral New Investigator, American Society of Mechanical Engineers Dilip Ballal Early Career award, and the Society of Automotive Engineers SAE Ralph R. Teetor Educational Award. He has received many of the highest honors at UCF including the UCF Luminary and Reach for the Stars awards.

JAIST prof Hongo estimates the bending energy of disiloxane molecule with ultra-fine molecular simulations run on a supercomputer

Understanding the structural properties of molecules found in nature or synthesized in the laboratory has always been the bread and butter of materials scientists. But, with advancements in science and technology, the endeavor has become even more ambitious: discovering new materials with highly desirable properties. To accomplish such a feat systematically, materials scientists rely upon sophisticated simulation techniques that incorporate the rules of quantum mechanics, the same rules which govern the molecules themselves. A global team of scientists considered the bending energy of a silicate molecule, disiloxane. Although the system appears to be simple and tractable, the bending energy calculation is actually a difficult problem to solve by the conventional simulation methods.  CREDIT Kenta Hongo from JAIST.

The simulation-based approach has been remarkably successful, so much so that an entire field of study called materials informatics has been dedicated to it. But there have also been instances of failure. A notable example comes from disiloxane, silicon (Si)-containing compound consisting of a Si-O-Si bridge with three hydrogen atoms at each end. The structure is simple enough, and yet it has been notoriously difficult to estimate just how much energy is needed to bend the Si-O-Si bridge. Experimental results have been inconsistent and theoretical calculations have yielded widely different values due to the sensitivity of the calculated properties to parameter choices and level of theory.

Fortunately, an international research team led by Dr. Kenta Hongo, Associate Professor at Japan Advanced Institute of Science and Technology, has now managed to solve this problem. In their study published in Physical Chemistry Chemical Physics, the team achieved this feat by using a state-of-the-art simulation technique called the “first-principles quantum Monte Carlo method” that finally overcame the difficulties other standard techniques could not surmount.

But does it all just boil down to better simulations? Not quite. “Getting an answer that does not agree with the experimentally known value is, in itself, not surprising. The agreement can improve with more care, and more expensive, simulations. But with disiloxane, the agreement becomes worse with more careful simulations,” explains Dr. Hongo. “What our method has achieved, rather, is good results without much dependence on the adjustment parameters, so that we don’t need to worry about whether the adjusted values are sufficient.”

The team compared the first-principles quantum Monte Carlo approach with other standard techniques, such as “density functional theory” (DFT) calculations and “coupled-cluster method with single and double substitutions and noniterative triples” (CCSD(T)), along with empirical measurements from previous studies. The three methods differed mainly in their sensitivity towards the “completeness” of basis sets (a set of functions used to define the quantum wavefunctions).

It turned out that for DFT and CCSD(T), the choice of basis set affected the amplitude as well as positions of zero amplitude for the wavefunctions, while for quantum Monte Carlo, it only affected the zero amplitude positions. This allowed one to tweak the amplitude such that the wavefunction shape approached that of an exact solution. “This self-healing property of the amplitude works well to reduce the basis-set dependence and lower the bias arising from an incomplete basis set in calculating the bending energy barrier,” elaborates Dr. Hongo.

While this is an exciting development in itself, Prof. Hongo points out the bigger picture. “Molecular simulations are widely used to design new medicines and catalysts. Getting rid of the fundamental difficulties in using them greatly contributes to the design of such materials. With our powerful supercomputers, the method used in our study could be a standard strategy for overcoming such difficulties,” he says.

This is indeed deserving of being called a quantum leap.

Even the smallest pollution particles change the rainfall regime in the Amazon

Researchers have found that nanoparticles resulting from human activities such as the burning of fossil fuels rapidly grow in the atmosphere and influence cloud formation

Even the finest particles of pollution influence the process of cloud formation and the rainfall regime. A study conducted in Manaus, the capital of Amazonas state in Brazil’s northern region, shows that oxidation leads small aerosols expelled by factories and car exhausts, for example, to grow very rapidly, reaching up to 400 times their original size and that this affects raindrop formation.

“Understanding cloud and rain formation mechanisms in the Amazon is a major challenge because of the complexity of the non-linear physical and chemical processes that occur in the atmosphere,”  said Paulo Artaxo, a professor at the University of São Paulo’s Physics Institute (IF-USP) and penultimate author of an article on the study published in Science Advances.

The co-authors are all researchers affiliated with institutions in the United States, except Artaxo and Luiz Augusto Machado, also a professor at IF-USP. 

The discovery enhances the accuracy of climate change studies based on mathematical models and simulations. “These nanoparticles of pollution [smaller than 10 nanometers] used to be overlooked in atmospheric calculations and models. The focus was on particles larger than 100 nm because these act as cloud condensation nuclei [on which water vapor condenses to form droplets] and change the rainfall regime. This study shows that smaller particles oxidize as they travel through the atmosphere, expanding rapidly until they reach the size necessary to become condensation nuclei,” Machado said.

The data was collected by instruments onboard a special aircraft that flew over the Manaus pollution plume for about 100 kilometers (km) in 2014 and 2015 during the Green Ocean Amazon (GOAmazon) scientific campaign. FAPESP funded the study via its support for the campaign and a Thematic Project, in both cases under the aegis of the FAPESP Research Program on Global Climate Change (RPGCC).

“Little was known about the role played by these nanoparticles in the rainfall regime,” Machado said. “It so happens that the Manaus area is unique in the world in the sense that it’s an open-air laboratory, a mega-city surrounded by forest at a great distance from other cities where we can investigate how a metropolitan area changes an environment similar to that of the pre-industrial era.”

Aerosols are microscopic solid or liquid particles suspended in the atmosphere. They are produced naturally by forests, as primary aerosols, and in the atmosphere from gases emitted naturally by forests that are known as volatile organic compounds (VOCs), as secondary aerosols. They can also be produced by human activities such as the burning of fossil fuels. The latter are the type investigated in this study. 

According to Machado, aerosols of less than 10 nm emitted by vehicle exhausts, factories, and power plants in the Manaus area form a pollution plume that is blown in a southwesterly direction by the prevailing winds. The researchers concluded that the particles grew rapidly during this journey.

“It’s very hard to estimate the effect of particulate matter on rainfall because of the large number of atmospheric variables that influence this interaction,” Machado said. “We, therefore, compared the pollution line with nearby areas that lie outside the plume. We found that the particles rapidly grow in size. By the time they’re 10 km out of Manaus, they’re larger, and at 30 km they can reach a large enough size to become condensation nuclei, affecting the formation of raindrops.”

Variable impact

Cloud formation mechanisms are complex and involve many atmospheric parameters. Small aerosols interfere in raindrop condensation, but they may intensify or reduce rainfall depending on atmospheric conditions and above all on cloud formation at each moment. According to Machado, a large amount of particulate matter in the plume creates a sort of competition for the water vapor present in clouds, and the size of the droplets decreases as a result.

“For the rain to fall, the droplets have to be a certain size,” he said. “What we call terminal droplet velocity has to be above the velocity of the upwelling air, or the cloud will be full of tiny droplets and no rain will fall.”

If a very strong vertical wind is blowing, however, it can drive this large mass of droplets to a higher altitude where they form ice particles and potentially fuel a fierce storm. “We found that as the particles grow and become condensation nuclei, scant rainfall results if they meet a small, warm cloud. The aerosols reduce the precipitation. However, if the cloud builds up to become a mass of cumulonimbus [dense, towering, vertical cloud], for example, the aerosols increase the precipitation,” Machado said. “In other words, even these small particles of pollution influence the rainfall regime.”

According to the researchers, the project will proceed on a broader basis and fresh data will be collected. This year the team will conduct an experiment called Chemistry of the Atmosphere: Field Experiment in Brazil (CAFE-Brazil), with the aid of a German aircraft that can fly as high as 15,000 m. Artaxo explained that similar studies using remote sensing are also being conducted at the 325 m ATTO tower in the heart of the Amazon Rainforest (more at agencia.fapesp.br/29665).

“In the study just published [on January 12], we collected data on a low flight path [at 4,000 m],” he said. “The German aircraft we’ll use for our next collections is one of the most sophisticated flying laboratories in existence, so we’ll be able to conduct an experiment designed to produce an understanding of key physical and chemical issues in the production of aerosols, clouds, and rain that is still a mystery to us.”