Hien Van Nguyen, University of Houston associate professor of electrical and computer engineering, is developing next-gen artificial intelligence to improve medical diagnostics.
Hien Van Nguyen, University of Houston associate professor of electrical and computer engineering, is developing next-gen artificial intelligence to improve medical diagnostics.

UH engineering prof Nguyen wins NCI grant to create next-gen AI to improve diagnostics 

Despite the remarkable progress in artificial intelligence (AI), several studies show that AI systems do not improve radiologists' diagnostic performance. Diagnostic errors contribute to 40,000 - 80,000 deaths annually in U.S. hospitals. This lapse creates a pressing need: Build next-generation computer-aided diagnosis algorithms that are more interactive to fully realize the benefits of AI in improving medical diagnosis. The new computational framework uses a unique combination of eye-gaze tracking, intention reverse engineering and reinforcement learning to decide when and how an AI system should interact with radiologists.

That’s just what Hien Van Nguyen, the University of Houston associate professor of electrical and computer engineering, is doing with a new $933,812 grant from the National Cancer Institute. He will focus on lung cancer diagnostics. 

“Current AI systems focus on improving stand-alone performances while neglecting team interaction with radiologists,” said Van Nguyen. “This project aims to develop a computational framework for AI to collaborate with human radiologists on medical diagnosis tasks.” 

That framework uses a unique combination of eye-gaze tracking, intention reverse engineering, and reinforcement learning to decide when and how an AI system should interact with radiologists. 

To maximize time efficiency and minimize the amount of distraction on clinical work, Van Nguyen is designing a user-friendly and minimally interfering interface for radiologist-AI interaction.  

The project evaluates the approaches for two clinically important applications: lung nodule detection and pulmonary embolism. Lung cancer is the second most common cancer, and pulmonary embolism is the third most common cause of cardiovascular death.  

“Studying how AI can help radiologists reduce these diseases' diagnostic errors will have significant clinical impacts,” said Van Nguyen. “This project will significantly advance the knowledge of the field by addressing important, but largely under-explored questions.”  

The questions include when and how AI systems should interact with radiologists and how to model radiologists' visual scanning process. 

“Our approaches are creative and original because they represent a substantive departure from the existing algorithms. Instead of continuously providing AI predictions, our system uses a gaze-assisted reinforcement learning agent to determine the optimal time and type of information to present to radiologists,” said Van Nguyen.  

“Our project will advance the strategies for designing user interfaces for doctor-AI interaction by combining gaze-sensing and novel AI methodologies.”  

Scheme of the machine learning model based on the feedforward artificial neural network
Scheme of the machine learning model based on the feedforward artificial neural network

Russian scientists develop a neural network algorithm that predicts Arrhenius crossover temperature with 90 percent accuracy

A joint paper by the Department of Computational Physics and Modeling of Physical Processes and Udmurt Federal Research Center of the Russian Academy of Sciences saw light in Materials.

The algorithm can help speed up the production of many materials, including metal alloys, and simplify quality control during such production. An algorithm based on a neural network created at KFU makes it possible to accurately calculate the Arrhenius temperature from several physical parameters of the material. (a) Diagram of the root mean square error ξ of estimation of the Arrhenius crossover temperature TA calculated for various combinations of the quantities Tm, Tg, Tg/Tm and m, which were the inputs of the machine learning model. Inset: T(pred)A and T(emp)A are the predicted and empirical Arrhenius crossover temperatures, respectively. (b) Correspondence between the empirical TA and the TA predicted by the machine learning model using the validation data set.

Among the parameters used by the team for modeling were melting temperature, glass transition temperature, and brittleness value. They are used to describe phase transitions and structural changes in liquids during cooling.

Co-author, Associate Professor Bulat Galimzyanov comments, “Many solid materials, such as glass, metals, plastics, initially have the form of melts – they are viscous liquids that solidify at a certain temperature, turning into a solid state. The temperature at which a change in the state of aggregation begins is called the Arrhenius temperature. When approaching it, the atoms of matter begin to move in groups and more slowly than before. This indicates the preparation of the liquid for solidification.”

The algorithm was tested for metallic, silicate, borate, and organic glasses, according to the interviewee, “We found out that for the created neural network, the melting and glass transition temperatures of the material are significant and sufficient characteristics for estimating the Arrhenius temperature. From these two values, the algorithm determined the Arrhenius temperature for all analyzed liquids with an accuracy of more than 90 percent.”

The scientists worked out an equation linking the Arrhenius temperature with the melting temperatures and the glass transition temperature.

“Glass transition and melting temperatures are easily measured in lab conditions. Furthermore, they can be found in the literature. Thus, determining the Arrhenius temperature has now become easier. We can analyze the properties of liquids faster and estimate the characteristics of resulting solid materials more precisely,” concludes Galimzyanov.

The team further plans to adapt the created algorithm to more complex materials, such as polymers.

Global map of eddies (Graphic: Nathan Beech)
Global map of eddies (Graphic: Nathan Beech)

EERIE project uses supercomputers for improved Earth system simulations

The ocean has a large effect on our planet’s climate. In this regard, mesoscale – i.e., medium-sized – eddies, which constitute essentially the weather on the ocean, could be far more important than previously believed. Accordingly, a new project, led by the Alfred Wegener Institute has just been launched to assess this aspect. By doing so, “European Eddy Rich Earth System Models” (EERIE) could significantly improve today’s Earth system models and therefore projections of the climate’s future development. 

Eddies come in a range of sizes, with diameters from only a few meters to several kilometers. Their influence on the climate depends on their size. Although these eddies have existed for some time, we still have limited quantitative information on their role, bearing in mind the impacts of a warming climate (Beech et al. 2022). A new EU-financed project aims to change that: with the aid of “European Eddy Rich Earth System Models” (EERIE), eddies will be more realistically represented in climate models – i.e., by the laws of physics rather than empirical parameterizations. Simulated and observed eddy kinetic energy patterns in the global ocean.

EERIE’s goal is to help produce a new generation of Earth system models (ESMs). To do so, it will focus on improving the simulation of mesoscale eddies, which, depending on the region, can be anywhere from five to 40 kilometers wide. The supercomputer modeling improvements will include e.g. the inclusion of open channels of water (“leads”) in sea ice, where the ocean influences the atmosphere via powerful heat fluxes. “The technological hurdles to achieving these high-resolution simulations are immense,” says Prof Thomas Jung, responsible for coordinating the project at the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI). “In order to allow quantitative statements, EERIE will have to achieve a simulation rate of up to five simulated years per day on the latest pre-exascale supercomputers available in Europe. Here, efficiency is a key factor – also to keep the simulations’ energy consumption and CO2 footprint to a minimum.” In order to implement, save and analyze these complex high-resolution simulations, the researchers will have to work hand in hand with software engineers to develop radically new software technologies.

In the course of the project, the researchers also plan to develop new simulation protocols, contributing to national and international climate change assessments in the process. In this way, EERIE is to yield valid and directly applicable climate information and to make valuable contributions in preparation for the IPCC’s next Assessment Report.

The project budget is over 10 million euros. 17 partner institutions are involved, including seven universities. The kick-off event for EERIE took place on 23 and 24 February 2023. The project, which officially began on 1 January 2023, will continue for four years.

Global cloudiness map, based on data collected by the Aqua research satellite over more than a decade (2002-2015). Clouds are not distributed uniformly but rather concentrated in hot spots. Photo: NASA
Global cloudiness map, based on data collected by the Aqua research satellite over more than a decade (2002-2015). Clouds are not distributed uniformly but rather concentrated in hot spots. Photo: NASA

Weizmann scientists solve a 50-year-old puzzle on why Earth’s hemispheres look equally bright when viewed from space

When looking at the Earth from space, its hemispheres – northern and southern – appear equally bright. This is particularly unexpected because the Southern Hemisphere is mostly covered with dark oceans, whereas the Northern Hemisphere has a vast land area that is much brighter than these oceans. For years, the brightness symmetry between hemispheres remained a mystery. In a new study, Weizmann Institute of Science, Isreal, researchers, and their collaborators reveal a strong correlation between storm intensity, cloudiness, and the solar energy reflection rate in each hemisphere. They offer a solution to the mystery, alongside an assessment of how climate change might alter the reflection rate in the future. (l-r) Prof. Yohai Kaspi and Or HadasAs early as the 1970s, when scientists analyzed data from the first meteorological satellites, they were surprised to find out that the two hemispheres reflect the same amount of solar radiation. The reflectivity of solar radiation is known in scientific lingo as “albedo.” To better comprehend what albedo is, think about driving at night: It is easy to spot the intermittent white lines, which reflect light from the car’s headlights well, but difficult to discern the dark asphalt. The same is true when observing Earth from space: The ratio of the solar energy hitting the Earth to the energy reflected by each region is determined by various factors. One of them is the ratio of dark oceans to bright land, which differ in reflectivity, just like asphalt and intermittent white lines. The land area of the Northern Hemisphere is about twice as large as that of the Southern, and indeed when measuring near the surface of the Earth, when the skies are clear, there is more than a 10 percent difference in albedo. Still, both hemispheres appear to be equally bright from space.

In this study, the team of researchers, led by Prof. Yohai Kaspi and Or Hadas of Weizmann’s Earth and Planetary Sciences Department, focused on another factor influencing albedo, one located in high altitudes and reflecting solar radiation – clouds. The team analyzed data derived from the world’s most advanced databases, including cloud data collected via NASA satellites (CERES), as well as data from ERA5, which is a global weather database containing information collected using a variety of sources in the air and on the ground, dating back to 1950. ERA5 data was utilized to complete cloud data and to cross-correlate 50 years of this data with information on the intensity of cyclones and anticyclones.

Next, the scientists classified storms of the last 50 years into three categories, according to intensity. They discovered a direct link between storm intensity and the number of clouds forming around the storm. While Northern Hemisphere and land areas, in general, are characterized by weaker storms, above oceans in the Southern Hemisphere, moderate and strong storms prevail. Data analysis showed that the link between storm intensity and cloudiness accounts for the difference in cloudiness between the hemispheres. “Cloud albedo arising from strong storms above the Southern Hemisphere was found to be a high-precision offsetting agent to the large land area in the Northern Hemisphere, and thus symmetry is preserved,” says Hadas, adding: “This suggests that storms are the linking factor between the brightness of Earth’s surface and that of clouds, solving the symmetry mystery.”

Could climate change make one of the hemispheres darker?

Earth has been undergoing rapid change in recent years, owing to climate change. To examine whether and how this could affect hemispheric albedo symmetry, the scientists used CMIP6, a set of models run by climate modeling centers around the world to simulate climate change. One of these models’ major shortcomings is their limited ability to predict the degree of cloudiness. Nevertheless, the relation found in this study between storm intensity and cloudiness enables scientists to assess future cloud amounts, based on storm predictions.

Models predict global warming will result in a decreased frequency of all storms above the Northern Hemisphere and of weak and moderate storms above the Southern Hemisphere. However, the strongest storms of the Southern Hemisphere will intensify. The cause of these predicted differences is “Arctic amplification,” a phenomenon in which the North Pole warms twice as fast as Earth’s mean warming rate. One might speculate that this difference should break hemispheric albedo symmetry. However, the research shows that a further increase in storm intensity might not change the degree of cloudiness in the Southern Hemisphere because cloud amounts reach saturation in very strong storms. Thus, symmetry might be preserved.

“It is not yet possible to determine with certainty whether the symmetry will break in the face of global warming,” says Kaspi. “However, the new research solves a basic scientific question and deepens our understanding of Earth’s radiation balance and its effectors. As global warming continues, geoengineered solutions will become vital for human life to carry on alongside it. I hope that a better understanding of basic climate phenomena, such as the hemispheric albedo symmetry, will help in developing these solutions.”

Controlled generation of single-photon emitters in silicon (red) by broad-beam implantation of ions (blue) through a lithographically defined mask (left) and by a scanned focused ion beam (right). Symbolically shown: the emission of two single photons at locations defined for this purpose by the process. In the background: An electron beam creates holes in the lithographic mask made of acrylate.  Source: M. Hollenbach, B. Schröder/HZDR
Controlled generation of single-photon emitters in silicon (red) by broad-beam implantation of ions (blue) through a lithographically defined mask (left) and by a scanned focused ion beam (right). Symbolically shown: the emission of two single photons at locations defined for this purpose by the process. In the background: An electron beam creates holes in the lithographic mask made of acrylate. Source: M. Hollenbach, B. Schröder/HZDR

Germany builds single-photon emitters in silicon at the nanoscale

Very shortly, quantum supercomputers are expected to revolutionize the way we compute, with new approaches to database searches, AI systems, simulations, and more. But to achieve such novel quantum technology applications, photonic integrated circuits which can effectively control photonic quantum states – the so-called qubits – are needed. Physicists from the Helmholtz-Zentrum Dresden-Rossendorf (HZDR), TU Dresden, and Leibniz-Institut für Kristallzüchtung (IKZ) have made a breakthrough in this effort: for the first time, they demonstrated the controlled creation of single-photon emitters in silicon at the nanoscale.

Photonic integrated circuits, or in short, PICs, utilize particles of light, better known as photons, as opposed to electrons that run in electronic integrated circuits. The main difference between the two is: A photonic integrated circuit provides functions for information signals imposed on optical wavelengths typically in the near-infrared spectrum. “Actually, these PICs with many integrated photonic components can generate, route, process, and detect light on a single chip”, says Dr. Georgy Astakhov, Head of Quantum Technologies at HZDR's Institute of Ion Beam Physics and Materials Research, and adds: “This modality is poised to play a key role in upcoming future technology, such as quantum computing. And PICs will lead the way.”

Before, quantum photonics experiments were notorious for the massive use of “bulk optics” distributed across the optical table and occupying the entire lab. Now, photonic chips are radically changing this landscape. Miniaturization, stability, and suitability for mass production might turn them into the workhorse of modern-day quantum photonics.

From random to control mode

Monolithic integration of single-photon sources in a controllable way would give a resource-efficient route to implement millions of photonic qubits in PICs. To run quantum computation protocols, these photons must be indistinguishable. With this, industrial-scale photonic quantum processor production would become feasible.

However, the currently established fabrication method stands in the way of the compatibility of this promising concept with today's semiconductor technology.

In a first attempt reported about two years ago, the researchers were already able to generate single photons on a silicon wafer, but only in a random and non-scalable way. Since then, they have come far. “Now, we show how focused ion beams from liquid metal alloy ion sources are used to place single-photon emitters at desired positions on the wafer while obtaining a high creation yield and high spectral quality”, says Dr. Nico Klingner, physicist.

Furthermore, the scientists at HZDR subjected the same single-photon emitters to a rigorous material testing program: After several cooling-down and warming-up cycles, they did not observe any degradation of their optical properties. These findings meet the preconditions required for mass production later on.

To translate this achievement into a widespread technology, and allow for wafer-scale engineering of individual photon emitters on the atomic scale compatible with established foundry manufacturing, the team implemented broad-beam implantation in a commercial implanter through a lithographically defined mask. “This work really allowed us to take advantage of the state-of-the-art silicon processing cleanroom and electron beam lithography machines at the Nano Fabrication facility Rossendorf”, explains Dr. Ciarán Fowley, Cleanroom group leader and Head of Nanofabrication and Analysis.

Using both methods, the team can create dozens of telecom single-photon emitters at predefined locations with a spatial accuracy of about 50 nm. They emit in the strategically important telecommunication O-band and exhibit stable operation over days under continuous-wave excitation.

The scientists are convinced that the realization of controllable fabrication of single-photon emitters in silicon makes them a highly promising candidate for photonic quantum technologies, with a fabrication pathway compatible with very large-scale integration. These single-photon emitters are now technologically ready for production in semiconductor fabs and incorporation into the existing telecommunication infrastructure.