How a new quantum approach can develop faster algorithms to deduce complex networks

Scientists conduct numerical simulations on complex networks to gain insight into a powerful quantum mechanics-inspired algorithm

Our world has no dearth of complex networks--from cellular networks in biology to intricate web networks in technology. These networks also form the basis of various applications in virtually all fields of science, and to analyze and manipulate these networks, specific "search" algorithms are required. But, conventional search algorithms are slow and, when dealing with large networks, require a long computational time. Recently, search algorithms based on the principles of quantum mechanics have been found to vastly outperform classical approaches. One such example is the "quantum walk" algorithm, which can be used to find a specific point or a "vertex" on a given N-site graph. Instead of simply going through neighboring vertices, the quantum walk approach employs probabilistic estimations based on the quantum mechanical theory, which drastically reduces the number of steps required to find the objective. To achieve this, before moving from one point to another, an operation called "oracle call" needs to be performed repeatedly to adjust the probability values in the quantum system representation. One main issue is to understand the relationship between the optimal computational time of the oracle call and the structure of the network, as this relationship is well understood for standard shapes and bodies, but it remains unclear for complex networks. Digging deeper into the intricacies of these networks in an effort to develop more efficient Quantum Algorithms{module INSIDE STORY}

In a new study published in Physical Review A, a team of scientists at Tokyo University of Science, led by Prof Tetsuro Nikuni, dug deeper into the intricacies of these networks to develop more efficient quantum algorithms. Prof Nikuni explains, "Many real-world systems, such as the World Wide Web and social/biological networks, exhibit complex structures. To fully explore the potential of these network systems, developing an efficient search algorithm is crucial."

To begin with, the scientists looked into the "fractal properties" (geometrical properties of figures that seem to infinitely replicate their overall shape) of networks. The researchers focused on some basic fractal lattices (structures with a fractal network), such as "Sierpinski gasket," "Sierpinski tetrahedron," and "Sierpinski carpet," to try to find out the relationship between the number of vertices (nodes of the network) and the optimal computational time in a quantum walk search. To this end, they performed numerical simulations with over a million vertices and checked whether the results were in line with previous studies, which proposed a mathematical law or a "scaling law" to explain this relationship.

The researchers found that the scaling law for some fractal lattices varied according to their spectral dimension, confirming the previous conjecture for other lattices. Surprisingly, they even found that the scaling law for another type of fractal lattice depends on a combination of its intrinsic characteristics, again showing that the previous conjecture on the optimal number of oracle calls might be accurate. Prof Nikuni says, "It may indeed be a fact that the quantum spatial search on fractal lattices is surprisingly subject to combinations of the characteristic quantities of the fractal geometry. It remains an open question as to why the scaling law for the number of oracle calls is given by such combinations." With this understanding, the team even proposed a new scaling hypothesis, which slightly differs from the ones proposed earlier, to gain more insight into different fractal geometries of networks.

The research team hopes that, with their findings, quantum searches will become easier to analyze experimentally--especially with recent experiments performing quantum walks on physical systems like optical lattices. The wide applicability of quantum algorithms on fractal lattices highlights the importance of this study. Owing to its exciting findings, this study was even selected as "Editor's suggestion" in the February 2020 issue of Physical Review A. Optimistic about the results and with future research directions laid out, Prof Nikuni concludes, "We hope that our study further promotes the interdisciplinary study of complex networks, mathematics, and quantum mechanics on fractal geometries."

Dutch supercomputer simulations show fundamental interactions inside the cell

A first clear picture of actin-binding to cell membrane lipids

Actin filaments have several important functions inside cells. For one, they support the cell membrane by binding to it. However, scientists did not know exactly how the actin interacts with the membrane lipids. Supercomputer simulations performed at the University of Groningen, a public research university in the city of Groningen in the Netherlands, supported by experiments, provide a molecular view on this very fundamental process. The results were published in an academic journal.

Actin filaments are involved in many important processes in eukaryotic cells, from motility to division to the contraction of muscle fibers. Actin can also form a network underneath the lipid bilayer. Here, it provides additional support for this structure, while curved filaments also play a role in cell division, when the membrane needs to constrict. Despite its importance, the molecular processes underlying the binding of actin to cell membranes is still not clear.

Supercomputer simulations CAPTION Simulation of actin-binding to the lipid membrane with negatively charged lipids (orange). Blue color indicates calcium concentrations.  CREDIT S-J Marrink, University of Groningen{module INSIDE STORY}

'The literature provides conflicting results,' says the University of Groningen Professor of Molecular Dynamics Siewert-Jan Marrink. "ll agree that actin, which is negatively charged, could bind to positively charged lipids and some say that they can even bind to membranes with no positive or even a negative charge." The latter is relevant for the biological situation as normal cell membranes carry a negative charge on the inside.

Marrink and his colleagues, therefore, performed molecular dynamics simulations of the interaction between lipids and actin. They used the Martini coarse-grained forcefield, which was developed by Marrink and is now used worldwide.

Molecular glue

By varying the components in the simulation, the scientists discovered that the ions that are present define the binding process. Marrink: 'Actin could only bind to negatively charged lipids in the presence of calcium ions. The two positive charges of calcium act as a kind of molecular glue.' In contrast, in the presence of positively charged lipids, calcium ions inhibited the binding of actin. The results of this simulation were confirmed by experiments performed in Professor Gijsje Koenderink's lab at the Delft University of Technology.

'The concentrations of calcium that were required in the simulations are higher than you would find inside cells,' says Marrink. "However, calcium ions tend to bind to membrane lipids, so the local concentration could be high enough."

Synthetic cell

The results provide the first clear picture of actin-binding to membrane lipids. "We want to use this in a national effort to build an artificial cell,' Marrink explains. The Dutch BaSyC (Building a Synthetic Cell) project involves many different steps and one of them is constructing membranes. 'These need to be reinforced with actin, so we have to understand how to control the interaction between the filaments and the lipid membrane. And we need to guide division of the artificial cell, where actin is needed for constriction."

Pitt engineer wins $500K NSF Career Award for advancing AI neural network technology

In science fiction stories from "I, Robot" to "Star Trek," an android's "positronic brain" enables it to function as a human, but with tremendously more processing power and speed. In reality, the opposite is true: a human brain - which today is still more proficient than CPUs at cognitive tasks like pattern recognition - needs only 20 watts of power to complete a task, while a supercomputer requires more than 50,000 times that amount of energy.

For that reason, researchers are turning to a neuromorphic computer and artificial neural networks that work more like the human brain. However, with current technology, it is both challenging and expensive to replicate the Spatio-temporal processes native to the brain, like short-term and long-term memory, in artificial spiking neural networks (SNN).

Feng Xiong, PhD, assistant professor of electrical and computer engineering at the University of Pittsburgh's Swanson School of Engineering, received a $500,000 CAREER Award from the National Science Foundation (NSF) for his work developing the missing element, a dynamic synapse, that will dramatically improve energy efficiency, bandwidth and cognitive capabilities of SNNs. Feng Xiong, Ph.D., assistant professor of electrical and computer engineering.{module INSIDE STORY}

"When the human brain sees rain and then feels wetness, or sees fire and feels the heat, the brain's synapses link the two ideas, so in the future, it will associate rain with wetness and fire with warmth. The two ideas are strongly linked in the brain," explains Xiong. "Computers, on the other hand, need to be fed massive datasets to do the same task. Our dynamic synapse would mimic the brain's ability to create neuronal connections as a function of the timing differences between stimulations, significantly improving the energy efficiency required to perform a task."

Current non-volatile memory devices that have been studied for use as artificial synapses in SNNs haven't measured up: they are designed to retain data permanently and aren't suited for the Spatio-temporal dynamics and high precision that the human brain is capable of. In the brain, it's not only the information that matters but also the timing of the information--for example, in some situations, the closer two pieces of information are in time, the stronger the synaptic strand between them.

By programming the conductor to conduct more electricity for a stronger neural connection, it can function more like the synapses of the human brain, giving more weight to items that are more closely linked as it learns.

"The resulted change in the electrical conductance (representing the synaptic weight or the synaptic connection strength) in the dynamic synapse will have both a short-term and a long-term component, mimicking the short-term and long-term memory/learning in the human brain," says Xiong.

Though researchers have demonstrated this kind of technology before in the lab, this project is the first time it will be applied to an SNN. The application could lead to the wide use of AI and revolutionary advances in cognitive computing, self-driving vehicles, and autonomous manufacturing.

In addition to the research component of the project, Xiong will use the opportunity to engage future engineers in his research. He plans to develop an after-school outreach program, host nanotech workshops with the Pennsylvania Junior Academy of Science, and welcome undergraduate engineering majors at Pitt to engage with the research.

The project is titled "Scalable Ionic Gated 2D Synapse (IG-2DS) with Programmable Spatio-Temporal Dynamics for Spiking Neural Networks" and will begin on March 1, 2020.