Overcoming Quantum Error

International research team uses TACC supercomputers to push the frontiers of quantum computing simulations

Quantum Error
Lattice for the topological color codes with 3-colored vertices. Physical qubits of the error-correcting code correspond to triangles (stars mark the ‘boundaries' of the sets of triangles displayed). (a) Boundary of a vertex v. The stabilizer operators Xv , Zv have support on the corresponding qubits. (b) Error pattern in the form of a string net. The three vertices that form its boundary are all the information we have to correct the error. (c) Two error patterns with the same boundary. Because together they form the boundary of the three vertices marked with a circle, they are equivalent.

Have you noticed that no one seems to talk about clock rate anymore? Instead of boosting speed at the microprocessor level, companies are packing more cores on a chip and more blades in a server, adding density to make up for the fact that, in the current paradigm, transistors cannot operate much faster.

This leveling of the speed curve has led many to pursue alternative models of computation, foremost among them quantum computing. Practically speaking, the field is in its infancy, with a wide range of possible implementations. Theoretically, however, the solutions share the same goal: using the quantum nature of matter — its ability to be in more than one state at once, called superposition — to speed up computations a million-fold or more.

Where typical computers represent information as 0s and 1s, quantum computers use 0, 1, and all possible superpositions to perform computations.

"Instead of basing your calculation on classical bits, now you can use all kinds of quantum mechanical phenomena, such as superposition and entanglement to perform computations," said Helmut Katzgraber, professor of physics at Texas A&M University. "By exploiting these quantum effects, you can, in principle, massively outperform standard computers."

Problems that are currently out of reach, such as simulating blood flow through the entire body; designing patient-specific drugs; decrypting data; or simulating complex materials and advanced quantum systems, would suddenly be graspable. This tantalizing possibility drives researchers to explore quantum computers.

The behavior of quantum particles is difficult to see or test in the traditional sense, so the primary ways scientists probe quantum systems is through computer simulations.

Studying the theoretical stability of a quantum computer via computer simulations is exceptionally difficult. However, an international team of researchers coming from diverse fields of physics found common ground on a new way of determining the error tolerance of so-called topological quantum computers using high-performance computing.

The approach involves modeling the interactions of a class of exotic, but well-known materials known as spin glasses. These materials have been shown to correlate to the stability of a topologically-protected quantum computer. By simulating these spin-glass-like systems on the Ranger and Lonestar4 supercomputers at the Texas Advanced Computing Center (TACC), the team has been able to explore the error threshold for topological quantum computers – a practically important aspect of these systems.

"It's an inherent property of quantum mechanics that if you prepare a state, it is not going to stay put," Katzgraber said. "Building a device that is precise enough to do the computations with little error, and then stays put long enough to do enough computations before losing the information, is very difficult."

topological diagram

In topological quantum error correction information is stored in anonlocal property of the system. This can, for example, be visualized as a string that lives on the surface of a torus: It either wraps around the torus (blue) or it can be contracted to a point (red), thus denoting two distinct topological sectors. The closeup shows that loops are formed by connecting different stabilizers Z with a closed line.

Katzgraber's work focuses on understanding how many physical bits can ‘break' before the system stops working. "We calculate the error threshold, an important figure of merit to tell you how good and how stable your quantum computer could be," he explained. Katzgraber and his colleagues have shown that the topological design can sustain a 10 percent error rate and even higher, a value considerably larger than for traditional implementations.

So what is a topological quantum computer? Katzgraber likens the topological design to "a donut with a string through it." The string can shift around, but it requires a good deal of energy to break through the donut or to cut the string. A 0 state can then be encoded into the system when the string passes through the donut, a 1 when it is outside the donut. As long as the donut or string do not break, the encoded information is protected against any external influences, i.e., wiggling of the string.

In the topological model, one uses many physical bits to build a logical bit. In exchange, the logical bit is protected against outside influences like decoherence. In this way, the system acts much like a DVD reader, encoding information at multiple points within the topology to prevent errors.

"It's a quantum mechanical version of a 0 and 1 that is protected to external influences," Katzgraber said.

The amount of error a topological quantum computing system can sustain corresponds to how many interactions in the underlying spin glass can be frustrated before the material stops being ferromagnetic. This insight informs the scientists about the stability potential of topological quantum computing systems.

"Topological codes are of great interest because they only require local operations for error correction and still provide a threshold," states Hector Bombin, a research collaborator at the Perimeter Institute in Canada. "In addition, they have demonstrated an extraordinary flexibility. In particular, their topological content allows one to introduce naturally several ways of computing and correcting errors that are specific to these class of codes."

According to Katzgraber, the non-intuitive approach to quantum simulations was driven by the unique nature of the collaboration between his group at Texas A&M University and ETH Zurich (Switzerland), as well as collaborators at the Perimeter Institute and the Universidad Complutense de Madrid in Spain.

"You have so many disparate fields of physics coming together on this one problem," Katzgraber said. "You have the statistical mechanics of disordered systems; quantum computation; and lattice gauge theories from high-energy physics. All three are completely perpendicular to each other, but the glue that brings them together is high performance computing."

Helmut Katzgraber

Helmut Katzgraber, professor of physics at Texas A &M University.

On TACC's high-performance computing systems, Katzgraber simulated systems of more than a 1,000 particles, and then repeated the calculations tens of thousands of times to statistically validate the virtual experiment. Each set of simulations required hundreds of thousands of processor hours, and all together, Katzgraber used more than two million processor hours for his intensive calculations. Results of the simulations were published in Physical Review Letters, Physical Review A and Physical Review E, and are inspiring further study.

Even as trailblazing companies fabricate the first quantum computing systems, many basic questions remain—most notably how to read in and read out information without error. The majority of people in the field believe a prototype that significantly outperforms silicon at room temperature is decades away. Nonetheless, the insights drawn from computational simulations like Katzgraber's are playing a key role in the conceptual development of what may be the next leap forward.

"We're going to make sure that one day we'll build computers that do not have the limits of Moore's law," said Katzgraber. "With that, we'll be able to do things that right now we're dreaming about."