Testing the Limits of the Standard Model

Quantum chromodynamicists use Ranger to explore sub-atomic physics: From the apple to the atom to the quark and beyond, physics has led us deeper and deeper into the heart of matter in a quest to determine what the universe is made of. It’s a real-world question that drives particle physicist, Robert Sugar, to dig in his heels and break out the big guns — supercomputers like Ranger, at the Texas Advanced Computing Center (TACC) — to help solve the numerical equations at the root of fundamental physics. With a new generation of National Science Foundation (NSF) supercomputers in production and more on the way, Sugar, professor of physics at the University of California, Santa Barbara and a member of the MILC (MIMD Lattice Computation) Collaboration, believes that in the next five or ten years, we’ll be able to answer many of the questions about the Standard Model of physics that have stymied scientists for decades, and perhaps gather insights into the new physics beyond. Sugar’s interest in particle physics, and specifically Quantum Chromodynamics (QCD) — the study of the strong force that holds atomic nuclei together — grew out of the rapid rewriting of physics in the 20th century. As new knowledge emerged from quantum mechanics and experiments, classical physics, until then the dominant doctrine, was discovered to have a limited “realm of applicability” that did not include atoms, nuclei, or their constituents. Since the early 1970s, the Standard Model has been the best explanation for the way the world operates. A collection of theories that describe the strong, weak and electromagnetic forces, the model proposed that the fundamental constituents of strongly interacting matter are quarks and gluons, and that these particles bind together to form protons and neutrons (the constituents of the atomic nucleus), as well as a “particle zoo” consisting of hundreds of different types of mesons and baryons. The Standard Model convincingly explains the world of atoms, as well as the nature of cosmic rays, radioactive decay, and high-energy collisions, but its inability to reconcile gravity with the other three fundamental forces of the universe has led many researchers to imagine a physics beyond the Standard Model, a Theory of Everything that would link together all known physical phenomena under a single schema. A frame from an animation illustrating the typical four-dimensional structure of gluon-field configurations used in describing the vacuum properties of QCD. Image courtesy of Derek Leinweber, CSSM, University of Adelaide.
Until a unifying theory is discovered, the formulas of the Standard Model provide the best way to predict the behavior and characteristics of matter. Known since the late 1970s, the formulas for quantum chromodynamics are simple to write down, Sugar said. “You could write them down in one or two lines. But the phenomenon is non-linear, and simple closed-form solutions don’t exist. So the only way to obtain many of the most interesting results is through numerical simulations.” Members of the MILC Collaboration began translating these non-linear equations into the computer code, C, in the 1980s, believing only supercomputers could solve the intractable problems. “It took a long time to gather the tools, the algorithms, the computational methods, but also for computers to become powerful enough,” Sugar recalled. “There was a long period where people were developing the tools and algorithms, but not producing results that were accurate enough to have an impact on experiments. “But that’s no longer true,” he continued. “Over the last five years, our calculations have become more and more accurate. We’re beginning to do calculations that really have an impact on experiments.” On Ranger, the MILC group is exploring the masses of mesons, baryons and quarks, simulating the behavior of the quark-gluon plasma, and examining the relationship between the weak and strong force in nature. Computational simulations are particularly important to particle physics because many of the characteristics that theorists predict are impossible to detect directly through experiments. The strong force, for instance, acts in such a way that the quarks that make up protons and neutrons cannot be separated under ordinary laboratory conditions. Their presence and characteristics must be gleaned through computational simulations on powerful HPC systems. Supercomputers act as a crystal ball for the MILC Collaboration to help identify the parameters, the masses and decay rates, of sub-atomic particles without actually splitting a proton. “If we really knew these parameters and we stared into our crystal ball, that would give us insights into what kinds of fundamental theories there might be,” Doug Toussaint, University of Arizona professor and one of Sugar’s colleagues in the MILC Collaboration, said. To deduce the parameters of sub-atomic particles, QCD researchers use lattice gauge methods, where quantum fields of quarks are plotted on a four-dimensional space-time grid that is divided into as many chunks as computational power will allow. “The calculations get better and better as you make the grid spacing smaller and smaller,” Sugar said. “But if you make the lattice spacing smaller, you have to include more grid points and that takes more computing power, which grows by the 4th power of the grid size! So you can see why we need larger computers to get greater accuracy.” One time history contributing to the propagation of a meson in lattice QCD between the two points denoted by the gray dots. The blue path shows the trajectory of a quark–antiquark pair, the green path shows the excitation of a quark–antiquark pair from the vacuum, and the red lines indicate gluon fields. Image by J. W. Negele and ScDAC Review.
The MILC Collaboration research group is one of the largest users of open-science computing in the world and takes advantage of many of the most powerful HPC systems on the TeraGrid (a U.S. cyberinfrastructure that combines leadership class resources at eleven partner sites to create an integrated computational network). Using more than 19 million computing hours at TACC in 2008, and millions more on computers at the Pittsburgh Supercomputer Center, the National Center for Supercomputing Applications, and the San Diego Supercomputer Center, the MILC collaboration produces lattices with the smallest spacings and quark masses to date. Processing in parallel across more than 4000 computer cores on Ranger, the group is simulating a 64^3x144 lattice grid representing a 4 Fermion box (a thousand times smaller than a nanometer), with a lattice spacing of .06 Fermi. By performing calculations at several lattice spacings, MILC researchers can extrapolate physical quantities down to zero lattice spacing, thereby obtaining results for the exact theory. Sugar believes MILC and other users of the MILC lattices will soon be able to predict the masses and decay properties of strongly interacting particles to an accuracy of a few percent. Ultimately, the goal is to define the “realm of applicability” for the Standard Model and show where the new physics begins. However, scientists like Sugar don’t work in a vacuum. Their work informs and is informed by experimental physicists working at particle accelerators like Fermilab and heavy-ion colliders like the Brookhaven National Laboratory. “You can’t do theory or simulations without the guidance of experiments, and the experimentalists would have a hard time knowing what to look for without some theory to guide them,” Sugar said. “It’s all interconnected.” With the NSF and the Department of Energy investing hundreds of millions of dollars in the coming decade on new, more capable supercomputers, computational science is experiencing a renaissance. The order of magnitude increase in computing capacity available to the open-science community enables more ambitious projects, more accurate results, faster time to completion, and the ability to put more physics into sub-atomic models. “We, and all the other groups that depend largely on supercomputer centers, are going to see a big increase in the time available,” Toussaint said. “When that has time to sink in, we’ll all be doing more accurate work, and perhaps approaching problems that we wouldn’t have thought of trying before.” Whether the phenomena physicists discover beyond the Standard Model support string theory, supersymettry, a Theory of Everything, or something utterly surprising, high performance computers like Ranger will be there, churning out numerical predictions to help answer the fundamental questions. ********************************************************** To learn more about the MILC Collaboration's computational studies, visit their homepage or read their recent paper, "QCD equation of state with 2+1 flavors of improved staggered quarks," in Physical Review D. Aaron Dubrow Texas Advanced Computing Center Science and Technology Writer