Matching Simulation to Reality in the Life of the Universe

TACC’s systems stimulate cosmological discovery: Tom Quinn is simulating the universe inside the circuitry of Lonestar, the massive supercomputer at the Texas Advanced Computing Center (TACC). Using a 33 million-particle sample set – a chunk of theoretical space containing half gas, half dark matter — the University of Washington professor of Astronomy and director of the N-Body Shop, is creating a computational model of the universe from the Big Bang onward. He’ll then test his model against observed galaxies. “Will the galaxies be disk-like, will their sizes be like the sizes we observe, will they have the right number of stars in them?” Quinn asks. But simulating the 13.7 billion year history of the universe is even more difficult than it sounds, especially when one needs to calculate the effect of everything on everything. N-body problems — which Quinn and his colleagues at the N-Body Shop specialize in — refer to the non-linear, “butterfly-effect” dynamics, where the forces (gravity, electromagnetism, gas dynamics) exerted on a large number (n) of parts (bodies) relate to all the other parts in the system and must be calculated together. Quinn won’t be the first to simulate the universe, but he hopes to do it better than ever before. In cosmological modeling, accuracy is everything, and without enough computing power and the right code, you can often arrive at the wrong conclusions. To explain, Quinn offers a history lesson: “About a decade ago, people thought they understood the morphology of galaxies and clusters of galaxies. Low-resolution simulations showed that as the dark matter comes together, it mixes up thoroughly and you don’t have any dark matter substructure. Then, around 1998, we ran some high-resolution simulations and found that the dark matter doesn’t mix up, that numerical noise was causing the confusion. And that raised a whole lot of new questions and predictions.” Even with 33 million particles, Quinn isn’t ruling out unexpected results. “The dynamics are just so non-linear that surprises show up,” Quinn remarked. N-body problems — which Quinn and his colleagues at the N-Body Shop specialize in — refer to the non-linear, “butterfly-effect” dynamics, where the forces (gravity, electromagnetism, gas dynamics) exerted on a large number (n) of parts (bodies) relate to all the other parts in the system and must be calculated together. Above: an animation simulating the dynamics of a galaxy (click to play).
The cosmological simulations Quinn runs are impossible without the incredibly powerful systems that TACC and the TeraGrid provide. “Even our smallest simulations would take about 100,000 node hours running on 64 CPUs,” Quinn said. His problems also require tightly connected supercomputing networks that allow for fast communication among processors. “Because of the nature of our calculations, the gravity part, every particle has to talk to some part of every other particle. Having the Infiniband low-latency, high-speed network on Lonestar was crucial.” Quinn’s simulations begin with the presumed first principles of the universe: cosmic background radiation; an expanding volume of cooling helium and hydrogen; dark matter separating from gas and coalescing into massive stars. Prior research had taken us this far into the birth of structure in the universe, but Quinn’s simulations push on, following the same theoretical conditions and objects as they spawn galaxies, stars, black holes, and planets that relate to and interact with each other. If one criterion for accurate modeling is the number of particles in a system — 33 million in this case — a second factor is the number of time-steps one uses, marker points at which the relationship of the parts can be calculated. As Lonestar clicks off each time-step, it computes the net force of every body on every body and updates the particles’ position, velocity and other attributes. “We have to cover about 13 giga-years with time-steps as little as 10,000 years,” Quinn said. “That’s roughly a million time-steps over the age of the universe.” The immense number of time-steps and particles lend new accuracy to Quinn’s simulations and allow him to make statistical comparisons about galaxy distributions, shapes and sizes that determine if a model is useful and if its theoretical underpinnings are accurate. Simulations sometimes also lead to unexpected discoveries separate from their statistical goals. Working with graduate student, Rok Roskar, and Brooks Postdoctoral Fellow, Victor Debattista, Quinn’s galactic simulations revealed unexpected results about the motions of migrating stars. “It was a surprise,” Quinn said. “We usually think of stars forming in one part of a galaxy and pretty much staying there. But our simulation showed that there’s actually significant radial migration. In fact, most of the stars that are beyond the gas edge were born a factor of two further in and migrated out because of non-linear gravitational interactions.” Discoveries like this often emerge for Quinn through the visualization stage of the computing process, when data from a simulation is reinterpreted as dynamic, information-rich images. “When we run a computation, we pick a camera position from which to watch the simulation,” Quinn explained. “Then the parallel code produces jpeg [image] frames which we assemble into a movie. Particularly when we’re including all the physics of how a galaxy comes together, you have to see the movie to understand exactly how it works.” In cosmology, as in many fields of scientific study, HPC-systems are changing the way researchers approach problems, broadening the scale and sharpening the detail of computational solutions. “The big breakthroughs in theory over the past decade have been based on supercomputing,” Quinn said. “The quality of the science you produce is highly correlated to the capabilities of the computers you have access to.” In 2007, Quinn and his team used over 1.1 million SUs (service units, or hours of computing) on Lonestar over 935 jobs for their gravitational N-Body simulations. When Ranger, the world’s most powerful supercomputer for open science research, becomes available to researchers in February 2008, Quinn will expand his project to include entire regions of the universe, enabling better modeling and a more nuanced understanding of how we got here and where we’re going. To learn about new developments at the N-Body Shop, visit: www-hpcc.astro.washington.edu. by Aaron Dubrow Science and Technology Writer Texas Advanced Computing Center