Over the course of the past several decades, cosmologists have used countless observations to devise a biography of the universe, known as the Standard Model. The model proposes that ripples in dark matter, magnified by time and gravity, caused clumps of matter to congeal into a web of intersecting filaments, eventually spawning stars, galaxies, black holes, and structure. But like the cosmos, the Standard Model is full of black holes — missing pieces of the puzzle, whose attraction is undeniable to astronomers.
Michael L. Norman, distinguished professor of physics at the University of California, San Diego and director of the Laboratory for Computational Astrophysics, has spent decades using advanced mathematical and computational methods to explore the universe. With the Standard Model firmly in place, “we’re done asking the simple questions,” Norman said. “Now we’re asking more detailed questions, and for that you need detailed simulations,” which require the number-crunching ability of high-performance computers. Norman is one of the largest users of supercomputing time in the world, with an allocation of 16 million computing hours at the Texas Advanced Computing Center (TACC) in 2008, and millions more on TeraGrid systems at the San Diego Supercomputer Center, the Pittsburgh Supercomputing Center and the National Center for Supercomputing Applications. In the coming months, Norman will use Ranger, the world’s most powerful supercomputer for open-science research, to perform the largest cosmological simulation to date. His simulation will capture billions of years of cosmic evolution with “extreme resolution,” and examine the dynamics of the Reionization Era, when space became the clear-skied place we see today. Gassy Dusk Using the most powerful telescopes, astronomers can look back 14 billion years to get a snapshot of the early evolution of the universe. But not long after the Big Bang, our view of the cosmos gets cloudy.
A timeline of the universe, created by S. G. Djorgovski & Digital Media Center, Caltech.
“At 380,000 years, the universe has expanded and cooled to the point where the plasma has recombined into neutral atoms,” Norman explained. “That is called the era of recombination.” Neutral hydrogen and helium (the primary elements in the primordial universe) absorbed the scarce light and created a uniform, opaque intergalactic medium through which standard telescopes cannot see. It wasn’t until the first stars and galaxies formed — between 500 million and one billion years after the Big Bang — that there was enough radiated energy to ionize (or remove the electrons from) the neutral gases, a process that made the universe translucent again. But that’s only part of the story, Norman insists. “At the end of the first era, you have ionized hydrogen and you have singly ionized helium. But a helium atom has two electrons and that second electron is very tightly bound to the helium, so it requires more energetic photons to strip off. To produce that you need quasar radiation.” Quasars are compact halos of matter surrounding the supermassive black hole of a young galaxy, and it just so happened that the cosmic explosion of quasars coincided with the Reionization Era. It stood to reason that quasars doubly-ionized the helium — Norman’s simulations would show how that occurred. There was another compelling reason to investigate the role of quasars in the reionization process. “Quasars act as beacons at the edge of the universe. When we look at its light, we actually see the absorption due to all the gas on the line of sight,” Norman explained. “If you know how to interpret those spectra, then you actually have a core sample, billions of light years long, of the distribution of dark matter.” However, to convert the observed absorption lines into a dark matter map, scientists need to know the temperature of the gas. “And the temperature of the gas is affected by helium-2 ionization.” In other words, the nature of the evolving intergalactic medium, and the dark matter map that it would reveal, could only be ascertained by highly detailed computational simulation.
This image is part of a time sequence from a reionization simulation at a volume similar to the upcoming run at TACC. Using Ranger, the goal is to apply adaptive mesh refinement to show the cosmic evolution with "extreme resolution."
Coding the Cosmos To simulate the evolution of the universe, Norman employs Enzo, a hydrodynamic cosmological simulation code written by Greg Bryan under Norman’s supervision at NCSA. Enzo combines equations that capture the behavior of dark matter and gas with the gravitational dynamics of N-body problems. But what makes the code truly special is its use of adaptive mesh refinement (AMR), a grid-based method that senses the quality of the solution everywhere in the domain and refines the mesh locally to maintain the quality of solution. “If you don’t have the ability to track that growth and collapse of small fluctuations with your mesh refinement, you basically have unresolved blobs,” Norman said. “You don’t have any predictive capabilities in those collapsed objects, which are actually the galaxies and clusters that we want to understand.” Coupling multiple physical algorithms with AMR, Norman can simulate the cosmic structure with “extreme resolution” to produce visualizations an order of magnitude more accurate than with other codes. In the case of Norman’s helium calculations on Ranger, this means simulating a 2048^3 root grid representing 300 million light years on a side, and evolving the behavior of dark matter, gas, and eventually stars and galaxies, over several billion years. “As we do the evolution, the code starts to refine the mesh locally,” Norman explained. “Our goal is to refine down ten levels, which gives us a 2^10 resolution increase locally: a thousand times the resolution.” But Norman doesn’t stop there. He wanted to add another crucial element to his simulation: radiative transfer, where the electromagnetic radiation produced by photons and quasars heats the universe and changes its characteristics, a crucial factor in the Reionization Era. It wasn’t until the advent of Ranger, however, that it became possible to add the radiative transfer equations to ENZO, Norman insisted. “It’s a very computationally intensive piece of physics that really requires Ranger's computer power to do.” When his simulation is completed this summer, it will be the largest cosmological calculation ever done. His single-run simulation will generate 200 terabytes of data, which will be analyzed and visualized over the course of the coming year to create a detailed picture of reionization in action. “We’re using supercomputers to roll back the clock of the universe and look at how it developed its remarkable characteristics,” Norman said. Unlike the early universe, where computer simulations are all scientists have, there is a solid body of observational data about the Reionization Era that Norman will consult and by which his simulation will be constrained. Though the broad outlines of the cosmic biography are known, Norman’s work aims to fill in the gaps and discover precisely how and why the universe evolved the way it did. “As a researcher, I’m really excited,” Norman exclaimed. “My style is to push the envelope in terms of doing big simulations that go where no man has gone before, both in scale and in complexity, and Ranger is the place where that is happening right now.” ********************************************************** To learn more about simulating the universe, read Norman's chapter in "Petascale Computing: Algorthims and Applications” entitled:
“Simulating Cosmological Evolution with ENZO.” Aaron Dubrow
Texas Advanced Computing Center
Science and Technology Writer