Atlas of the Shifting Earth

Leading researchers use Ranger to realistically simulate plate tectonics

Story Highlights:

  • Researchers using TACC’s Ranger supercomputer have modeled the processes responsible for continental drift and plate tectonics in greater detail than any previous simulation.

  • The group’s simulations are the first to explicitly include faults and plastic failure on a global scale, features that are thought to be key to geodynamic behavior.

  • The advent of petascale computing systems combined with advanced in adaptive mesh refinement allowed the research team to model the multi-scale mantle convection down to 2.5 km. resolution, a necessary step in the accurate modeling of the Earth.

 

Tectonic plates slide across the surface of the Earth at slower than glacial pace. Bumping against each other, pulling apart and diving below the surface, the plate motions drive the Earth’s underlying dynamism, causing earthquakes, volcanic explosions, and mountain upsurges.

What causes the movement of these massive plates? What prevents the plates from moving faster?

Mantle convection is the slow creeping motion of the Earth's rocky mantle in response to gravitationally unstable variations in its density. The process is responsible for continental drift and plate tectonics, but it had never been successfully modeled in such detail — until now.

Snapshot of three time steps from a mantle convection simulation showing dynamically-adapted refinement near rising thermal plumes and thermal boundary layers.

An interdisciplinary team of scientists is using the Ranger supercomputer at the Texas Advanced Computing Center (TACC) to create the most detailed simulation of the earth’s mantle to date. The team includes Carsten Burstedde, Omar Ghattas, Georg Stadler, Tiankai Tu, and Lucas Wilcox at the University of Texas at Austin; Laura Alisic, Mike Gurnis, and Eh Tan at the California Institute of Technology; and Shijie Zhong at the University of Colorado at Boulder.

“We’re attempting, for the first time, to simulate the fact that the Earth has rigid blocks, and that they slide by other rigid blocks at a fine scale,” said Mike Gurnis, John E. and Hazel S. Smits Professor of Geophysics at the California Institute of Technology and Director of the Seismological Laboratory.

The plates are 100 kilometer-thick limbs of convection, the outer skin of rising and falling cells of the slowly creeping, but solid mantle. Geophysicists have shown that the convective motion within the mantle and buoyancy within the lithosphere pulls and pushes the plates across the surface of the globe.

“We’re pretty sure we know where the forces that drive plate tectonics come from. It’s the weight of the slabs in the cold, convective down-welling,” Gurnis said. “What we don’t know is what resists the plate motion.”

This has led to an ongoing debate in geophysics, one that Gurnis and the rest of their team are trying to solve using computational methods.

When the plates slide by thrust faults, they bend and weaken. The researchers believe this “plastic failure” provides resistance and could play an important role in limiting the speed of plate movements. The group’s model is the first to explicitly include faults and plastic failure on a global scale, features that are thought to be key to geodynamic behavior.

A descriptive model of the workings of plate tectonics has eluded researchers until now primarily because of the multiscale nature of the problem, says Omar Ghattas, the John A. and Katherine G. Jackson Chair in Computational Geosciences at The University of Texas at Austin. Some plates are more than 10,000 kilometers across, but the permanent deformation near faults where the plates meet is only a little over a kilometer, and it’s at this scale where much of the action happens.

“A global simulation that can drill down to the one kilometer scale has never been achieved before,” said Ghattas. “The best global models were previously able to simulate only down to scales of tens of kilometers.”

However, by applying parallel Adaptive Mesh Refinement (AMR), a technique perfected by the group, the team has produced a model of mantle convection and plate movement that captures both the large-scale and fine-grained details of geologic movement.

With the new model in place, “there are incredible opportunities to study the physics of the earth in a way that has never been achieved before,” Gurnis said.

“This research will give us a much better understanding of the force balance on the plates. We’re getting a major upgrade of our understanding of the physics of the Earth.”

John E. and Hazel S. Smits Professor of Geophysics at the California Institute of Technology and Director of the Seismological Laboratory

Adapting to Massive Problems

With AMR, scientists have found a way to focus the attention and detail of the computational model where it is most needed, since it’s not possible to include all trillion cubic kilometers of the Earth’s mantle in a single 1 km-scale simulation.

In 2007, the group received a Petascale Applications (PetaApps) grant from the National Science Foundation (NSF) to develop a geodynamics code that could scale up to a sustained petaflop, or a thousand trillion floating point operations per second.

The PetaApps program brings together multidisciplinary teams to find fresh solutions to some of science’s most important and currently intractable problems. “To develop these codes, you need sustained efforts at the interfaces between computer science, applied mathematics, scientific computing, and the domain physics areas,” Ghattas said. “It would not have happened had it not been for NSF’s PetaApps program.”

Tectonic plate motion resulting from a mantle convection simulation with nonlinear rheology. Very fine mesh resolution (2.5 km) used to resolve fine-scale flow near plate boundaries. Adaptive mesh refinement (AMR) results in three orders of magnitude fewer mesh elements than a uniform mesh would require. The computed plate velocities are indicated by the vector plot, while the surface color distribution represents viscosity magnitudes.

The group built their parallel AMR code from a clean sheet of paper, developing next-generation algorithms that are designed to scale up to hundreds of thousands of cores. This is often the only way to scale up complex science codes to petascale systems, Ghattas said.

The group’s paper highlighting AMR simulations across the entirety of Ranger’s 62,464 cores was a finalist for the Gordon Bell Prize, HPC’s highest honor, at the Supercomputing 2008 conference. That work formed the basis of what has become the preeminent AMR-based solid earth geophysics code, Rhea.

“We’re performing simulations where the mechanical properties of the system are in the model in a realistic way,” said Ghattas. “We’ve never had faults and plastic failure in global simulations.”

The models have produced spectacular simulations of the whole planet with plates resolved at scales down to 2.5 km. The model will likely provide evidence that can answer the question of what determines the rate and direction of plate movements.

The research also points the way to more effective uses of AMR techniques, which are beneficial for many scientific problems that are multiscale in nature.

“Problems that are characterized by a wide range of length and time scales abound in computational science in every area of research,” Ghattas said. “This is a distinctive feature of many of the problems that are being tackled today. So, absolutely, this AMR method will have an impact.”

Team members from the Mantle Convection PetaApps project with visualizations from the Rhea mantle convection code on TACC's Stallion tiled display. From L to R: Omar Ghattas UT-Austin), Lucas Wilcox (UT-Austin), Carsten Burstedde (UT-Austin), Georg Stadler (UT-Austin), Michael Gurnis (Caltech). Team members not pictured: Laura Alisic (Caltech), Eh Tan (Caltech), Tiankai Tu (UT-Austin), and Shijie Zhong (Colorado). Photo by Leigh Brodie.

Inverse Perspectives

The model the team developed has higher resolution and more realistic physics than previous models. To truly derive an accurate earth model, however, it will be necessary to assimilate historical and present-day seismic observations and plate motions into the simulations to estimate uncertain parameters.

Using observational data to estimate uncertain parameters in a simulation, called “inverse modeling,” is a frontier problem throughout the computational sciences. “Now that we have big enough machines and the forward models are becoming more mature, we’re in a position to tackle the inverse problem,” Ghattas said. “But to solve the inverse problem, you have to solve the forward problem many times. If you need a sustained petaflop machine for the forward problem, you’re going to need a sustained exaflop machine for the inverse problem.”

Marching toward this goal, the group is developing the computational methods and software to make the most effective use of the massive HPC systems presently coming online. These tools will then be offered to the NSF TeraGrid and geophysics communities, where they can enable the search for a better understanding of the dynamics of the Earth.

“Our ability to understand the physics is dramatically improving, and this research will give us a much better understanding of the force balance on the plates,” Gurnis said. “We’re getting a major upgrade of our understanding of the physics of the Earth.”