Tomorrow's Forecast: Clear with a Chance of Tremors

CyberShake computations at TACC produce dynamic ground motion map for Southern California

Story Highlights:

  •  

     

    The Southern California Earthquake Center (SCEC) used the Ranger supercomputer at TACC to explore the effect of earthquakes on structures in Southern California over long time-scales.

  •  

    Their computational simulations form the basis for new probabilistic seismic hazard analysis maps used by the U.S. Geological Survey, which impact building codes and zoning.

  •  

     

    The new hazard analysis maps predict more shaking in heavily populated regions of the Los Angeles basin.

 

Imagine if the nightly news featured an earthquake forecast alongside your local weather outlook.

The CyberShake project, based at the Southern California Earthquake Center (SCEC), is advancing geophysics toward that goal. Five years into a giant, multi-institutional effort led by SCEC director, Thomas Jordan, CyberShake 3.0 is producing maps that predict how much ground motion can be expected throughout the L.A. basin over the next 50 years.

The CyberShake predictions, called seismic hazard maps, have the potential to preserve thousands of lives and save billions of dollars in the case of a catastrophic earthquake. Emergency response managers count on these predictions to determine what areas will be hardest hit in a quake, and where to deploy resources. Building engineers rely on them as well to construct structurally sound buildings.

More than 220 CyberShake physics-based PSHA hazard curves were assimilated into a background UCERF2 (2008) and NGA-based (2008) PSHA map above. New analyses tend to raise hazard estimates in the Los Angeles and Ventura Basins and reduce hazard estimates for mountainous regions  in southern California.

 

 

 

 

 

“We want buildings to last at least 50 years,” explained Philip Maechling, information technology architect for SCEC, which is associated with the University of Southern California. “We ask, ‘What are the peak ground motions that this building, or this site, will experience over that timeframe?’”

Seismologists have developed a technique to answer this question called probabilistic seismic hazard analysis (PSHA). PSHA has traditionally been based on attenuation models, in which historical records are extrapolated to create maps of unstable zones. For Southern California, however, the range of different types of soil and rock make it difficult to produce accurate maps. So, for nearly a decade, seismologists have been using numerical algorithms and computer simulations to predict future ground motions with far greater detail than traditional methods based on their knowledge of earthquake physics.

Computational PSHA combines the results of millions of virtual earthquake simulations into a map that tells a builder what will likely happen at their site in the future. To create their latest maps for CyberShake, SCEC teamed with the Texas Advanced Computing Center (TACC), whose massive supercomputer, Ranger, enabled the creation of next-generation hazard predictions that are more comprehensive than anything that has been created before.

Mapping Potential Quakes

PSHA requires two inputs: a velocity or earth structure model, and an earthquake rupture forecast. The first input uses information about the geology of the area being studied to determine how earthquakes would interact with the soil and substructure of the region: how fast seismic waves would travel, and how much shaking they would cause. The second input involves identifying where all the active faults in a region are, and determining the probability of each fault rupturing.

CyberShake is a very integrative project,” Maechling said. “It combines a lot of other activities at SCEC to produce these probabilistic maps that are an interface between seismologists and the public.”

The difficulty of creating accurate PSHA maps lies in simulating all the likely earthquakes at a given site. Since every earthquake has a number of potential outcomes, depending on the slip direction, the epicenter, and other factors, SCEC had to simulate 415,000 earthquakes to properly characterize southern California. Then, each site had to be understood in relation to all other possible earthquakes. These impacts were summed to determine the potential ground motion for each location on the hazard map.

“We’ve been working on CyberShake for over five years now,” Maechling reminisced. “The first year, we managed two hazard curves and it took six months to run them. The third year, we did another big run and got about twenty sites. The scientists were pleased, but they really needed a map.”

Employing an average of 4,400 processors simultaneously (and as many as 14,540 processors) over a period of 50 days on Ranger, SCEC researchers created a map consisting of 200 sites, one hundred times more than the group's first efforts in 2005. In total, the project used more than 20 million computing hours in 2008-2009.

With 200 data points, the researchers were able to present a map that exceeded the detail and accuracy of traditional attenuation maps.

CyberShake essentially enables one to obtain a more customized hazard estimate at a site,” said Ned Field, U.S. Geological Survey seismologist and leader for developing official earthquake forecast models for California. “During the 1994 Northridge earthquake, it was noted that neighboring areas of Santa Monica experienced very different levels of shaking. This was due to a lens effect where seismic waves were focused on certain areas and de-focused from others. Attenuation-relationship-based hazard studies would never predict these details, but CyberShake gives us hope.”

The new simulations mainly agreed with the attenuation maps, but there were crucial differences as well. CyberShake predicted significantly more ground motion in the most densely populated areas of the L.A. basin, potentially causing greater damage than previously anticipated.

 

“In the next five years, just like weather forecasts, we’re going to have ground motion forecasts and the technique should be applicable anywhere in the world.”

Philip Maechling, information architect for the Southern California Earthquake Center

 

For now, the CyberShake map is being used to improve the results of the attenuation map by combining the results of the two models. However, over time, researchers believe computational simulations will become the dominant predictor of seismic hazard.

The practical impact of these improvements is enormous. “Much work remains with respect to verifying that the results are reliable,” said Field. “Nevertheless, once we gain confidence in the methodology, CyberShake will enable both more reliable loss estimates for insurance purposes and better building codes for avoiding catastrophic collapse.”

Daily Predictions from Innovative Computing

Philip Maechling, information 
technology architect for the Southern
California Earthquake Center

But static PSHA maps are not the end goal, according to Maechling. In the near future, CyberShake will be integrated into a system that dynamically incorporates real-world seismic changes as they happen, producing daily ground motion forecasts.

“In the next five years, just like weather forecasts, we’re going to have ground motion forecasts,” said Maechling, “and the technique should be applicable anywhere in the world.”

Above and beyond the geophysical applications of the research, the methods developed by the CyberShake team will have far-reaching effects throughout high-performance computing.

CyberShake ran tens of thousands of jobs on Ranger, comprising more than 180 million tasks, making it one of the highest-capacity projects to ever run on an open HPC system. The developers of the project created new workflows and automation tools to allow many small jobs to be combined into a larger job, to ease the path through the Ranger job queue, and to guide the output files to SCEC’s storage system.

Though much of scientific computing is performed in parallel, there are many applications that run serially. For those problems, the methodology developed by the CyberShake group is an important innovation.

“There are a significant number of scientists who would like this capability,” said Maechling. “They have existing scientific codes that they’d like to run at large-scale on a cluster, but they don’t know how. The techniques that we’re developing could be used to help them in their research as well.” Other fields that could be impacted by the automated use of HPC include atmospheric research, high-energy physics, and biomedical research.

With its insights into the seismic future of the L.A. basin and improved methods for using HPC, the CyberShake project is a perfect example of how computational science impacts society.

“I see the USGS and emergency management agencies looking at our simulations and saying, ‘We need to take into consideration these new results coming out of SCEC,’” said Maechling. “That’s exciting.”


To learn more about CyberShake and the Southern California Earthquake Center, visit: www.scec.org