Since 2017, EQSIM—one of several projects supported by the US Department of Energy’s Exascale Computing Project (ECP)—has been breaking new ground in efforts to understand how seismic activity affects the structural integrity of buildings and infrastructure.
To accomplish this, however, has presented several challenges, noted EQSIM principal investigator David McCallen, a senior scientist in Lawrence Berkeley National Laboratory’s (Berkeley Lab) Earth and Environmental Sciences Area and director of the Center for Civil Engineering Earthquake Research at the University of Nevada Reno.
“The prediction of future earthquake motions that will occur at a specific site is a challenging problem because the processes associated with earthquakes and the response of structures is very complicated,” he said. “When the earthquake fault ruptures, it releases energy in a very complex way, and that energy manifests and propagates as seismic waves through the earth. In addition, the earth is very heterogeneous and the geology is very complicated. So when those waves arrive at the site or piece of infrastructure you are concerned with, they interact with that infrastructure in a very complicated way.”
A Team Effort
Over the last 5 years, using both the Cori and Perlmutter supercomputers at Berkeley Lab and the Summit system at DOE’s Oak Ridge National Laboratory, the EQSIM team has focused primarily on modelling earthquake scenarios in the San Francisco Bay Area. These supercomputing resources helped them create a detailed, regional-scale model that includes all of the necessary geophysics modelling features, such as 3D geology, earth surface topography, material attenuation, nonreflecting boundaries, and fault rupture.
“We’ve gone from simulating this model at 2–2.5 Hz at the start of this project to simulating more than 300 billion grid points at 10 Hz, which is a huge computational lift,” McCallen said.
Other notable achievements of this ECP project include:
-
Making important advances to the SW4 geophysics code, including how it is coupled to local engineering models of the soil and structure system.
-
Developing a schema for handling the huge datasets used in these models. “For a single earthquake we are running 272 TB of data, so you have to have a strategy for storing, visualising, and exploiting that data,” McCallen said.
-
Developing a visualisation tool that allows very efficient browsing of this data.
“The development of the computational workflow and how everything fits together is one of our biggest achievements, starting with the initiation of the earthquake fault structure all the way through to the response of the engineered system,” McCallen said. “We are solving one high-level problem but also a whole suite of lower-level challenges to make this work. The ability to envision, implement, and optimise that workflow has been absolutely essential.”
None of this could have happened without the contributions of multiple partners across a spectrum of science, engineering, and mathematics, he emphasised. Earth engineers, seismologists, computer scientists, and applied mathematicians from Berkeley Lab and Livermore Lab formed the multidisciplinary, closely integrated team necessary to address the computational challenges.
Read the full article written by Kathy Kincade, Senior Writer on the Berkeley Lab Computing Sciences Area communications team.