Oci-1440085 Project Report for Performance Period: 1 September 2014 – 30 August 2015 Principal Investigator



Download 69.47 Kb.
Date31.01.2017
Size69.47 Kb.
#14112
Extending the Spatiotemporal Scales of Physics-based Seismic Hazard Analysis

(OCI-1440085)

Project Report for Performance Period:

1 September 2014 – 30 August 2015

Principal Investigator:

Thomas H. Jordan – University of Southern California – Earth Sciences



Co-PI:

Jacobo Bielak – Carnegie Mellon University – Civil Engineering

Kim Bak Olsen - San Diego State University - Department of Geological Sciences

Yifeng Cui - San Diego Supercomputer Center - GeoComputing Laboratory


1.  What are the major goals of the project?

SCEC researchers are using our Petascale Computing Resource Allocation (PRAC) on NCSA Blue Waters to develop physics-based seismic hazard analysis methods and to apply these methods to better understand earthquake processes and seismic hazards. SCEC’s PRAC project targets three primary objectives:


O1. Validation of high-frequency simulations (up to 8 Hz) against seismic recordings of historical earthquakes, such as the 1994 Northridge earthquake (M 6.7).
O2. Computation of 2-Hz CyberShake hazard model for the Los Angeles region as input to the development of high-resolution urban seismic hazard maps by the USGS and SCEC.
O3. High-frequency (4 Hz) simulation of a M7.8 earthquake on the San Andreas fault that will revise the 2008 Great California ShakeOut scenario and improve the risk analysis developed in detail for this scenario.
During Year 1 of SCEC’s PRAC project, we have made significant progress towards these objectives by integrating new physics into our simulation codes, running wave propagation simulations at 4Hz, advancing toward Objective 1, and performing a CyberShake hazard model calculation at 1Hz, advancing toward Objective 2.
2.  What was accomplished under these goals (you must provide information for at least one of the 4 categories below)?
Major Activities:
A major activity of the SCEC PRAC researcher teams during Project year 1 was to integrate advanced physics into existing high-performance wave propagation software. Physical effects that were negligible at lower frequencies are becoming increasingly more important for simulation frequencies above 1Hz. During Project year 1, as a major activity on our project, we integrated more realistic physics into wave propagation codes including both AWP-ODC and Hercules.
Hercules wave propagation code development continues under the leadership of J. Bielak and R. Taborda. We completed the implementation of a GPU module and a performance-monitoring module on Hercules, one of the parallel codes in our High-F simulation software framework. Hercules is a multifaceted finite element-based solver capable of simulating either elastic or anelastic wave propagation effects. To effectively use hybrid parallel architectures such as NCSA Blue Waters systems, Hercules was refactored to utilize Nvidia GPU accelerators. Specifically, the stiffness contributions, attenuation contributions, and the displacement updates modules of the code now use the CUDA SDK. These operations comprise the majority of the physics calculations performed by the solver when determining the solution to the anelastodynamic equations. A number of techniques were employed to maximize computational efficiency on the GPU. The compute kernels now utilize memory coalescing to increase memory subsystem throughput while still maintaining data structure compatibility with the existing CPU-based physics calculations. In addition, the kernels themselves are structured to minimize control divergence and thread synchronization which often leads to loss of performance. Host-to-device communication fully employs pinned memory for maximum transfer speeds. Lastly, the GPU implementation supports fully asynchronous operation, with the stiffness and attenuation contribution calculations overlapped with the I/O processing of the solver. The new GPU implementation computes, at runtime, the optimum kernel launch parameter configuration for the current compute capability. This maximizes occupancy of the GPU. The software also supports an arbitrary number of GPU devices per compute node, and utilizes a simple load balancing scheme to assign each host CPU to a GPU device on the compute node. The new implementation has been successfully executed on the XK7 partition at NCSA Blue Waters. On average, the GPU solver demonstrated a speedup of 2.3 times over the CPU solver. We developed a numerical scheme based on a fictitious domain ground motion in the presence of realistic surface topography of the Earth’s crust. We showed that by adapting a non-conforming octree-based meshing scheme associated with a virtual representation of the topography, we can obtain accurate representations of ground motion. The new scheme allows us to preserve the advantageous features of multi-resolution cubic-shaped finite elements inherent to Hercules. We implemented the new scheme into Hercules and used it to simulate ground motion in the Aburrá Valley in the Colombian Andes, an earthquake-prone region that exhibits moderate-to-strong surficial irregularities. We found that 3D topography for this application can significantly influence ground motion, with amplification factors as high as 5 at some locations and reduction factors up to 2. We incorporated acoustic wave propagation and gravity effects into Hercules in order to capture the generation and offshore propagation of tsunamis triggered by suboceanic earthquakes. We addressed the coupled nature of earthquakes and the resultant tsunamis through a case study of the 2011 Tohoku-Oki event. We focused on the generation, offshore propagation of the tsunami waves using publicly available velocity and source models. Initial results are consistent with the arrival times, wave heights, and location where the tsunami first hit the coast of Japan. As part of our software engineering efforts, Hercules was moved from a private to a public GitHub repository. With this change the code is now offered openly to users, and is on path to become a community software of wider use beyond SCEC activities. Currently, the code is in use by researchers from 6 different universities in the U.S., Mexico, Colombia, and Europe (France), as well as by researchers from the U.S. Geological Survey. The code’s GitHub Wiki has also been improved, and now includes documentation on how to use both the GPU and CPU versions of the code.
We also have made important advancements improving the underlying physics of our AWP Software. We have implemented non-associated Drucker-Prager nonlinear rheology following the return map algorithm in the scalable AWP-ODC Wave Propagation finite difference code to model wave propagation resulting from the ShakeOut source description. The implementation was verified against the SCEC/USGS dynamic earthquake rupture code verification exercise (Harris et al., 2011). When using the kinematic ShakeOut source up to 0.5Hz, PGVs obtained for a linear visco-elastic medium exceed 1 m/s inside a large area along the main waveguide connecting the San Bernardino basin and the Los Angeles basin (LAB), with PGVs above 2 m/s in isolated patches. When plasticity based on realistic cohesion models is taken into account, PGVs remain generally below 1 m/s in the LAB, corresponding to reductions up to 50%. Plasticity also results in reduced directivity. Additional fully dynamic simulations with the code confirm the findings. We have implemented a frequency-dependent Q(f) into AWP-ODC, for both CPU and GPU versions. Preliminary results indicate that the efficient coarse-grained approach is accurate for Q as low as 20 over a bandwidth of two decades. Test for the 2008 Mw 5.4 Chino Hills earthquake indicates that Q(f) generally fit the strong motion data better than for constant Q models for frequencies over 1 Hz. A GPU version of our AWP software, called AWP-ODC-GPU, has matured rapidly, and after careful evaluation in 2014, we are now using this highly scalable and efficient code for high-frequency deterministic, CyberShake reciprocity-based, and non-linear plasticity project simulations.
In early 2015, the SDSC and SCEC GPU-based AWP-ODC-GPU high performance code was publically recognized by the NVIDIA Corp. The HPGeoC team at San Diego Supercomputer Center, led by Yifeng Cui, received the 2015 NVIDIA Global Impact Award for development and use of the AWP-ODC-GPU code, in recognition of this GPU code’s outstanding performance on GPU-enabled supercomputers including Blue Waters and Titan, and for the broad impact application of the code, including its use for SCEC CyberShake calculations. This award included a one-time cash award, with full details available on the NVIDIA web site at: http://blogs.nvidia.com/blog/2015/03/06/gpus-help-prepare-for-the-big-one/
Specific objectives:

Significant Results:

Key Outcomes or Other Accomplishments:
Two key SCEC PRAC Project accomplishments during project year 1 include: (1) 4Hz deterministic verification using 3 codes, and (2) CyberShake Study 15.4, the first 1Hz CyberShake hazard model.
4Hz Deterministic Verification: Working our way toward our PRAC project goal O1 of 8Hz validation simulations, during project year 1 we performed 4Hz deterministic verification using 3 different simulation codes. This work was done on Blue Waters as part of SCEC High-Frequency (High-F) ground motion simulation research. The SCEC High-F verification and validation computational researchers incorporate advanced simulation capabilities into SCEC ground motion simulation codes and then apply the improved codes to the estimation of ground motions for use in the construction of seismic hazard models. During PRAC year 1, we produced simulation results for a series of 0–4 Hz verification and validation exercises using three codes: AWP-ODC, AWP-RWG, and Hercules. The first two codes implement a 4th order finite difference (FD) method, while the third one implements a 2nd order finite elements (FE) method. Our verification exercises have followed an incremental approach to the model’s complexity. We started the verification efforts with a simple half-space model, continue with a smooth 1D crustal model, and follow that with ongoing efforts using the SCEC 3D velocity model CVM-S4.26. For validation purposes, we chose the 2014 Mw 5.1 La Habra, California, earthquake as a case study, for which we aim to predict the ground motion and evaluate the simulation results with comparisons against data. The simulation domain comprises a volume of 180 km x 135 km of surface area, and a depth of 62 km. The model covers the entire greater Los Angeles basin and other structural features in its vicinity. We use a point source to represent the earthquake rupture with a mechanism derived from strong-motion data, and a slip-rate time function obtained from a dynamic rough-fault model with frequency content up to 5 Hz. Initial verification steps include comparisons of results obtained with the different codes for simulations under elastic and anelastic conditions. Initial anelastic simulations were done using frequency-independent Q models, as implemented in each code. We also explored the significance of a frequency-dependent Q model. Our results for the verification exercises at the various complexity levels allowed us to identify the numerical parameters necessary for the codes to yield synthetics with a good level of agreement in the half-space and 1D crustal model cases. Figure 1 shows simulated ground velocity seismograms at 4Hz at two different sites using three different wave propagation codes.
CyberShake Study 15.4: During year 1 of our PRAC project, we have continued development of the CyberShake Platform. CyberShake is capable of generating the very large suites of simulations (100M+ synthetic seismograms) needed for physics-based probabilistic seismic hazard analysis (PSHA). Starting in April 2015, our SCEC research team used NCSA Blue Waters and OLCF Titan supercomputers to perform CyberShake Study 15.4 which was completed within 38 days, before end of May, 2015. This computation produced a Los Angeles region seismic hazard model, shown in Figure 2a-2b, that doubled the maximum seismic frequency represented in the Los Angeles urban seismic hazard model, from 0.5 Hz to 1 Hz. Seismic hazard curves were derived from large ensembles of seismograms at frequencies below this maximum for 336 surface sites distributed across the Los Angeles region. This new probabilistic model uses refined earthquake rupture descriptions through revisions to the conditional hypocenter distributions and the conditional slip distributions. This seismic hazard calculation used the CVM-S4.26 3D velocity model, which was validated and improved using ALCF Mira, as the best available southern California 3D velocity model. In order to complete our first 1Hz CyberShake simulations within weeks, rather than months, we divided the computational work between OLCF Titan and NCSA Blue Waters. We defined the distributed calculation using scientific workflows that automatically executed our parallel GPU intensive calculations on OLCF Titan, executed GPU parallel jobs and CPU-based post-processing on Blue Waters, and transferred scientific data products back to SCEC systems for visualization and archiving. Combined uses of these distributed computer systems (Blue Waters, Titan, USC) enable us to complete a CyberShake hazard model within the practical operational limits of our research group. For CyberShake Study 15.4, SCEC’s GPU-based anelastic wave propagation AWP-ODC software was used to run CPU-based post-processing calculations that synthesized over 300 million seismograms. In this study, we integrated all three system Blue Waters, Titan, and USC into our workflow management system, utilizing our workflow tools to manage job execution, manage data, and provide fault tolerance. Approximately 200 TB of SGT data was transferred from Titan to Blue Waters automatically as part of the workflow. On Titan, the accelerated calculations of the GPU Strain Green Tensor (SGT) implementation are 6.3 times more efficient than the CPU implementation, which saved us 2 million node-hours over the course of the study.
The CyberShake 15.4 model provides new seismic hazard information of interest to broad impact customers of CyberShake, including seismologists, utility companies, and civil engineers responsible for California building codes. The new model, which samples the complete Uniform California Earthquake Rupture Forecast, will be registered into the USGS Urban Seismic Hazard Mapping Project.
3.  What opportunities for training and professional development has the project provided?

Our SCEC PRAC proposal has provided opportunities for training and professional development to project members and their students, by providing access to Blue Waters to computer professionals and student researcher, and by providing travel funds to participate in Blue Waters meetings. Graduate students (and Post-docs) on the project include:


Shima Azizzadeh szzzdhrd@memphis.edu (engineering grad student)

Kenneth Duru kduru@stanford.edu (geoscience grad student)

Md Monsurul Huda mmhuda@memphis.edu (engineering grad student)

Naeem Khoshnevis nkhshnvs@memphis.edu (engineering grad student)

Yu-Pin Lin yupinlin@usc.edu (geoscience post-doc)

Kevin Milner kmilner@usc.edu (geoscience grad student)

Shiying Nie nie@rohan.sdsu.edu (geoscience grad student)

Dmitry Pekurovsky dmitry@sdsc.edu (computer science grad student)

Daniel Roten droten@sdsc.edu (geoscience post-doc)

William Savran wsavran@gmail.com (geoscience grad student)

Zheqiang Shi zshi@projects.sdsu.edu (geoscience post-doc)

Patrick Small patrices@usc.edu (computer science grad student)

Xin Song xinsong@usc.edu (geoscience grad student)

Yongfei Wang yow004@ucsd.edu (geoscience grad student)

Kyle Withers quantumkylew@aol.com (geoscience grad student)

Qian Yao yaoqian19910216@gmail.com (geoscience grad student)



4.  How have the results been disseminated to communities of interest?
SCEC PRAC Project PI, Thomas H. Jordan, discussed SCEC PRAC project and results in an Executive Branch meeting held at the White House on July 30, 2015, at a meeting held as part of the National Strategic Computing Initiative announcement.
The results of our research work are disseminated through journal and conference publications, presentations, and feedback to ground motion modelers including the following:


  1. Azzizadeh-Roodpish, S., Khoshnevis, N. and Taborda, R. (2014). Evaluation of the Southern California Velocity Models through Simulation and Validation of Multiple Historical Events, in Proc. SCEC Annu. Meet., Poster CME 080, September 2014, Palm Springs, CA.

  2. Cui, Y., (2014) “Scaled-up Physics-based Simulations for Earthquake System Science”, Monash Undergraduate Research Projects Abroad (MURPA) Seminar, University of Qeensland, Oct 16, 2014, Tutorial

  3. Cui, Y., & Daniel Roten, (2014) INCITE, ICTP Symposium on HPC and Data-Intensive Applications in Earth Sciences, Trieste, Nov 13-14, 2014

  4. Cui, Y. (2014) “Advances in Earthquake Simulations at Extreme Scale”, International Co-design Workshop, Guangzhou, China, Nov 6-8, 2014

  5. Cui, Y., Thomas Jordan, Kim Olsen, Philip Maechling, Scott Callaghan et al. (2014), “Extreme Scale Physics-Based Simulations for Earthquake System Science”, US-China-Germany E-Science and Cyberinfrastructure (CHANGES) Workshop, Beijing, Sept 10-12, 2014

  6. Cui, Y., Thomas Jordan, Kim Olsen, Philip Maechling, Scott Callaghan et al., “SCEC Application Performance and Software Development”, OLCF Users Meeting, Oak Ridge, July 22-24, 2014

  7. Duru, K., Dunham, E.M., and S.A. Bydlon (2014), Dynamic earthquake rupture simulations on nonplanar faults embedded in 3D geometrically complex, heterogeneous Earth models, Southern California Earthquake Center Annual Meeting, Palm Springs, CA.

  8. Duru, K., Dunham, E.M., and S.A. Bydlon (2014), Dynamic earthquake rupture simulations on nonplanar faults embedded in 3D geometrically complex, heterogeneous Earth models, American Geophysical Union Fall Meeting, San Francisco, CA.

  9. Duru, K., Dunham, E.M., and S.A. Bydlon (2014), A stable and efficient numerical method for the elastic wave equation and dynamic earthquake rupture simulations in 3D heterogeneous media and complex geometry, International Conference on Spectral and High Order Methods (ICOSAHOM), Salt Lake City, UT.

  10. Gill, D., Small, P., Maechling, P., Jordan, T. H., Shaw, J., Plesch, A., Chen, P., Lee, E.-J., Taborda, R. and Callaghan, S. (2014). UCVM: Open Source Software for Understanding and Delivering 3D Velocity Models, in Proc. SCEC Annu. Meet., September, Palm Springs, CA.

  11. Gill, D., Small, P., Maechling, P. J., Jordan, T. H., Shaw, J. H., Plesch, A., Chen, P., Lee, E.-J., Taborda, R., Olsen, K. B. and Callaghan, S. (2014). UCVM: Open Source Software for Understanding and Delivering 3D Velocity Models, in Abstr. AGU Fall Meet., December, San Francisco, CA.

  12. Gill, D., Small, P., Taborda, R., Lee, E.-J., Olsen, K. B., Maechling, P. J. and Jordan, T. H. (2015). Standardized Access to Seismic Velocity Models Using the Unified Community Velocity Model (UCVM) Software, in Abstr. SSA Annu. Meet., April , Pasadena.

  13. Karaoglu, H. and Bielak, J. (2014). Coupled finite element simulation of earthquakes and tsunami inception—A case study of the 2011 Tohoku-Oki earthquake and tsunami, in Abst. AGU Fall Meet., December 2014, San Francisco, CA

  14. Karaoglu, H. and Bielak, J., (2015). Finite element simulation of earthquake ground motion with coupling tsunamis in large domains, in Abstr. 26th IUGG General Assembly, June-July, Prague, Czech Republic.

  15. Karaoglu, H. Bielak, J. (2015). Coupled Finite Element Simulation of Earthquakes and Tsunami Inception: A Case Study of the 2011 Tohoku-Oki Earthquake and Tsunami, in Abstr. SSA Annu. Meet., April, Pasadena.

  16. Khoshnevis, N. and Taborda, R. (2014). Sensitivity of Ground Motion Simulation Validation to Signal Processing and GOF Criteria, in Proc. SCEC Annu. Meet., September 2014, Palm Springs, CA.

  17. Khoshnevis, N. and Taborda, R. (2015). Evaluation of Attenuation Models Q-Vs Relationships used in Physics-Based Ground-Motion Earthquake Simulation, in Abstr. SSA Annu. Meet., April 2015, Pasadena.

  18. Isbiliroglu, Y., Taborda, R. and Bielak, J. (2014). Multiple Structure-Soil-Structure Interaction and Coupling Effects in Building Clusters, in Proc. SCEC Annu. Meet., September 2015, Palm Springs, CA.

  19. Maechling, P., et al., (2015) Petascale Research in Earthquake System Science, Blue Waters Symposium, May 10-13, 2015, Sunriver Resort, Sunriver Oregon

  20. Restrepo, D. L., Bielak, J. and Taborda, R. (2014). Effects of Topography on Ground Motion in Southern California and the Wasatch Front Regions, in Abstr. AGU Fall Meet., December 2014, San Francisco, CA.

  21. Roten D., Yifeng Cui, (2015) “GPU-powered Simulations of Seismic Waves in Nonlinear Media”, GTC’15, San Jose, April 4-8, 2015

  22. Roten, D., K.B. Olsen, S.M. Day, and Y. Cui. Can we rely on linear elasticity to predict long-period ground motion? SCEC Annual Meeting, Palm Springs, Sept 2014, invited talk, Tuesday 8:00am.

  23. Small, P., Taborda, R., Bielak, J. and Jordan, T. (2014). GPU acceleration of Hercules, in Proc. SCEC Annu. Meet., September 2014, Palm Springs, CA.

  24. Taborda, R., Gill, D., Small, P., Silva, F., Maechling, P. L., Bielak, J. and Jordan, T. H. (2014). Integration of a 3D Low-Frequency Simulation with the SCEC Broadband Platform, , in Proc. SCEC Annu. Meet., September 2014, Palm Springs, CA.

  25. Taborda, R. (2014). Validation of Physics-Based (Deterministic) Ground Motion Earthquake Simulations and Evaluation of the Southern California Seismic Velocity Models, Presentation at the 10th US-Japan Joint Meeting USGS and UJNR Panel on Earthquake Research, October 13–15, Sendai, Japan.

  26. Withers, K.B., K.B. Olsen, Z. Shi, and S. Day. High-complexity deterministic Q(f) simulation of the 1994 Northridge Mw6.7 earthquake, SCEC Annual Meeting, Palm Springs, Sept 2014, poster # 66.

  27. Withers, K.B., K.B. Olsen, Z. Shi, and S. Day. High-complexity deterministic Q(f) simulation of the 1994 Northridge Mw6.7 earthquake, Abstract S31C-4433 presented at 2014 Fall Meeting, AGU, San Francisco, Calif., 15-19 Dec.

  28. Withers, K.B., K.B. Olsen, Z. Shi, and S.M. Day. Broadband (0-8 Hz) ground motion variability from ensemble simulations of the 1994 Mw6.7 Northridge earthquake including rough fault descriptions and Q(f), Seism. Res. Lett. 86 2B, 653.


List of countries visited for research on the project
Norway, (Y. Cui 2014)

Italy, (Y. Cui 2014)

China, (Y. Cui 2014)

Australia (Y. Cui 2014)

Japan (R. Taborda, October 2014)

Czech Republic (J. Bielak, June-July 2015)


SCEC project members have presented and discussed our work in a series of geoscientific and computational science meeting and workshops during the project including the following:


  • XSEDE15: Scientific Advancements Enabled by Enhanced Cyberinfrastructure, July 26 - 30, 2015, St. Louis, MO

  • Engineering Mechanics Institute (EMI) Conference, June 16-19, 2015, Stanford Engineering Campus, Palo Alto California

  • Blue Waters Symposium, May 10-13, 2015, Sunriver Resort, Sunriver Oregon

  • Seismological Society of America Spring, Pasadena, CA, April 21-23, 2015

  • SI2 PI Meeting Feb 17-19, 2015, Arlington Virginia

  • SCEC Workshop on Soil-Structure Interaction of Complex Systems, Jan 29, 2015, University of Southern California

  • AGU Fall December 15-19, 2014, San Francisco CA

  • SCEC Committee for Utilization of Ground Motion Simulations Meeting, December 3, 2014

  • SC14, Nov 17-20, 2014 New Orleans LA

  • SCEC Annual Meeting, Sept 8-10, 2014 Palm Springs, CA

  • SCEC Earthquake Ground Motion Simulation Meeting, 7 Sept 2014, Palm Springs CA

  • National Conference on Earthquake Engineering July 21-25, 2014 Anchorage, Alaska

Our PRAC team has also worked with NCSA, ALCF, OLCF, NVIDIA, USC, and other communications experts to develop scientific summaries of our recent accomplishments and impact. Several press articles, including NCSA articles, included descriptions of SCEC’s use of Blue Waters. Links to these online press articles include the following:


- NSF Discoveries Article about SCEC HPC in 2015

http://nsf.gov/discoveries/disc_summ.jsp?cntn_id=136013


- Articles About SCEC Research Presented at the 2015 Blue Waters Symposium

http://www.ncsa.illinois.edu/news/story/2015_blue_waters_symposium_highlights_successes_looks_to_the_future_of_supe

http://insidehpc.com/2015/06/petascale-research-in-earthquake-science-on-blue-waters/

- Articles about SDSC NVIDIA Award

http://blogs.nvidia.com/blog/2015/08/31/gpu-quake-hazard/

http://www.hpcwire.com/off-the-wire/sdsc-researchers-awarded-nvidia-2015-global-impact-award/

https://www.olcf.ornl.gov/2015/04/08/olcf-user-earns-nvidia-award-for-gpu-accelerated-earthquake-simulations/

http://ucsdnews.ucsd.edu/pressrelease/sdsc_researchers_win_nvidias_2015_global_impact_award

http://blogs.nvidia.com/blog/2015/03/16/global-impact-award-winner/

http://blogs.nvidia.com/blog/2015/03/06/gpus-help-prepare-for-the-big-one/


- Washington Post Article on UCERF3

http://www.washingtonpost.com/news/speaking-of-science/wp/2015/03/11/california-has-increasingly-powerful-earthquakes-to-look-forward-to/


- NBC Los Angeles Interview Thomas H. Jordan

http://www.nbclosangeles.com/video/#%21/on-air/as-seen-on/Reports-Examines-the-Future-of-California-Earthquakes/295839301


- NCSA Article on SCEC Seismic Hazard and SCEC Workflows

http://www.ncsa.illinois.edu/news/story/do_the_wave

http://www.ncsa.illinois.edu/news/story/streamlining_simulation
5. What do you plan to do during the next reporting period to accomplish the goals?

During project year 2, we will work towards our three main objectives O1, O2, and O3 as defined in our original proposal (and above). To reach these objectives, we need to double the current maximum simulated frequency for both our validation earthquake simulations and our CyberShake simulations. During the next year, we will increase the maximum simulated frequencies from 4Hz to 8Hz for individual validation simulations, and from 1Hz to 2Hz for CyberShake ensemble simulations.


In our scenario ground motion simulation, we expect move past verification onto validation of the simulation of the La Habra earthquake using the model CVM-S4.26. Using realistic 3D velocity model, we will use our updated high performance wave propagation codes to reproduce observations from this earthquake, where strong motion data are available at 350+ stations within the simulation domain region. We will also continue development of our GPU versions of AWP-ODC and Hercules.
For our CyberShake research, with the outstanding efficiency of our GPUs-based CyberShake code, we are now pushing the maximum simulated frequency of our CyberShake calculations towards simulation frequencies of 1.5Hz or 2.0Hz which will provide useable PSA1.0 amplitude measurements not currently available from the current 1Hz calculations.
Our CyberShake research will also involve developing engineering metrics, of interest to American Society of Civil Engineer (ASCE) working group interested in using our simulated ground motion results in future building code recommendations. To support verification, validation, and analysis of CyberShake results, we will add calculation of engineering parameters, specifically calculation of Risk-Targeted Maximum Considered Earthquake Response Spectra (MCER) into our workflow calculations, ensuring these parameters are calculated routinely for every CyberShake hazard model.
6.  Products - What has the project produced?

Software products produced by the project include open source software distributions, and simulation data sets, including the following:




  1. Unified Community Velocity Model (UCVM) Community Software is available for download at http://scec.usc.edu/scecpedia/UCVM

  2. The AWP-ODC CPU and GPU code is available for download on request from an SVN repository at SDSC.

  3. The Hercules software is currently hosted in a private GitHUB repository, and Hercules is available for download on request.

  4. CyberShake Study 15.4 hazard model is available for access and download from a SCEC web site.



7.  Participants & Other Collaborating Organizations - Who has been involved?


  1. Thomas Jordan [PI] tjordan@usc.edu

  2. Shima Azizzadeh szzzdhrd@memphis.edu

  3. Jacobo Bielak jbielak@cmu.edu

  4. Samuel Bydlon sbydlon@stanford.edu

  5. Scott Callaghan scottcal@usc.edu

  6. Po Chen pchen@uwyo.edu

  7. Dong Ju Choi dchoi@sdsc.edu

  8. Yifeng Cui yfcui@sdsc.edu

  9. Steven Day sday@mail.sdsu.edu

  10. Kenneth Duru kduru@stanford.edu

  11. Brittany Erickson berickson@projects.sdsu.edu

  12. David Gill davidgil@usc.edu

  13. Robert Graves rwgraves@usgs.gov

  14. Md Monsurul Huda mmhuda@memphis.edu

  15. Yigit Isbiliroglu yisbilir@andrew.cmu.edu

  16. Gideon Juve gideon@isi.edu

  17. Haydar Karaoglu haydarkaraoglu@gmail.com

  18. Naeem Khoshnevis nkhshnvs@memphis.edu

  19. Andriy Kot andriy@illinois.edu

  20. Yu-Pin Lin yupinlin@usc.edu

  21. Philip Maechling maechlin@usc.edu

  22. Kevin Milner kmilner@usc.edu

  23. Dawei Mu dmu@uwyo.edu

  24. Shiying Nie nie@rohan.sdsu.edu

  25. Kim Olsen kolsen@geology.sdsu.edu

  26. Dmitry Pekurovsky dmitry@sdsc.edu

  27. Efecan Poyraz epoyraz@ucsd.edu

  28. Hari Radhakrishnan hari.radhakrishnan@gmail.com

  29. Dorian Restrepo drestrep@andrew.cmu.edu

  30. Daniel Roten droten@sdsc.edu

  31. Mats Rynge rynge@isi.edu

  32. William Savran wsavran@gmail.com

  33. Zheqiang Shi zshi@projects.sdsu.edu

  34. Liwen Shih shih@uhcl.edu

  35. Fabio Silva fsilva@usc.edu

  36. Patrick Small patrices@usc.edu

  37. Xin Song xinsong@usc.edu

  38. Ricardo Taborda ricardo.taborda@memphis.edu

  39. Karan Vahi vahi@isi.edu

  40. Yongfei Wang yow004@ucsd.edu

  41. Kyle Withers quantumkylew@aol.com

  42. Heming Xu h1xu@sdsc.edu

  43. Qian Yao yaoqian19910216@gmail.com


Other collaborators or contacts been involved?

  1. Gregory Beroza - beroza@stanford.edu - Professor Geophysics

  2. Eric Dunham - dunham@stanford.edu - Professor Geophysics

  3. Haydar Karaoglu - haydarkaraoglu@gmail.com - Postdoc at the Institute de Physique du Globe de Paris

  4. John Yu - johnyu@usc.edu - Research Staff Computer Science

  5. Po Chen - pchen@uwyo.edu - Professor Geophysics

  6. Dhabaleswar (DK) Panda, OSU (MVAPICH Compiler Support)

  7. Zizhong Chen, UCR (AWP-MIC Development Collaboration)

  8. Xing Cai, Simula Lab/Norway (GPU-based Application Collaboration)



  1. Impact - What is the impact of the project? How has it contributed?

What is the impact on the development of the principal discipline(s) of the project?
The SCEC PRAC project has used physics-based computational models and observation-based 3D earth structure models to show how high performance computing can improve seismic hazard forecasts. The project translates basic research into practical products for reducing risk and improving community resilience. Specifically, this work helped reduce the total uncertainty in long-term hazard models, which has important practical consequences for the seismic provisions in building codes and especially for critical-facility operators. For example, PG&E faces very high costs (potentially tens of billions of dollars) in seismic retrofit expenses associated with its immoveable facilities, including many dams across California and its Diablo Canyon Nuclear Power Plant on the central California coast. Ground motion prediction equations (GMPEs), in common use by engineering organizations, predict the logarithmic intensity of ground shaking as a deterministic value conditioned on a set of explanatory variables plus a normally distributed random variable with a standard deviation sigma_T. The latter accounts for the unexplained variability in the ground motion data used to calibrate the GMPE and is typically 0.5-0.7 in natural log units. Reducing this residual or “aleatory” variability is a high priority for seismic hazard analysis, because the probabilities of exceedance at high hazard values go up rapidly with sigma_T, adding costs to the seismic design of critical facilities to account for the prediction uncertainty. However, attempts to decrease sigma_T by incorporating more explanatory variables to the GMPEs have been largely unsuccessful. An alternative is to employ physics-based earthquake simulations that properly account for source directivity, basin effects, directivity-basin coupling, and other 3D complexities. We have explored the theoretical limits of this approach through an analysis of large ensembles of simulations generated for the Los Angeles region by SCEC’s CyberShake project using the new tool of averaging-based factorization (ABF, Wang & Jordan, BSSA, 2014). The residual variance obtained by applying GMPEs to the CyberShake dataset matches the frequency-dependence of sigma_T obtained for the GMPE calibration dataset. The ABF analysis allows us to partition this variance into uncorrelated components representing source, path, and site effects. We show that simulations can potentially reduce sigma_T by about one-third, which could lower the exceedance probabilities at high hazard levels by orders of magnitude. Realizing this gain in forecasting probability would have a broad impact on risk-reduction strategies, especially for critical facilities such as large dams, nuclear power plants, lifelines, and energy transportation networks.
During this PRAC Project Year 1, we produced the first 1Hz physics-based seismic hazard model. CyberShake 15.4 Study represents a major milestone in physics-based PSHA for Southern California. Calculations of this scale were beyond the capabilities of previously available open-science supercomputers before Blue Waters. The fact that our 1Hz calculation ran for over four weeks using both Blue Waters and Titan indicates the scale of the CyberShake calculations. These computational results were made possible by the NSF Track 1 and DOE Leadership class HPC centers and their willingness to collaborate with SCEC and each other, helping SCEC to effectively use these large systems. As a result of these organizations collaborating together on computational system science research, SCEC's CyberShake seismic hazard simulations continue to improve the state-of-the-art in earthquake system science. In addition, realistic earthquake simulations can improve new systems now being developed by the USGS for operational earthquake forecasting and earthquake early warning
What is the impact on other disciplines?

Civil engineers are interested in using ground motion simulation results to augment existing observational data for strong ground motions. Engineers would like to use simulations to supplement the observational record. Only through a careful evaluation process of evaluation and validation, as we are performing on the SCEC PRAC project, will simulated ground motions be accepted for use by engineers. This research has tackled an important and challenging problem that will advance utilization of simulated ground motions in engineering practice. This will ultimately lead to discovery and understanding of effects of earthquakes on the built environment, and to improved ability to estimate ground motion duration in engineering practice.


SCEC’s computational research has also made an important impact on NSF and DOE HPC resource providers. During PRAC Year 1, we worked with NCSA Blue Waters and DOE HPC resource providers to run our CyberShake seismic hazard calculation. SCEC’s CyberShake 15.4 Study was a successful, multi-week simulation that used NSF, DOE, and USC HPC computer resources, working on an important scientific and engineering calculation, at a scale not possible by any other open-science calculation system.
This as an example of an NSF and DOE HPC collaboration that involved NSF Track 1 and DOE Leadership HPC computing facilities, and an NSF funded science research activity. We believe, on the way to extreme scale computing, NSF and DOE HPC collaborations on real practical problems that can benefit from large amounts of ongoing computational research, will be important in the future. SCEC’s PRAC work enhances the computational environment for geosystem studies by promoting a vertical integration through the multiple layers of HPC computational, data, and middleware cyberinfrastructure.
What is the impact on the development of human resources?

Our SCEC PRAC project helped to motivate and educate a diverse scientific workforce from undergraduates to early-career level researchers, engineers, and software developers, to integrate system science with HPC. Computational geoscience has recently emerged as a research discipline related to, but distinct from either geoscience or computer science. The SCEC PRAC project is training a workforce in the multi-disciplinary skills needed to contribute to computational geoscientific research. Project participants may then be qualified to work across fields expanding their occupational choices.


What is the impact on physical resources that form infrastructure?

SCEC’s PRAC work helps translate basic research into practical products for reducing risk and improving community resilience. Through close scientific collaboration with USGS groups responsible for national seismic hazard estimates, SCEC research helps to advance the computational capabilities of national seismic safety authorities. Through SCEC computational tools, and the results they produce, SCEC’s PRAC research provides scientists and engineers with new information that can be used to evaluate the seismic safety in urban areas and for specific sites of interest such as public power plants and hydroelectric dams.


What is the impact on institutional resources that form infrastructure?

SCEC’s PRAC project has generated outstanding scientific results of high value to earthquake science and engineering, and it thus contributes to the rationale for high levels of federal support of NSF and DOE HPC facilities. The PI was able to make this point at the roll-out meeting of the National Strategic Computing Initiative, held at the White House on July 30, 2015.


Important target users for our PRAC results are civil engineers including building engineers from the American Society of Civil Engineers (ASCE) who are working to develop building code recommendations. This group is participating in CyberShake evaluation and development process.
What is the impact on technology transfer?

The high performance scientific software developed and used on this project, including UCVM, Hercules, and AWP-ODC project, establish a mechanism for converting research software, developed by an individual researcher, into community scientific research software. These well-validated, open-source, scientific SCEC community codes developed and optimized on the PRAC project have attracted interest from private researchers and commercial companies for use in seismic hazard analysis research.


What is the impact on society beyond science and technology?

SCEC’s PRAC project activities have potential broad impact beyond science and technology through improved public safety. By providing engineers with more accurate and more complete information about earthquake generated strong ground motions, SCEC’s PRAC Project has the potential to make our urban areas safer.



Publications:

[0] Deelman, E. , Karan Vahi, Gideon Juve, Mats Rynge, Scott Callaghan, Philip J. Maechling, Rajiv Mayani, Weiwei Chen, Rafael Ferreira da Silva, Miron Livny, Kent Wenger (2014) Pegasus, a workflow management system for science automation, Elsevier, Future Generation Computer Systems, Oct 2014, doi:10.1016/j.future.2014.10.008


[1] Duru, K., and K. Virta (2014), Stable and high order accurate difference methods for the elastic wave equation in discontinuous media, J. Comput. Phys., 279, 37-62, doi:10.1016/j.jcp.2014.08.046.
[2] Duru, K., and G. Kreiss (2014), Boundary waves and stability of the perfectly matched layer for the two space dimensional elastic wave equation in second order form, SIAM J. Numer. Anal., 52(6), 2883-2904, doi:10.1137/13093563X.
[3] Duru, K., G. Kreiss, and K. Mattsson (2014), Accurate and stable boundary treatments for the elastic wave equation in second order form, SIAM J. Sci. Comput., 36(6), A2787–A2818, doi:10.1137/130947210.
[4] Duru, K., and G. Kreiss (2014), Boundary waves and stability of the perfectly matched layer for two space dimensional elastic wave equation in second order form, SIAM J. Numer. Anal., 52(6), 2883–2904, doi:10.1137/13093563X.
[5] Bydlon, S. A., and E. M. Dunham (2015), Rupture dynamics and ground motions from earthquakes in 2-D heterogeneous media, Geophys. Res. Lett., doi:10.1002/2014GL062982.
[6] Duru, K., and E. M. Dunham (2015), Dynamic earthquake rupture simulations on nonplanar faults embedded in 3D geometrically complex, heterogeneous elastic solids, J. Comput. Phys., submitted.
[7] Duru, K., J. E. Kozdon, and G. Kreiss (2015), Boundary conditions and stability of a perfectly matched layer for the elastic wave equation in first order form, J. Comput. Phys., submitted.
[8] Isbiliroglu, Y., Taborda, R., and Bielak, J. (2015). Coupled Soil-Structure Interaction Effects of Building Clusters During Earthquakes, Earthquake Spectra, 31 (1): 463-500, doi: 10.1193/102412EQS315M.
[9] Khoshnevis, N. and Taborda, R. (2015). Sensitivity of Ground Motion Simulation Validation Criteria to Filtering, in Proc. 12th Int. Conf. Applications of Statistics and Probability in Civil Engineering, ICASP12, Vancouver, Canada, July 12–15.
[10] Roten, D., K.B. Olsen, S.M. Day, Y. Cui, and D. Fah (2014). Expected seismic shaking in Los Angeles reduced by San Andreas fault zone plasticity, Geophys. Res. Lett. 2769-2777, DOI: 10.1002/2014GL059411.
[11] Restrepo, D. and Bielak, J. (2014). Virtual topography: A fictitious domain approach for analyzing free-surface irregularities in large-scale earthquake ground motion simulation, Int. J. Numer. Meth. Eng. 100 (7): 504–533.
[12] Restrepo, D., Bielak, J., Gómez, J., Jaramillo J., and Serrano, R. (2015). Effects of realistic topography on the ground motion of the Colombian Andes—A case study of the Aburrá Valley, Antioquia, Geophysics. J. Int. (submitted).
[13] Riaño-Escandón, A. C., Reyes-Ortiz, J. C., Yamin-Lacouture, L. E., Bielak, J. and Taborda, R. (2015). 3D large-scale models for simulating earthquake ground motion in seismic regions: A state-of-the-art review, in Proc. Natl. Conf. Earthq. Eng., Bogotá, Colombia, May 27–29.
[14] Taborda, R. and Roten, D. (2015). Physics-Based Ground Motion Simulation, in Encyclopedia of Earthquake Engineering, Beer, M., Patelli, E., Kougioumtzoglou, I. and Au, I. (Eds.), SpringerReference (www.springerreference.com), Springer-Verlag Berlin Heidelberg.
[15] Wang, F., and T. H. Jordan (2014), Comparison of probabilistic seismic-hazard models using averaging-based factorization, Bulletin of the Seismological Society of America June 2014 vol. 104 no. 3 1230-1257 doi: 10.1785/0120130263
[16] Zhou, Jun (2014) Scalable Parallel Programming for High Performance Seismic Simulation on Petascale Heterogeneous Supercomputer, PhD Thesis, University of California at San Diego, (Yifeng Cui as co-supervisor), 2014. UCSD PhD Thesis

References:

[1] Harris, R. A., Barall, M., Archuleta, R., Dunham, E., Aagaard, B., Ampuero, J. P., Bhat, H., Cruz-Atienza, V., Dalguer, L., Dawson, P., Day, S., Duan, B., Ely, G., Kaneko, Y., Kase, Y., Lapusta, N., Liu, Y., Ma, S., Oglesby, D., Olsen, K., Pitarka, A., Song, S., Templeton, E. (2009) The SCEC/USGS Dynamic Earthquake Rupture Code Verification Exercise, Seismological Research Letters Vol. 80, p. 119-126


[2] Olsen, K. B., S. M. Day, J. B. Minster, Y. Cui, A. Chourasia, M. Faerman, R. Moore, P. Maechling and T. Jordan (2006). Strong shaking in Los Angeles expected from southern San Andreas earthquake, Geophys. Res. Lett, Vol 33, L07305, doi:10.1029/2005GRL025472.

[3] Strasser, F. O., N. A. Abrahamson, and J. J. Bommer (2009). Sigma: issues, insights, and challenges, Seismol. Res. Lett. 80, 40–56.


[4] Tu, T., Yu, H., Ramírez-Guzmán, L., Bielak, J., Ghattas, O., Ma, K.-L., & O’Hallaron, D.R., 2006. From mesh generation to scientific visualization: an end-to-end approach to parallel supercomputing, in Proceedings of the 2006 ACM/IEEE International Conference for High Performance Computing, Networking, Storage and Analysis, p. 15, IEEE Computer Society, Tampa, Florida.



Download 69.47 Kb.

Share with your friends:




The database is protected by copyright ©ininet.org 2024
send message

    Main page