Probabilistic seismic hazard maps for California do not perform better relative to historical shaking data when site-specific VS30 is considered
- 1Northwestern University, Earth and Planetary Sciences, Evanston, IL, United States of America (email@example.com)
- 2University of Washington, Seattle, WA, USA
- 3US Geological Survey, Pasadena, CA, USA
- 4University of California, Berkeley, Berkeley, CA, USA
Probabilistic seismic hazard assessments forecast levels of earthquake shaking that should be exceeded with only a certain probability over a given period of time are important for earthquake hazard mitigation. These rely on assumptions about when and where earthquakes will occur, their size, and the resulting shaking as a function of distance as described by ground-motion models (GMMs) that cover broad geologic regions. Seismic hazard maps are used to develop building codes.
To explore the robustness of maps’ shaking forecasts, we consider how maps hindcast past shaking. We have compiled the California Historical Intensity Mapping Project (CHIMP) dataset of the maximum observed seismic intensity of shaking from the largest Californian earthquakes over the past 162 years. Previous comparisons between the maps for a constant VS30 (shear-wave velcoity in the top 30 m of soil) of 760 m/s and CHIMP based on several metrics suggested that current maps overpredict shaking.
The differences between the VS30 at the CHIMP sites and the reference value of 760 m/s could amplify or deamplify the ground motions relative to the mapped values. We evaluate whether the VS30 at the CHIMP sites could cause a possible bias in the models. By comparison with the intensity data in CHIMP, we find that using site-specific VS30 does not improve map performance, because the site corrections cause only minor differences from the original 2018 USGS hazard maps at the short periods (high frequencies) relevant to peak ground acceleration and hence MMI. The minimal differences reflect the fact that the nonlinear deamplification due to increased soil damping largely offsets the linear amplification due to low VS30. The net effects will be larger for longer periods relevant to tall buildings, where net amplification occurs.
Possible reasons for this discrepancy include limitations of the dataset, a bias in the hazard models, an over-estimation of the aleatory variability of the ground motion or that seismicity throughout the historical period has been lower than the long-term average, perhaps by chance due to the variability of earthquake recurrence. Resolving this discrepancy, which is also observed in Italy and Japan, could improve the performance of seismic hazard maps and thus earthquake safety for California and, by extension, worldwide. We also explore whether new nonergodic GMMs, with reduced aleatory variability, perform better than presently used ergodic GMMs compared to historical data.
How to cite: Gallahue, M., Salditch, L., Lucas, M., Neely, J., Hough, S., Stein, S., Abrahamson, N., and Williams, T.: Probabilistic seismic hazard maps for California do not perform better relative to historical shaking data when site-specific VS30 is considered, EGU General Assembly 2021, online, 19–30 Apr 2021, EGU21-6633, https://doi.org/10.5194/egusphere-egu21-6633, 2021.
Corresponding displays formerly uploaded have been withdrawn.