Probabilistic Approach of Liquefaction Assessment for Guwahati Based on Shear Wave Velocity Values

Author(s):  
K. S. Vipin ◽  
S. D. Anitha Kumari
2017 ◽  
Vol 17 (5) ◽  
pp. 781-800 ◽  
Author(s):  
Indranil Kongar ◽  
Tiziana Rossetto ◽  
Sonia Giovinazzi

Abstract. Currently, some catastrophe models used by the insurance industry account for liquefaction by applying a simple factor to shaking-induced losses. The factor is based only on local liquefaction susceptibility and this highlights the need for a more sophisticated approach to incorporating the effects of liquefaction in loss models. This study compares 11 unique models, each based on one of three principal simplified liquefaction assessment methods: liquefaction potential index (LPI) calculated from shear-wave velocity, the HAZUS software method and a method created specifically to make use of USGS remote sensing data. Data from the September 2010 Darfield and February 2011 Christchurch earthquakes in New Zealand are used to compare observed liquefaction occurrences to forecasts from these models using binary classification performance measures. The analysis shows that the best-performing model is the LPI calculated using known shear-wave velocity profiles, which correctly forecasts 78 % of sites where liquefaction occurred and 80 % of sites where liquefaction did not occur, when the threshold is set at 7. However, these data may not always be available to insurers. The next best model is also based on LPI but uses shear-wave velocity profiles simulated from the combination of USGS VS30 data and empirical functions that relate VS30 to average shear-wave velocities at shallower depths. This model correctly forecasts 58 % of sites where liquefaction occurred and 84 % of sites where liquefaction did not occur, when the threshold is set at 4. These scores increase to 78 and 86 %, respectively, when forecasts are based on liquefaction probabilities that are empirically related to the same values of LPI. This model is potentially more useful for insurance since the input data are publicly available. HAZUS models, which are commonly used in studies where no local model is available, perform poorly and incorrectly forecast 87 % of sites where liquefaction occurred, even at optimal thresholds. This paper also considers two models (HAZUS and EPOLLS) for estimation of the scale of liquefaction in terms of permanent ground deformation but finds that both models perform poorly, with correlations between observations and forecasts lower than 0.4 in all cases. Therefore these models potentially provide negligible additional value to loss estimation analysis outside of the regions for which they have been developed.


2016 ◽  
Author(s):  
Indranil Kongar ◽  
Tiziana Rossetto ◽  
Sonia Giovinazzi

Abstract. Currently, catastrophe models used by the insurance industry account for liquefaction simply by applying a factor to shaking-induced losses based on local liquefaction susceptibility so there is a need more sophisticated approach to incorporating the effects of liquefaction in loss models is needed. This study compares eleven unique models, each based on one of three principal simplified liquefaction assessment methodologies: liquefaction potential index (LPI) calculated from shear-wave velocity; the HAZUS software methodology; and a methodology created specifically to make use of USGS remote sensing data. Data from the September 2010 Darfield and February 2011 Christchurch earthquakes in New Zealand are used to compare observed liquefaction occurrences to predictions from these models using binary classification performance measures. The analysis shows that the best performing model is the LPI calculated using known shear-wave velocity profiles, although this data may not always be available to insurers. The next best model is also based on LPI, but uses shear-wave velocity profiles simulated from the combination of USGS VS30 data and empirical functions that relate VS30 to average shear-wave velocities at shallower depths. This model is useful for insurance since the input data is publicly available. This paper also considers two models (HAZUS and EPOLLS) for prediction of the scale of liquefaction in terms of permanent ground deformation but finds that both models perform poorly and thus potentially provide negligible additional value to loss estimation analysis outside of the regions for which they have been developed.


1988 ◽  
Vol 4 (4) ◽  
pp. 687-729 ◽  
Author(s):  
H. B. Seed ◽  
M. P. Romo ◽  
J. I. Sun ◽  
A. Jaime ◽  
J. Lysmer

Comparisons are presented between the characteristics of ground motions at five sites underlain by clay at which ground motions were recorded in Mexico City in the earthquake of September 16, 1985 and for which analyses of ground response have been made, based on the measured properties of soils and the motions recorded on hard formations at the National University of Mexico. It is shown that the ground response in areas of Mexico City underlain by clay is extremely sensitive to small changes in the shear wave velocity of the clay and it is suggested that a probabilistic approach which allows for uncertainties in shear wave velocity measurements and in the characteristics of the motions on the hard formations is desirable to assess these effects. Based on the results of such an approach it is concluded that simple ground response analyses can provide very useful data for engineering assessments of the effects of local soil conditions on the characteristics of ground motions likely to develop at sites underlain by soft clays, and that the use of these procedures also provides a useful basis for estimating the general nature of the ground motions in the extensive heavy damage zone of Mexico City in the 1985 earthquake.


Solid Earth ◽  
2019 ◽  
Vol 10 (2) ◽  
pp. 379-390 ◽  
Author(s):  
Yaniv Darvasi ◽  
Amotz Agnon

Abstract. Instrumental strong motion data are not common around the Dead Sea region. Therefore, calibrating a new attenuation equation is a considerable challenge. However, the Holy Land has a remarkable historical archive, attesting to numerous regional and local earthquakes. Combining the historical record with new seismic measurements will improve the regional equation. On 11 July 1927, a rupture, in the crust in proximity to the northern Dead Sea, generated a moderate 6.2 ML earthquake. Up to 500 people were killed, and extensive destruction was recorded, even as far as 150 km from the focus. We consider local near-surface properties, in particular, the shear-wave velocity, as an amplification factor. Where the shear-wave velocity is low, the seismic intensity far from the focus would likely be greater than expected from a standard attenuation curve. In this work, we used the multichannel analysis of surface waves (MASW) method to estimate seismic wave velocity at anomalous sites in Israel in order to calibrate a new attenuation equation for the Dead Sea region. Our new attenuation equation contains a term which quantifies only lithological effects, while factors such as building quality, foundation depth, topography, earthquake directivity, type of fault, etc. remain out of our scope. Nonetheless, about 60 % of the measured anomalous sites fit expectations; therefore, this new ground-motion prediction equation (GMPE) is statistically better than the old ones. From our local point of view, this is the first time that integration of the 1927 historical data and modern shear-wave velocity profile measurements improved the attenuation equation (sometimes referred to as the attenuation relation) for the Dead Sea region. In the wider context, regions of low-to-moderate seismicity should use macroseismic earthquake data, together with modern measurements, in order to better estimate the peak ground acceleration or the seismic intensities to be caused by future earthquakes. This integration will conceivably lead to a better mitigation of damage from future earthquakes and should improve maps of seismic hazard.


2021 ◽  
pp. 875529302110010
Author(s):  
Sameer Ladak ◽  
Sheri Molnar ◽  
Samantha Palmer

Site characterization is a crucial component in assessing seismic hazard, typically involving in situ shear-wave velocity ( VS) depth profiling, and measurement of site amplification including site period. Noninvasive methods are ideal for soil sites and become challenging in terms of field logistics and interpretation in more complex geologic settings including rock sites. Multiple noninvasive active- and passive-seismic techniques are applied at 25 seismograph stations across Eastern Canada. It is typically assumed that these stations are installed on hard rock. We investigate which site characterization methods are suitable at rock sites as well as confirm the hard rock assumption by providing VS profiles. Active-source compression-wave refraction and surface wave array techniques consistently provide velocity measurements at rock sites; passive-source array testing is less consistent but it is our most suitable method in constraining the rock VS. Bayesian inversion of Rayleigh wave dispersion curves provides quantitative uncertainty in the rock VS. We succeed in estimating rock VS at 16 stations, with constrained rock VS estimates at 7 stations that are consistent with previous estimates for Precambrian and Paleozoic rock types. The National Building Code of Canada uses solely the time-averaged shear-wave velocity of the upper 30 m ( VS30) to classify rock sites. We determine a mean VS30 of ∼ 1600 m/s for 16 Eastern Canada stations; the hard rock assumption is correct (>1500 m/s) but not as hard as often assumed (∼2000 m/s). Mean variability in VS30 is ∼400 m/s and can lead to softer rock classifications, in particular, for Paleozoic rock types with lower average rock VS near the hard/soft rock boundary. Microtremor and earthquake horizontal-to-vertical spectral ratios are obtained and provide site period classifications as an alternative to VS30.


Sign in / Sign up

Export Citation Format

Share Document