Options
Parsons, Tom
Loading...
Preferred name
Parsons, Tom
ORCID
Scopus Author ID
7103344997
14 results
Now showing 1 - 10 of 14
- PublicationOpen AccessSeismic attenuation and stress on the San Andreas Fault at Parkfield: are we critical yet?The Parkfield transitional segment of the San Andreas Fault (SAF) is characterized by the production of frequent quasi-periodical M6 events that break the very same asperity. The last Parkfield mainshock occurred on 28 September 2004, 38 years after the 1966 earthquake, and after the segment showed a ∼22 years average recurrence time. The main reason for the much longer interevent period between the last two earthquakes is thought to be the reduction of the Coulomb stress from the M6.5 Coalinga earthquake of 2 May 1983, and the M6 Nuñez events of June 11th and 22 July 1983. Plausibly, the transitional segment of the SAF at Parkfield is now in the late part of its seismic cycle and current observations may all be relative to a state of stress close to criticality. However, the behavior of the attenuation parameter in the last few years seems substantially different from the one that characterized the years prior to the 2004 mainshock. A few questions arise: (i) Does a detectable preparation phase for the Parkfield mainshocks exist, and is it the same for all events? (ii) How dynamically/kinematically similar are the quasi-periodic occurrences of the Parkfield mainshocks? (iii) Are some dynamic/kinematic characteristics of the next mainshock predictable from the analysis of current data? (e.g., do we expect the epicenter of the next failure to be co-located to that of 2004?) (iv) Should we expect the duration of the current interseismic period to be close to the 22-year “undisturbed” average value? We respond to the questions listed above by analyzing the non-geometric attenuation of direct S-waves along the transitional segment of the SAF at Parkfield, in the close vicinity of the fault plane, between January 2001 and November 2023. Of particular interest is the preparatory behavior of the attenuation parameter as the 2004 mainshock approached, on both sides of the SAF. We also show that the non-volcanic tremor activity modulates the seismic attenuation in the area, and possibly the seismicity along the Parkfield fault segment, including the occurrence of the mainshocks.
39 2 - PublicationOpen AccessOn the Use of High-Resolution and Deep-Learning Seismic Catalogs for Short-Term Earthquake Forecasts: Potential Benefits and Current LimitationsEnhanced earthquake catalogs provide detailed images of evolving seismic sequences. Currently, these data sets take some time to be released but will soon become available in real time. Here, we explore whether and how enhanced seismic catalogs feeding into established short-term earthquake forecasting protocols may result in higher predictive skill. We consider three enhanced catalogs for the 2016-2017 Central Italy sequence, featuring a bulk completeness lower by at least two magnitude units compared to the real-time catalog and an improved hypocentral resolution. We use them to inform a set of physical Coulomb Rate-and-State (CRS) and statistical Epidemic-Type Aftershock Sequence (ETAS) models to forecast the space-time occurrence of M3+ events during the first 6 months of the sequence. We track model performance using standard likelihood-based metrics and compare their skill against the best-performing CRS and ETAS models among those developed with the real-time catalog. We find that while the incorporation of the triggering contributions from new small magnitude detections of the enhanced catalogs is beneficial for both types of forecasts, these models do not significantly outperform their respective near real-time benchmarks. To explore the reasons behind this result, we perform targeted sensitivity tests that show how (a) the typical spatial discretizations of forecast experiments ( ≥ 2 km) hamper the ability of models to capture highly localized secondary triggering patterns and (b) differences in earthquake parameters (i.e., magnitude and hypocenters) reported in different catalogs can affect forecast evaluation. These findings will contribute toward improving forecast model design and evaluation strategies for next-generation seismic catalogs.
81 32 - PublicationOpen AccessCrustal permeability changes inferred from seismic attenuation: Impacts on multi-mainshock sequences(2022-09-08)
; ; ; ; ; ; ; ; ; ; ; We use amplitude ratios from narrowband-filtered earthquake seismograms to measure variations of seismic attenuation over time, providing unique insights into the dynamic state of stress in the Earth’s crust at depth. Our dataset from earthquakes of the 2016–2017 Central Apennines sequence allows us to obtain high-resolution time histories of seismic attenuation (frequency band: 0.5–30 Hz) characterized by strong earthquake dilatation-induced fluctuations at seismogenic depths, caused by the cumulative elastic stress drop after the sequence, as well as damage-induced ones at shallow depths caused by energetic surface waves. Cumulative stress drop causes negative dilatation, reduced permeability, and seismic attenuation, whereas strongmotion surface waves produce an increase in crack density, and so in permeability and seismic attenuation. In the aftermath of the main shocks of the sequence, we show that the M ≥ 3.5 earthquake occurrence vs. time and distance is consistent with fluid diffusion: diffusion signatures are associated with changes in seismic attenuation during the first days of the Amatrice, Visso- Norcia, and Capitignano sub-sequences. We hypothesize that coseismic permeability changes create fluid diffusion pathways that are at least partly responsible for triggering multi-mainshock seismic sequences. Here we show that anelastic seismic attenuation fluctuates coherently with our hypothesis.82 17 - PublicationOpen AccessThe Making of the NEAM Tsunami Hazard Model 2018 (NEAMTHM18)(2021-03-05)
; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ;; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ;; ; ; ; ; ; ; ; ; ;; ; ; ; ; ; ;The NEAM Tsunami Hazard Model 2018 (NEAMTHM18) is a probabilistic hazard model for tsunamis generated by earthquakes. It covers the coastlines of the North-eastern Atlantic, the Mediterranean, and connected seas (NEAM). NEAMTHM18 was designed as a three-phase project. The first two phases were dedicated to the model development and hazard calculations, following a formalized decision-making process based on a multiple-expert protocol. The third phase was dedicated to documentation and dissemination. The hazard assessment workflow was structured in Steps and Levels. There are four Steps: Step-1) probabilistic earthquake model; Step-2) tsunami generation and modeling in deep water; Step-3) shoaling and inundation; Step-4) hazard aggregation and uncertainty quantification. Each Step includes a different number of Levels. Level-0 always describes the input data; the other Levels describe the intermediate results needed to proceed from one Step to another. Alternative datasets and models were considered in the implementation. The epistemic hazard uncertainty was quantified through an ensemble modeling technique accounting for alternative models’ weights and yielding a distribution of hazard curves represented by the mean and various percentiles. Hazard curves were calculated at 2,343 Points of Interest (POI) distributed at an average spacing of ∼20 km. Precalculated probability maps for five maximum inundation heights (MIH) and hazard intensity maps for five average return periods (ARP) were produced from hazard curves. In the entire NEAM Region, MIHs of several meters are rare but not impossible. Considering a 2% probability of exceedance in 50 years (ARP≈2,475 years), the POIs with MIH >5 m are fewer than 1% and are all in the Mediterranean on Libya, Egypt, Cyprus, and Greece coasts. In the North-East Atlantic, POIs with MIH >3 m are on the coasts of Mauritania and Gulf of Cadiz. Overall, 30% of the POIs have MIH >1 m. NEAMTHM18 results and documentation are available through the TSUMAPS-NEAM project website (http://www.tsumaps-neam.eu/), featuring an interactive web mapper. Although the NEAMTHM18 cannot substitute in-depth analyses at local scales, it represents the first action to start local and more detailed hazard and risk assessments and contributes to designing evacuation maps for tsunami early warning.1610 99 - PublicationRestrictedSeismic Attenuation Monitoring of a Critically Stressed San Andreas FaultWe show that seismic attenuation ( QS^-1) along the San Andreas fault (SAF) at Parkfield correlates with the occurrence of moderate-to-large earthquakes at local and regional distances. Earthquake-related QS^-1 anomalies are likely caused by changes in permeability from dilatant static stress changes, damage by strong shaking from local sources, and pore unclogging/clogging from mobilization of colloids by dynamic strains. We find that, prior to the 2004 M6 Parkfield earthquake, prefailure conditions for some local events of moderate magnitude correspond to positive anomalies of QS^-1 on the Pacific side, with local and regional earthquakes producing sharp attenuation reversals. After the 2004 Parkfield earthquake, we see higher QS^-1 anomalies along the SAF, but low sensitivity to local and regional earthquakes, probably because the mainshock significantly altered the permeability state of the rocks adjacent to the SAF, and its sensitivity to earthquake-induced stress perturbations. Plain Language Summary We discovered that along the San Andreas fault, the damping of seismic energy through the Earth's crust is modulated by the state of stress in the crust, especially when the fault is close to rupture. The phenomenon depends on the density of cracks that permeate the rock, their interconnection, and the degree that they are filled with fluids. We show examples where attenuation, and thus permeability, is modulated by (i) damage from local earthquake shaking, (ii) dilatation imposed by local earthquakes, and (iii) clogging and unclogging of cracks induced by ground motion from distant earthquakes. These examples correlate with observed water well level changes. We note significant changes to the attenuation signal after the 2004 M6.0 Parkfield earthquake, with less sensitivity to local and distant earthquakes.
102 3 - PublicationOpen AccessCharacteristic Earthquake Magnitude Frequency Distributions on Faults Calculated From Consensus Data in CaliforniaAn estimate of the expected earthquake rate at all possible magnitudes is needed for seismic hazard forecasts. Regional earthquake magnitude frequency distributions obey a negative exponential law (Gutenberg-Richter), but it is unclear if individual faults do. We add three new methods to calculate long-term California earthquake rupture rates to the existing Uniform California Earthquake Rupture Forecast version 3 efforts to assess method and parameter dependence on magnitude frequency results for individual faults. All solutions show strongly characteristic magnitude-frequency distributions on the San Andreas and other faults, with higher rates of large earthquakes than would be expected from a Gutenberg-Richter distribution. This is a necessary outcome that results from fitting high fault slip rates under the overall statewide earthquake rate budget. We find that input data choices can affect the nucleation magnitude-frequency distribution shape for the San Andreas Fault; solutions are closer to a Gutenberg-Richter distribution if the maximum magnitude allowed for earthquakes that occur away from mapped faults (background events) is raised above the consensus threshold of M = 7.6, if the moment rate for background events is reduced, or if the overall maximum magnitude is reduced from M = 8.5. We also find that participation magnitudefrequency distribution shapes can be strongly affected by slip rate discontinuities along faults that may be artifacts related to segment boundaries.
196 102 - PublicationOpen AccessComments on ‘Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?’ by F. Mulargia, P.B. Stark and R.J. Geller(2018-01)
; ; ; ; ; ; ; ; ; Parsons et al. (2012), compared the characteristic and Gutenberg–Richter (G-R) distributions for time-dependent M ≥ 7.9 earthquake probability in the Nankai-Tokai subduction zone, Japan, a region for which historical information about several repeating strong earthquakes does exist. The purpose of their paper was to assess the possibility of making reasonable hazard assessments without requiring a characteristic model, and the conclusion was yes. In fact, they found that a simulator that imposes no physical geometric rupture barriers (meaning gaps or steps in the faults), can replicate the spatial proportion of fault segment ruptures evident within the studied area (within 95% confidence bounds). They concluded that the adoption of a G-R model can attain very similar matches to the historical catalog of the Nankai-Tokai zone, and suggested that very simple earthquake rupture simulations based on empirical data and fundamental earthquake laws could be useful forecast tools in information-poor settings.525 335 - PublicationRestrictedNucleation speed limit on remote fluid-induced earthquakesEarthquakes triggered by other remote seismic events are explained as a response to long-traveling seismic waves that temporarily stress the crust. However, delays of hours or days after seismic waves pass through are reported by several studies, which are difficult to reconcile with the transient stresses imparted by seismic waves. We show that these delays are proportional to magnitude and that nucleation times are best fit to a fluid diffusion process if the governing rupture process involves unlocking a magnitude-dependent critical nucleation zone. It is well established that distant earthquakes can strongly affect the pressure and distribution of crustal pore fluids. Earth's crust contains hydraulically isolated, pressurized compartments in which fluids are contained within low-permeability walls. We know that strong shaking induced by seismic waves from large earthquakes can change the permeability of rocks. Thus, the boundary of a pressurized compartment may see its permeability rise. Previously confined, overpressurized pore fluids may then diffuse away, infiltrate faults, decrease their strength, and induce earthquakes. Magnitude-dependent delays and critical nucleation zone conclusions can also be applied to human-induced earthquakes.
917 6 - PublicationRestrictedA physics-based earthquake simulator and its application to seismic hazard assessment in Calabria (Southern Italy) region(2017-03)
; ; ; ; ; ; ; ; ; ; ; The use of a newly developed earthquake simulator has allowed the production of catalogs lasting 100 kyr and containing more than 100,000 events of magnitudes ≥4.5. The model of the fault system upon which we applied the simulator code was obtained from the DISS 3.2.0 database, selecting all the faults that are recognized on the Calabria region, for a total of 22 fault segments. The application of our simulation algorithm provides typical features in time, space and magnitude behavior of the seismicity, which can be compared with those of the real observations. The results of the physics-based simulator algorithm were compared with those obtained by an alternative method using a slip-rate balanced technique. Finally, as an example of a possible use of synthetic catalogs, an attenuation law has been applied to all the events reported in the synthetic catalog for the production of maps showing the exceedance probability of given values of PGA on the territory under investigation.454 6 - PublicationRestrictedTsunamis: Bayesian Probabilistic Hazard Analysis(Springer, 2017)
; ; ; ; ; ; ; ; ; Tsunamis are low-frequency high-consequences major natural threats, rare events devastating vast coastal regions near and far from their generation areas. They may be caused by coseismic seafloor motions, subaerial and submarine mass movements, volcanic activities (like explosions, pyroclastic flows and caldera collapses), meteorological phenomena and meteorite ocean impacts. The probability of tsunami occurrence and/or impact on a given coast may be treated formally by combining calculations based on empirical observations and on models; this probability can be updated in light of new/independent information. This is the general concept of the Bayesian method applied to tsunami probabilistic hazard analysis, which also provides a direct quantification of forecast uncertainties. This entry presents a critical overview of Bayesian procedures with a primary focus on their appropriate and relevant applicability to tsunami hazard analyses.68 4