Analysis of focal mechanisms offers important insights into earthquake hazard assessment and tectonic processes but is often constrained by sparse station coverage and high noise levels in observed data. To address these challenges, this paper presents a high-performance computing (HPC) methodology for generating extensive synthetic datasets of earthquake focal mechanisms. The proposed framework incorporates a tailored parallelization strategy to ensure efficient and scalable simulations of diverse fault geometries and tectonic scenarios. At the node level, earthquake source parameters (hypocenter coordinates and strike, dip, rake combinations) are distributed into balanced batches to optimize load distribution. Within each node, computational processes handle the calculation of P-wave polarities and radiation amplitudes, utilizing the joblib library for efficient intra-node parallelism, enabling the rapid generation of high-fidelity synthetic datasets. Large-scale simulations, combined with parallelization techniques, demonstrate how HPC can effectively address data variability and scarcity, bridging the gap between computational methodologies and observational data constraints. The generated datasets provide a robust foundation for developing and improving data-driven seismic analysis methods.
Leveraging High-Performance Computing for Generating Large-Scale Synthetic Datasets of Focal Mechanisms in Seismic Networks / Annunziata, D.; Prezioso, E.; Izzo, S.; Canzaniello, M.; Savoia, M.; Amitrano, S.; Qi, P.; Giampaolo, F.; Piccialli, F.. - (2025), pp. 9-16. ( 25th IEEE International Symposium on Cluster, Cloud and Internet Computing Workshops, CCGridW 2025 nor 2025) [10.1109/CCGridW65158.2025.00012].
Leveraging High-Performance Computing for Generating Large-Scale Synthetic Datasets of Focal Mechanisms in Seismic Networks
Annunziata D.;Prezioso E.;Izzo S.;Canzaniello M.;Savoia M.;Amitrano S.;Qi P.;Giampaolo F.;Piccialli F.
2025
Abstract
Analysis of focal mechanisms offers important insights into earthquake hazard assessment and tectonic processes but is often constrained by sparse station coverage and high noise levels in observed data. To address these challenges, this paper presents a high-performance computing (HPC) methodology for generating extensive synthetic datasets of earthquake focal mechanisms. The proposed framework incorporates a tailored parallelization strategy to ensure efficient and scalable simulations of diverse fault geometries and tectonic scenarios. At the node level, earthquake source parameters (hypocenter coordinates and strike, dip, rake combinations) are distributed into balanced batches to optimize load distribution. Within each node, computational processes handle the calculation of P-wave polarities and radiation amplitudes, utilizing the joblib library for efficient intra-node parallelism, enabling the rapid generation of high-fidelity synthetic datasets. Large-scale simulations, combined with parallelization techniques, demonstrate how HPC can effectively address data variability and scarcity, bridging the gap between computational methodologies and observational data constraints. The generated datasets provide a robust foundation for developing and improving data-driven seismic analysis methods.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


