Options
Barcelona Supercomputing Center, Barcelona, Spain
40 results
Now showing 1 - 10 of 40
- PublicationOpen AccessDigital Twins Components for Geophysical Extreme Phenomena: the example of Vlcanic Hzards within the DT-GEO projec(2023)
; ; ; ; ; ; ; ; ; ; ; ; ; ;; The project Digital Twin for GEOphysical extremes-(DT-GEO) aims to use Digital Twin Components to create replicas of physical systems, serving as a virtual laboratory to study natural extreme events. The ratio- nale is the intrinsic risks of potentially catastrophic events to anthropic activities, infrastructures, and cultural heritage. In the framework of the project, this paper describes, how the DTC workflow architecture is designed, focusing on flexibility, scalability, and maintainability, and how it is further developed. To demonstrate how ICT efforts can expand horizons in Geosciences, an application to volcanic hazard is presented taking as a case study the 2019 volcanic eruption of Raikoke (Kuril Islands).183 12 - PublicationOpen AccessThe EU Center of Excellence for Exascale in Solid Earth (ChEESE): Implementation, results, and roadmap for the second phase(2023)
; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ;; ;; ; ;; ; ;; ;; ; ; ;; ; ; ; ; ;; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ;; ; ;; ; ; ; ; ; ; ; ; ; ; ;; ; ; ; ; ; ;; ; ; ; ; ;; ; ; ;; ;; ; ; ; ;; ; ; ;; ;The EU Center of Excellence for Exascale in Solid Earth (ChEESE) develops exascale transition capabilities in the domain of Solid Earth, an area of geophysics rich in computational challenges embracing different approaches to exascale (capability, capacity, and urgent computing). The first implementation phase of the project (ChEESE-1P; 2018–2022) addressed scientific and technical computational challenges in seismology, tsunami science, volcanology, and magnetohydrodynamics, in order to understand the phenomena, anticipate the impact of natural disasters, and contribute to risk management. The project initiated the optimisation of 10 community flagship codes for the upcoming exascale systems and implemented 12 Pilot Demonstrators that combine the flagship codes with dedicated workflows in order to address the underlying capability and capacity computational challenges. Pilot Demonstrators reaching more mature Technology Readiness Levels (TRLs) were further enabled in operational service environments on critical aspects of geohazards such as long-term and short-term probabilistic hazard assessment, urgent computing, and early warning and probabilistic forecasting. Partnership and service co-design with members of the project Industry and User Board (IUB) leveraged the uptake of results across multiple research institutions, academia, industry, and public governance bodies (e.g. civil protection agencies). This article summarises the implementation strategy and the results from ChEESE-1P, outlining also the underpinning concepts and the roadmap for the on-going second project implementation phase (ChEESE-2P; 2023–2026).395 19 - PublicationOpen AccessReconstructing tephra fall deposits via ensemble-based data assimilation techniquesIn recent years, there has been a growing inter- est in ensemble approaches for modelling the atmospheric transport of volcanic aerosol, ash, and lapilli (tephra). The development of such techniques enables the exploration of novel methods for incorporating real observations into tephra dispersal models. However, traditional data assimilation al- gorithms, including ensemble Kalman filter (EnKF) meth- ods, can yield suboptimal state estimates for positive-definite variables such as those related to volcanic aerosols and tephra deposits. This study proposes two new ensemble- based data assimilation techniques for semi-positive-definite variables with highly skewed uncertainty distributions, in- cluding aerosol concentrations and tephra deposit mass load- ing: the Gaussian with non-negative constraints (GNC) and gamma inverse-gamma (GIG) methods. The proposed meth- ods are applied to reconstruct the tephra fallout deposit re- sulting from the 2015 Calbuco eruption using an ensemble of 256 runs performed with the FALL3D dispersal model. An assessment of the methodologies is conducted consider- ing two independent datasets of deposit thickness measure- ments: an assimilation dataset and a validation dataset. Dif- ferent evaluation metrics (e.g. RMSE, MBE, and SMAPE) are computed for the validation dataset, and the results are compared to two references: the ensemble prior mean and the EnKF analysis. Results show that the assimilation leads to a significant improvement over the first-guess results ob- tained from the simple ensemble forecast. The evidence from this study suggests that the GNC method was the most skilful approach and represents a promising alternative for assimila- tion of volcanic fallout data. The spatial distributions of the tephra fallout deposit thickness and volume according to the GNC analysis are in good agreement with estimations based on field measurements and isopach maps reported in previ- ous studies. On the other hand, although it is an interesting approach, the GIG method failed to improve the EnKF analysis.
173 11 - PublicationOpen AccessEnabling Dynamic and Intelligent Workflows for HPC, Data Analytics, and AI Convergence(2022-04-20)
; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ;; ; ; ;; ; ; ; ; ; ;; ; ; ;; ;; ;The evolution of High-Performance Computing (HPC) platforms enables the design and execution of progressively larger and more complex workflow applications in these systems. The complexity comes not only from the number of elements that compose the workflows but also from the type of computations they perform. While traditional HPC workflows target simulations and modelling of physical phenomena, current needs require in addition data analytics (DA) and artificial intelligence (AI) tasks. However, the development of these workflows is hampered by the lack of proper programming models and environments that support the integration of HPC, DA, and AI, as well as the lack of tools to easily deploy and execute the workflows in HPC systems. To progress in this direction, this paper presents use cases where complex workflows are required and investigates the main issues to be addressed for the HPC/DA/AI convergence. Based on this study, the paper identifies the challenges of a new workflow platform to manage complex workflows. Finally, it proposes a development approach for such a workflow platform addressing these challenges in two directions: first, by defining a software stack that provides the functionalities to manage these complex workflows; and second, by proposing the HPC Workflow as a Service (HPCWaaS) paradigm, which leverages the software stack to facilitate the reusability of complex workflows in federated HPC infrastructures. Proposals presented in this work are subject to study and development as part of the EuroHPC eFlows4HPC project.152 71 - PublicationOpen AccessData Assimilation of Volcanic Aerosols using FALL3D+PDAF(2022)
; ; ; ; ; ; ; ; ;; ; Modelling atmospheric dispersal of volcanic ash and aerosols is becoming increasingly valuable for assessing the potential impacts of explosive volcanic eruptions on infrastructures, air quality, and aviation. Management of volcanic risk and reduction of aviation impacts can strongly benefit from quantitative forecasting of volcanic ash. However, an accurate prediction of volcanic aerosol concentrations using numerical modelling relies on proper estimations of multiple model parameters which are prone to errors. Uncertainties in key parameters such as eruption column height, physical properties of particles or meteorological fields, represent a major source of error affecting the forecast quality. The availability of near-real-time geostationary satellite observations with high spatial and temporal resolutions provides the opportunity to improve forecasts in an operational context by incorporating observations into numerical models. Specifically, ensemble-based filters aim at converting a prior ensemble of system states into an analysis ensemble by assimilating a set of noisy observations. Previous studies dealing with volcanic ash transport have demonstrated that a significant improvement of forecast skill can be achieved by this approach. In this work, we present a new implementation of an ensemble-based Data Assimilation (DA) method coupling the FALL3D dispersal model and the Parallel Data Assimilation Framework (PDAF). The FALL3D+PDAF system runs in parallel, supports online-coupled DA and can be efficiently integrated into operational workflows by exploiting high-performance computing (HPC) resources. Two numerical experiments are considered: (i) a twin experiment using an incomplete dataset of synthetic observations of volcanic ash and, (ii) an experiment based on the 2019 Raikoke eruption using real observations of SO2 mass loading. An ensemble-based Kalman filtering technique based on the Local Ensemble Transform Kalman Filter (LETKF) is used to assimilate satellite-retrieved data of column mass loading. We show that this procedure may lead to nonphysical solutions and, consequently, conclude that LETKF is not the best approach for the assimilation of volcanic aerosols. However, we find that a truncated state constructed from the LETKF solution approaches the real solution after a few assimilation cycles, yielding a dramatic improvement of forecast quality when compared to simulations without assimilation.426 8 - PublicationOpen AccessVIGIL: A Python tool for automatized probabilistic VolcanIc Gas dIspersion modeLling(2022)
; ; ; ; ; ; ; ; ; ;; ; ; ;; ; ; Probabilistic volcanic hazard assessment is a standard methodology based on running a deterministic hazard quantification tool multiple times to explore the full range of uncertainty in the input parameters and boundary conditions, in order to probabilistically quantify the variability of outputs accounting for such uncertainties. Nowadays, different volcanic hazards are quantified by means of this approach. Among these, volcanic gas emission is particularly relevant given the threat posed to human health if concentrations and exposure times exceed certain thresholds. There are different types of gas emissions but two main scenarios can be recognized: hot buoyant gas emissions from fumaroles and the ground and dense gas emissions feeding density currents that can occur, e.g., in limnic eruptions. Simulation tools are available to model the evolution of critical gas concentrations over an area of interest. Moreover, in order to perform probabilistic hazard assessments of volcanic gases, simulations should account for the natural variability associated to aspects such as seasonal and daily wind conditions, localized or diffuse source locations, and gas fluxes. Here we present VIGIL (automatized probabilistic VolcanIc Gas dIspersion modeLling), a new Python tool designed for managing the entire simulation workflow involved in single and probabilistic applications of gas dispersion modelling. VIGIL is able to manage the whole process from meteorological data processing, needed to run gas dispersion in both the dilute and dense gas flow scenarios, to the post processing of models’ outputs. Two application examples are presented to show some of the modelling capabilities offered by VIGIL.466 18 - PublicationOpen AccessLong-term hazard assessment of explosive eruptions at Jan Mayen (Norway) and implications for air traffic in the North Atlantic(2022)
; ; ; ; ; ; ; ; ;; ;; ; ;; Volcanic eruptions are among the most jeopardizing natural events due to their potential impacts on life, assets, and the environment. In particular, atmospheric dispersal of volcanic tephra and aerosols during explosive eruptions poses a serious threat to life and has significant consequences for infrastructures and global aviation safety. The volcanic island of Jan Mayen, located in the North Atlantic under trans-continental air traffic routes, is considered the northernmost active volcanic area in the world with at least five eruptive periods recorded during the last 200 years. However, quantitative hazard assessments on the possible consequences for the air traffic of a future ash-forming eruption at Jan Mayen are nonexistent. This study presents the first comprehensive long-term volcanic hazard assessment for the volcanic island of Jan Mayen in terms of ash dispersal and concentration at different flight levels. In order to delve into the characterization and modeling of that potential impact, a probabilistic approach based on merging a large number of numerical simulations is adopted, varying the volcano's eruption source parameters (ESPs) and meteorological scenario. Each ESP value is randomly sampled following a continuous probability density function (PDF) based on the Jan Mayen geological record. Over 20 years of meteorological data is considered in order to explore the natural variability associated with weather conditions and is used to run thousands of simulations of the ash dispersal model FALL3D on a 2 km resolution grid. The simulated scenarios are combined to produce probability maps of airborne ash concentration, arrival time, and persistence of unfavorable conditions at flight levels 50 and 250 (FL050 and FL250). The resulting maps can serve as an aid during the development of civil protection strategies, to decision-makers and aviation stakeholders, in assessing and preventing the potential impact of a future ash-rich eruption at Jan Mayen.406 13 - PublicationOpen AccessValidating gas dispersion modelling at La Solfatara (Campi Flegrei, South Italy)(2022)
; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ;Probabilistic hazard assessments of volcanic gases need to account for the natural variability associated to aspects such as weather conditions, source location, emission rate, and gas species. In order to quantitatively carry out these assessments, computational tools for gas dispersal need to be validated to demon- strate the reliability of the model results. Here we provide an exemplificative gas dispersal model validation at La Solfatara (a maar crater within Campi Flegrei caldera) which hosts one of the largest and hazardous fumarolic sites of the world, by using a workflow designed for automating the simulation strategy for probabilis- tic gas hazard assessments. This represents the first fundamental step towards gas hazard quantification in the area.645 14 - PublicationOpen AccessOn the feasibility and usefulness of high-performance computing in probabilistic volcanic hazard assessment: An application to tephra hazard from Campi Flegrei(2022)
; ; ; ; ; ; ; ; ; ; ; ;; ; ;; ; ;; For active volcanoes, knowledge about probabilities of eruption and impacted areas becomes valuable information for decision-makers to develop short- and long-term emergency plans, for which probabilistic volcanic hazard assessment (PVHA) is needed. High-resolution or spatially extended PVHA requires extreme-scale high-performance computing systems. Within the framework of ChEESE (Center of Excellence for Exascale in Solid Earth; www.cheese-coe.eu), an effort was made to generate exascale-suitable codes and workflows to collect and process in some hours the large amount of data that a quality PVHA requires. To this end, we created an optimized HPC-based workflow coined PVHA_HPC-WF to develop PVHA for a volcano. This tool uses the Bayesian event tree methodology to calculate eruption probabilities, vent-opening location(s), and eruptive source parameters (ESPs) based on volcano history, monitoring system data, and meteorological conditions. Then, the tool interacts with the chosen hazard model, performing a simulation for each ESP set or volcanic scenario (VS). Finally, the resulting information is processed by proof-of-concept-subjected high-performance data analytics (HPDA) scripts, producing the hazard maps which describe the probability over time of exceeding critical thresholds at each location in the investigated geographical domain. Although PVHA_HPC-WF can be adapted to other hazards, we focus here on tephra (i.e., lapilli and ash) transport and deposition. As an application, we performed PVHA for Campi Flegrei (CF), Italy, an active volcano located in one of the most densely inhabited areas in Europe and under busy air traffic routes. CF is currently in unrest, classified as being in an attention level by the Italian Civil Protection. We consider an approximate 2,000 × 2,000 × 40 km computational domain with 2 km grid resolution in the horizontal and 40 vertical levels, centered in CF. To explore the natural variability and uncertainty of the eruptive conditions, we consider a large number of VSs allowing us to include those of low probability but high impact, and simulations of tephra dispersal are performed for each of them using the FALL3D model. Results show the potential of HPC to timely execute a vast range of simulations of complex numerical models in large high-resolution computational domains and analyze great volumes of data to obtain quality hazard maps.443 52 - PublicationOpen AccessStochastic modelling of explosive eruptive events at Galeras Volcano, Colombia(2021-01-18)
; ; ; ; ; ; ; ; ; A statistical analysis of explosive eruptive events can give important clues on the behaviour of a volcano for both the time- and size-domains, producing crucial information for hazards assessment. In this paper, we analyse in these domains an up-to-date catalogue of eruptive events at Galeras volcano, collating data from the Colombian Geological Survey and from the Smithsonian Institution. The dataset appears to be complete, stationary and consisting of independent events since 1820, for events of magnitude ≥2.6. In the time-domain, Inter-Event Times are fitted by various renewal models to describe the observed repose times. On the basis of the Akaike Information Criterion, the preferred model is the Lognormal, with a characteristic time scale of ∼1.6 years. However, a tendency for the events to cluster in time into ”eruptive cycles” is observed. Therefore, we perform a cluster analysis, to objectively identify clusters of events: we find three plausible partitions into 6, 8 and 11 clusters of events with magnitude ≥ 2.6 the 6-cluster partition being the preferred. The Inter-Event Times between cluster onsets (inter-cluster) and between events belonging to the same cluster (intra-cluster) are also modelled by renewal models. For inter-cluster data, the preferred model is the Brownian Passage Time, describing a periodical occurrence (mean return time ∼ 36 years) perturbed by a Gaussian noise. For the intra-cluster explosions, the preferred model is the Lognormal, with a characteristic time scale of ∼ 0.9 years. In the size-domain, we analyse only single events, due to the low number of clusters. Considering two independent parts of the catalogue, we cannot reject the null hypothesis of the erupted mass being described by a power law, implying no characteristic eruption size. Finally, looking for time- and size-predictability, we find a significant inverse linear relationship between the logarithm of the erupted mass during a cycle and the time to the subsequent one. These results suggest that, presently, Galeras is still in the eruption cycle started in 2007; a new eruptive cycle may be expected in a few decades, unless the present cluster resumes to activity with magnitude ≥2.6.418 14