Tsunami risk management in the Exascale Era:Global advances and the European standpoint 

crossref(2023)

引用 0|浏览0
暂无评分
摘要
<p>Regional and local tsunami sources are a clich&#233; of scientific disaggregation. From the physical perspective, despite emerging studies on cascading hazard and risk, hazard characterization often sees the tsunami as an individual event without addressing the effects of the primary hazard (typically a high-magnitude earthquake) that triggered the tsunami. Moreover, tsunami effects are partitioned into single processes: hydraulic effects or induced effects, such as debris transport, which is a representative approach often assumed when treating complex phenomena. From a technical perspective, describing cascading hazards and translating them into a composite loading pattern for natural and built environments is challenging, and the difficulty increases exponentially when fluid-soil-interactions are considered. From a modeling perspective, physical and numerical simulations are employed to complement scarce databases of extreme tsunami events. However, the level of modeling sophistication deemed necessary to reproduce such complex phenomena is elevated and there are uncertainties associated with natural phenomena and their modelling, ranging from the genesis of the tsunami to structural and community response. The number and influencing potential of uncertainties pose an extraordinary concern when developing mitigation measures. From a risk management perspective, cascading natural and anthropogenic hazards constitutes a challenge for combining safety requirements with financial, social, and ecological concerns. Risk management can benefit from strengthening the ties between natural hazards and engineering practitioners, linking science and industry, and promoting dialogue between risk analysts and policy-makers.</p> <p>Ultimately, risk management requires heterogeneous data and information from real and synthetic origins. Yet, the quality of data used for risk management may often depend on the computational resources (in terms of performance, energy, and storage capacity) needed to simulate complex multi-scale and multi-physics phenomena, as well as to analyze large data sets. For example, the quality of the numerical solutions is often dependent on the amount of data used to calibrate the models and the runtime of the models needs to be aligned with time constraints (ex.: faster than real time tsunami simulations for early warning systems). The North American platform Hazus is capable of producing risk maps. In the European risk assessment, there is a lack of integration and interaction of results from GEM and SERA, and TSUMAPS-NEAM projects, intended to develop seismic and tsunami hazard studies, respectively. The computational modeling aids in the advancement of scientific knowledge by aggregating the numerous factors involved and their translation to tsunami risk management policies.</p> <p>A global trend in geosciences and engineering is to develop sophisticated numerical schemes and to build computational facilities that can solve them, thereby aiming to reduce uncertainty levels and preparing the scientific (r)-evolution for the so-called Exascale Era. The present work aims to gather multidisciplinary perspectives on a discussion about: 1) challenges to overcome on tsunami risk management, such as sophistication of earthquake and tsunami numerical schemes; 2) uncertainty-awareness and future needs to develop unanimous and systematic measures to reduce uncertainties associated with geophysical and engineering processes; 3) pros and cons of using HPC resources towards safety and operational performance levels; and 4) applicability to critical infrastructures.</p>
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要