Internal Versus Forced Variability Metrics for General Circulation Models Using Information Theory

Journal of Geophysical Research: Oceans(2024)

引用 0|浏览0
暂无评分
摘要
AbstractOcean model simulations show variability due to intrinsic chaos and external forcing (air‐sea fluxes, river input, etc.). It is important to estimate their contributions to total variability for attribution. Using variance to estimate variability might be unreliable due to non‐Gaussian higher statistical moments. We show the use of non‐parametric information theory metrics, Shannon entropy and mutual information, for measuring internal and forced variability in ocean models. These metrics are applied to spatially and temporally averaged data. The metrics delineate relative intrinsic to total variability in a wider range of circumstances than previous approaches based on variance ratios. The metrics are applied to (a) a synthetic ensemble of random vectors, (b) ocean component of a global climate (GFDL‐ESM2M) large ensemble, (c) ensemble of a realistic coastal ocean model. The information theory metric qualitatively agrees with the variance‐based metric and possibly identifies regions of nonlinear correlations. In application (2)–the climate ensemble–the information theory metric detects higher temperature intrinsic variability in the Arctic region compared to the variance metric illustrating that the former is robust in a skewed probability distribution (Arctic sea surface temperature) resulting from sharply nonlinear behavior (freezing point). In application (3)–coastal ensemble–variability is dominated by external forcing. Using different selective forcing ensembles, we quantify the sensitivity of the coastal model to different types of external forcing: variations in the river runoff and changes in wind product do not add information (i.e., variability) during summer. Information theory enables ranking how much each forcing type contributes across multiple variables.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要