Standardization of Eddy Covariance Measurements: Role of Setup, Calculation and Filtering in Parallel and Long-term datasets 

crossref(2023)

引用 0|浏览2
暂无评分
摘要
<p>For monitoring GHGs and energy fluxes between ecosystems and the atmosphere, the Eddy covariance (EC) technique is a widely accepted approach. Its two dedicated instruments sonic anemometer and gas analyzers are available in the market with various designs and features. These many options, in addition to the diverse data processing methods that are routinely used, are potential sources of uncertainty that can impede site-to-site comparisons. The performances and specifications of the single sensor do not necessarily reflect uncertainty in the final measurements. As there are not any analogous measurements that could be used for validation, the Research Infrastructures (e.g., ICOS, NEON or Ameriflux) standardized the technique in its different steps, including sensor&#8217;s selection, instrumental setup, and data processing. However, no perfect sensor exists that can handle all possible environmental variables without probable concerns.</p><p>This synthesis study is divided in to two sections. Primarily, effect of standardization is analysed using data from 15 sites covering different climate and ecosystems where two EC system run in parallel, one of them is standardized. The data are then processed both by the single station teams and centrally to evaluate differences due to setups and processing. Second part is reprocessing of long-term data from 9 sites, with the objective of understanding the effect of change in setups on a long timeseries, as well as to verify whether a standardized processing can aid harmonization of historical dataset gathered with old instruments with new dataset. Results pointed out that differences between the two systems and processing are site dependent and both setup and processing play role.</p><p>Effect of standardization in the EC setup has been quantified on average between 10 and 16 % in carbon flux, 11 and 19 % in LE flux and 5 and 7 % in H flux. Differences due to processing methods are in general smaller for the standardized setup (9 % in FC, 14 % in LE and 10 % in H) respect to the non-standardized setup (17 % for FC, 16 % for LE and 12 % for H). Reprocessing of long-term data by using ICOS standard processing scheme helped to reduce the effect of instrumental setup shift from nonstandard to ICOS more prominently in LE and H fluxes.</p><p>It is difficult to identify a single component that unites all the sites variations and differences because of the intricacy of the EC technique and its numerous steps (setup, calculation, and filtering). Although standardization does not guarantee the accuracy of the absolute numbers, it does help to decrease difference when modest changes (in time and among sites) must be recognized. Proper storage and organization of raw data and meta data is key for accurate data interpretation and future reanalysis.</p><p>&#160;</p>
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要