Standardization of Eddy Covariance Measurements: Role of Setup, Calculation and Filtering in Parallel and Long-term datasets
- 1National Research Council (CNR), Research Institute on Terrestrial Ecosystems (IRET), Porano, Italy (sundas.shaukat@iret.cnr.it)
- 2University of Tuscia, Viterbo , Italy
- 3Euro-Mediterranean Center on Climate Change, IAFES Division, Viterbo, Italy
- *A full list of authors appears at the end of the abstract
For monitoring GHGs and energy fluxes between ecosystems and the atmosphere, the Eddy covariance (EC) technique is a widely accepted approach. Its two dedicated instruments sonic anemometer and gas analyzers are available in the market with various designs and features. These many options, in addition to the diverse data processing methods that are routinely used, are potential sources of uncertainty that can impede site-to-site comparisons. The performances and specifications of the single sensor do not necessarily reflect uncertainty in the final measurements. As there are not any analogous measurements that could be used for validation, the Research Infrastructures (e.g., ICOS, NEON or Ameriflux) standardized the technique in its different steps, including sensor’s selection, instrumental setup, and data processing. However, no perfect sensor exists that can handle all possible environmental variables without probable concerns.
This synthesis study is divided in to two sections. Primarily, effect of standardization is analysed using data from 15 sites covering different climate and ecosystems where two EC system run in parallel, one of them is standardized. The data are then processed both by the single station teams and centrally to evaluate differences due to setups and processing. Second part is reprocessing of long-term data from 9 sites, with the objective of understanding the effect of change in setups on a long timeseries, as well as to verify whether a standardized processing can aid harmonization of historical dataset gathered with old instruments with new dataset. Results pointed out that differences between the two systems and processing are site dependent and both setup and processing play role.
Effect of standardization in the EC setup has been quantified on average between 10 and 16 % in carbon flux, 11 and 19 % in LE flux and 5 and 7 % in H flux. Differences due to processing methods are in general smaller for the standardized setup (9 % in FC, 14 % in LE and 10 % in H) respect to the non-standardized setup (17 % for FC, 16 % for LE and 12 % for H). Reprocessing of long-term data by using ICOS standard processing scheme helped to reduce the effect of instrumental setup shift from nonstandard to ICOS more prominently in LE and H fluxes.
It is difficult to identify a single component that unites all the sites variations and differences because of the intricacy of the EC technique and its numerous steps (setup, calculation, and filtering). Although standardization does not guarantee the accuracy of the absolute numbers, it does help to decrease difference when modest changes (in time and among sites) must be recognized. Proper storage and organization of raw data and meta data is key for accurate data interpretation and future reanalysis.
Antje Maria Moffat (4), Luca Belelli Marchesini (5), Bernard Heinesch (6), Daniel Berveiller (7), Christian Brummer (8), Aurore Brut (9), Christophe Chipeaux (1)0, Alexander Graf (11), Radek Czerný (12), Thomas Grünwald (13), Lukas Hörtnagl (14), Anne Klosterhalfen (15), Sébastien Lafont (10), Leonardo Montagnani (16), Tanguy Manise (6), Virginie Moreaux (17), Matthias Peichl (15), Frederik Schrader (8), Tiphaine Tallec (9), Ivan Mammarella 1(8), Pasi Kolari 1(8)
How to cite: Shaukat, S., Sabbatini, S., Nicolini, G., and Papale, D. and the ICOS-PIs: Standardization of Eddy Covariance Measurements: Role of Setup, Calculation and Filtering in Parallel and Long-term datasets , EGU General Assembly 2023, Vienna, Austria, 24–28 Apr 2023, EGU23-16593, https://doi.org/10.5194/egusphere-egu23-16593, 2023.