Over 600 flux stations are presently operational as part of continental and national flux networks and under the umbrella of the FLUXNET global network, with multiple dozens more flux stations operating as smaller dedicated networks and standalone projects. A total of over 2100 flux tower locations had provided data in the past covering large portions of the globe.

In the last 10 years, multiple new highly advanced software, codes and routines to analyze various aspects of this wealth of flux and ancillary data were developed by individuals, research groups, regional and national networks. From raw-data processing over multi-method data cleaning, gap filling and flux partitioning to flux-footprint budgeting and bottom-up modeling, such advanced tools offer an unprecedented analytical power to the ecosystem flux community.

This session is designed to bring together scientists and developers who have developed such tools utilizing the latest in flux methodology, ecosystem measurements, new instrumentations, and modern computing, so as to provide an orientation map of available tools to ecosystem researchers, flux scientists and ecosystem modelers.

If you developed a tool or code that you think is useful for understanding ecosystem processes and that facilitates data processing, this is the right session to submit to.

Convener: Tarek S. El-Madany | Co-conveners: Frank Griessbaum, Torsten Sachs
| Attendance Fri, 08 May, 14:00–15:45 (CEST)

Files for download

Session materials Download all presentations (26MB)

Chat time: Friday, 8 May 2020, 14:00–15:45

Chairperson: Tarek, Frank, Torsten
D557 |
Junbin Zhao, Holger Lange, and Helge Meissner

Soil respiration is an important ecosystem process that releases carbon dioxide into the atmosphere. While soil respiration can be measured continuously at high temporal resolutions, gaps in the dataset are inevitable, leading to uncertainties in carbon budget estimations. Therefore, robust methods used to fill the gaps are needed. The process-based non-linear least squares (NLS) regression is the most widely used gap-filling method, which utilizes the established relationship between the soil respiration and temperature. In addition to NLS, we also implemented three other methods based on: 1) artificial neural networks (ANN), driven by temperature and moisture measurements, 2) singular spectrum analysis (SSA), relying only on the time series itself, and 3) the expectation-maximization (EM) approach, referencing to parallel flux measurements in the spatial vicinity. Six soil respiration datasets (2017-2019) from two boreal forests were used for benchmarking. Artificial gaps were randomly introduced into the datasets and then filled using the four methods. The time-series-based methods, SSA and EM, showed higher accuracies than NLS and ANN in small gaps (<1 day). In larger gaps (15 days), the performance was similar among NLS, SSA and EM; however, ANN showed large errors in gaps that coincided with precipitation events. Compared to the observations, gap-filled data by SSA showed similar degree of variances and those filled by EM were associated with similar first-order autocorrelation coefficients. In contrast, data filled by both NLS and ANN exhibited lower variance and higher autocorrelation than the observations. For estimations of the annual soil respiration budget, NLS, SSA and EM produced satisfying results with budget errors < 6% while ANN exhibited larger errors up to 16.0%. Our study highlights the two time-series-based methods which showed great potential in gap-filling carbon flux data, especially when other environmental variables are unavailable. The R code to perform the gap-filling with the four methods in this study is incorporated into the R package “FluxGapsR” freely available at https://github.com/junbinzhao/FluxGapsR/.

How to cite: Zhao, J., Lange, H., and Meissner, H.: Gap-filling continuously-measured soil respiration data with time-series-based methods, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-2174, https://doi.org/10.5194/egusphere-egu2020-2174, 2020.

D558 |
Ladislav Šigut, Pavel Sedlák, Milan Fischer, Georg Jocher, Thomas Wutzler, Marian Pavelka, and Matthias Mauder

The eddy covariance method provides important insights about CO2, water and energy exchange-related processes on the ecosystem scale level. Data are collected quasi-continuously with sampling frequency 10 Hz at minimum, often throughout multiple years, producing large datasets. Standard data processing methods are already devised but undergo continuous refinements that should be reflected in the available software. Currently, a suite of software packages is available for computation of half-hourly products from high frequency raw data. However, software packages consolidating the further post-processing computations are not yet that common. The post-processing steps can consist of quality control, footprint modelling, computation of storage fluxes, gap-filling, flux partitioning and data aggregation. Also they can be realized in different programming languages and require various input data formats. Users would therefore often evaluate only certain aspects of the dataset which limits the amount of extractable information from obtained data and they possibly omit the features that could affect data quality or interpretation. Here we present the free R software package openeddy () that provides utilities for input data handling, extended quality control checks, data aggregation and visualization and that includes a workflow () that attempts to integrate all post-processing steps through incorporation of other free software packages, such as REddyProc (). The framework is designed for the standard set of eddy covariance fluxes, i.e. of momentum, latent and sensible heat as well as CO2. Special attention was paid to the visualization of results at different stages of processing and at different time resolutions and aggregation steps. This allows to quickly check that computations were performed as expected and it also helps to notice issues in the dataset itself. Finally, the proposed folder structure with defined post-processing steps allows to organize data in different stages of processing for improved ease of use. Produced workflow files document the whole processing chain and its possible adaptations for a given site. We believe that such a tool can be particularly useful for eddy-covariance novices, groups that cannot or do not contribute their data to regional networks for further processing or users that want to evaluate their data independently. This or similar efforts can also help to save human resources or speed up the development of new methods.

This work was supported by the Ministry of Education, Youth and Sports of the Czech Republic within the CzeCOS program, grant number LM2015061, and within Mobility CzechGlobe 2, grant number CZ.02.2.69/0.0/0.0/18_053/0016924.

How to cite: Šigut, L., Sedlák, P., Fischer, M., Jocher, G., Wutzler, T., Pavelka, M., and Mauder, M.: Unlocking the potential of eddy covariance data with the R software package openeddy, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-2382, https://doi.org/10.5194/egusphere-egu2020-2382, 2020.

D559 |
Thomas Wutzler, Mirco Migliavacca, and Kendalynn Morris

Soil CO2 efflux data from automated chambers provide an important constraint for ecosystem and soil respiration. Usually, half-hourly time series of several replicated chambers have to be aggregated to plot-level while gaps in the time series have to be accommodated. Gaps cause jumps and other problems in aggregation of replicated measurement in each half-hour, therefore, lookup tables and machine learning approaches are used to fill gaps beforehand.

Here, we present an alternative fully Bayesian approach for the combined gap-filling and aggregation based on Integrated Nested Laplace Approximation (INLA). This method integrates all information from every measurement across replicates and across time and therefore efficiently estimates the correlation structure among all observations. It provides the full marginal posterior distribution of the aggregated time series at the plot level across the time span of the time series. We compare several aggregation approaches using four years of data from 16 automatic chambers at the eddy-covariance site in Majadas de Tietar in Spain (ES-LM1, ES-LMa).

This approach is applicable for other replicated time series as well. We further explore its usage for analysing time-varying effects across treatments and habitats and its usage for gap-filling net ecosystem exchange (NEE) data based on the full correlation structure in a data-cube of time and environmental conditions.

How to cite: Wutzler, T., Migliavacca, M., and Morris, K.: Effective aggregation of gappy replicated time series using INLA, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-3645, https://doi.org/10.5194/egusphere-egu2020-3645, 2020.

D560 |
Gerardo Fratini, Israel Begashaw, Andreas Burkart, John Gamon, Kaiyu Guan, David Johnson, Tommaso Julitta, Gilberto Pastorello, Karolina Sakowska, Mong-Kuen Sun, Lynn Woodford, and George Burba

Data from thousands of past and present eddy covariance flux stations are available across the globe, while multiple hundreds actively operating as individual process-level studies, small flux networks dedicated to specific research goals, and larger national and continental networks with broad ecological and environmental foci.

Many flux stations have weather and soil data to help clean, analyze and interpret the fluxes but most do not have optical proximal sensors, do not allow straightforward coupling with remote sensing (drone, aircraft, satellite, etc.) data, and cannot easily be used for validation of remotely sensed products, ecosystem modeling, or upscaling from field to regional levels. The flux source areas themselves (e.g., flux footprints) are typically not defined in the flux datasets, and the time stamps of the fluxes come in a large number of outdated non-trackable formats. Finally, the past ways of the flux data quality control, analysis and interpretation require a participation of micrometeorological expert (or an entire network) with their own custom codes or exceptional skills in using existing software such as MatLab or VB Tools in Excel. These are the key issues effectively preventing a larger environmental research community and remote sensing community from fully utilizing eddy covariance flux data.

In 2016-2020, a set of new tools to collect, process, analyze, time- and space- allocate and share time-synchronized flux data from multiple flux stations were developed and deployed globally. These new tools can be effective in solving most or all of the key issues listed above. The fully automated FluxSuite system combines hardware, software and web services, and does not require an expert to run it. It can be incorporated into a new flux station or added to a present station, using a weatherized remotely-accessible microcomputer, SmartFlux3 which utilizes EddyPro software to calculate fully-processed fluxes in near-real-time, alongside biomet data and flux footprints. All data are merged into a single quality-controlled file timed using PTP time protocol. Remote sensing researchers and modelers without actual physical stations can form “virtual networks” of actual stations by collaborating with tower PIs from different physical networks and flux databases.

The very latest development in this overall approach is the flux data analysis software, Tovi, designed to seamlessly ingest the data from the flux stations and to allow a non-micrometeorologist to quality control, analyze and interpret the flux data. It allows rapid execution of the QC/QA and data analysis steps which have been time-consuming and complicated in the past, and other data analysis steps virtually not doable in the past, all using interactive and intuitive GUI, including advanced footprint calculations and flux apportioning necessary for remote sensing community; NEE flux partitioning; automated generation specific lists of references for each workflow; etc.

This presentation will show how combinations of these new tools are used by major networks, and describe how this approach can be utilized for matching remote sensing and tower data for ground truthing, improve scientific interactions, and promote a better utilization of the eddy covariance flux data by a wider environmental research community.

How to cite: Fratini, G., Begashaw, I., Burkart, A., Gamon, J., Guan, K., Johnson, D., Julitta, T., Pastorello, G., Sakowska, K., Sun, M.-K., Woodford, L., and Burba, G.: Eddy Covariance Flux Data: Sitting on a Golden Egg, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-5652, https://doi.org/10.5194/egusphere-egu2020-5652, 2020.

D561 |
Felix Nieberding, Gerardo Fratini, Christian Wille, Yaoming Ma, and Torsten Sachs

When using open path infrared gas analyzers (IRGA), lens contamination, calibration inaccuracies and ageing internal chemicals may lead to a slow drift in gas concentration measurements. During flux calculations, these errors are usually not accounted for under the assumption, that a slow drift in mean gas concentrations (i.e. over several weeks to months) does not affect the estimation of turbulent fluctuations and, hence, of covariances. This is however not the case. In fact, Serrano-Ortiz et al. (2008) estimated that an underestimation of the CO2 concentration will propagate into an overestimation of the carbon uptake via the WPL correction. In addition, for instruments – such as the widely used LI-7500 – where a nonlinear calibration curve relates raw absorptance measurements to gas concentration, Fratini et al. (2014) have shown that errors in mean concentrations leak into errors in fluxes on account of amplified or dampened estimated fluctuations. Both these effects can be eliminated, possibly completely, when the drift in gas concentration is corrected before raw data processing. Therefore, the offset between measured and reference (i.e. "real") gas concentrations has to be quantified and converted into the corresponding zero absorption biases. We performed the drift correction on a 15 years dataset from the Tibetan Plateau, where an extreme drift in concentration measurements occurred. Due to the remote location, user calibrations were performed irregularly, and no independent gas concentration measurements are available. Hence, we used the CO2 concentration measurements from the Mauna Loa atmospheric observatory (NOAA ESRL Global Monitoring Division, 2018, updated annually) as reference gas concentration to derive an offset for every half hour measurement. The offset in H­­2O mixing ratios could be determined from auxiliary low frequency measurements of relative humidity, temperature and air pressure. We then converted mixing ratios to raw absorptances using the instrument-specific calibration curve to apply an absorptance offset to every 10 Hz measurement, thus eliminating the long-term drift in both mean values and fluctuations of gas concentrations. The corrected raw data time series were then used for calculation of fluxes and subsequent corrections, including de-spiking, axis rotation, detrending and correction for spectral attenuations and air density fluctuations. In comparison to the uncorrected fluxes, the corrected fluxes yielded a considerably lower carbon uptake during daytime and summer, which is in compliance with instrument theory of operation and the results from numerical simulations and field data analysis as conducted by Fratini et al. (2014).


Fratini, G., McDermitt, D.K. & Papale, D. (2014) Eddy-covariance flux errors due to biases in gas concentration measurements: origins, quantification and correction. Biogeosciences, 11, 1037–1051.

NOAA ESRL Global Monitoring Division (2018, updated annually) Atmospheric Carbon Dioxide Dry Air Mole Fractions from quasi-continuous measurements at Mauna Loa, Hawaii, Barrow, Alaska, American Samoa and South Pole. Compiled by K.W. Thoning, D.R. Kitzis, and A. Crotwell., 2019th edn. NOAA ESRL GMD CCGG Group, Boulder, Colorado, USA.

Serrano-Ortiz, P., Kowalski, A.S., Domingo, F., Ruiz, B. & Alados-Arboledas, L. (2008) Consequences of Uncertainties in CO2 Density for Estimating Net Ecosystem CO2 Exchange by Open-path Eddy Covariance. Boundary-Layer Meteorology, 126, 209–218.

How to cite: Nieberding, F., Fratini, G., Wille, C., Ma, Y., and Sachs, T.: Addressing eddy-covariance flux errors due to uncertainties in open path IRGA operation under harsh environmental conditions on the Tibetan Plateau, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-5660, https://doi.org/10.5194/egusphere-egu2020-5660, 2020.

D562 |
| solicited
Bart Schilperoort, Karl Lapo, Anita Freundorfer, and Bas des Tombe

Distributed Temperature Sensing (DTS) using fiber optic cables is a promising technique capable of filling in critical gaps between point observations and remote sensing. While DTS only directly measures the fiber temperature, it has been used to make spatially distributed observations of air temperature, wet bulb temperature, wind speed, and more, on the scales of centimeters to kilometers at temporal resolutions as fine as a second. Of particular interest for the flux community, the spatially distributed nature of DTS allows us to place point observations within a spatial context, highlighting missing physics and linking processes across scales.

However, DTS is not without its drawbacks. It is not a push button operation – each DTS array is unique, requiring an exceptional investment in time for the deployment and for turning DTS observations into physically-meaningful results. Characteristics of DTS observations change with the DTS device used, but also with, e.g., the type of the fiber, the layout of the fiber optic array, and properties of the reference sections used in calibration. These issues create two main challenges in processing DTS data: 1) the need for a robust calibration and 2) management of data that can exceed a terabyte, especially with large or long-term installations. To address these challenges and simplify the use of this powerful technique we present two tools, which can be used both standalone and in conjunction with each other.

First is ‘python-dts-calibration’, a Python package which is aimed at performing thorough calibration of DTS data, as calibration by DTS devices is often lacking in quality. It is able to perform a more robust calibration than the device default, and provides confidence intervals for the calibrated temperature. The confidence intervals vary along the fiber and over time and are different for every setup. The second tool, ‘pyfocs’, is a Python package meant for managing larger, long term installations. This tool automates the workflow including checking data integrity, calibration, and physically mapping the data. pyfocs incorporates ‘python-dts-calibration’ at its core, allowing the tool to robustly calibrate any DTS configuration. Lastly, the package provides the option for calculating other parameters, such as wind speed.

Both tools are open-source and hosted on GitHub[1][2], allowing for everyone to check the code and suggest changes. By sharing our tools, we hope to make the use of fiber optic DTS in geosciences easier and open the door of this new technology to non-specialists.


[1] https://github.com/dtscalibration/python-dts-calibration

[2] https://github.com/klapo/pyfocs

How to cite: Schilperoort, B., Lapo, K., Freundorfer, A., and des Tombe, B.: Untangling fiber optic Distributed Temperature Sensing: Getting the right temperature and getting there smoothly, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-7821, https://doi.org/10.5194/egusphere-egu2020-7821, 2020.

D563 |
Matthias Mauder and the Energy-balance closure correction evaluation team

The apparent lack of surface energy balance closure is one of the most crucial challenges in the measurement of biosphere-atmosphere exchange. In principle, this issue can have a variety of potential reasons, including instrumental errors and errors introduced in the data processing chain. In addition, secondary circulations have been identified as one of the main reasons for a non-closure of the surface energy balance, since the related energy transport cannot be captured by common eddy-covariance tower flux measurements. When present, neglecting this process will result in an underestimation of turbulent fluxes. Secondary circulations can, however, be represented by means of large-eddy simulations, which have been employed to develop a novel semi-empirical model to correct for the missing large-scale flux (De Roo et al. 2018, DOI 10.1371/journal.pone.0209022). In this study, we compare the results of this process-based method with two other previously published bulk-correction methods (Mauder et al. 2013, DOI 10.1016/j.agrformet.2012.09.006; Charuchittipan et al. 2014, DOI 10.1007/s10546-014-9922-6). These three correction methods are applied for multiple sites in different biomes around the world. Independent data of energy fluxes from these sites are used to assess which of these methods leads to the most reliable results, and we discuss the limitations of these corrections methods with respect to meteorological conditions and site characteristics, such as measurement height, the landscape-scale heterogeneity and terrain complexity.

How to cite: Mauder, M. and the Energy-balance closure correction evaluation team: Evaluation of energy balance closure correction methods for multiple eddy-covariance sites in different biomes, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-9319, https://doi.org/10.5194/egusphere-egu2020-9319, 2020.

D564 |
Gianluca Filippa, Edoardo Cremonese, Marta Galvagno, and Mirco Migliavacca

Flux towers are more and more often equipped with digital cameras (aka phenocams) widely used to track canopy greenness. Phenocam-derived vegetation indices can capture land surface phenology but also seasonality in gross primary production (GPP) estimated from eddy covariance (EC) measurements. In addition, phenocams can be used to track seasonal development of different species or individuals within the same image scene, and evaluate spatial variability within the footprint of EC measurements. Further, phenocams were recently used to quantify disturbance such as late frost, fires, storms etc. in forested ecosystems and the impact of climate extremes on ecosystem functioning. With the recent rapid development of phenocameras, the need for up-to-date, efficient, open-source software is also increasing tremendously. The phenopix R package was developed for this purpose. In this contribution, we will provide an overview of the software capabilities, with a special focus on how EC measurements can benefit from phenocam data streams.

The steps of a basic processing workflow will be illustrated, including drawing a region of interest (ROI) on an image; extracting red, green and blue digital numbers from a seasonal series of images; depicting greenness index trajectories; fitting a curve to the seasonal trajectories; extracting relevant phenological thresholds (phenophases); characterizing phenophase uncertainties. A focus will be made on recent software developments, including the calculation of camera-derived NDVI and other infrared-based indices, and the handling of shifts in the field of view of the phenocameras.

How to cite: Filippa, G., Cremonese, E., Galvagno, M., and Migliavacca, M.: The various facets of digital repeat photography fully exploited with the phenopix R package, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-17955, https://doi.org/10.5194/egusphere-egu2020-17955, 2020.

D565 |
Jacob Nelson

Here we present an overview of methods for partitioning evapotranspiration (ET) from eddy covariance data. We focus on methods that are designed to use the core energy and carbon fluxes, as well as meteorological data, and do not require supplemental measurements or campaigns. A comparison of three such methods for estimating transpiration (T) showed high correlations between them (R2 of  daily T between 0.80 and 0.87) and higher correlations to daily stand T estimates from sap flow data (R2 between 0.58 and 0.66) compared to the tower ET (R2 = 0.49). However, the three methods show significant differences in magnitude, with T/ET values ranging from 45% to 77%. Despite the differences in magnitude, the methods show plausible patterns with respect to LAI, seasonal cycles, WUE, and VPD; moreover, they represent an improvement compared to using ET as a proxy for T even when filtering for days after rain. Finally, we outline practical aspects of applying the methods, such as how to apply the methods and code availability.

How to cite: Nelson, J.: Water Flux Partitioning for Eddy Covariance Data, EGU General Assembly 2020, Online, 4–8 May 2020, EGU2020-18965, https://doi.org/10.5194/egusphere-egu2020-18965, 2020.

D566 |
Using Real-Time Eddy Covariance Data for Timely Validation of Operational Remote Sensing Products: I. Setting Workflow for QC and GPP Partitioning
Roel Van Hoolst, Radek Czerný, Jorge Torres Leon, Gerardo Fratini, Marian Pavelka, Ladislav Šigut, Else Swinnen, Carolien Tote, and George Burba