EGU24-5766, updated on 08 Mar 2024
https://doi.org/10.5194/egusphere-egu24-5766
EGU General Assembly 2024
© Author(s) 2024. This work is distributed under
the Creative Commons Attribution 4.0 License.

Investigating Deep Learning Techniques to Estimate Fractional Vegetation Cover in the Australian Semi-arid Ecosystems combining Drone-based RGB imagery, multispectral Imagery and LiDAR data.

Laura Sotomayor1, Teja Kattenborn2, Florent Poux3, Darren Turner1, and Arko Lucieer1
Laura Sotomayor et al.
  • 1University of Tasmania
  • 2University of Freiburg
  • 3University of Liège

Semi-arid terrestrial ecosystems exhibit sparse vegetation, characterised by herbaceous non-woody plants (e.g., forbs or grass) and woody plants (e.g., trees or shrubs). These ecosystems encounter challenges from global climate change, including shifts in rainfall, temperature variations, and elevated atmospheric carbon dioxide (CO2) levels. Effective monitoring is essential for informed decision-making and sustainable natural resource management in the context of rapid environmental changes.

Fractional Vegetation Cover (FVC) is a key biophysical parameter for monitoring ecosystems, indicating their balance and resilience. The assessment of FVC is important for evaluating vegetation biomass and carbon stocks, pivotal components of ecosystem health. The precise mapping of FVC across various scales involves three key cover types: photosynthetic vegetation (PV), representing ground covered by green leaves facilitating photosynthesis; non-photosynthetic vegetation (NPV), encompassing branches, woody stems, standing litter, and dry leaves with reduced or no chlorophyll content; and bare earth (BE), representing the uncovered ground surface without vegetation. FVC offers a quantitative measure of the relative contribution of each cover type to the total ground surface, aiding in characterising vegetation composition.

Efficient and accurate remote sensing techniques are essential to complement conventional field-based methods for performing FVC measurements.  Drone remote sensing technologies provide opportunities to capture fine-scale spatial variability in vegetation, enabling the derivation of ecological (e.g., FVC), biophysical (e.g., aboveground biomass), and biochemical variables (e.g., leaf chlorophyll content). Local calibration and validation of drone products enhance upscaling to coarser spatial scales defined by satellite observations, improving the understanding of vegetation dynamics at the national scale for subsequent change detection analyses.

The research project applies deep learning methods in remote sensing to enhance understanding of ecosystem composition, structure, and function features, with a specific focus on diverse terrestrial ecosystems in Australia. Leveraging drone technologies and advanced deep learning algorithms, the project develops automated workflows for systematic ecosystem health assessments, thereby making a significant contribution to the validation of satellite observations. The research framework emphasises the potential of Deep Learning methods in generating FVC products from RGB and multispectral imagery obtained through drone data. The conclusion highlights the benefits of integrating LiDAR data with Deep Learning approaches for analysing denser vegetation structure scenarios, offering a holistic approach for a comprehensive understanding of ecosystem health and dynamics. This approach provides valuable insights for environmental monitoring and management.

How to cite: Sotomayor, L., Kattenborn, T., Poux, F., Turner, D., and Lucieer, A.: Investigating Deep Learning Techniques to Estimate Fractional Vegetation Cover in the Australian Semi-arid Ecosystems combining Drone-based RGB imagery, multispectral Imagery and LiDAR data., EGU General Assembly 2024, Vienna, Austria, 14–19 Apr 2024, EGU24-5766, https://doi.org/10.5194/egusphere-egu24-5766, 2024.