EGU23-10901, updated on 26 Feb 2023
https://doi.org/10.5194/egusphere-egu23-10901
EGU General Assembly 2023
© Author(s) 2023. This work is distributed under
the Creative Commons Attribution 4.0 License.

From spectra to functional plant traits: Transferable multi-trait models from heterogeneous and sparse data

Eya Cherif1,2, Hannes Feilhauer1,2, Katja Berger5,6, Michael Ewald7, Tobias B. Hank8, Kyle R. Kovach9, Philip A. Townsend9, Zhihui Wang9, and Teja Kattenborn1,3
Eya Cherif et al.
  • 1Remote Sensing Centre for Earth System Research (RSC4Earth), Leipzig University, Germany
  • 2Center for scalable data analytics and artificial intelligence (ScaDS.AI), University of Leipzig - TU Dresden, Germany
  • 3German Centre for Integrative Biodiversity Research (iDiv), Halle-Jena-Leipzig, Germany
  • 5Image Processing Laboratory (IPL), Universitsat de València, València, Spain
  • 6Mantle Labs GmbH, Vienna, Austria
  • 7Institute of Geography and Geoecology, Karlsruhe Institute of Technology (KIT), Karlsruhe, Germany
  • 8Department of Geography, Faculty of Geosciences, Ludwig-Maximilians-Universität München (LMU), Munich, Germany
  • 9Department of Forest and Wildlife Ecology, University of Wisconsin-Madison, Madison, WI, USA

Our understanding of the Earth´s functional biodiversity and its imprint on ecosystem functioning, structure and resilience is still incomplete. Large-scale information on vegetation properties (‘plant traits’) is critical to assess functional diversity and its role in the Earth system. Such parameters are constantly changing due to variations in environmental conditions which makes extensive in-situ measurements in the field not logistically feasible. The advent of the upcoming space-borne hyperspectral missions will facilitate to map these properties. However, we are still lacking efficient and accurate methods to translate hyperspectral reflectance into large scale information on plant traits across biomes, land cover and sensor types. Yet, the absence of globally representative data sets on reflectance data and the corresponding in-situ measurements represents a bottleneck to develop empirical models for estimating plant traits from hyperspectral reflectance. Recent and ongoing initiatives (e.g. EcoSIS) provide a constantly growing source of hyperspectral data and plant trait observations from different vegetation types and sensors. In this study we integrated 29 data sets including four different ecosystem types spanning from Europe to north America. By combining these heterogeneous data sets, we propose multi-trait models based on Convolutional Neural Networks (CNNs) that simultaneously infer multiple plant traits from canopy spectra. We targeted a broad set of structural and chemical traits (n=20) related to light harvesting, growth, propagation and defense (e.g. leaf mass per area, leaf area index, pigments nitrogen, phosphorus). The performance of our multi-trait CNN models predicting these traits was compared to single-trait CNNs as well as single-trait partial least squares regression (PLSR) models. The results of the multi-trait models across a broad range of vegetation types (crops, forest, tundra, grassland, shrubland) and sensor types were promising and outcompeted state-of-the-art PLSR models. We found that the overall prediction performances significantly increased from single- to multi-trait CNN models and those of PLSR models. The key contribution of this study is to highlight the potential of weakly supervised approaches together with Deep Learning to overcome the scarcity of in-situ measurements and take a step forward in creating large-scale maps of Earth’s biophysical properties with the increase in availability of hyperspectral Earth observation data.

How to cite: Cherif, E., Feilhauer, H., Berger, K., Ewald, M., Hank, T. B., Kovach, K. R., Townsend, P. A., Wang, Z., and Kattenborn, T.: From spectra to functional plant traits: Transferable multi-trait models from heterogeneous and sparse data, EGU General Assembly 2023, Vienna, Austria, 24–28 Apr 2023, EGU23-10901, https://doi.org/10.5194/egusphere-egu23-10901, 2023.