EGU26-20434, updated on 14 Mar 2026
https://doi.org/10.5194/egusphere-egu26-20434
EGU General Assembly 2026
© Author(s) 2026. This work is distributed under
the Creative Commons Attribution 4.0 License.
Oral | Friday, 08 May, 15:25–15:35 (CEST)
 
Room N1
3Dtrees.earth - A platform for accessing, analyzing, and visualizing LiDAR data in forest environments
Kilian Gerberding1, Janusch Vajna-Jehle1, Teja Kattenborn1, Christian Scharinger1, Daniel Lusk1, Benjamin Brede2, Maximilian Sperlich1, Thomas Seifert1, Björn Grüning3, Stefano Puliti4, and Julian Frey1
Kilian Gerberding et al.
  • 1Faculty of Environment and Natural Resources, University of Freiburg, Freiburg im Breisgau, Germany
  • 2Institute for Earth System Science and Remote Sensing, University of Leipzig, Leipzig, Germany
  • 3Institute of Computer Science, University of Freiburg, Freiburg im Breisgau, Germany
  • 4Norwegian Institute of Bioeconomy Research (NIBIO), Norway

Forests are undergoing rapid structural changes driven by droughts, storms, pests, and long-term climatic stress. Quantifying these dynamics requires detailed three-dimensional information on forest structure, biomass, and diversity. Drone-based, mobile, and terrestrial LiDAR have become essential for acquiring such data, yet their broader use remains constrained by fragmented processing workflows, heterogeneous data formats, limited cross-platform integration, and restricted access to scalable computing resources.

3Dtrees.earth is an open, cloud-based platform designed to overcome these barriers through integrated, scalable, and reproducible extraction of forest information from multi-platform LiDAR data. The platform supports standardized processing of LiDAR point clouds terrestrial (TLS), uncrewed aerial (ULS), and mobile laser scanning (MLS), applying modular pipelines for data harmonization, instance and semantic segmentation, species prediction, and structural trait extraction. Building on the recent advances in 3D deep learning, 3Dtrees.earth integrates state-of-the-art models for single-tree detection, species classification, and tree-level inventory generation. 

All processing workflows are containerized and deployed via Galaxy Europe, enabling users to analyze large LiDAR datasets without local software or dedicated computing resources. A core design principle is accessibility combined with transparency: users interact through web-based workflows and shared histories that fully document tool versions, parameters, and data provenance, ensuring reproducibility across regions, sensors, and applications. Derived products - including canopy height models, tree-level inventories, biomass estimates, and structural diversity indices - are curated according to FAIR principles with persistent storage, rich metadata, and standardized access interfaces. 

Co-developed with forest managers, researchers, public agencies, and AI developers, 3Dtrees.earth serves multiple communities. Practitioners gain access to operational products such as tree density and height maps, gap and deadwood indicators, and structural diversity metrics that can be directly integrated into management planning. Scientists benefit from harmonized benchmark datasets and reproducible workflows that facilitate method comparison across regions, forest types, and sensor platforms. AI developers are provided with large-scale, well-labeled 3D forest datasets for training and evaluating generalizable forest analytics models.

By lowering technical barriers and standardizing 3D forest analytics, 3Dtrees.earth aims to accelerate the integration of LiDAR-derived structural information into forest research, monitoring, and management at a global scale.

How to cite: Gerberding, K., Vajna-Jehle, J., Kattenborn, T., Scharinger, C., Lusk, D., Brede, B., Sperlich, M., Seifert, T., Grüning, B., Puliti, S., and Frey, J.: 3Dtrees.earth - A platform for accessing, analyzing, and visualizing LiDAR data in forest environments, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-20434, https://doi.org/10.5194/egusphere-egu26-20434, 2026.