EGU26-11017, updated on 16 Mar 2026
https://doi.org/10.5194/egusphere-egu26-11017
EGU General Assembly 2026
© Author(s) 2026. This work is distributed under
the Creative Commons Attribution 4.0 License.
Oral | Friday, 08 May, 09:15–09:25 (CEST)
 
Room N1
Leveraging Copernicus data, local expertise, and novel technology to create national forestry products
Albin Bjärhall1, Phillipp Fanta-Jende2, Lorenzo Beltrame2, Jules Salzinger2, Jasmin Lampert3, and Benjamin Schumacher1
Albin Bjärhall et al.
  • 1Department of Forest Inventory – Remote Sensing Unit, Austrian Research Centre for Forests, Vienna, Austria (ORCID: 0000-0002-5572-9507)
  • 2Assistive & Autonomous Systems, AIT Austrian Institute of Technology GmbH
  • 3Data Science & Artificial Intelligence, AIT Austrian Institute of Technology GmbH (ORCID: 0000-0002-0414-4525)

The Austrian Research Centre for Forests (BFW) uses a range of remote sensing (RS) data—including aerial imagery, airborne laser scanning, and high‑resolution Copernicus Sentinel data—to support the National Forest Inventory and produce national forestry products. These include tree species maps, timber stock maps, and a Sentinel‑2‑based anomaly detection map for identifying forest disturbances across Austria. The success of these products is grounded in three factors: (1) strong collaboration with researchers to ensure that recent scientific advances are translated into operational applications; (2) the integration of high‑resolution aerial data with extensive local expertise, enabling high‑quality training datasets for machine‑learning approaches; and (3) the Austrian National Forest Inventory, which provides a robust ground‑truth basis for validating RS‑derived results.

At the same time, the demand for precise, timely, and spatially detailed national forestry products is steadily increasing. This demand is driven by growing monitoring and reporting requirements, as well as by the increasing impacts of land-use and climate change on forest ecosystems. These developments highlight existing limitations of RS-based forestry products, particularly in complex alpine terrain, where terrain shadows and cloud cover can delay or obscure the detection of natural disturbances such as windthrows. Within the SAFIR project, we investigate how AI–based tools can be used to address these challenges and enhance the performance of existing forest disturbance monitoring tools.

As presented in our poster, the SAFIR project combines BFW’s anomaly detection map with ground-truth training data on windthrow events provided by the Österreichische Bundesforste (ÖBF). This fusion allows us to: (1) distinguish windthrow events from other disturbance types within the anomaly detection map; (2) assess the spatial and temporal agreement between modelled disturbances and inventoried windthrow events across Austria; and (3) quantify the detection rate of windthrow events in the existing product. Building on this assessment, identified windthrow sites can be specifically targeted with machine-learning approaches for cloud and terrain-shadow removal, thereby improving both the timeliness and accuracy of windthrow detection and providing validated inputs for developing new datasets and AI models specialized in de‑clouding and de‑shadowing windthrow areas.

By integrating established Copernicus-based forest disturbance products with an extensive, independently collected ground-truth dataset on windthrows, SAFIR enables a systematic evaluation of current windthrow detection capabilities and provides a pathway for targeted methodological improvement. By leveraging AI-based techniques to overcome known limitations, the project contributes to the development of more robust national forestry products for monitoring windthrow damage.

How to cite: Bjärhall, A., Fanta-Jende, P., Beltrame, L., Salzinger, J., Lampert, J., and Schumacher, B.: Leveraging Copernicus data, local expertise, and novel technology to create national forestry products, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-11017, https://doi.org/10.5194/egusphere-egu26-11017, 2026.