10th International Conference on Geomorphology
© Author(s) 2022. This work is distributed under
the Creative Commons Attribution 4.0 License.

Total Carbon content assessed by UAS near-infrared imagery as a new fire severity metric

Lea Wittenberg1, Seham Hamzi1, Dar Roberts2, Charles Ichoku3, Nurit Shtober‐Zisu4, and Anna Brook1
Lea Wittenberg et al.
  • 1University of Haifa, Geography and Environmental Studies, Israel (leaw@geo.haifa.ac.il)
  • 2Geography Department, University of California Santa Barbara, Santa Barbara, CA 93106, USA (dar@geog.ucsb.edu)
  • 3NASA Goddard Space Flight Center, Greenbelt, MD 20771, USA (charles.m.ichoku@nasa.gov )
  • 4Department of Israel Studies, University of Haifa, Haifa, Israel (nshtober@research.haifa.ac.il

The ash produced by forest fires is a complex mixture of organic and inorganic particles with various properties. Ash and char are broad indicators for evaluating the impacts of fire on nutrient cycling and ecosystem recovery. Numerous studies suggested assessing fire severity by changes in ash characteristics. Traditional methods for fire severity are based on in situ observations, i.e. a visual approximation of changes in the forest floor and soil which are time-consuming and subjective. These measures primarily reflect the level of consumption of organic layers, the deposition of ash, particularly its depth and colour, and fire-induced changes in the soil. Numerous recent studies on fire severity suggested using remote sensing techniques combined with field observations via machine learning and spectral induces approaches to obtain practical tools for assessing the fire effects on ecosystems. While index thresholding can be easily implemented, its effectiveness over large areas is limited pattern coverage of forest type and fire regimes.

On the other hand, the machine learning algorithms allow multivariate classifications, but in the case of processing space-time series analysis, learning becomes complex and time-consuming. Therefore, there is no complete agreement on a quantitative index determining the severity metric. Given that wildfires play a significant role in controlling forest carbon storage and cycling, this study presents the potential of low-cost multispectral imagery across visible and near-infrared regions collected by the unmanned aerial systems to determine fire severity according to the colour and chemical properties of the vegetation ash. The use of multispectral imagery data might reduce impreciseness caused by manual colour matching and produce a vast and accurate spatial-temporal severity map. The suggested severity map is based on deep learning algorithms' spectral information used to evaluate chemical changes in fuels. These methods quantify total carbon content and assess the corresponding fire intensity required to form a particular residue. By designing three different learning algorithms (PLS-DA, ANN, and 1-D CNN) for two datasets (RGB images and Munsell colour versus UAS-based multispectral imagery) the multispectral prediction results show an excellent performance, which, therefore, shows that the deep network-based near-infrared remote sensing technology has a future potential to become an alternative and reliable fire severity monitoring method.

How to cite: Wittenberg, L., Hamzi, S., Roberts, D., Ichoku, C., Shtober‐Zisu, N., and Brook, A.: Total Carbon content assessed by UAS near-infrared imagery as a new fire severity metric, 10th International Conference on Geomorphology, Coimbra, Portugal, 12–16 Sep 2022, ICG2022-259, https://doi.org/10.5194/icg2022-259, 2022.