Automated processing of webcam digital imagery for near-real-time identification of drought impacts on vegetation
- 1Mendel University in Brno, Faculty of Agrisciences, Brno, Czechia
- 2Global Change Research Institute CAS, Department of Climate Change Impacts on Agroecosystems, Brno, Czechia
The occurrence of severe drought episodes, especially changes in their intensity and frequency, is considered one of the major impacts of changing climate. Consequences of these changes manifest on many levels ranging from ecological to societal, through all the drought types (meteorological, hydrological, agricultural, or socioeconomic). In relation to this development, the need to build complex drought early warning, monitoring, predictive, and impact assessment systems is also becoming more important. Even though many such systems are already in place, relying on complex modeling, remote sensing data, or participative data collection, we often lack the possibility of fast, near-real-time, and straightforward drought manifestation mapping with good spatiotemporal coverage. The proposed methodology presented in our study aspires to cover this gap. The presented work aims to develop a method for early warning and identifying drought impacts on vegetation based on automated interpretation of imagery from an RGB camera network. Thanks to the cooperation with Windy.com as the main partner, we are able to access the largest global web camera network with continuous time series of imagery in daily, weekly, and monthly timestep. In the first step, we prepared a training database for automated image categorization to detect vegetation in each camera image. To perform this step, we manually classified over two thousand camera images and defined the presence of vegetation and three subtypes of vegetation – forest, grassland, and agricultural vegetation. Thanks to this training dataset, we are able to categorize any other imagery in the database automatically, thanks to machine learning algorithms. After labeling images, we created time series for cameras focusing on vegetation operating during daylight hours. To lower the computational load, we focused only on vegetated parts of images. Therefore, for each camera, we created a time series and defined masks of green vegetated areas. The mask definition is based on HSV color model transformation, with statistically defined thresholds to detect green color. All image time series are subsequently masked, and the RGB image is normalized. Finally, the RGB greenness index, as (2*G)/(R+B), is computed for each masked vegetation area to assess spatiotemporal changes in greenness in the focus area of each camera. With the ability to monitor continuous camera imagery in weekly timestep, combined with the global scope of the webcam network, the proposed approach brings an operative tool to monitor current vegetation conditions. Thanks to the fully automated processing of real-time images, we are able to cover underrepresented areas and include those in drought and drought impact monitoring networks without bringing additional costs.
How to cite: Bláhová, M., Kudláčková, L., Fischer, M., Poděbradská, M., and Trnka, M.: Automated processing of webcam digital imagery for near-real-time identification of drought impacts on vegetation, EMS Annual Meeting 2023, Bratislava, Slovakia, 4–8 Sep 2023, EMS2023-265, https://doi.org/10.5194/ems2023-265, 2023.