EGU26-2076, updated on 13 Mar 2026
https://doi.org/10.5194/egusphere-egu26-2076
EGU General Assembly 2026
© Author(s) 2026. This work is distributed under
the Creative Commons Attribution 4.0 License.
Oral | Tuesday, 05 May, 08:35–08:45 (CEST)
 
Room -2.92
Surveillance Camera Sensor Networks: An Emerging Observing System for Near-Surface Precipitation Phase Monitoring
Xing Wang1,2, Kun Zhao1, and Zhengwei Yang1
Xing Wang et al.
  • 1Nanjing University, School of Atmospheric Sciences, Nanjing, China (wangxing@nju.edu.cn)
  • 2Nanjing Institute of Technology, School of Computer Engineering, Nanjing, China (jwangxing0719@163.com)

High spatiotemporal monitoring of near-surface precipitation phase (N-SPP) is essential for weather forecasting, transportation, agriculture, and hydrology. However, operational capability remains constrained as manual observations are being phased out, in situ surface instruments are sparse and often lack representativeness, and discrepancies persist between aloft phase estimates from radar/satellite and actual conditions at the ground. Recent studies have also reported an “upper-limit” effect in radar-based N-SPP products, further underscoring the need for low-cost, automated, and scalable alternatives.

Urban surveillance cameras are ubiquitous in modern cities and continuously capture near-surface scenes at high temporal resolution. As precipitation particles traverse the camera field of view, the resulting videos inherently preserve rich visual–temporal signatures of hydrometeors, providing a viable basis for visually discriminating precipitation phases. Moreover, leveraging existing camera infrastructures requires no dedicated sensor deployment, making camera sensor networks a cost-effective solution for high-resolution N-SPP monitoring. Nevertheless, reliably distinguishing common N-SPP types (i.e., rain, snow, and graupel) in unconstrained videos remains challenging due to subtle inter-class differences and strong sensitivities to illumination changes, complex backgrounds, camera settings, and wind-driven trajectories.

Our previous study (Wang et al., 2025) demonstrated the feasibility of using urban surveillance cameras for N-SPP monitoring. Guided by meteorological, optical, and imaging principles, we identified discriminative cues from both daytime and nighttime videos and developed an efficient framework that couples MobileNetV2-based transfer learning for spatial feature extraction with GRU-based temporal modeling. Using a self-curated 94-hour surveillance video dataset and benchmarking against 24 baseline methods, the proposed approach achieved the best overall performance, with accuracies of 0.9677 on the dataset and 0.9301 in real-world field validation against manually quality-controlled 2DVD measurements. The model also remained stable under variations in camera settings and across day–night conditions, and exhibited satisfactory wind robustness for wind speeds below 5 m/s.

Building on these results, we further move toward operational, city-scale deployment by discussing key challenges in multi-camera collaborative observing, including camera siting and field-of-view geometric constraints, automated camera screening and tiered selection, and quality-control/anomaly-detection procedures to address occlusion, glare, raindrop adhesion or wiper interference, stream frame loss, and camera-parameter drift. These discussions provide practical guidance for constructing a stable and reliable surveillance camera–based N-SPP monitoring network.

 

Reference:

Wang, X., Zhao, K., Huang, H., Zhou, A., & Chen, H. (2025). Surveillance camera-based deep learning framework for high-resolution ground hydrometeor phase observation. Atmospheric Measurement Techniques Discussions, 2025, 1-38.

How to cite: Wang, X., Zhao, K., and Yang, Z.: Surveillance Camera Sensor Networks: An Emerging Observing System for Near-Surface Precipitation Phase Monitoring, EGU General Assembly 2026, Vienna, Austria, 3–8 May 2026, EGU26-2076, https://doi.org/10.5194/egusphere-egu26-2076, 2026.