- 1State Key Laboratory of Internet of Things for Smart City and Department of Civil and Environmental Engineering, University of Macau, Macao SAR, People’s Republic of China (yc17415@um.edu.mo)
- 2State Key Laboratory of Internet of Things for Smart City and Department of Ocean Science and Technology, University of Macau, Macao SAR, People’s Republic of China (pingshen@um.edu.mo)
The growing prevalence of urban floods necessitates the development of cost-effective and scalable monitoring solutions. Traditional water-level sensors are often prohibitively expensive for widespread deployment. Moreover, existing image-based methods frequently encounter limitations in generalizability, particularly the difficulty of harmonizing selected reference features in large-scale quantitative measurements. To address this research gap, we present a novel method that utilizes traffic camera imagery to provide a lightweight solution for quantitatively monitoring urban flood inundation depths with high spatial and temporal resolution. Specifically, the waterline in flood images is recognized by a neural network and localized using world coordinates calibrated by common road markings, allowing for accurate inundation depth measurement based solely on the imagery. This method eliminates the need for costly point cloud data collection or pre-calibrated measurement objectives in urban settings. Additionally, this method enables the simultaneous collection of waterlogging depths from multiple reference objectives within the same image, yielding more robust measurements. This innovative approach paves the way for cost-effective, high-resolution, and reliable quantitative monitoring of urban flood inundation depths, ultimately providing crucial data support for emergency responses and long-term flood mitigation strategies.
How to cite: Qin, J. and Ping, S.: A Scalable and Lightweight Urban Flood Monitoring Solution Utilizing Only Traffic Camera Images, EGU General Assembly 2025, Vienna, Austria, 27 Apr–2 May 2025, EGU25-10710, https://doi.org/10.5194/egusphere-egu25-10710, 2025.