Smart City Gnosys

Smart city article details

Title Flood Depth Estimation From Web Images
ID_Doc 26675
Authors Meng Z.; Peng B.; Huang Q.
Year 2019
Published Proceedings of the 2nd ACM SIGSPATIAL International Workshop on Advances in Resilient and Intelligent Cities, ARIC 2019
DOI http://dx.doi.org/10.1145/3356395.3365542
Abstract Natural hazards have been resulting in severe damage to our cities, and flooding is one of the most disastrous in the U.S and worldwide. Therefore, it is critical to develop efficient methods for risk and damage assessments after natural hazards, such as flood depth estimation. Existing works primarily leverage photos and images capturing flood scenes to estimate flood depth using traditional computer vision and machine learning techniques. However, the advancement of deep learning (DL) methods make it possible to estimate flood depth more accurate. Therefore, based on state-of-the-art DL technique (i.e., Mask R-CNN) and publicly available images from the Internet, this study aims to investigate and improve the flood depth estimation. Specifically, human objects are detected and segmented from flooded images to infer the floodwater depth. This study provides a new framework to extract critical information from large accessible online data for rescue teams or even robots to carry out appropriate plans for disaster relief and rescue missions in the urban area, shedding lights on the real-time detection of the flood depth. © 2019 Copyright is held by the owner/author(s). Publication rights licensed to ACM.
Author Keywords Detection; Flood depth; Mask R-CNN; Resilient


Similar Articles


Id Similarity Authors Title Published
53553 View0.879Notarangelo N.M.; Wirion C.; van Winsen F.Sturm-Flooddepth: A Deep Learning Pipeline For Mapping Urban Flood Depth Using Street-Level And Oblique Aerial ImageryGeomatica, 77, 2 (2025)
922 View0.867Liu B.; Li Y.; Ma M.; Mao B.A Comprehensive Review Of Machine Learning Approaches For Flood Depth EstimationInternational Journal of Disaster Risk Science, 16, 3 (2025)