Smart City Gnosys

Smart city article details

Title Edge-Guided Hierarchical Network For Building Change Detection In Remote Sensing Images
ID_Doc 21845
Authors Yang M.; Zhou Y.; Feng Y.; Huo S.
Year 2024
Published Applied Sciences (Switzerland), 14, 13
DOI http://dx.doi.org/10.3390/app14135415
Abstract Building change detection monitors building changes by comparing and analyzing multi-temporal images acquired from the same area and plays an important role in land resource planning, smart city construction and natural disaster assessment. Different from change detection in conventional scenes, buildings in the building change detection task usually appear in a densely distributed state, which is easy to be occluded; at the same time, building change detection is easily interfered with by shadows generated by light and similar-colored features around the buildings, which makes the edges of the changed region challenging to be distinguished. Aiming at the above problems, this paper utilizes edge information to guide the neural network to learn edge features related to changes and suppress edge features unrelated to changes, so as to accurately extract building change information. First, an edge-extracted module is designed, which combines deep and shallow features to supplement the lack of feature information at different resolutions and to extract the edge structure of the changed features; second, an edge-guided module is designed to fuse the edge features with different levels of features and to guide the neural network to focus on the confusing building edge regions by increasing the edge weights to improve the network’s ability to detect the edges that have changed. The proposed building change detection algorithm has been validated on two publicly available data (WHU and LEVIR-CD building change detection datasets). The experimental results show that the proposed model achieves 91.14% and 89.76% in F1 scores, respectively, demonstrating superior performance compared to some recent learning change detection methods. © 2024 by the authors.
Author Keywords change detection; deep learning; edge features; feature fusion; optical remote sensing images


Similar Articles


Id Similarity Authors Title Published
21839 View0.925Holail S.; Saleh T.; Xiao X.; Zahran M.; Xia G.-S.; Li D.Edge-Cvt: Edge-Informed Cnn And Vision Transformer For Building Change Detection In Satellite ImageryISPRS Journal of Photogrammetry and Remote Sensing, 227 (2025)
13125 View0.902Yuan Q.; Wang N.Buildings Change Detection Using High-Resolution Remote Sensing Images With Self-Attention Knowledge Distillation And Multiscale Change-Aware ModuleInternational Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives, 46, M-2-2022 (2022)
48330 View0.883Sun C.; Chen H.; Du C.; Jing N.Semibuildingchange: A Semi-Supervised High-Resolution Remote Sensing Image Building Change Detection Method With A Pseudo Bitemporal Data GeneratorIEEE Transactions on Geoscience and Remote Sensing, 61 (2023)
11750 View0.87Lin H.; Hao M.; Luo W.; Yu H.; Zheng N.Bearnet: A Novel Buildings Edge-Aware Refined Network For Building Extraction From High-Resolution Remote Sensing ImagesIEEE Geoscience and Remote Sensing Letters, 20 (2023)
4137 View0.866Yang M.; Zhao L.; Ye L.; Jiang H.; Yang Z.A Review Of Convolutional Neural Networks Related Methods For Building Extraction From Remote Sensing Images; [基于卷积神经网络的遥感影像建筑物提取方法综述]Journal of Geo-Information Science, 26, 6 (2024)
2916 View0.852Yin J.; Wu F.; Qiu Y.; Li A.; Liu C.; Gong X.A Multiscale And Multitask Deep Learning Framework For Automatic Building ExtractionRemote Sensing, 14, 19 (2022)
22451 View0.85Yang M.; Zhao L.; Ye L.; Jia W.; Jiang H.; Yang Z.Egafnet: An Edge Guidance And Scale-Aware Adaptive Fusion Network For Building Extraction From Remote Sensing ImagesIEEE Transactions on Geoscience and Remote Sensing, 63 (2025)
13075 View0.85Han Z.; Li X.; Wang X.; Wu Z.; Liu J.Building Segmentation In Urban And Rural Areas With Mfa-Net: A Multidimensional Feature Adjustment ApproachSensors, 25, 8 (2025)