为了解决遥感道路提取中边缘细节特征利用不充分,以及复杂背景遮挡区域的道路难以实现准确分割等问题,提出一种基于边缘引导和多尺度感知的遥感道路提取模型(Edge-guidance and Multi-scale percep-tion U-Net,EMUNet)。以U-Net为基础,增加遥感图像的Canny边缘检测结果作为输入,并通过设计结合注意力的边缘引导融合模块对各层编码器进行特征引导,以此充分利用边缘信息提高最终的道路提取质量;其次,针对图像中存在的背景遮挡问题,通过构建多尺度并行空洞卷积模块增强网络的多尺度感知能力,从而捕获更多的上下文信息,对一些受到背景遮挡的区域实现准确提取。在Massachusetts道路数据集上进行实验验证,与U-Net相比,EMUNet能实现对细小道路和受遮挡区域更准确的分割,并且召回率、F1分数和交并比均优于其他对比算法,能够实现更为完整和准确的道路信息提取。
Road extraction from remote sensing images based on edge guidance and multi-scale perception
In order to solve the problems of under-utilisation of edge detail features in road extraction,and the difficulty of achieving accurate segmentation of roads in complex background occlusion regions,the study proposed a remote sensing road extraction model based on edge-guidance and multi-scale perception U-Net(EMUNet.Based on U-Net,the Canny edge detection result of remote sensing image is added as input,and the feature guidance of each layer encoder is carried out by designing the edge-guided fusion module combined with the attention,so as to make full use of the edge information to improve the quality of the final road extraction;secondly,in view of the background occlusion problem existing in the image,the multi-scale parallel hollow convolution module is constructed to enhance the multi-scale perception ability of the network,so as to capture more road information and to improve the quality of the road extraction.Secondly,to address the problem of background occlusion in the image,the network is enhanced by constructing a multi-scale parallel dilated convolution module to capture more contextual information and accurately extract the regions that are obscured by the background.The experimental verification is carried out on the Massachu-setts road dataset,and compared with U-Net,EMUNet can achieve more accurate segmentation of small roads and oc-cluded regions,and the recall rate,F1 score and intersection ratio are better than other comparative algorithms,so it can achieve more complete and accurate road information extraction.