首页|基于高分遥感影像的城市道路提取算法研究

基于高分遥感影像的城市道路提取算法研究

扫码查看
高分遥感影像中道路易被建筑物树木等遮挡,从而导致提取的道路存在中断、碎片化分布等问题.为了解决以上问题,提出一种连通性增强道路提取网络,利用条形卷积挖掘不同方向上的道路连续特征,并基于图形分析提出连接注意力分支,挖掘道路相邻像素之间的连续信息;之后,通过计算机编程裁剪处理原始影像与道路标签,并基于图结构解析出连通性立方体,利用pytorch构建模型并构造损失函数完成模型训练和预测.为了检验模型精度,在美国Massachusetts高分数据集和中国典型城市高分数据集进行道路提取实验,通过对比,模型提取结果的召回率较其他模型高 7.84%,交并比、F1 值均取得最高精度,实现道路的有效提取.
Research on Urban Road Extraction Algorithm of High-Resolution Remote Sensing Images
Roads is often obscured by buildings and trees in high resolution remote sensing images,and easily confused with the background features,which leads to the problem that the extracted roads are easy to be interrupted and distributed in a fragmented manner.In order to solve the above problems,this paper proposed a connectivity-enhanced road extraction network.The network utilized strip convolution to mine continuous features of roads in different directions,and proposed connectivity attention branches based on graphical analysis to mine continuous information between adjacent pixels of roads.In order to realize the road extraction using this algorithm,the original images and road labels were cropped and processed by computer programming,and the connectivity cube was parsed based on the graph structure,and the model was constructed using pytorch and the loss function was used to complete the model training and prediction.In order to test the model precision,this paper carried out road extraction experiments on the Massachusetts dataset and the high resolution image dataset of representative cities in China,and by comparing the extraction results of the model with the comparison models,the recall rate increased by 7.84%,intersection to intersection ratio,and F1 value of the model extraction achieved the highest score,which realized the effective extraction of roads.

remote sensing imagehigh-resolution remote sensing imagesroad extractiondeep learningsemantic segmentation

杨少文、杨志波

展开 >

中铁二院工程集团有限责任公司,成都 610031

长沙市规划勘测设计研究院,长沙 410007

遥感影像 高分影像 道路提取 深度学习 语义分割

2024

铁道勘察
中铁工程设计咨询集团有限公司

铁道勘察

影响因子:0.542
ISSN:1672-7479
年,卷(期):2024.50(3)
  • 21