首页|基于高分辨率遥感影像应用BASS-Net构建化工园区典型地物识别模型

基于高分辨率遥感影像应用BASS-Net构建化工园区典型地物识别模型

扫码查看
化工园区典型地物提取一般利用传统的遥感图像处理方法,难以实现地物的精细识别,不利于化工园区环境监测和管理.研究旨在探索深度学习方法在化工园区典型地物高精度识别应用中的可行性.针对化工园区地物分布的高空间异质性,搭建TensorFlow深度学习框架,基于高分辨率遥感影像数据构建化工园区18种典型地物数据集,应用卷积神经网络BASS-Net训练化工园区典型地物识别模型,进行化工园区地物识别,并与随机森林(RF)和支持向量机(SVM)的识别结果进行对比分析.结果显示:BASS-Net模型对园区典型地物识别的整体精度、召回率和F1分数分别为 97.17%、97.76%和 97.46%,比RF高 20%以上,比SVM高 50%以上,具有明显优势.由此可见,应用卷积神经网络BASS-Net模型可以实现典型化工地物的自动精准识别,其结果较传统机器学习方法优势明显,可为化工园区环境监测和管理提供支撑.
Recognition of Typical Objects in Chemical Industry Parks Using BASS-Net based on High-resolution Remote Sensing Images
Image processing technics is usually applied to extract typical objects in Chemical Industry Parks(CIPs).However,its precision is considered not enough for the monitoring and management of CIPs.The pur-pose of this study is to explore the feasibility of deep learning methods in the extraction of typical objects in CIPs.This study applied convolutional neural network BASS-Net to build a typical objects recognition model of CIPs through high-resolution remote sensing images.The results showed that the overall recognition accura-cy,recall rate and F1 score of the BASS-Net model for typical objects in CIPs are 97.17%,97.76% and 97.46%,and the accuracy,recall rate and F1 for each 18 typical types can reach more than 93%,which indicat-ed that the BASS-Net trained model has the ability to classify all the typical classes in CIPs.After comparing the results with those of the RF and SVM,it can be concluded that the BASS-Net model is far superior than the other two models.The BASS-Net model can be expected to provide support for environmental monitoring and management in CIPs.

Deep learningConvolutional Neural Network(CNN)Chemical Industry Parks(CIPs)Machine learningTypical objects recognition model

孙维维、刘杰、张芳芳、马海艺、王昌昆、潘贤章

展开 >

中国科学院南京土壤研究所 土壤与农业可持续发展国家重点实验室,江苏 南京 210008

中国科学院大学,北京 100049

深度学习 卷积神经网络 化工园区 机器学习 地物识别模型

国家重点研发计划项目中国科学院土壤环境与污染修复重点实验室开放基金课题

2020YFC1807401SEPR2020-10

2024

遥感技术与应用
中国科学院遥感联合中心

遥感技术与应用

CSTPCD北大核心
影响因子:0.961
ISSN:1004-0323
年,卷(期):2024.39(3)
  • 10