首页|基于自编码神经网络的航空物探遥感数据分类方法研究

基于自编码神经网络的航空物探遥感数据分类方法研究

扫码查看
航空物探遥感数据的采集过程中受到电磁波辐射等外界因素的影响,导致航空物探遥感数据分类准确率较低,为此提出基于自编码神经网络的航空物探遥感数据分类方法;根据航空物探对象的基本特征,设置遥感数据的分类标准;通过辐射校正、几何纠正、噪声消除等步骤,完成航空物探遥感数据的预处理;构建自编码神经网络,利用自编码神经网络算法,从光谱、形状、纹理等方面提取遥感数据特征,通过特征匹配确定航空物探遥感数据的所属类型;通过分类性能测试实验得出结论:所提方法的全局遥感数据分类成功率和错误率的平均值分别为99。8%和0。6%,局部遥感数据分类的成功率和错误率的平均值分别为99。8%和0。3%,即所提方法在分类性能方面具有明显优势。
Research on Classification Method of Airborne Geophysical Remote Sensing Data Based on Self-Coding Neural Network
The collection process of airborne geophysical remote sensing data is affected by external factors such as electromag-netic wave radiation,resulting in low classification accuracy of airborne geophysical remote sensing data.Therefore,a classification method for airborne geophysical remote sensing data based on self-coding neural network is proposed.The classification standards for remote sensing data are set according to the basic characteristics of aerial geophysical exploration objects,The preprocessing of air-borne geophysical remote sensing data is completed through the steps such as the radiation correction,geometric correction,and noise elimination,and the self-coding neural network is built,the self-coding neural network algorithms are used to extract remote sensing data features from spectrum,shape,texture,and other aspects,and the type of aerial geophysical remote sensing data is determined through feature matching.Through classification performance testing experiments,it is concluded that the average success and error rates of the proposed method for global remote sensing data classification are 99.8%and 0.6%,respectively.The average success and error rates for local remote sensing data classification are 99.8%and 0.3%,respectively,indicating that the proposed method has a significant advantage in classification performance.

self-coding neural networkaviation datageophysical remote sensing datadata classification

于刘

展开 >

上海航空工业(集团)有限公司,上海 201206

自编码神经网络 航空数据 物探遥感数据 数据分类

2024

计算机测量与控制
中国计算机自动测量与控制技术协会

计算机测量与控制

CSTPCD
影响因子:0.546
ISSN:1671-4598
年,卷(期):2024.32(3)
  • 20