首页|基于多尺度图卷积的高光谱图像分类

基于多尺度图卷积的高光谱图像分类

扫码查看
近年来,卷积神经网络在高光谱图像分类领域取得了显著的进步,但是其只能对图像进行规则格网运算,不能自适应的进行特征聚合.因此,本文提出了一种基于分段森林的多尺度图卷积神经网络的高光谱图像分类方法,主要有四个步骤:首先使用主成分分析进行降维,根据图像的空间信息构建多尺度的分段森林,建立子树之间关系;然后提出了一种基于图卷积网络的U-net模型架构,通过池化和解池化建立多个尺度之间的图结构特征的转换;网络通过图卷积神经网络进行自适应的特征聚合,并在编码器和解码器之间采用跳层连接融合了多尺度特征;最后通过SoftMax进行节点的半监督分类.实验在公开的高光谱数据集上进行了验证,均取得了较好的分类精度,表明了该方法的有效性.
Hyperspectral image classification based on multi-scale graph convolution
In recent years,convolutional neural networks have made remarkable progress in the field of hyperspectral image classification,but they can only perform regular grid operations on images,and cannot adaptively perform feature aggregation.Therefore,a segmented forest-based multi-scale convolutional neural network hyperspectral image classifi-cation method is proposed in this paper,which consists of four steps.Firstly,principal component analysis is used for dimensionality reduction,and a multi-scale segmented forest is constructed according to the spatial information of ima-ges to establish the relationship between the subtrees.Then,a U-net model architecture based on graph convolutional network is proposed to establish the transformation of graph structural features between multiple scales by pooling and unpooling.The network uses a graph convolutional neural network to perform adaptive feature aggregation and fuses multi-scale features by layer hopping connection between encoder and decoder.Finally,the semi-supervised classifica-tion of nodes is carried out through SoftMax.The experiment is verified on the public hyperspectral dataset,all of which achieves good classification accuracy,demonstrating the effectiveness of the method.

hyperspectral imagemultiscalesegmented forestgraph convolutional neural networksubtree

温馨、李禄、范军芳、胡智峰、周锋、吴亚平

展开 >

北京信息科技大学自动化学院,北京 100192

北京信息科技大学现代测控技术教育部重点实验室,北京 100192

中煤航测遥感集团有限公司,陕西西安 710100

湖北省测绘工程院,湖北武汉 430070

展开 >

高光谱图像 多尺度 分段森林 图卷积神经网络 子树

北京市属高等学校高水平科研创新团队建设支持计划项目北京市自然科学基金项目

BPHR202201234214072

2024

激光与红外
华北光电技术研究所

激光与红外

CSTPCD北大核心
影响因子:0.723
ISSN:1001-5078
年,卷(期):2024.54(8)