首页|基于特征加权与融合的小样本遥感目标检测

基于特征加权与融合的小样本遥感目标检测

扫码查看
基于深度卷积神经网络的目标检测器需要大量标注样本展开训练,针对训练样本数量不足导致目标检测器泛化能力较差的问题,基于元特征调制提出一种特征加权与融合的小样本遥感目标检测方法.首先,在元特征提取网络中嵌入瓶颈结构式特征学习模块C3,增加网络深度和感受野;其次,利用路径聚合网络(PAN)进行元特征融合,有效提升了网络对多尺度遥感目标的感知能力;最后,使用轻量级卷积神经网络学习原型向量以加权元特征,在轻量化模型的同时,利用模型已有知识快速微调模型,以适应对新类目标的检测.实验结果显示,在NWPU VHR-10和DIOR数据集上,该方法相比于FSODM方法,在新类对象上的平均检测精度分别提高了29.40%和11.78%.可视化结果表明,该方法在小样本遥感目标检测上效果更优.
Few-Shot Object Detection on Remote Sensing Images Based on Feature Weighting and Fusion
Object detectors based on convolutional neural networks require a large number of labeled samples for training.To address the is-sue of poor generalization of the object detector due to insufficient training samples,this paper proposes a few-shot object detection method on remote sensing images via feature weighting and fusion based on meta-feature modulation.Firstly,the feature learning module with bottleneck structure(C3)is embedded in the meta-feature extraction network to increase network depth and receptive field.Secondly,the path aggrega-tion network(PAN)are used for meta-feature fusion,which effectively enhance the perception of the network to multi-scale remote sensing objects.Then,prototype vectors are learned from a lightweight convolutional neural network for meta-feature weighting,which transfers model knowledge from the base class to the new class and makes the model lightweight at the same time.Experimental results show that on the NWPU VHR-10 and DIOR datasets,the proposed method improves the mean average precision on the new class of remote sensing objects by 29.40%and 11.78%,respectively,compared to FSODM method.Moreover,visualization results demonstrate that this method performs better on few-shot remote sensing object detection.

remote sensing datasetfew-shot object detectionC3-Darknet feature extraction networkmulti-feature fusionfeature weighting

宋云凯、吴原顼、叶蕴瑶、肖进胜

展开 >

武汉大学 电子信息学院,湖北 武汉 430072

遥感数据集 小样本目标检测 C3-Darknet特征提取网络 多特征融合 特征加权

国家社会科学基金项目

20BKG031

2024

软件导刊
湖北省信息学会

软件导刊

影响因子:0.524
ISSN:1672-7800
年,卷(期):2024.23(4)
  • 22