首页|建筑物指数与对抗网络结合的检测样本增广

建筑物指数与对抗网络结合的检测样本增广

扫码查看
在基于深度学习的遥感图像大范围目标检测中,部分地物获取难度较大,训练结果不佳.因此,利用形态学建筑物指数与生成式对抗网络进行样本增广,减少因检测样本不足导致的模型过拟合问题.通过提取纹理结构信息相关的形态学建筑物指数,将其与原始样本进行叠加,对建筑物的纹理与空间特征进行强化.同时利用已有样本训练生成式对抗网络以增广部分目标类别,并将其与形态学建筑物指数增强后的样本进行合成,以扩充原始样本集.相比于翻转,裁剪,色调变化的增广策略,使用该方法的检测精度在YOLOv5、EfficientDet等模型上的检测精度均有2%~5%的提升.实验证明,利用建筑物指数与生成式对抗网络相结合的样本增广方法对于诸如发电站等特殊感兴趣类别的小样本遥感图像目标检测精度具有明显提升效果.
Detection Sample Augmentation Based on Building Index and Adversarial Network
In the large-scale target detection of remote sensing images based on deep learning,it is difficult to obtain some ground objects,and show poor performance in training results.Therefore,the morphological building index and the generative adversarial network were used for sample augmentation to reduce the problem of model overfitting caused by insufficient detection samples.By extracting the morphological building index related to the texture structure information,superimposing it with the original sample,the texture and spatial characteristics of the building could be strengthened.The existing samples were used to train the generative adversarial network to augment some targets categories.After compositing them with the samples enhanced by the morphological building index,the original sample set was expanded.Compared with the augmentation strategies of flipping,cropping,and changing the color,the detection accuracy of this method got 2%~5%improvement on YLOLOv5,EfficientDet and other models.Experiments have proved that the sample augmentation method combining building index and generative adversarial network can significantly improve the detection accuracy of small-sample remote sensing image targets of special interest categories such as power stations.

image processingtarget detectionsample augmentationbuilding indexgenerative adversial network

王伟、陆冬华、高岩、张怡婷

展开 >

核工业北京地质研究院遥感信息与图像分析技术国家级重点实验室,北京 100029

图像处理 目标检测 样本增广 建筑物指数 生成式对抗网络

核工业北京地质研究院遥感信息与图像分析技术国家级重点实验室项目

6142A01210101-1

2024

科学技术与工程
中国技术经济学会

科学技术与工程

CSTPCD北大核心
影响因子:0.338
ISSN:1671-1815
年,卷(期):2024.24(3)
  • 18