首页|基于Pix2pix网络的遥感影像样本扩增方法

基于Pix2pix网络的遥感影像样本扩增方法

扫码查看
基于深度学习的遥感影像土地分类应用需要海量的数据集作为训练样本,影像标签数据集常因数量少而难以满足训练要求,利用现有样本进行扩增是一种有效的技术手段.传统的数据扩增技术仅改变影像颜色、清晰度等,且扩增数量有一定上限,为了 自动化大幅扩增更具多样性的样本,研究设计了一种基于Pix2pix网络的遥感影像样本扩增方法.利用Pix2pix网络生成器根据无人机和谷歌影像标签生成虚拟影像,判定器对虚拟影像与真实影像进行比对,在生成对抗训练多次后输出样本对实现扩增.结果表明:生成结果的视觉对比相似度高,无人机影像和谷歌影像对比原始图像平均余弦相似度分别为0.85与0.96,平均直方图相似度为0.50和0.61,是一种有效的遥感影像样本扩增方法.
Remote Sensing Image Sample Augmentation Method based on Pix2pix Network
Remote sensing image land classification applications based on deep learning require massive data sets as training samples,and the image label data sets are often difficult to meet the training requirements due to the small number.Using existing samples to increase is an effective technical method.The traditional data augmen-tation technology only changes the color and sharpness of the image,and the amount of augmentation has a cer-tain limit.In order to automate the augmentation of more diverse samples,a remote sensing image sample aug-mentation method based on Pix2pix network is designed in this paper.Pix2pix network generator is used to gen-erate virtual images according to unmanned aerial vehicle and Google image tags,and the discriminator com-pares the virtual images with the real images.After generating adversarial training for many times,the sample pairs are output to achieve augmentation.The results show that the visual contrast similarity of the generated re-sults is high and the average cosine similarity of the unmanned aerial vehicle image and Google image is 0.85 and 0.96,respectively,and the average histogram similarity is 0.50 and 0.61.It is an effective method for re-mote sensing image sample augmentation.

Data augmentationPix2pix networkDeep learningRemote sensing image

谢威夷、徐锡杰、芮小平、邹亚荣

展开 >

河海大学地球科学与工程学院,江苏 南京 211100

School of Electronic Engineering,Queen Mary,University of London,London E1 4NS,UK

自然资源部国家卫星海洋应用中心,北京 100081

自然资源部空间海洋遥感与应用研究重点实验室,北京 100081

展开 >

样本扩增 Pix2pix网络 深度学习 遥感影像

2024

遥感技术与应用
中国科学院遥感联合中心

遥感技术与应用

CSTPCD北大核心
影响因子:0.961
ISSN:1004-0323
年,卷(期):2024.39(5)