首页|基于深度异构迁移学习的水稻遥感影像提取

基于深度异构迁移学习的水稻遥感影像提取

扫码查看
为了实现在目标域仅有无标注样本的条件下对异构遥感影像上的水稻提取模型进行高质量构建和复用,构建了一种基于时空约束的深度异构特征迁移学习模型.首先,基于空间位置构建源域和目标域无标签样本组,并提取其深度特征;其次,构建异构特征迁移模型,创建同名样本特征转换、同名样本特征正则、样本重建损失函数,减少特征负迁移影响,实现异构特征的精准迁移;最后,建立半监督分类模型,通过引入HingLoss损失来消除错误伪标签的影响,实现分类精度的提高.结果表明,本研究方法能够实现不同分辨率下影像间的样本特征迁移,相较于未经过特征迁移的情况,准确率提升了27.68个百分点,F1分数提升了17.3个百分点.
Extraction of rice from remote sensing images based on deep heterogeneous transfer learning
In order to achieve high-quality construction and reuse of rice extraction models from remote sensing images based on het-erogeneous with only unlabeled samples in the target domain,a deep heterogeneous feature transfer learning model based on temporal and spatial constraints was constructed.Firstly,unlabeled sample groups in the source domains and target domains were constructed based on spatial location,and their deep features were extracted;secondly,in order to reduce the negative transfer impact of features and realize precise transfer of heterogeneous features,a heterogeneous feature transferring model was constructed by using a composite loss function including corresponding sample feature conversion loss,corresponding sample feature regular loss,and sample recon-struction loss;finally,in order to improve the accuracy of classification,a semi-supervised classification model was established,and HingLoss was introduced to eliminate the impact of wrong pseudo labels.The results showed that the research method could realize sam-ple feature transfer between images at different resolutions.Compared with the case without feature transfer,the accuracy rate was im-proved by 27.68 percentage points,and the F1 score was improved by 17.3 percentage points.

unlabeled sampleextraction of ricehigh resolution remote sensing imagesdeep heterogeneous transfer learning

邱儒琼、何丽华、李孟璠

展开 >

中国地质大学(武汉)国家地理信息系统工程技术研究中心,武汉 430074

湖北省发展规划研究院有限公司,武汉 430071

湖北省地理国情监测中心,武汉 430071

无标注样本 水稻提取 高分辨率遥感影像 深度异构迁移学习

湖北省自然资源厅科研项目

ZRZY2021KJ03

2024

湖北农业科学
湖北省农业科学院 华中农业大学 长江大学 黄冈师范学院

湖北农业科学

CSTPCD
影响因子:0.442
ISSN:0439-8114
年,卷(期):2024.63(8)
  • 12