首页|结合U-Net和STGAN的多时相遥感图像云去除算法

结合U-Net和STGAN的多时相遥感图像云去除算法

扫码查看
针对光学遥感图像中云的遮挡可能会降低甚至完全遮挡图像中的某些地面覆盖信息,限制对地观测、变化检测或土地覆盖分类等的问题,云去除是迫切需要解决的一项重要任务。为了恢复被云遮挡的地面区域,提出一种基于多时相遥感图像的两阶段云去除算法。第一阶段是云分割,即直接使用U-Net提取云并去除薄云。第二阶段是图像恢复,采用时空生成网络(STGAN)去除厚云,STGAN的生成模型采用改进的多输入的U-Net,通过一次从同一位置的7帧图像序列中提取关键特征恢复相应的不规则厚云覆盖区域。第一阶段的薄云处理有利于后面的STGAN捕捉到更多的地面信息。实验结果表明,与传统的去云方法和深度学习Pix2Pix等算法相比较,该算法无论在视觉效果上,还是峰值信噪比(PSNR)与结构相似性(SSIM)等客观质量评价指标上,均有显著的提升,有利于光学遥感图像的进一步利用。
Cloud removal in multitemporal remote sensing imagery combining U-Net and spatiotemporal generative networks
Cloud occlusion often occurs in optical remote sensing images.Cloud occlusion may reduce or even completely occlude some ground cover information in the images,which limits ground observation,change detection,or land cover classification.Cloud removal is an important task that urgently needs to be solved.Thin and thick clouds usually coexist in optical remote sensing images,and the cloud removal algorithm for single-frame remote sensing images is only suitable for solving the problem of thin cloud occlusion.Therefore,using multi-temporal remote sensing images of the same area at different times to remove clouds has become a major issue.This study aims to fully utilize images in the same location without cloud time period to replace cloud-occluded images for restoring the ground area occluded by clouds.For this purpose,a two-stage cloud removal algorithm for multi-temporal remote sensing images based on U-Net and spatiotemporal generative network(STGAN)is proposed.The first stage is cloud segmentation,which directly uses the U-Net model to extract clouds and remove thin clouds.The second stage is image restoration,which directly uses STGAN to remove thick clouds.It inputs the seven frames of ground images after removing thin clouds into the STGAN model to obtain a single,detail-rich cloud-free ground image.The generative model of STGAN adopts an improved multi-input U-Net to recover the corresponding irregularities in the thick cloud cover area by extracting key features from seven frames of images at the same location at a time.The thin cloud processing in the first stage is beneficial to the subsequent STGAN to capture more ground information.The proposed algorithm can solve the inability of U-Net to handle cloud occlusion in thick cloud areas.It can also capture more ground information than directly using STGAN for cloud removal.It has a better cloud removal effect.The experimental results on our dataset show that only using the first-stage U-Net model and only using the second-stage STGAN model for cloud removal are inferior to the proposed two-stage cloud removal algorithm in terms of subjective visual effects and objective quantitative evaluation indicators such as peak signal-to-noise ratio and structural similarity.This performance fully verifies the effectiveness of the cloud removal algorithm in this study.Compared with traditional cloud removal methods such as RPCA,TRPCA and deep learning algorithms such as Pix2Pix,the proposed algorithm is superior to the comparison algorithm and has a significant improvement,which fully verifies the advancement of the cloud removal algorithm in this study.The proposed algorithm fully utilizes the spatiotemporal information of multi-temporal cloudy satellite images of the same area at different times.It also has good cloud removal performance,which is conducive to the further utilization of optical remote sensing images.Although the proposed algorithm has achieved a relatively good cloud removal effect,it also has certain limitations.The cloud removal effect of the algorithm is not ideal for cloud image sequences with a large area covered by thick clouds.In the follow-up research,the spatiotemporal features of image sequence frames will be explored to better reconstruct large areas covered by thick clouds.

remote sensing imagesmulti-temporalcloud removalimage restorationU-NetSTGAN

王卓、马骏、郭毅、周川杰、柏彬、李峰

展开 >

河南大学软件学院,开封 475100

中国空间技术研究院钱学森空间技术实验室,北京 100094

澳大利亚西悉尼大学,悉尼2751

北京市遥感信息研究所,北京 100192

展开 >

遥感图像 多时相 云去除 图像恢复 U-Net STGAN

国家重点研发计划

2020YFA04100

2024

遥感学报
中国地理学会环境遥感分会 中国科学院遥感应用研究所

遥感学报

CSTPCD北大核心
影响因子:2.921
ISSN:1007-4619
年,卷(期):2024.28(8)