首页|联合光谱和空间双尺度细节注入的遥感图像融合

联合光谱和空间双尺度细节注入的遥感图像融合

扫码查看
设计了一种基于引导滤波增强的联合光谱尺度和空间尺度细节注入的自适应遥感图像融合方法。首先,将全色图像作为引导图像,利用引导滤波对多光谱图像分区域处理,锐化纹理丰富区域、平滑光谱丰富区域,得到增强多光谱图像。其次,利用经典成分替换法和多尺度分析法分别提取光谱尺度和空间差异的细节信息,通过互信息整合得到包含双尺度的细节图像。再次,以原始多光谱图像和亮度分量的光谱相关关系、增强光谱图像的边缘信息作为细节图像的注入约束条件,得到细节注入图像。最后,将多光谱图像与联合空间和光谱双尺度的细节注入图像相加得到融合的高分辨率多光谱图像。分别利用IKONOS、QuickBird、WorldView4和高分二号四类遥感数据进行实验,结果表明,与其他融合方法相比,所提方法在主观视觉效果和客观量化指标上均表现出良好的性能,验证了所提方法的有效性。
Remote Sensing Image Fusion Based on Joint Spectral and Spatial Dual-scale Detail Injection
Remote sensing image fusion is a crucial process that significantly enhances the quality of remote sensing data by integrating multispectral and panchromatic images.However,this integration poses challenges in both spectral and spatial scales.Traditional fusion methods,such as Intensity-Hue-Saturation(IHS)transformation for Component Substitution(CS)and wavelet transform fusion for Multi-Resolution Analysis(MRA),each have their own advantages and disadvantages when striving to generate high-resolution multispectral images.The former enhances spatial resolution significantly but leads to noticeable spectral distortion,while the latter maintains spectral information well but has limited spatial resolution enhancement.To fully leverage the strengths of both types of methods,we introduce a novel fusion method that combines spectral scale detail and spatial scale detail.The proposed method consists of three stages.The first stage is the multispectral image enhancement preprocessing,which takes into consideration the different demands for spectral and spatial information of different features in remote sensing images.The panchromatic image is used as a guiding image,and guided filtering is applied to enhance the multispectral image.After enhancement,the texture-rich regions in the multispectral image are sharpened,the gradient's information of which are increased.Meanwhile the spectral-rich regions are smoothed with average filtering to preserve spectral information.The enhanced multispectral image is the foundation for controlling the amount of detail injection in the subsequent stage.The second stage is the detail injection stage,which is a key component of the proposed method and includes detail extraction,detail injection coefficient calculation,and spectral preservation coefficient calculation.Initially,spectral scale difference details and spatial scale difference details are extracted separately from the multispectral and panchromatic images using classical component substitution methods and multi-scale analysis methods.Since there is some information redundancy in the extracted details,a normalized mutual information is used to calculate the ratio of the two types of details and generate a dual-scale detail image through linear weighting.To ensure that the injected details align with the original panchromatic image,edge detection is performed using gradient operators on the enhanced multispectral image,and the resulting edge matrix serves as a constraint for detail injection.Additionally,the spectral preservation coefficient is calculated based on the spectral correlation between the original multispectral image and its intensity component to maintain the spectral relationship during the fusion process.Finally,the dual-scale detail image,edge detection matrix,and spectral preservation coefficient are multiplied to obtain the final detail injection amount.The third stage is image fusion,where the original upsampled multispectral image is pixel-wise added to the detail injection image from the second stage to generate the fused high-resolution multispectral image.To validate the effectiveness of the proposed method,experiments were conducted on four types of remote sensing datasets including IKONOS,QuickBird,WorldView4,and GF-2.The images used in the experiments contain various terrain elements such as vegetation,buildings,and water bodies to verify the requirements of different land features for spectral and spatial information.Comparing the results of the four sets of experiments,the proposed method shows color similarity to the original multispectral images in the fused true-color composite images,with clear edges and rich textures.In terms of objective quality evaluation,the experiments on IKONOS and WorldView4 datasets achieved the best or second-best results in Dλ,Ds and QNR,while in the remaining two indicators,although there was no significant advantage,the subjective visual quality was significantly better than the comparison methods.In conclusion,the proposed method,combining spectral and spatial scale detail injection,addresses the shortcomings of insufficient single-scale detail information extraction,better adapts to the characteristics of different land features,and improves the detail and accuracy of the fusion results.

Guide fliterMutual informationGradient detectionDetail injectionRemote sensing image fusion

王淑香、金飞、林雨准、芮杰、左溪冰、刘潇、杨小兵

展开 >

信息工程大学 地理空间信息学院,郑州 450001

中国人民解放军32020部队,武汉 430000

引导滤波 互信息 梯度检测 细节注入 遥感图像融合

2024

光子学报
中国光学学会 中国科学院西安光学精密机械研究所

光子学报

CSTPCD北大核心
影响因子:0.948
ISSN:1004-4213
年,卷(期):2024.53(10)