首页|基于自适应多特征融合的红外图像增强算法

基于自适应多特征融合的红外图像增强算法

扫码查看
针对红外图像纹理不清晰、亮度低、高噪声的问题,提出了一种自适应多特征融合的红外图像增强算法.首先,通过用自动线性映射的方法对14位红外图像进行有效特征提取得到了 16位图像,提升了图像可视化效果.其次,引入广义反锐化掩模(Generalized Unsharp Masking,GUM)算法与带色彩恢复的多尺度视网膜(Multi-Scale Retinex with Color Restoration,MSRCR)增强算法联合处理的方法,获得图像不同尺度的有效信息,提升了图像的对比度.最后设计了自适应权重图,并结合图像金字塔结构的特性,对不同特征层进行有效信息的互补融合,提升了图像亮度,丰富了图像的纹理信息.实验结果表明,此算法有效提升了红外图像的对比度和视觉效果;相较于现有的几种算法,其平均梯度(Average Gradient,AG)约提升 0.6%,峰值信噪比(Peak Signal-to-Noise Ra-tio,PSNR)约提升10%,图像的边缘信息有效率约提升11%,图像的清晰度约提升10%.
Infrared Image Enhancement Algorithm Based on Adaptive Multi-Feature Fusion
Aiming at the problems of unclear texture,low brightness and high noise of infrared images,an a-daptive multi-feature fusion algorithm for infrared image enhancement is proposed in this paper.Firstly,the automatic linear mapping method is used to extract 16-bit infrared images from 14-bit infrared images,which improves the visual effect.Secondly,the combined processing method of GUM and MSRCR is introduced to obtain effective information on different scales of the image and improve the contrast of the image.Finally,the adaptive weight map is designed and combined with the characteristics of the image pyramid structure to com-plement and fuse the effective information of different feature layers,which improves the brightness of the im-age and enriches the texture information of the image.Experimental results show that this algorithm can effec-tively improve the contrast and visual effect of infrared images,and its AG is increased by about 0.6%com-pared with the existing algorithms.The PSNR is about 10%higher,the image edge information efficiency is about 11%higher,and the image sharpness is about 10%higher.

feature extractionweight mappyramidmultiscale fusion

邸若海、万乐乐、李亮亮、孙梦宇、李晓艳、王鹏

展开 >

西安工业大学电子信息工程学院,陕西西安 710021

西安工业大学机械工程学院,陕西西安 710021

西安工业大学光电工程学院,陕西西安 710021

特征提取 权重图 金字塔 多尺度融合

国家自然科学基金项目陕西省科技厅重点研发计划国家重点研发计划2022年度陕西高校青年创新团队项目山东省智慧交通重点实验室(筹)项目2023年陕西省高校工程研究中心项目

6217136002022GY-1102022YFF0604900

2024

红外
中国科学院上海技术物理研究所

红外

影响因子:0.317
ISSN:1672-8785
年,卷(期):2024.45(7)
  • 5