首页|基于水下场景先验的水下图像增强方法研究

基于水下场景先验的水下图像增强方法研究

扫码查看
针对水体光线吸收与散射作用引起的图像模糊、低对比度和颜色失真等问题,提出一种基于水下场景先验的水下图像增强方法.首先利用水下场景的先验知识,结合水下成像物理模型和水下场景的光学特性,利用 10 种预定义衰减系数合成涵盖不同类型和退化水平的水下图像数据集;然后利用初始残差和密集级联,设计一类轻量级卷积神经网络(convolutional neural networks,CNN)模型增强水下图像,结合基于轻量级的网络结构和有效的训练数据,可减少增强模型的计算量并有效改善水下退化图像的视觉质量;最后采用归一化的后处理过程进一步提升图像增强的效果.仿真实验结果表明,所提方法可行有效,可应用到不同的真实水下场景,具有较强的鲁棒性与有效性.
Underwater image enhancement based on prior knowledge of underwater scenes
In view of the problems of image blur,low contrast,and color distortion caused by light absorption and scattering in water,a method for enhancing underwater images based on prior knowledge of underwater scenes is proposed.First,using the prior knowledge of underwater scenes,combined with the physical model of underwater imaging and the optical characteristics of underwater scenes,ten predefined attenuation coefficients are used to synthesize underwater image datasets covering different types and levels of degradation.Then,a lightweight convolutional neural network(CNN)model is designed to enhance underwater images using the initial residuals and dense cascades.Combined with the lightweight network structure and effective training data,the computational complexity of the enhancement model is reduced,and meanwhile the visual quality of degraded underwater images is effectively improved.Finally,a normalized post-processing process is used to further enhance the effect of image enhancement.The simulation results show that the proposed method is feasible and effective,and can be applied to different real underwater scenes with strong robustness and effectiveness.

deep learningconvolutional neural networkunderwater scene priorunderwater image synthesisunderwater image enhancementinitial residualnormalization processstructural similarity loss

陈鑫、钱旭、周佳加、武杨

展开 >

海装沈阳局驻哈尔滨地区第二军事代表室,黑龙江哈尔滨 150001

哈尔滨工程大学智能科学与工程学院,黑龙江哈尔滨 150001

深度学习 卷积神经网络 水下场景先验 水下图像合成 水下图像增强 初始残差 归一化处理 结构相似性损失

国家自然科学基金国家自然科学基金国家自然科学基金

516090485190904452071108

2024

应用科技
哈尔滨工程大学

应用科技

CSTPCD
影响因子:0.693
ISSN:1009-671X
年,卷(期):2024.51(2)
  • 29