首页|基于深度神经网络的水下图像偏振信息复原方法

基于深度神经网络的水下图像偏振信息复原方法

扫码查看
偏振度和偏振角等偏振信息能够反映目标的偏振特性,根据偏振态的差异可以辨别目标,因此偏振信息在水下探测与识别等领域有重要应用。传统的水下偏振成像方法能够对水下图像进行散射抑制并复原光强图像,但这些方法忽略了对偏振信息的复原,造成偏振信息的丢失。因此,提出基于通道注意力机制的水下图像偏振信息复原神经网络,以拓展水下偏振成像的功能,将偏振度和偏振角信息进行复原。同时设计了基于偏振参量的内容与风格损失函数,进一步提升偏振复原效果。实验结果表明,该网络可有效地抑制水下图像的散射光,同时能较好地恢复线性偏振度和偏振角信息。所提方法对浑浊水下偏振成像技术的进一步应用具有重要意义。
Polarization Information Restoration of Underwater Images Based on Deep Neural Network
Objective Underwater imaging is an important method for exploring oceans,lakes,and other underwater environments,and it is of significance for many fields such as coastal defense,ocean exploration,underwater rescue,and aquaculture.However,there are many suspended particles in the actual water environment,which will scatter and absorb the signal light of the target.Therefore,images obtained by underwater imaging often feature image quality degradation,such as serious contrast reduction and serious detail loss.Due to the differences in polarization characteristics between the signal light and the backscattered light,polarization imaging technology is introduced to underwater imaging and polarization information is employed to suppress scattered light and enhance signal light,which can make up for the shortcomings of detection effects restricted by the environment.Although existing polarization-based descattering methods for underwater imaging can enhance image contrast and improve image quality,these methods only focus on intensity information and ignore polarization information restoration,resulting in a loss of polarization information.In fact,the degree of linear polarization(DoLP)and polarization angle(AoP)among other polarization information can reflect the polarization characteristics of the target.Meanwhile,they are adopted to distinguish targets based on different polarization states and have important applications in underwater detection and recognition.Therefore,we propose a neural network based on the channel attention mechanism to extend the function of underwater polarization imaging by restoring polarization information.Methods The proposed method mainly utilizes a convolutional neural network to restore the polarization information.The network is mainly composed of three parts,including shallow feature extraction module(SFE),a series of residual dense modules(RDBs),and channel attention-based global feature fusion module(CAGFF).Specifically,SFE employs U-Net and two convolutional layers as feature extraction module to extract shallow features containing polarization features of input images.Subsequently,the shallow features with rich polarization information extracted by SFE are input into a series of RDBs,which are mainly composed of dense connections and residual connections of convolutional layers.A series of RDBs outputs are then fed into the CAGFF which consists of a channel attention module and a convolutional layer.Finally,the polarization informed content and style loss(CSL)is designed to train the network,which employs intensity images to calculate the content loss and adopts polarization information to calculate style loss.Results and Discussions The results of polarization imaging experiments of different objects in underwater environments show that our method can restore polarization information and improve underwater imaging quality.In ablation experiments,the four contributions of the network are removed step by step to verify the effectiveness of each contribution.By visual comparison,the network with all improvements accurately recovers both intensity image and polarization information(Fig.5).Removing any component will degrade the restored results and blur details.Quantitatively,our network yields the highest peak signal to noise ratio(PSNR)value among different network structures(Table 2).Compared with other representative methods of underwater image enhancement,the proposed method has the best performance in intensity,DoLP,and AoP images(Fig.6).The PSNRs of two different scenes illustrate that the proposed method has advantages in the reconstruction of DoLP and AoP images.Additionally,several groups of experiments on different parameters are conducted to verify the influence of these parameters.First.the number of channel attention modules is determined by comparing the network performance with different values(Fig.7).Then by the comparative experiment of RDB parameters,we find that the values of RDB parameters are positively correlated with the network performance(Fig.8).Finally,the weight of the loss function is fully discussed(Fig.9).Conclusions Aiming to remove the scattered light and restore the polarization information,we propose a neural network based on the channel attention mechanism.Based on the residual dense network,our network mainly has four contributions,which are polarization input,SFE with U-Net,CAGFF,and polarization informed CSL function.By performing the above contributions,the network can efficiently extract and utilize the polarization features of different levels to restore the polarization information.Meanwhile,we build a dataset for underwater polarimetric images,in which the input images and ground truth images are respectively obtained in turbid water and clear water.Based on this dataset,we carry out a series of experiments on different objects in underwater environments.Ablation experimental results show that our contributions to the network are effective,and removing anyone will degrade the results.Compared with other underwater polarization imaging methods,the results show that our method can suppress the influence of scattered light on polarization imaging and significantly improve the image contrast and clarity.Specifically,our method can successfully restore the DoLP and AoP to expand the function of underwater polarization imaging.

polarimetric imagingunderwater imagingpolarization information restorationdeep learningattention mechanism

刘贺东、韩宜霖、李校博、程振洲、刘铁根、翟京生、胡浩丰

展开 >

天津大学精密仪器与光电子工程学院,天津 300072

天津大学海洋科学与技术学院,天津 300072

偏振成像 水下成像 偏振信息复原 深度学习 注意力机制

2024

光学学报
中国光学学会 中国科学院上海光学精密机械研究所

光学学报

CSTPCD北大核心
影响因子:1.931
ISSN:0253-2239
年,卷(期):2024.44(12)