首页|基于改进Mobile-UNet的轻量级瞳孔分割算法设计

基于改进Mobile-UNet的轻量级瞳孔分割算法设计

扫码查看
偏心摄影视力筛查设备是屈光状态快速检测应用的重要手段,瞳孔图像分割是其成像算法的重要内容。针对嵌入式设备计算资源有限和瞳孔分割精度低等问题,提出一种基于改进Mobile-UNet的轻量级瞳孔图像分割算法。算法基于U-Net改进,采用倒残差线性瓶颈模块初步轻量化。通过分组卷积降低参数,利用通道混洗打开组间通道,并引入自适应参数融合并行注意力机制提升分割性能。此外优化损失函数增强对边界的注意。实验结果表明,与MobilenetV2相比模型参数量减少90%,浮点运算次数增加19%,但分割性能显著提升;与U-Net相比模型复杂度大幅降低且分割性能提升。相比其他算法该模型的复杂度和分割性能均表现出优势,实现轻量且高效的分割。
Design of lightweight pupil segmentation algorithm based on improved Mobile-UNet
Eccentric photographic vision screening equipment is an important means of rapid detection of refractive state,and pupil image segmentation is an important part of its imaging algorithm.Aiming at the problems of limited computing resources and low precision of pupil segmentation in embedded devices,a lightweight pupil image segmenta-tion algorithm based on improved Mobile-UNet was proposed.Based on U-Net improvement,the algorithm is prelimi-narily lightweight by using inverse residual linear bottleneck module.Group convolution is used to reduce parameters,channel mixing is used to open inter-group channels,and an adaptive parameter fusion parallel attention mechanism is introduced to improve segmentation performance.In addition,the optimization of the loss function enhances the atten-tion to the boundary.The experimental results show that compared with MobilenetV2,the number of model parameters is reduced by 90%,the number of floating point operations is increased by 19%,but the segmentation performance is significantly improved.Compared with U-Net,the complexity of the model is greatly reduced and the segmentation performance is improved.Compared with other algorithms,this model has advantages in complexity and segmentation performance,and achieves lightweight and efficient segmentation.

pupilimageembeddeddevices,lightweightnetworksdeeplearningimagesegmentation

胡乔伟、谭洪、刘新娟、胡南、方二喜

展开 >

苏州大学电子信息学院,江苏苏州 215000

苏州大学工程训练中心,江苏苏州 215000

瞳孔图像 嵌入式设备,轻量级网络 深度学习 图像分割

国家重点研发计划江苏省自然科学基金项目

SQ2022YFB3200085BK20181431

2024

激光杂志
重庆市光学机械研究所

激光杂志

CSTPCD北大核心
影响因子:0.74
ISSN:0253-2743
年,卷(期):2024.45(1)
  • 6