基于强化特征提取的视网膜血管分割
Retinal vessel segmentation based on enhanced feature extraction
孙国栋 1闫丰亭 1史志才2
作者信息
- 1. 上海工程技术大学,上海 201620
- 2. 上海市信息安全综合管理技术研究重点实验室,上海 200240
- 折叠
摘要
视网膜血管的分割精确率对眼科疾病和糖尿病早期诊断有着重要影响.面对现有方法在微血管与病变区域分割性能差的问题,本文提出一种强化提取血管特征的分割模型.该模型在编码部位引入多尺度特征提取残差模块(multi-scale feature extraction residual module,MFE-residu-al)和多级残差空洞卷积层,用来扩展感受野,学习多层次图像特征,提高模型对血管信息的利用率;下采样和短连接部位分别融入轻量化注意力机制和多通道注意力模块,增加模型对血管的识别度,降低误分割的可能性.本文基于DRIVE和STARE两种公开数据集进行了实验,来验证改进模型的分割能力.结果表明,两种数据上的准确率分别为0.9652和0.9715,灵敏度分别为0.8205和0.8256,与其他算法相比,分割性能更有优势.
Abstract
The segmentation accuracy of retinal vessels has an important impact on the early diagnosis of ophthalmic diseases and diabetes.Facing the problem of poor segmentation performance of existing methods in microvascular and lesion regions,a segmentation model with enhanced extraction of vessel features is proposed in this paper.The model introduces a multi-scale feature extraction residual module(MFE-residual)and a multi-level residual null convolution layer at the encoding site,which is used to extend the perceptual field,learn multilevel image features and improve the utilization of vessel informa-tion by the model;the downsampling and short connection sites are incorporated into the lightweight at-tention mechanisms and multi-channel attention module to increase the recognition of vessels by the model and reduce the possibility of mis-segmentation.Experiments are conducted based on two publicly available datasets,DRIVE and STARE,to verify the segmentation capability of the improved model.The results show that the accuracy is 0.965 2 and 0.971 5 on both data,and the sensitivity is 0.820 5 and 0.825 6,respectively,which are more advantageous than other algorithms in terms of segmentation per-formance.
关键词
图像处理/视网膜血管分割/Unet/卷积神经网络(CNN)/多尺度特征提取/多通道注意力机制Key words
image processing/retinal vessel segmentation/Unet/convolutional neural network(CNN)/multi-scale feature extraction/multi-channel attention module引用本文复制引用
基金项目
新一代人工智能重大项目(2020AAA0109300)
上海市信息安全综合管理技术研究重点实验室开放研究课题(AGK2019004)
出版年
2024