首页|基于改进MobileNetV2神经网络的视网膜OCT图像多分类

基于改进MobileNetV2神经网络的视网膜OCT图像多分类

扫码查看
光学相干断层成像(optical coherence tomography,OCT)是一种具有无接触、高分辨率等特点的新型眼科医学诊断方法,现在已经作为医生临床诊断眼科疾病的重要参考物,但人工分类疾病费时费力,视网膜病变的早期发现和临床诊断至关重要.为了解决该类问题,本文提出了一种基于改进MobileNetV2 神经网络对视网膜OCT图像多分类识别方法.此方法利用特征融合技术处理图像并设计增加注意力机制改进网络模型,二者在极大程度上提高OCT图像的分类准确率.与原有算法相比,分类效果具有明显提升,本文模型的分类准确率、召回值、精确度、F1 值分别达到 98.3%、98.44%、98.94%、98.69%,已经超越人工分类的准确率.此类方法不仅在实际诊断中加快诊断流程、降低医生负担、提高诊断质量,同时也为眼科医疗研究提供新的方向.
Multi-classification of Retinal OCT Images Based on Improved MobileNetV2 Neural Network
Optical coherence tomography(OCT)is a new type of ophthalmic diagnosis method with non-contact,high resolution,and other characteristics,which has been used as an important reference for doctors to clinically diagnose ophthalmic diseases.As early detection and clinical diagnosis of retinopathy are crucial,it is necessary to change the time-consuming and laborious status quo of the manual classification of diseases.To this end,this study proposes a multi-classification recognition method for retinal OCT images based on an improved MobileNetV2 neural network.This method uses feature fusion technology to process images and designs an attention increase mechanism to improve the network model,greatly improving the classification accuracy of OCT images.Compared with the original algorithm,the classification effect has been significantly improved,and the classification accuracy,recall value,accuracy,and F1 value of the proposed model reach 98.3%,98.44%,98.94%and 98.69%,respectively,which has exceeded the accuracy of manual classification.Such methods not only speed up the diagnostic process,reduce the burden on doctors,and improve the quality of diagnosis in actual diagnosis,but also provide a new direction for ophthalmic medical research.

retinaoptical coherence tomography(OCT)attention mechanismfeature fusionimage classification

姚娟、乔焕、方玲玲

展开 >

辽宁师范大学计算机与人工智能学院,大连 116081

视网膜 光学相干断层扫描 注意力机制 特征融合 图像分类

辽宁省自然科学基金辽宁省教育厅项目

2021-MS-272LJKQZ2021088

2024

计算机系统应用
中国科学院软件研究所

计算机系统应用

CSTPCD
影响因子:0.449
ISSN:1003-3254
年,卷(期):2024.33(5)
  • 31