首页|结合改进注意力的肠胃镜图像深度估计

结合改进注意力的肠胃镜图像深度估计

扫码查看
针对肠胃镜诊疗系统存在图像关键信息模糊和适应性差等问题,提出一种结合改进注意力机制的循环生成对抗网络,以实现对消化道深度信息的准确估计.该网络以CycleGAN为基础网络,结合双重注意力机制,并引入残差门控机制和非局部模块来更全面地捕捉和理解输入数据的特征结构和全局相关性,从而提高深度图像生成的质量和适应性;同时采用双尺度特征融合网络作为判别器,以提升其判别能力并平衡与生成器之间的工作性能.实验结果表明,在肠胃镜场景中预测效果良好,相比其他无监督方法,在胃道、小肠和结肠数据集上平均准确度分别提升了 7.39%、10.17%和10.27%.同时,在实验室人体胃道器官模型上也能够准确地估计出相对深度信息,并提供精确的边界信息.
Depth Estimation of Gastrointestinal Endoscopy Images Using Improved Attention
In response to the key information blur in images and poor adaptability in the gastrointestinal endoscopy diagnosis and treatment system,this study proposes a cycle generative adversarial network(CycleGAN)combining an improved attention mechanism to accurately estimate the depth information of the digestive tract.Based on CycleGAN,the network combines a dual attention mechanism and introduces a residual gate mechanism and a non-local module to comprehensively capture and understand the feature structure and global correlation of input data,thereby improving the quality and adaptation of depth image generation.Meanwhile,a dual-scale feature fusion network is employed as the discriminator to improve the discrimination ability and balance the working performance between the generator and the discriminator.Experimental results show that the proposed method yields good prediction performance in the gastrointestinal endoscopy scenes.Its average accuracy of the stomach,small intestine,and colon datasets is improved by 7.39%,10.17%,and 10.27% respectively compared with other unsupervised methods.Additionally,it can accurately estimate the relative depth information and provide accurate boundary information in the laboratory human gastric organ model.

gastrointestinal endoscopy imagedepth estimationgenerative adversarial network(GAN)attention mechanismdual-scale feature

林飞凡、李凌、徐强

展开 >

湘潭大学物理与光电工程学院,湘潭 411105

苏州中科华影健康科技有限公司,苏州 215123

肠胃镜图像 深度估计 生成对抗网络 注意力机制 双尺度特征

国家重点研发计划苏州市科技计划

2020YFC2003802SYC2022109

2024

计算机系统应用
中国科学院软件研究所

计算机系统应用

CSTPCD
影响因子:0.449
ISSN:1003-3254
年,卷(期):2024.33(1)
  • 5