首页|SwinT-Unet:基于双通道自注意力机制的超声图像分割方法

SwinT-Unet:基于双通道自注意力机制的超声图像分割方法

扫码查看
超声图像分割在疾病诊断和治疗中扮演着关键的角色,但由于超声图像的低对比度、噪声干扰以及病灶在形状、大小和位置上的差异等特点,导致准确地分割出感兴趣的区域仍然是一个具有挑战性的任务.为了解决这一问题,本文提出了一种双通道自注意力机制U型网络(SwinT-Unet),该网络利用Swin-Transformer与Unet编码器同时进行特征提取.为了有效融合Swin-Transformer和Unet编码器提取到的不同层级的特征,本文还提出了一个门控双层特征融合模块(Gated Dual-layer Feature Fusion,GDFF),通过门控机制实现了整体特征与局部特征的有效融合,从而提高分割结果的精确度和鲁棒性.本文在2个不同的超声图像分割数据集上进行了实验,结果表明,本研究所提出的模型在分割准确性和鲁棒性方面均优于现有的卷积神经网络和基于Transformer的网络模型.本文为超声图像分割领域提供了一种新的方法,并为临床医学诊断和治疗提供了更准确、可靠的支持.
SwinT-Unet:Ultrasound Image Segmentation Based on Two-Channel Self-Attention Mechanism
Ultrasound image segmentation plays a key role in disease diagnosis and treatment,but accurately seg-menting the regions of interest is still a challenging task due to the low contrast,noise interference,and variability in shape,size,and location of the lesions in ultrasound images.To address this problem,we propose a dual-channel self-attention mechanism U-shaped network(SwinT-Unet),which utilizes Swin-Transformer and Unet encoder to simultaneously extract features.To effectively fuse the different-level features extracted by Swin-Transformer and Unet encoder,we also propose a gated dual-layer feature fusion module(GDFF),which achieves the effective fusion of global and local features through the gating mechanism,thereby improving the accuracy and robustness of the segmentation results.We conduct experiments on two different ultrasound image datasets,and the results show that our proposed model outperforms the existing convolution-al neural network and Transformer-based models in terms of segmentation accuracy and robustness.Our paper provides a new method for ultrasound image segmentation,and offers more accurate and reliable support for clinical medical diagnosis and treatment.

ultrasound imageUnetSwin-Transformerimage segmentationmedical image

宋艳涛、路云里

展开 >

山西大学大数据科学与产业研究院,山西 太原 030006

山西大学计算机与信息技术学院,山西 太原 030006

超声图像 Unet Swin-Transformer 图像分割 医学图像

2024

电子学报
中国电子学会

电子学报

CSTPCD北大核心
影响因子:1.237
ISSN:0372-2112
年,卷(期):2024.52(11)