首页|EATNet: An extensive attention-based approach for cervical precancerous lesions diagnosis in histopathological images

EATNet: An extensive attention-based approach for cervical precancerous lesions diagnosis in histopathological images

扫码查看
© 2024Grading of cervical precancerous lesions is an important prerequisite for determining the treatment plan for precancerous lesions. However, on account of the huge scale of whole slide histopathological images but the small area of interest, the lack of pixel-level annotation data, and the subjectivity of lesion diagnosis without definite quantified standards, which lead to the difficulty of lesion classification. Most existing methodologies split high-resolution images into patches and employ patch-based local feature representations to deliver image-level decisions, resulting in the destruction of the contextual information and the weakened ability to learn clinically relevant representations. To overcome these challenges, this study proposes an Extensive ATtention Network (EATNet) for diagnosing cervical precancerous lesions in histopathological images. EATNet extends the bag-of-words strategy by splitting a whole slide histopathological image into several bags and instances to end-to-end learn representations from gigapixels. The instance-level and bag-level attention blocks are designed to encode the abundant global dependencies, in order to produce discriminative WSI descriptors with only slide-level labels. Experiments are conducted on two public cervical and endometrial datasets, which demonstrate superior performance over prevalent methods with AUC of 92%–94%.

Bag-of-instancesCervical precancerous lesionsDiagnostic classificationHistopathological imagesMulti-head self-attention

Xu J.、Shi L.、Gao Y.、Zhang Y.、Zhao G.

展开 >

School of Computer and Artificial Intelligence Zhengzhou University

School of Cyber Science and Engineering Zhengzhou University

Songshan Lab

2025

Biomedical signal processing and control

Biomedical signal processing and control

SCI
ISSN:1746-8094
年,卷(期):2025.99(Jan.)
  • 61