首页|Sao Joao University Hospital Researcher Provides New Insights into Artificial Intelligence (P268 Artificial Intelligence and Panendoscopy Automatic Detection of Pleomorphic Lesions in Multibrand Device-Assisted Enteroscopy)

Sao Joao University Hospital Researcher Provides New Insights into Artificial Intelligence (P268 Artificial Intelligence and Panendoscopy Automatic Detection of Pleomorphic Lesions in Multibrand Device-Assisted Enteroscopy)

扫码查看
Investigators discuss new findings in artificial intelligence. According to news reporting originating from Porto, Portugal, by NewsRx correspondents, research stated, “Device-assisted enteroscopy (DAE) stands as the sole diagnostic and therapeutic procedure capable of thoroughly examining the entire gastrointestinal (GI) tract. Nevertheless, its diagnostic yield falls short in ensuring a cost-effective panendoscopy and there is still a significant interobserver variability during the procedure.” Our news editors obtained a quote from the research from Sao Joao University Hospital: “Multi-layered convolutional neural networks (CNN) have proven beneficial in numerous medical applications, yet there is a noticeable gap in research regarding their implementation in DAE. The aim of this study is to develop and a validate a multidevice CNN for panendoscopic detection of pleomorphic lesions (vascular lesions, hematic residues, protruding lesions, ulcers and erosions) during DAE. In a retrospective analysis of 338 DAE procedures conducted at two specialized centers, frames from 152 single-balloon enteroscopies (Fujifilm®), 172 double-balloon enteroscopies (Olympus®), and 14 motorized spiral enteroscopies (Olympus®) were used to construct and validate the CNN. The dataset, comprising 40,655 images, was divided into a training dataset (90% of images, n=36,599) and a validation dataset (10% of images, n=4,066). We conducted a 5-fold cross validation, during training dataset. Primary outcomes were sensitivity, specificity,accuracy, and area under the precision recall curve (AUC-PR). During the training dataset’s 5-fold crossvalidation, the model demonstrated a mean sensitivity of 88.7% (88.0 89.5%), specificity of 98.0% (97.8 98.1%), PPV of 92.6% (92.0 93.1%), PPN of 97.0% (96.8 97.2%), with a mean accuracy of 96.0% (95.8 96.2%). During validation dataset, the CNN presented 88.9% sensitivity, 98.9% specificity, 95.8% PPV, 97.1% NPV, 96.8% accuracy and AUC-PR of 0.97. The CNN processed 124 frames per second.”

Sao Joao University HospitalPortoPortugalEuropeArtificial IntelligenceEmerging TechnologiesMachine Learning

2024

Robotics & Machine Learning Daily News

Robotics & Machine Learning Daily News

ISSN:
年,卷(期):2024.(Feb.9)