摘要
一位新闻记者-机器人与机器学习的工作人员新闻编辑每日新闻-机器学习的新研究是一篇报告的主题。根据NewsRx记者从南京发来的新闻报道,研究表明:“中红外(MIR)光谱技术可以表征不同乳腺组织中大分子成分的含量和结构变化,可以通过机器学习进行特征提取和模型训练,实现不同乳腺组织的准确分类和识别。”一维卷积神经网络(1D-CNN)以其对光谱信号等序列数据的高效处理能力而在深度学习领域中脱颖而出。记者从南京航空航天大学的研究中得到一句话:“在这项研究中,”采用自行研制的MIR中空光纤衰减全反射探针(HOF-ATR)与傅立叶变换红外光谱仪(FTIR)耦合,原位采集乳腺组织的MIR光谱,对乳腺癌组织中大分子含量和结构的变化进行分期分析,首次建立了基于1D-CNN的三级分类模型,用于识别正常乳腺组织。结果表明,基于基线校正(BC)和数据增强的1D-CNN模型具有更高的分类精度,总准确率为95.09%,优于机器学习模型SVM-DA(90.00%)、SVR(88.89%)、PCA-FDA(67.78%)和PCA-KNN(70.00%)。
Abstract
By a News Reporter-Staff News Editor at Robotics & Machine Learning Daily News Daily News-New research on Machine Learning is th e subject of a report. According to news reporting from Nanjing, People's Republ ic of China, by NewsRx journalists, research stated, "Mid-infrared (MIR) spectro scopy can characterize the content and structural changes of macromolecular comp onents in different breast tissues, which can be used for feature extraction and model training by machine learning to achieve accurate classification and recog nition of different breast tissues. In parallel, the one-dimensional convolution al neural network (1D-CNN) stands out in the field of deep learning for its abil ity to efficiently process sequential data, such as spectroscopic signals." The news correspondents obtained a quote from the research from the Nanjing Univ ersity of Aeronautics and Astronautics, "In this study, MIR spectra of breast ti ssue were collected in situ by coupling the self-developed MIR hollow optical fi ber attenuated total reflection (HOF-ATR) probe with a Fourier transform infrare d spectroscopy (FTIR) spectrometer. Staging analysis was conducted on the change s in macromolecular content and structure in breast cancer tissues. For the firs t time, a trinary classification model was established based on 1D-CNN for recog nizing normal, paracancerous and cancerous tissues. The final predication result s reveal that the 1D-CNN model based on baseline correction (BC) and data augmen tation yields more precise classification results, with a total accuracy of 95.0 9%, exhibiting superior discrimination ability than machine learnin g models of SVM-DA (90.00%), SVR (88.89%), PCA-FDA (67 .78%) and PCA-KNN (70.00%)."