首页|基于改进Swin Transformer的条码检测算法

基于改进Swin Transformer的条码检测算法

扫码查看
条码检测是一种广泛应用于不同行业的技术,用于识别、验证和检查条码的质量和准确性.然而,传统的检测方法在处理尺度变化较大等复杂情况下的表现较差,通常存在检测速度和效率低、应用范围有限等问题,并且传统的方法通常只关注图像的局部信息,忽视了条码在不同尺度上的特征等问题.针对上述问题,提出了 一种基于改进Swin Transformer的条码检测算法.首先,通过引入局部感知性和多尺度特征提取机制,具有更好的鲁棒性,能够应对不同大小和形状的条码;然后,引入了基于FCOS的检测框架的创新思想;最后,在标注好的条码数据集上对改进后的算法进行模型训练和测试.实验结果表明,所提出的改进后的模型与YOLOv4算法相比,精确率、召回率分别提高了 5.78%、3.18%,整体性能优于其他主流算法,有效提升了条码检测能力并达到较高的检测精度.
Barcode Detection Algorithm Based on Improved Swin Transformer
Barcode detection is a technology widely used in different industries to identify,verify and check the quality and accuracy of barcodes.However,the traditional detection method has poor performance in complex situations such as large changes in the processing scale.Detection speed and efficiency are low,and the application range is limited.The traditional method usually only focuses on the local information of the image,ignoring the characteristics of the barcode at different scales.In order to solve the above problems,a barcode detection algorithm based on improved Swin Transformer is proposed.Firstly,by introducing local perception and multi-scale feature extraction mechanisms,it has better robustness and can cope with barcodes of different sizes and shapes.Then,the innovative idea of FCOS-based detection framework is introduced.Finally,the model training and testing of the improved algorithm on the labeled barcode dataset show that the accuracy and recall of the improved model proposed in this paper are improved by 5.78%and 3.18%compared with the YOLOv4 algorithm,respectively,and the overall performance is better than other mainstream algorithms,which effectively improves the barcode detection ability and achieves high detection accuracy.

target detectiondeep learningbarcodeSwin TransformerFCOS

王正家、庄健、肖喆、许款款、何涛

展开 >

湖北工业大学 机械工程学院,武汉 430068

现代制造质量工程湖北省重点实验室,武汉 430068

冶金装备及其控制教育部重点实验室,武汉 430081

目标检测 深度学习 条码 Swin Transformer FCOS

国家自然科学基金资助项目

51275158

2024

机械设计与研究
上海交通大学

机械设计与研究

CSTPCD北大核心
影响因子:0.531
ISSN:1006-2343
年,卷(期):2024.40(3)