首页|融合FasterNet和RepVGG的安全设备佩戴检测方法

融合FasterNet和RepVGG的安全设备佩戴检测方法

扫码查看
针对工业场景下检测模型参数量多、安全绳易与背景混淆问题,提出一种融合FasterNet和RepVGG的目标检测算法(you only look once version 5 fasternet-repvgg-cutmix-bifpn,YOLOv5s-FRCB).通过引入轻量级网络结构FasterNet Block和RepVGG Block替换YOLOv5s部分卷积层,显著减少模型参数量和加快检测速度,满足实时性需求;通过BiFPN特征强连通性提升模型特征学习能力;改进Cutmix数据增强方法,随机将目标嵌入输入图像,更新标签,缓解标签类别不平衡问题,提高泛化性.在自建安全设备佩戴检测数据集上进行试验,结果表明:YOLOv5s-FRCB mAP值达到了 96.3%,算法模型内存减少34%,是一种高效实用的安全设备佩戴检测方法;YOLOv5s-FRCB能在保证准确率的同时,进一步降低计算量.
A security device wearing detection method integrating FasterNet and RepVGG
Addressing the issues of high parameter count in detection models and the ease of safety rope easily confused with the background in industrial scenarios,an object detection algorithm combining fasternet and RepVGG was proposed(you only look once version 5 FasterNet-RepVGG-CutMix-BiFPN,YOLOv5s-FRCB).This algorithm integrated FasterNet and RepVGG by repla-cing some convolutional layers of YOLOv5s with the lightweight network structures,FasterNet Block and RepVGG Block.This sig-nificantly reduced the model's parameter count and the accelerated detection speed,meeting real-time requirements.The strong con-nectivity features in BiFPN enhanced the model's feature learning capacity.An improved Cutmix data augmentation method was a-dopted,which randomly enhanced targets into input images and updated their labels,alleviating issues of label category imbalance and improving generalization.Experiments were carried on a self-built safety equipment wear detection dataset,the results showed that the mAP value of YOLOv5s-FRCB reached 96.3%,and the algorithm model memory was reduced by 34%,which was an effi-cient and practical method for safety equipment wear detection;YOLOv5s-FRCB achieved a further reduction in computational com-plexity while maintaining accuracy.

safety equipment wearing detectionobject detectionlightweightdata enhancementreal-time detection

张曼、孙凯军、李翔、孙纪舟

展开 >

淮阴工学院计算机与软件工程学院,江苏 淮安 223003

安全设备佩戴检测 目标检测 轻量化 数据增强 实时检测

2024

山东大学学报(工学版)
山东大学

山东大学学报(工学版)

CSTPCD北大核心
影响因子:0.634
ISSN:1672-3961
年,卷(期):2024.54(6)