首页|基于视频的施工作业监护人离岗识别模型

基于视频的施工作业监护人离岗识别模型

扫码查看
为防范施工作业和石油化工生产作业中监护人离岗带来的安全隐患,使用重新训练的YOLOv8 模型智能分析施工现场的监控视频,对施工作业监护人离岗行为进行识别。设计出一套施工作业监护人离岗识别判断流程:接入施工视频后,使用YOLOv8 识别出人员、马甲、施工标志物、施工区域标志物等,判断施工作业是否正在进行,确定出施工区域和监护人应在区域,检测监护人并判断监护人是否在施工区域内,一旦监护人离岗即时发出报警信息,实现自动判定施工区域内是否有监护人。该流程中将可变区域识别方法引入YOLOv8 中,只对区域内目标进行识别,避免了区域外目标对判断的干扰。施工作业监护人离岗识别模型可对施工时的监护人离岗进行准确高效地识别,识别成功率达到 90%,可满足施工安全监管需求。
Video-based Identification Model for the Off-duty of Construction Guardian
In order to prevent potential safety hazards caused by guardians leaving their posts during construction operations and petrochemical production operations,the retrained YOLOv8 model was used to intelligently analyze the surveillance video of the construction site and identify the construction operation guardians'off-duty behavior.A set of construction operation guardian off-duty identification and judgment process was designed:accessed the construction video,used YOLOv8 to identify personnel,vests,construction markers,construction area markers,etc.,determined whether the construction operation is in progress,determined the construction area and the area where the guardian should be,the guardian was detected and judged whether the guardian is in the construction area.Once the guardian leaves the post,an alarm message was immediately sent to automatically determine whether there is a guardian in the construction area.In this process,the variable area recognition method was introduced into YOLOv8,which only recognizes targets within the area,avoiding the interference of targets outside the area on the judgment.The construction operation guardian off-duty identification model can accurately and efficiently identify the off-duty of guardians during construction.The identification success rate reached 90%,which meets the needs of construction safety supervision.

computer visionvideo analysisconstruction operationguardian off-dutybehavior recognition

苏洪全、谭蕾、姜浩、郑亚强、马庆、董强、王新星

展开 >

昆仑数智科技有限责任公司,北京 100000

计算机视觉 视频分析 施工作业 监护人离岗 行为识别

2024

化工管理
中国化工企业管理协会

化工管理

影响因子:0.336
ISSN:1008-4800
年,卷(期):2024.(3)
  • 22