Research on pavement pothole detection algorithm based on improved YOLOv5
As one of the important defects of pavement structure,pavement potholes are of great significance for ensu-ring the driving safety of autonomous vehicles or the operation of mobile robots.When dealing with pavement pothole detection,challenging computer vision tasks are faced,requiring diverse data samples to be processed under different working conditions.Weather factors such as fog,rain,and snow can negative affect the quality and visibility of road im-ages,which in turn increases the difficulty of data preprocessing and feature extraction.Traditional target detection al-gorithms are usually difficult to effectively adapt to these scenario variations,resulting in training datasets that do not adequately reflect the diversity and complexity of road potholes,which reduces the generalization ability and accuracy of the target detection model.In practical applications,these methods are prone to lead to omission and misdetection errors,which have an impact on efficiency and quality of road condition identification and evaluation.In this paper,an improved pavement pothole detection algorithm based on YOLOv5 is proposed,which improves the detection accuracy as well as the recall rate of the model.By introducing a simple,powerful but very novel attention mechanism(BiFPN)and replacing the appropriate activation function and loss function,while the calculation parameters are reduced and the detection model is simplified.The experimental results show that the improved algorithm in this paper improves the accuracy(Precision)by 7.2%compared with the original model,the recall rate(Recall)by 5.5%,and the average accuracy(mAP)by 80.8%,which is 2.1%higher than the original YOLOv5 model.In summary,compared with the commonly used traditional algorithms,the improved algorithm in this paper can significantly improve the detection ac-curacy and reduce the missed detection rate without sacrificing the running speed,which has a better value for mobile deployment and reference value for corresponding research.