YOLO-EZ:An efficient and lightweight model for grape disease detection
With the rapid development of deep learning technology,computer vision has demonstrated significant potential in agricultural disease detection.To address the issue of grape disease detection,this paper proposes a YOLO-EZ model integrating attention mechanisms and a weighted pyramid network with a bidirectional feature.The YOLO-EZ model,based on the lightweight MobileViTv3 backbone network,enhances feature extraction through SCConv attention convolution and optimizes the feature fusion process by using BiFPN,which significantly improves the recognition accuracy of grape disease features.Extensive experiments on the grape disease dataset show that YOLO-EZ achieves a precision of 92.8%,a recall of 89.9%,an mAP50 of 93.6%,and an mAP50-95 of 73.3%,outperforming several advanced comparative models.Moreover,YOLO-EZ maintains high performance while reducing model parameters and only has a parameter size of 5.8 MB,which makes it suitable for the deployment on mobile and edge computing devices and demonstrate its feasibility and efficiency in practical applications.