To meet the need of crop leaf disease detection in complex scenarios, this study designs a method based on the computing power of mobile devices that ensures both detection accuracy and real-time efficiency, offering significant practical application value. Based on a comparison with existing mainstream detection models, this paper proposes a target detection and recognition algorithm, TG_YOLOv5, which utilizes multi-dimensional data fusion on the YOLOv5 model. The triplet attention mechanism and C3CBAM module are incorporated into the network structure to capture connections between spatial and channel dimensions of input feature maps, thereby enhancing the model's feature extraction capabilities without significantly increasing the parameter count. The GhostConv lightweight module is used to construct the backbone network, reducing model complexity, shrinking the model size, and improving detection speed. A self-constructed rice leaf disease dataset is used for experimentation. Results show that TG_YOLOv5 achieves a mean Average Precision (mAP) of 98.3% and a recall rate of 97.2%, representing a 1.2% improvement in mAP and a 4.3% improvement in recall over the traditional YOLOv5 algorithm. The trained lightweight model is then deployed on a Raspberry Pi using the MNN engine for acceleration, showing a 73.8% increase in detection speed across models after MNN acceleration. Additionally, this model achieves satisfactory detection accuracy and speed on apple and tomato datasets, validating its generalization ability. This research provides a theoretical foundation for remote real-time detection of rice diseases in agriculture.
Keywords: Attention Mechanism; Deep Learning; Multi-dimensional Feature Fusion; Object Detection; YOLOv5.