Skip to main content

ORIGINAL RESEARCH article

Front. Plant Sci.
Sec. Sustainable and Intelligent Phytoprotection
Volume 15 - 2024 | doi: 10.3389/fpls.2024.1488185
This article is part of the Research Topic Precision Information Identification and Integrated Control: Pest Identification, Crop Health Monitoring, and Field Management View all articles

A method of identification and localization of tea buds based on lightweight improved YOLOV5

Provisionally accepted
YUAN HONG WANG YUAN HONG WANG 1Jinzhu Lu Jinzhu Lu 1*Qi WANG Qi WANG 1ZONGMEI GAO ZONGMEI GAO 2
  • 1 School of Mechanical Engineering, Xihua University, Sichuan, China
  • 2 Department of Biological Systems Engineering, College of Agricultural, Human, and Natural Resource Sciences, Washington State University, Pullman, Washington, United States

The final, formatted version of the article will be published soon.

    The low degree of intelligence and standardization of tea bud picking, as well as laborious and time-consuming manual harvesting, bring significant challenges to the sustainable development of the high-quality tea industry. There is an urgent need to investigate the critical technologies of intelligent picking robots for tea. The complexity of the model requires high hardware computing resources, which limits the deployment of the tea bud detection model in tea-picking robots. Therefore, in this study, we propose the YOLOV5M-SBSD tea bud lightweight detection model to address the above issues. The Fuding white tea bud image dataset was established by collecting Fuding white tea images; then the lightweight network ShuffleNetV2 was used to replace the YOLOV5 backbone network; the up-sampling algorithm of YOLOV5 was optimized by using CARAFE modular structure, which increases the sensory field of the network while maintaining the lightweight; then BiFPN was used to achieve more efficient multi-scale feature fusion; and the introduction of the parameter-free attention SimAm to enhance the feature extraction ability of the model while not adding extra computation. The improved model was denoted as YOLOV5M-SBSD and compared and analyzed with other mainstream target detection models. Then, the YOLOV5M-SBSD recognition model is experimented on with the tea bud dataset, and the tea buds are recognized using YOLOV5M-SBSD. The experimental results show that the recognition accuracy of tea buds is 88.7%, the recall rate is 86.9%, and the average accuracy is 93.1%, which is 0.5% higher than the original YOLOV5M algorithm's accuracy, the average accuracy is 0.2% higher, the Size is reduced by 82.89%, and the Params, and GFlops are reduced by 83.7% and 85.6%, respectively. The improved algorithm has higher detection accuracy while reducing the amount of computation and parameters. Also, it reduces the dependence on hardware, provides a reference for deploying the tea bud target detection model in the natural environment of the tea garden, and has specific theoretical and practical significance for the identification and localization of the intelligent picking robot of tea buds.

    Keywords: Tea buds, target detection, YOLOV5M-SBSD, Lightweight modeling, deep learning

    Received: 29 Aug 2024; Accepted: 05 Nov 2024.

    Copyright: © 2024 WANG, Lu, WANG and GAO. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

    * Correspondence: Jinzhu Lu, School of Mechanical Engineering, Xihua University, Sichuan, China

    Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.