Skip to main content

ORIGINAL RESEARCH article

Front. Anim. Sci.
Sec. Precision Livestock Farming
Volume 5 - 2024 | doi: 10.3389/fanim.2024.1499253

Individual behavior tracking of heifers by using object detection algorithm YOLOv4

Provisionally accepted
  • 1 Friedrich-Loeffler-Institute, Greifswald-Insel Riems, Germany
  • 2 Institute of Epidemiology, Friedrich Loeffler Institute, Greifswald-Insel Riems, Germany
  • 3 Animal Health Management, Faculty of Agriculture and Food Science, University of Applied Sciences Neubrandenburg,, Neubrandenburg, Germany
  • 4 Animal Health and Animal Welfare, Faculty of Agricultural and Environmental Science, University of Rostock, Rostock, Germany

The final, formatted version of the article will be published soon.

    Standing and lying times of animals are often used as an indicator to assess welfare and health status. Changes in standing and lying times due to health problems or discomfort can reduce productivity Since manual evaluation is time-consuming and cost-intensive, video surveillance offers an opportunity to obtain an unbiased insight. The objective of this study was to identify the individual heifers in group housing and to track their body posture (‘standing’ / ’lying’) by training a real-time monitoring system based on the convolutional neural network YOLOv4. For this purpose, videos of three groups of five heifers were used and two models were trained. First, a body posture model was trained to localize the heifers and classify their body posture. Therefore, 860 images were extracted from the videos and the heifers were labeled ‘standing’ or ‘lying’ according to their posture. The second model was trained for individual animal identification. Only videos of one group with five heifers were used and 200 images were extracted. Each heifer was assigned its own number and labeled accordingly in the image set. In both cases, the image sets were divided separately into a test set and a training set with the ratio (20%:80%). For each model, the neural network YOLOv4 was adapted as a detector and trained with an own training set (685 images and 160 images, respectively). The accuracy of the detection was validated with an own test set (175 images and 40 images, respectively). The body posture model achieved an accuracy of 99.54%. The individual animal identification model achieved an accuracy of 99.79%. The combination of both models enables an individual evaluation of ‘standing’ and ‘lying’ times for each animal in real time. The use of such a model in practical dairy farming serves the early detection of changes in behavior while simultaneously saving working time.

    Keywords: animal behavior, Computer Vision, YOLO(v4), Heifer, Individual detection, Holstein friesian

    Received: 20 Sep 2024; Accepted: 10 Dec 2024.

    Copyright: © 2024 Jahn, Schmidt, Bachmann, Louton, Homeier-Bachmann and Schütz. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

    * Correspondence: Sarah Jahn, Friedrich-Loeffler-Institute, Greifswald-Insel Riems, Germany

    Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.