Journal of Animal Science and Technology
Korean Society of Animal Science and Technology
Article

Automatic detection of trapping events of postnatal piglets in loose housing pen: comparison of YOLO versions 4, 5, and 8

Taeyong Yun1, Jinsul Kim2, Jinhyeon Yun1,3, Tai-Won Um1
1Department of Data Science, Chonnam National University, Gwangju 61186, Korea.
2School of Electronics and Computer Engineering, Chonnam National University, Gwangju 61186, Korea.
3Department of Animal Science, Chonnam National University, Gwangju 61186, Korea.

© Copyright 2024 Korean Society of Animal Science and Technology. This is an Open-Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

Received: Jan 08, 2024; Revised: Aug 08, 2024; Accepted: Oct 29, 2024

Published Online: Oct 29, 2024

Abstract

In recent years, the pig industry has experienced an alarming surge in piglet mortality shortly after farrowing due to crushing by the sow. This issue has been exacerbated by the adoption of hyperprolific sows and the transition to loose housing pens, adversely affecting both animal welfare and productivity. In response to these challenges, researchers have progressively turned to artificial intelligence of things (AIoT) to address various issues within the livestock sector. The primary objective of this study was to conduct a comparative analysis of different versions of object detection algorithms, aiming to identify the optimal AIoT system for monitoring piglet crushing events based on performance and practicality. The methodology involved extracting relevant footage depicting instances of piglet crushing from recorded farrowing pen videos, which were subsequently condensed into 2-3 min edited clips. These clips were categorized into three classes: no trapping, trapping, and crushing. Data augmentation techniques, including rotation, flipping, and adjustments to saturation and contrast, were applied to enhance the dataset. This study employed three deep learning object recognition algorithms––YOLOv4-Tiny, YOLOv5s and YOLOv8s––followed by a performance analysis. The average precision (AP) for trapping detection across the models yielded values of 0.963 for YOLOv4-Tiny, and 0.995 for both YOLOv5s, and YOLOv8s. Notably, trapping detection performance was similar between YOLOv5s and YOLOv8s. However, YOLOv5s proved to be the best choice considering its model size of 13.6 MB compared to YOLOv4-Tiny’s 22.4 MB and YOLOv8’s 21.4 MB. Considering both performance metrics and model size, YOLOv5s emerges as the most suitable model for detecting trapping within an AIoT framework. Future endeavors may leverage this research to refine and expand the scope of AIoT applications in addressing challenges within the pig industry, ultimately contributing to advancements in both animal husbandry practices and technological solutions.

Keywords: Piglet Crushing; Deep learning object-detection algorithm; YOLO; Trapping; AIoT