Vol. 119 No. 1 (2025)
Research Papers

Realtime Monitoring of Animal Behavior Using Deep Learning Models

Romesa Rao
Institute of Computing, Muhammad Nawaz Shareef University of Agriculture, Multan, Pakistan
Salman Qadri
Institute of Computing, Muhammad Nawaz Shareef University of Agriculture, Multan, Pakistan
Rao Kashif
Faculty of Engineering & Computing, National University of Modern Languages, Pakistan

Published 2025-07-08

Keywords

  • Pose Estimation,
  • Behavior Classifier,
  • ResNet-50,
  • Trajectrory Analysis,
  • Random Forest,
  • Decision Tree
  • ...More
    Less

How to Cite

Rao, R., Qadri, S., & Kashif, R. (2025). Realtime Monitoring of Animal Behavior Using Deep Learning Models . Journal of Agriculture and Environment for International Development (JAEID), 119(1), 125–148. https://doi.org/10.36253/jaeid-16397

Abstract

Accurate monitoring of animal health and behavior is crucial for improving welfare and productivity in livestock management. Traditional observation methods are time-consuming and prone to subjective bias. To address these challenges, we propose an automated system for behavioral pattern using deep learning-based pose estimation techniques. Specifically, we utilize ResNet-50, a deep convolutional neural network, to detect key anatomical landmarks such as the nose, eyes, ears, and body center. By tracking these keypoints, we generate movement trajectories that help identify behavioral patterns. For behavior classification, we initially applied a decision tree algorithm, achieving an accuracy of 60%. To enhance performance, we implemented a random forest classifier, which significantly improved the accuracy to 96%. The system tries to classify seven key behaviors: "stand," "sit," "eat," "drink," "aggressive," "sit with legs tied," and "let go of the tail." The random forest model achieved the highest accuracy in detecting "standing" and "aggressive" behaviors, while lower accuracy was observed for "eating" behavior. Additionally, our pose estimation model demonstrated high precision and recall metrics, indicating robust performance in keypoint detection with minimal deviation from ground truth annotations. This automated system reduces the need for manual observation and provides a reliable tool for monitoring some animal behavior. The potential applications extend to various domains, including animal studies and livestock management, offering a scalable and user-friendly solution for real-time behavior analysis.

References

  1. Avanzato, R., Beritelli, F., & Puglisi, V. F. (2022, November). Dairy cow behavior recognition using computer vision techniques and CNN networks. In 2022 IEEE International Conference on Internet of Things and Intelligence Systems (IoTaIS) (pp. 122-128). IEEE.
  2. Chen, C., Zhu, W., & Norton, T. (2021). Behaviour recognition of pigs and cattle: Journey from computer vision to deep learning. Computers and Electronics in Agriculture, 187, 106255.
  3. Fujimori, S., Ishikawa, T., & Watanabe, H. (2020, October). Animal behavior classification using DeepLabCut. In 2020 IEEE 9th Global Conference on Consumer Electronics (GCCE) (pp. 254-257). IEEE.
  4. Gong, C., Zhang, Y., Wei, Y., Du, X., Su, L., & Weng, Z. (2022). Multicow pose estimation based on keypoint extraction. PloS one, 17(6), e0269259.
  5. Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning. MIT Press. Chapter 7: Regularization for Deep Learning.
  6. Guzhva, O. (2018). Computer vision algorithms as a modern tool for behavioural analysis in dairy cattle (No. 2018: 33).
  7. Kashiha, M., Bahr, C., Ott, S., Moons, C. P., Niewold, T. A., Ödberg, F. O., & Berckmans, D. (2013). Automatic identification of marked pigs in a pen using image pattern recognition. Computers and electronics in agriculture, 93, 111-120.
  8. Kosourikhina, V., Kavanagh, D., Richardson, M. J., & Kaplan, D. M. (2022). Validation of deep learning-based markerless 3D pose estimation. Plos one, 17(10), e0276258.
  9. Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2012). Imagenet classification with deep convolutional neural networks. In Advances in Neural Information Processing Systems (pp. 1097-1105).
  10. Labuguen, R., Bardeloza, D. K., Negrete, S. B., Matsumoto, J., Inoue, K., & Shibata, T. (2019, May). Primate markerless pose estimation and movement analysis using DeepLabCut. In 2019 Joint 8th International Conference on Informatics, Electronics & Vision (ICIEV) and 2019 3rd international conference on imaging, vision & pattern recognition (icIVPR) (pp. 297-300). IEEE.
  11. Li, J., Kang, F., Zhang, Y., Liu, Y., & Yu, X. (2023). Research on Tracking and Identification of Typical Protective Behavior of Cows Based on DeepLabCut. Applied Sciences, 13(2), 1141.
  12. Li, J., Wu, P., Kang, F., Zhang, L., & Xuan, C. (2018). Study on the Detection of Dairy Cows’ Self‐Protective Behaviors Based on Vision Analysis. Advances in Multimedia, 2018(1), 9106836.
  13. Mathis, A., Mamidanna, P., Cury, K. M., Abe, T., Murthy, V. N., Mathis, M. W., & Bethge, M. (2018). DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nature neuroscience, 21(9), 1281-1289.
  14. Nath, T., Mathis, A., Chen, A. C., Patel, A., Bethge, M., & Mathis, M. W. (2019). Using DeepLabCut for 3D markerless pose estimation across species and behaviors. Nature protocols, 14(7), 2152-2176.
  15. Pereira, T. D., Tabris, N., Matsliah, A., Turner, D. M., Li, J., Ravindranath, S., ... & Murthy, M. (2022). SLEAP: A deep learning system for multi-animal pose tracking. Nature methods, 19(4), 486-495.
  16. Perez, M., & Toler-Franklin, C. (2023). CNN-based action recognition and pose estimation for classifying animal behavior from videos: A survey. arXiv preprint arXiv:2301.06187.
  17. Sakata, S. (2023). SaLSa: a combinatory approach of semi-automatic labeling and long short-term memory to classify behavioral syllables. Eneuro, 10(12).
  18. Shorten, C., & Khoshgoftaar, T. M. (2019). A survey on image data augmentation for deep learning. Journal of Big Data, 6(1), 1-48
  19. Simonyan, K., & Zisserman, A. (2014). Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556.
  20. Tien, R. N., Tekriwal, A., Calame, D. J., Platt, J. P., Baker, S., Seeberger, L. C., ... & Kramer, D. R. (2022). Deep learning based markerless motion tracking as a clinical tool for movement disorders: Utility, feasibility and early experience. Frontiers in Signal Processing, 2, 884384.
  21. Wang, J., He, Z., Zheng, G., Gao, S., & Zhao, K. (2018). Development and validation of an ensemble classifier for real-time recognition of cow behavior patterns from accelerometer data and location data. PloS one, 13(9), e0203546.
  22. Wiltshire, C., Lewis‐Cheetham, J., Komedová, V., Matsuzawa, T., Graham, K. E., & Hobaiter, C. (2023). DeepWild: Application of the pose estimation tool DeepLabCut for behaviour tracking in wild chimpanzees and bonobos. Journal of Animal Ecology, 92(8), 1560-1574.
  23. Wu, D., Wang, Y., Han, M., Song, L., Shang, Y., Zhang, X., & Song, H. (2021). Using a CNN-LSTM for basic behaviors detection of a single dairy cow in a complex environment. Computers and Electronics in Agriculture, 182, 106016.