28
0

Read My Ears! Horse Ear Movement Detection for Equine Affective State Assessment

Abstract

The Equine Facial Action Coding System (EquiFACS) enables the systematic annotation of facial movements through distinct Action Units (AUs). It serves as a crucial tool for assessing affective states in horses by identifying subtle facial expressions associated with discomfort. However, the field of horse affective state assessment is constrained by the scarcity of annotated data, as manually labelling facial AUs is both time-consuming and costly. To address this challenge, automated annotation systems are essential for leveraging existing datasets and improving affective states detection tools. In this work, we study different methods for specific ear AU detection and localization from horse videos. We leverage past works on deep learning-based video feature extraction combined with recurrent neural networks for the video classification task, as well as a classic optical flow based approach. We achieve 87.5% classification accuracy of ear movement presence on a public horse video dataset, demonstrating the potential of our approach. We discuss future directions to develop these systems, with the aim of bridging the gap between automated AU detection and practical applications in equine welfare and veterinary diagnostics. Our code will be made publicly available atthis https URL.

View on arXiv
@article{alves2025_2505.03554,
  title={ Read My Ears! Horse Ear Movement Detection for Equine Affective State Assessment },
  author={ João Alves and Pia Haubro Andersen and Rikke Gade },
  journal={arXiv preprint arXiv:2505.03554},
  year={ 2025 }
}
Comments on this paper