نویسندگان
1 گروه مهندسی بیوسیستم - دانشکده کشاورزی- دانشگاه تبریز - تبریز - ایران
2 عضو هیات علمی/دانشگاه تبریز
3 گروه علوم دامی – دانشکده کشاورزی - دانشگاه تبریز - تبریز - ایران
چکیده
کلیدواژهها
موضوعات
عنوان مقاله [English]
نویسندگان [English]
Introduction
Poultry production serves as a crucial sector in the global food supply chain by offering a cost effective and protein rich source of nutrition to meet the demands of a rapidly growing population. As global consumption of poultry meat continues to increase, ensuring the health and welfare of broiler chickens has become more essential than ever. One key indicator of flock health and environmental quality in broiler production systems is animal behavior. Monitoring behaviors such as feeding, drinking, and general activity can provide valuable insights into welfare status, environmental conditions, and management efficiency. However, conventional methods for behavior monitoring typically rely on manual observation, which is labor-intensive, inconsistent, and impractical for large scale or continuous surveillance in commercial settings.
Advancements in computer vision and deep learning have opened new possibilities for automating behavior analysis in livestock farming. In particular, object detection models based on convolutional neural networks (CNNs) have shown high accuracy in detecting animals and recognizing specific postures or activities. This study explores the application of YOLOv11s, a recent lightweight yet powerful object detection model, to recognize and classify broiler behaviors from top view images captured in a real poultry farm environment. The primary aim is to develop an efficient and real-time monitoring system that can automatically detect key behaviors, reduce human intervention, and ultimately support precision poultry farming with enhanced animal welfare management.
Materials and Methods
This study was conducted at the research poultry farm of the university of Tabriz. Images of broiler chickens were captured from 48 separate pens to monitor their spatial distribution and behaviors. The top view imaging setup helped minimize occlusions and provided a clear view of the broilers' positions and actions. The collected images were annotated manually by experts into three behavioral categories: normal, feeding, and drinking. Preprocessing steps were applied to the raw images to enhance model training. These included resizing, normalization, and data augmentation techniques such as horizontal flipping and brightness adjustment. The dataset was divided into training, validation, and testing subsets, considering class imbalance and ensuring representative distribution of behaviors. Specific attention was given to improving the annotation quality to support accurate model learning.
For behavior detection, the YOLOv11s model a lightweight version of the YOLOv11 object detection architecture was employed due to its high inference speed and accuracy. The model was customized and trained using fine-tuned hyperparameters to achieve optimal performance. The training process was monitored using metrics such as loss convergence and mean Average Precision (mAP) at IoU 0.5. This setup enabled effective and real-time detection of broiler behaviors in smart poultry farming systems
Results and Discussion
The YOLOv11s model was successfully trained to detect and classify three distinct broiler behaviors: feeding, drinking, and normal activity, based on top view images collected from 48 pens in a research oriented poultry facility. The model achieved a mean Average Precision (mAP) of 90% at IoU threshold 0.5, demonstrating high overall accuracy in behavior recognition. It performed particularly well in detecting feeding behavior (99% precision) and drinking behavior (89% precision), while performance for normal behavior was slightly lower (82%) due to the underrepresentation of this class during annotation. Although mild overfitting was observed during training, this issue was alleviated through regularization techniques such as dropout and early stopping, allowing the model to generalize well to unseen data.
Spatial analysis of detections revealed clear clustering of feeding and drinking behaviors around feeders and drinkers, respectively, while instances of normal activity were more evenly distributed across the pen space. This distribution pattern not only validates the behavioral labels but also suggests that the system can be used to assess flock dynamics and detect anomalies in movement or engagement. Furthermore, the lightweight architecture of YOLOv11s enabled real-time inference with minimal computational overhead, making it a practical solution for continuous on farm monitoring. These results are consistent with trends reported in prior studies using deep learning for animal behavior detection, while offering a novel, efficient, and scalable approach tailored for precision poultry farming.
Conclusion
This study demonstrated the effectiveness of the YOLOv11s deep learning model for automated recognition of broiler behaviors from top view images. The model accurately classified three key behaviors: feeding, drinking, and normal activity, using lightweight processing suitable for real-time monitoring. High detection accuracy, particularly for feeding (99%) and drinking (89%), along with acceptable performance for normal behavior (82%), confirmed the reliability of the proposed system in practical scenarios. Spatial patterns of the behaviors also aligned with expected distributions, reinforcing the validity of the detection results.
The findings suggest that integrating YOLOv11s into smart poultry farming systems can enhance real-time flock observation, reduce human labor, and improve management decision making. The lightweight nature of the model makes it suitable for edge deployment, enabling cost effective and scalable solutions for precision livestock monitoring. Overall, this research highlights the feasibility of using deep learning for intelligent broiler behavior analysis, contributing to better welfare, health assessment, and efficient poultry production.
Acknowledgement
The authors would like to express their sincere appreciation to the staff of the KhalatPoushan Research Station at the University of Tabriz and the students of the poultry research facility for providing support and resources during the data collection process.
کلیدواژهها [English]