پایش هوشمند رفتار جوجه‌ها در مزارع پرورش طیور با استفاده از یادگیری عمیق

نوع مقاله : مقاله پژوهشی

نویسندگان

1 گروه مهندسی بیوسیستم - دانشکده کشاورزی- دانشگاه تبریز - تبریز - ایران

2 عضو هیات علمی/دانشگاه تبریز

3 گروه علوم دامی – دانشکده کشاورزی - دانشگاه تبریز - تبریز - ایران

چکیده

تولید طیور نقش حیاتی در تأمین پروتئین مقرون به صرفه برای تغذیه جمعیت رو به رشد جهان دارد. مدیریت بهینه مرغداری‌ها یکی از عوامل کلیدی در تولید مؤثر و با کیفیت گوشت طیور به شمار می‌رود. پراکنش مکانی جوجه‌های گوشتی می‌تواند به عنوان یک شاخص مدیریتی برای شناسایی سلامت یا مشکلات موجود در گله عمل کند. در حال حاضر، بازرسی‌های متعدد روزانه توزیع جوجه‌ها در مرغداری‌ها به صورت دستی انجام می‌شود که این فرایند نه تنها زمان‌بر و طاقت‌فرسا است، بلکه مستعد خطاهای انسانی نیز می‌باشد. روش‌های بینایی ماشین مبتنی بر هوش مصنوعی از جمله رویکردهایی هستند که قادرند چالش‌های ناشی از روش‌های دستی در سیستم‌های مدیریت مرغداری‌ها را برطرف کنند. در این تحقیق، یادگیری عمیق به‌عنوان یکی از شاخه‌های هوش مصنوعی با هدف تشخیص وضعیت جوجه‌ها مورد بررسی قرار گرفت. پس از تصویربرداری از نمای بالا از ۴۸ پِن (محوطه نگهداری) در مرغداری تحقیقاتی دانشگاه تبریز، برخی پردازش‌های اولیه روی تصاویر اعمال شد. این اقدامات با هدف بهبود فرآیند آموزش و افزایش دقت مدل پیشنهادی انجام گرفت. در مرحله بعد، پس از بهبود و سفارشی کردن معماری مدل و اصلاح فرآیند آموزش، از نسخه سبک و کم‌حجم یولو نسخه 11 کوچک که یکی از مدل‌های جدید و مطرح در حوزه یادگیری عمیق است، برای آموزش و ارزیابی وضعیت جوجه‌ها استفاده شد. استفاده از این مدل و نسخه مربوطه با هدف دستیابی به دقت بالا همراه با سرعت مناسب برای پردازش در زمان واقعی انجام شد. کاهش تدریجی اتلاف در روند آموزش و اعتبارسنجی مدل ارائه شده، نشان‌دهنده دستیابی مدل به تعادل و آموزش بهینه بود. مدل یولو نسخه 11 کوچک با دستیابی به میانگین دقت متوسط 90% (mAP 0.5) توانست عملکرد قابل‌توجهی در شناسایی رفتار جوجه‌ها ارائه دهد. به‌ویژه، در تشخیص رفتار خوردن با دقت %99 و نوشیدن با دقت %89 عملکرد مطلوبی داشت. با این حال، دقت پایین‌تر در شناسایی رفتار معمولی (%82) عمدتاً ناشی از برچسب‌گذاری ناکافی این رفتار بود. به طور کلی، نتایج نشان داد که یولو نسخه 11 کوچک با ایجاد تعادل میان دقت و سرعت، گزینه‌ای کارآمد برای پایش زمان واقعی رفتار جوجه‌ها در سیستم‌های مدیریت هوشمند مرغداری محسوب می‌شود.

کلیدواژه‌ها

موضوعات


عنوان مقاله [English]

Intelligent Monitoring of Broilers Behavior in Poultry Farms Using Deep Learning

نویسندگان [English]

  • Hossein Akhtari 1
  • Hossein Navid 2
  • Abasalt Bazrafshan 1
  • Majid Olyayee 3
  • Ali Ghaffarnezhad 1
1 Department of Biosystem Engineering, Faculty of Agriculture, University of Tabriz, Tabriz. Iran
2 Department of Biosystem Engineering, Faculty of Agriculture, University of Tabriz, Tabriz. Iran
3 Department of Animal Science, Faculty of Agriculture, University of Tabriz. Iran
چکیده [English]

Introduction
Poultry production serves as a crucial sector in the global food supply chain by offering a cost effective and protein rich source of nutrition to meet the demands of a rapidly growing population. As global consumption of poultry meat continues to increase, ensuring the health and welfare of broiler chickens has become more essential than ever. One key indicator of flock health and environmental quality in broiler production systems is animal behavior. Monitoring behaviors such as feeding, drinking, and general activity can provide valuable insights into welfare status, environmental conditions, and management efficiency. However, conventional methods for behavior monitoring typically rely on manual observation, which is labor-intensive, inconsistent, and impractical for large scale or continuous surveillance in commercial settings.
Advancements in computer vision and deep learning have opened new possibilities for automating behavior analysis in livestock farming. In particular, object detection models based on convolutional neural networks (CNNs) have shown high accuracy in detecting animals and recognizing specific postures or activities. This study explores the application of YOLOv11s, a recent lightweight yet powerful object detection model, to recognize and classify broiler behaviors from top view images captured in a real poultry farm environment. The primary aim is to develop an efficient and real-time monitoring system that can automatically detect key behaviors, reduce human intervention, and ultimately support precision poultry farming with enhanced animal welfare management.
 
Materials and Methods
This study was conducted at the research poultry farm of the university of Tabriz. Images of broiler chickens were captured from 48 separate pens to monitor their spatial distribution and behaviors. The top view imaging setup helped minimize occlusions and provided a clear view of the broilers' positions and actions. The collected images were annotated manually by experts into three behavioral categories: normal, feeding, and drinking. Preprocessing steps were applied to the raw images to enhance model training. These included resizing, normalization, and data augmentation techniques such as horizontal flipping and brightness adjustment. The dataset was divided into training, validation, and testing subsets, considering class imbalance and ensuring representative distribution of behaviors. Specific attention was given to improving the annotation quality to support accurate model learning.
For behavior detection, the YOLOv11s model a lightweight version of the YOLOv11 object detection architecture was employed due to its high inference speed and accuracy. The model was customized and trained using fine-tuned hyperparameters to achieve optimal performance. The training process was monitored using metrics such as loss convergence and mean Average Precision (mAP) at IoU 0.5. This setup enabled effective and real-time detection of broiler behaviors in smart poultry farming systems.
 
Results and Discussion
The YOLOv11s model was successfully trained to detect and classify three distinct broiler behaviors: feeding, drinking, and normal activity, based on top view images collected from 48 pens in a research oriented poultry facility. The model achieved a mean Average Precision (mAP) of 90% at IoU threshold 0.5, demonstrating high overall accuracy in behavior recognition. It performed particularly well in detecting feeding behavior (99% precision) and drinking behavior (89% precision), while performance for normal behavior was slightly lower (82%) due to the underrepresentation of this class during annotation. Although mild overfitting was observed during training, this issue was alleviated through regularization techniques such as dropout and early stopping, allowing the model to generalize well to unseen data.
Spatial analysis of detections revealed clear clustering of feeding and drinking behaviors around feeders and drinkers, respectively, while instances of normal activity were more evenly distributed across the pen space. This distribution pattern not only validates the behavioral labels but also suggests that the system can be used to assess flock dynamics and detect anomalies in movement or engagement. Furthermore, the lightweight architecture of YOLOv11s enabled real-time inference with minimal computational overhead, making it a practical solution for continuous on farm monitoring. These results are consistent with trends reported in prior studies using deep learning for animal behavior detection, while offering a novel, efficient, and scalable approach tailored for precision poultry farming.
 
Conclusion
This study demonstrated the effectiveness of the YOLOv11s deep learning model for automated recognition of broiler behaviors from top view images. The model accurately classified three key behaviors: feeding, drinking, and normal activity, using lightweight processing suitable for real-time monitoring. High detection accuracy, particularly for feeding (99%) and drinking (89%), along with acceptable performance for normal behavior (82%), confirmed the reliability of the proposed system in practical scenarios. Spatial patterns of the behaviors also aligned with expected distributions, reinforcing the validity of the detection results. The findings suggest that integrating YOLOv11s into smart poultry farming systems can enhance real-time flock observation, reduce human labor, and improve management decision making. The lightweight nature of the model makes it suitable for edge deployment, enabling cost effective and scalable solutions for precision livestock monitoring. Overall, this research highlights the feasibility of using deep learning for intelligent broiler behavior analysis, contributing to better welfare, health assessment, and efficient poultry production.
 
Acknowledgement
The authors would like to express their sincere appreciation to the staff of the KhalatPoushan Research Station at the University of Tabriz and the students of the poultry research facility for providing  support and resources during the data collection process.

کلیدواژه‌ها [English]

  • Machine vision؛ Broiler vehavior,
  • Automated monitoring؛ Smart agriculture؛ Smart poultry؛ Deep learning