Poultry esophagus detection using retinanet and mask region-based convolutional neural network object detection model
Syariah Compliance Automated Chicken Processing System (SYCUT) is a system for monitoring the slaughtering process to ensure that chickens are slaughtered in accordance with sharia of Islam. SYCUT uses vision inspection technology to determine whether slaughtered chickens are halal or otherwise. The...
Saved in:
Main Author: | |
---|---|
Format: | Thesis |
Language: | English |
Published: |
2022
|
Subjects: | |
Online Access: | http://eprints.utm.my/id/eprint/99648/1/NorAziahAmirahMMJIIT2022.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Syariah Compliance Automated Chicken Processing System (SYCUT) is a system for monitoring the slaughtering process to ensure that chickens are slaughtered in accordance with sharia of Islam. SYCUT uses vision inspection technology to determine whether slaughtered chickens are halal or otherwise. The vision inspection technology consists of a detection module to detect whether the esophagus of chickens is cut accordingly. The researcher employed Viola-Jones object detection framework to train the detection module. The detection module had problems due to images of esophagus that were bloodied, blurred, or occluded. This resulted in a low detection rate in the system. Besides, a conventional method requires image preprocessing tool like low-pass filter and Otsu’s thresholding to improve the conditions of the images before detection which adds to the computational cost. In this study, the researcher divided image inputs into categories to reduce misclassification and aid in data annotation. Then, the researcher proposed a poultry esophagus detection system based on deep learning to improve the current algorithm in SYCUT. The researcher combined the deep learning method with the RetinaNet and Mask R-CNN models, which could perform segmentation and object detection in a single image. The researcher then compared the proposed method with the previous conventional SYCUT algorithm. The proposed method could detect bloodied and occluded images more accurately. The developed algorithm improves overall esophageal detection performance from 68.65 to 92.77 per cent. The SYCUT performs efficiently even in uncontrolled working environments due to the effectiveness of the developed deep learning method. However, the limitation of this deep learning method is it needs huge data for training. This research only improves the detection of certain image types, like bloodied and occluded. Future work should include improving the precision-recall value of the system and its real-time implementation for esophageal detection in real or simulated environments. |
---|