Abstract:In large-scale broiler farms, the behavior of broilers is usually observed and analyzed by feeders or professional veterinarians to determine their health status and breeding environment status. However, this method is time-consuming and subjective. In addition, in caged environments, due to the high density of chickens and serious mutual occlusion, the visual features of behavior are not obvious, and traditional detection algorithms cannot accurately identify the behavior characteristics of chickens. Therefore, an improved object detection algorithm for behavior detection of caged white-feather broilers was proposed. The proposed algorithm consisted of two modules: multi-scale detail feature fusion module (MDF) and object relation inference module (ORI). The multi-scale detail feature module fully utilized and extracted the multi-scale detail features contained in the shallow feature maps of the feature extraction network, and integrated them into the corresponding feature maps responsible for detection at the corresponding scale, achieving effective transmission and supplementation of detail features. The relational reasoning module fully utilized the positional relationships between objects for inference and judgment, enabling the model to more fully utilize the potential relationships between objects to assist in detection. To verify the effectiveness of the proposed algorithm, a large number of comparative experiments on both authoritative public datasets in the field of object detection and self-built behavior detection datasets in real large-scale caged white-feather broiler breeding environments was conducted. The experimental results showed that the proposed improved algorithm achieved the best detection accuracy compared with other state-of-the-art models, both in the COCO dataset and the self-built dataset. For the detection of behaviors such as feeding, drinking, moving, and opening the mouth, which were crucial for the health status of broiler chickens, the algorithm achieved accuracy rates of 99.6%, 98.7%, 99.2%, and 98.3% respectively.