Journal of South China University of Technology(Natural Science Edition) ›› 2025, Vol. 53 ›› Issue (7): 60-69.doi: 10.12141/j.issn.1000-565X.240591

• Electronics, Communication & Automation Technology • Previous Articles     Next Articles

Real-Time Feeding Target Recognition Method Based on SAM Optimization

ZHANG Qin, WENG Kaihang   

  1. School of Mechanical and Automotive Engineering,South China University of Technology,Guangzhou 510640,Guangdong,China
  • Received:2024-12-20 Online:2025-07-25 Published:2025-02-28
  • About author:张勤(1964—),女,博士,教授,主要从事机器人及其应用研究。E-mail: zhangqin@scut.edu.cn
  • Supported by:
    the Natural Science Foundation of Hainan Province(324MS095)

Abstract:

Feeding-assistance robots are key equipment in promoting the modernization and transformation of animal husbandry. The rapid and accurate identification of feeding targets is essential for enabling intelligent feed-pushing, while balancing segmentation accuracy and operational efficiency is crucial for ensuring the overall performance of recognition algorithms—an important topic in the field of intelligent livestock management. To address the mismatch between segmentation accuracy and processing efficiency in current dairy cow feeding target recognition methods, this paper proposed a real-time feeding target recognition method (RTFTR) based on an optimized Segment Anything Model (SAM). Built on the SAM-det architecture, RTFTR first introduces lightweight image encoder and object detector, along with a parallelized buffer queue design, to balance the operational efficiency of each module and enhance inference speed. It then employs a High-Quality (HQ) token mechanism to enhance the feature space decoding capacity, optimizes the mask decoder, and applies stage-wise training tailored to feeding targets to improve segmentation accuracy. Experimental results show that the proposed method ensures inference efficiency while enhancing segmentation accuracy. In the task of cow feeding target recognition, the method achieves a segmentation accuracy of 98.7% for cows, 96.4% for feed, 99.2% for bunk, with an overall average accuracy of 98.1%, and a processing speed of 52.9 f/s, meeting the application requirements for cow feeding target recognition in complex environments and limited robotic computational resources.

Key words: feeding-assistance robot, segment anything model, cow feeding, target recognition, segmentation accuracy

CLC Number: