Research on the Relative Position Detection Method between Orchard Robots and Fruit Tree Rows

Baoxing Gu,Qin Liu,Yi Gao,Guangzhao Tian,Baohua Zhang,Haiqing Wang,He Li, Liqing Chen, Yuwei Wang, Peng Chen, Bolin Cai

SENSORS(2023)

引用 0|浏览4
暂无评分
摘要
The relative position of the orchard robot to the rows of fruit trees is an important parameter for achieving autonomous navigation. The current methods for estimating the position parameters between rows of orchard robots obtain low parameter accuracy. To address this problem, this paper proposes a machine vision-based method for detecting the relative position of orchard robots and fruit tree rows. First, the fruit tree trunk is identified based on the improved YOLOv4 model; second, the camera coordinates of the tree trunk are calculated using the principle of binocular camera triangulation, and the ground projection coordinates of the tree trunk are obtained through coordinate conversion; finally, the midpoints of the projection coordinates of different sides are combined, the navigation path is obtained by linear fitting with the least squares method, and the position parameters of the orchard robot are obtained through calculation. The experimental results show that the average accuracy and average recall rate of the improved YOLOv4 model for fruit tree trunk detection are 5.92% and 7.91% higher, respectively, than those of the original YOLOv4 model. The average errors of heading angle and lateral deviation estimates obtained based on the method in this paper are 0.57 degrees and 0.02 m. The method can accurately calculate heading angle and lateral deviation values at different positions between rows and provide a reference for the autonomous visual navigation of orchard robots.
更多
查看译文
关键词
orchard robot,autonomous navigation,positional parameters,machine vision,YOLO
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要