Trajectory Tracking of Variable Centroid Objects Based on Fusion of Vision and Force Perception

IEEE TRANSACTIONS ON CYBERNETICS(2023)

引用 6|浏览75
暂无评分
摘要
Compared with traditional rigid objects' dynamic throwing and catching by the robot, the in-flight trajectory of nonrigid objects (incredibly variable centroid objects) throwing is more challenging to predict and track. This article proposes a variable centroid trajectory tracking network (VCTTN) with the fusion of vision and force information by introducing force data of throw processing to the vision neural network. The VCTTN-based model-free robot control system is developed to perform highly precise prediction and tracking with a part of the in-flight vision. The flight trajectories dataset of variable centroid objects generated by the robot arm is collected to train VCTTN. The experimental results show that trajectory prediction and tracking with the vision-force VCTTN is superior to the ones with the traditional vision perception and has an excellent tracking performance.
更多
查看译文
关键词
Robots,Trajectory,Robot sensing systems,Manipulators,Trajectory tracking,Planning,Cameras,Neural network,trajectory tracking,variable centroid object,vision-force fusion
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要