Slip Detection with Combined Tactile and Visual Information

2018 IEEE International Conference on Robotics and Automation (ICRA)(2018)

引用 128|浏览51
暂无评分
摘要
Slip detection plays a vital role in robotic manipulation and it has long been a challenging problem in the robotic community. In this paper, we propose a new method based on deep neural network (DNN) to detect slip. The training data is acquired by a GelSight tactile sensor and a camera mounted on a gripper when we use a robot arm to grasp and lift 94 daily objects with different grasping forces and grasping positions. The DNN is trained to classify whether a slip occurred or not. To evaluate the performance of the DNN, we test 10 unseen objects in 152 grasps. A detection accuracy as high as 88.03% is achieved. It is anticipated that the accuracy can be further improved with a larger dataset. This method is beneficial for robots to make stable grasps, which can be widely applied to automatic force control, grasping strategy selection and fine manipulation.
更多
查看译文
关键词
robot arm,grasping positions,slip detection,visual information,robotic manipulation,deep neural network,GelSight tactile sensor,grasping forces,DNN training,grasp stability,tactile information,gripper,camera-based tactile sensor,image sequences,image capture
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要