RFTrans: Leveraging Refractive Flow of Transparent Objects for Surface Normal Estimation and Manipulation

IEEE ROBOTICS AND AUTOMATION LETTERS(2024)

引用 0|浏览9
暂无评分
摘要
Transparent objects are widely used in our daily lives, making it important to teach robots to interact with them. However, it's not easy because the reflective and refractive effects can make depth cameras fail to give accurate geometry measurements. To solve this problem, this paper introduces RFTrans, an RGB-D-based method for surface normal estimation and manipulation of transparent objects. By leveraging refractive flow as an intermediate representation, the proposed method circumvents the drawbacks of directly predicting the geometry (e.g. surface normal) from images and helps bridge the sim-to-real gap. It integrates the RFNet, which predicts refractive flow, object mask, and boundaries, followed by the F2Net, which estimates surface normal from the refractive flow. To make manipulation possible, a global optimization module will take in the predictions, refine the raw depth, and construct the point cloud with normal. An off-the-shelf analytical grasp planning algorithm is followed to generate the grasp poses. We build a synthetic dataset with physically plausible ray-tracing rendering techniques to train the networks. Results show that the proposed method trained on the synthetic dataset can consistently outperform the baseline method in both synthetic and real-world benchmarks by a large margin. Finally, a real-world robot grasping task witnesses an 83% success rate, proving that refractive flow can help enable direct sim-to-real transfer.
更多
查看译文
关键词
Perception for grasping and manipulation,RGB-D perception
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要