Hierarchical Planning for Long-Horizon Manipulation with Geometric and Symbolic Scene Graphs

2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021)(2021)

引用 82|浏览84
暂无评分
摘要
We present a visually grounded hierarchical planning algorithm for long-horizon manipulation tasks. Our algorithm offers a joint framework of neuro-symbolic task planning and low-level motion generation conditioned on the specified goal. At the core of our approach is a two-level scene graph representation, namely geometric scene graph and symbolic scene graph. This hierarchical representation serves as a structured, object-centric abstraction of manipulation scenes. Our model uses graph neural networks to process these scene graphs for predicting high-level task plans and low-level motions. We demonstrate that our method scales to long-horizon tasks and generalizes well to novel task goals. We validate our method in a kitchen storage task in both physical simulation and the real world. Experiments show that our method achieves over 70% success rate and nearly 90% of subgoal completion rate on the real robot while being four orders of magnitude faster in computation time compared to standard search-based task-and-motion planner.(1)
更多
查看译文
关键词
search-based task-and-motion planner,long-horizon manipulation tasks,visually grounded hierarchical planning algorithm,kitchen storage task,high-level task plans,graph neural networks,manipulation scenes,hierarchical representation,symbolic scene graph,geometric scene graph,two-level scene graph representation,low-level motion generation,neuro-symbolic task planning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要