Multi-Uncertainty Captured Multi-Robot Lidar Odometry and Mapping Framework for Large-Scale Environments

UNMANNED SYSTEMS(2023)

引用 3|浏览2
暂无评分
摘要
Multi-robot simultaneous localization and mapping (MR-SLAM) is of great importance for enhancing the efficiency of large-scale environment exploration. Despite remarkable advances in schemes for cooperation, there is a critical lack of approaches to handle multiple uncertainties inherent to MR-SLAM in large-scale environments. This paper proposes a multi-uncertainty captured multi-robot lidar odometry and mapping (MUC-LOAM) framework, to quantify and utilize the uncertainties of feature points and robot mutual poses in large-scale environments. A proposed hybrid weighting strategy for pose update is integrated into MUC-LOAM to handle feature uncertainty from distance changing and dynamic objects. A devised Bayesian Neural Network (BNN) is proposed to capture mutual pose uncertainty. Then the covariance propagation of quaternions to Euler angles conversion is leveraged to filter out unreliable mutual poses. Another covariance propagation through coordinate transformations in nonlinear optimization improves the accuracy of map merging. The feasibility and enhanced robustness of the proposed framework for large-scale exploration are validated on both public datasets and real-world experiments.
更多
查看译文
关键词
Lidar odometry and mapping, multi-robot systems, uncertainty capture, Bayesian neural network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要