Fusion Drive: End-to-End Multi Modal Sensor Fusion for Guided Low-Cost Autonomous Vehicle

2020 17th International Conference on Ubiquitous Robots (UR)(2020)

引用 7|浏览8
暂无评分
摘要
In this paper, we present a supervised learning-based mixed-input sensor fusion neural network for autonomous navigation on a designed track referred to as Fusion Drive. The proposed method combines RGB image and LiDAR laser sensor data for guided navigation along the track and avoidance of learned as well as previously unobserved obstacles for a low-cost embedded navigation system. The proposed network combines separate CNN-based sensor processing into a fully combined network that learns throttle and steering angle labels end-to-end. The proposed network outputs navigational commands with similar learned behavior from the human demonstrations. Performed experiments with validation data-set and in real environment exhibit desired behavior. Recorded performance shows improvement over similar approaches.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要