Estimation Techniques In Robust Vision-Based Landing Of Aerial Vehicles

IFAC PAPERSONLINE(2017)

引用 3|浏览2
暂无评分
摘要
This paper describes recent advances in autonomous visual landing of aircraft in three operational scenarios. The first suggests a robust visual target and an algorithm that tracks the suggested target. The second explores the case when we can not use a prepared visual target and have to land on an arbitrary target. Both the first and second methods are evaluated with a mobile target. The third addresses the problem of landing on an unprepared static target in GPS-denied environments. A key thread throughout all approaches is the estimation of not only of system states, but also of error covariance of the target and vehicle. The error covariance then may be used to determine the status of the estimation during the approach and to engage a contingency maneuver if necessary. The approaches are validated in high-fidelity simulation and in flight testing. Landing pad tracking is shown to be accurate and robust to viewpoint and distance. GPS-denied landing is found to have low error and be robust to landing zone appearance. (C) 2017, IFAC (International Federation of Automatic Control) Hosting by Elsevier Ltd. All rights reserved.
更多
查看译文
关键词
Vision Navigation, Autonomous Landing, Kalman Filter, Particle Filter, Adaptive Estimation, Unmanned Aircraft
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要