OOSTraj: Out-of-Sight Trajectory Prediction With Vision-Positioning Denoising
CoRR(2024)
摘要
Trajectory prediction is fundamental in computer vision and autonomous
driving, particularly for understanding pedestrian behavior and enabling
proactive decision-making. Existing approaches in this field often assume
precise and complete observational data, neglecting the challenges associated
with out-of-view objects and the noise inherent in sensor data due to limited
camera range, physical obstructions, and the absence of ground truth for
denoised sensor data. Such oversights are critical safety concerns, as they can
result in missing essential, non-visible objects. To bridge this gap, we
present a novel method for out-of-sight trajectory prediction that leverages a
vision-positioning technique. Our approach denoises noisy sensor observations
in an unsupervised manner and precisely maps sensor-based trajectories of
out-of-sight objects into visual trajectories. This method has demonstrated
state-of-the-art performance in out-of-sight noisy sensor trajectory denoising
and prediction on the Vi-Fi and JRDB datasets. By enhancing trajectory
prediction accuracy and addressing the challenges of out-of-sight objects, our
work significantly contributes to improving the safety and reliability of
autonomous driving in complex environments. Our work represents the first
initiative towards Out-Of-Sight Trajectory prediction (OOSTraj), setting a new
benchmark for future research. The code is available at
.
更多查看译文
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要