谷歌浏览器插件
订阅小程序
在清言上使用

L-RadSet: A Long-range Multimodal Dataset with 4D Radar for Autonomous Driving and Its Application

Jingzhong Li, Yuyi Wang,Lin Yang,Jun Lin,Gaoqiang Kang, Zhen Shi,Yuxuan Chen, Yue Jin, Kanta Akiyama

IEEE Transactions on Intelligent Vehicles(2024)

引用 0|浏览2
暂无评分
摘要
A long-range and robust perception system plays a crucial role in advancing research and deployment of autonomous driving. 4D radar, as an emerging range sensor, offers superior resilience to adverse weather conditions than lidar and provides elevation measurement compared to 3D radar. Existing 4D radar datasets, emphasizing robust and multimodal perception, typically combine camera, lidar, and 4D radar. However, they often lack long-range capability due to limited annotations. Furthermore, the configuration of their single shortfocus camera fails to effectively match a long-range 4D radar. To overcome these limitations, we present a novel long-range multimodal dataset. It encompasses high-resolution and longrange sensors, including forward-facing cameras, a 360° lidar, and a front-mounted 4D radar, along with detailed annotations for 3D objects across 11.2K key frames in different scenarios and weather conditions. Particularly, our dataset introduces, for the first time, three cameras with different focal lengths, enabling simultaneous capture of images with varying perception ranges. It serves as a valuable resource for developing accurate longrange perception algorithms. Remarkably, our dataset achieves the longest annotation ranges among comparable 4D radar datasets, spanning up to 220 meters. It supports applications such as 3D object detection and tracking, as well as facilitates the study of multimodal tasks. Through rigorous experiments, we validate the efficacy of our dataset and offer valuable insights into long-range 3D object detection.
更多
查看译文
关键词
autonomous driving,4D radar dataset,3D object detection,long-range perception
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要