Low Power Depth Estimation For Time-Of-Flight Imaging

2017 24TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP)(2017)

引用 29|浏览43
暂无评分
摘要
Depth sensing is used in a variety of applications that range from augmented reality to robotics. One way to measure depth is with a time-of-flight (TOF) camera, which obtains depth by emitting light and measuring its round trip time. However, for many battery powered devices, the illumination source of the TOE camera requires a significant amount of power and further limits its battery life. To minimize the power required for depth sensing, we present an algorithm that exploits the apparent motion across images collected alongside the TOF camera to obtain a new depth map without illuminating the scene. Our technique is best suited for estimating the depth of rigid objects and obtains low latency, 640 x 480 depth maps at 30 frames per second on a low power embedded platform by using block matching at a sparse set of points and least squares minimization. We evaluated our technique on an RGB-D dataset where it produced depth maps with a mean relative error of 0.85% while reducing the total power required for depth sensing by 3x.
更多
查看译文
关键词
time-of-flight camera, low-power depth estimation, RGEI-D, depth map
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要