FUSEDAR: Adaptive Environment Lighting Reconstruction for Visually Coherent Mobile AR Rendering

2022 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES ABSTRACTS AND WORKSHOPS (VRW 2022)(2022)

引用 2|浏览17
暂无评分
摘要
Obtaining accurate omnidirectional environment lighting for high quality rendering in mobile augmented reality is challenging due to the practical limitation of mobile devices and the inherent spatial variance of lighting. In this paper, we present a novel adaptive environment lighting reconstruction method called FusedAR, which is designed from the outset to consider mobile characteristics, e.g., by exploiting mobile user natural behaviors of pointing the camera sensors perpendicular to the observation-rendering direction. Our initial evaluation shows that FusedAR achieves better rendering effects compared to using a recent deep learning-based AR lighting estimation system [8] and environment lighting captured by 360° cameras.
更多
查看译文
关键词
Computing methodologies, Computer graphics, Graphics systems and interfaces, Mixed / augmented reality, Human-centered computing, Human computer interaction (HCI), Interaction paradigms Mixed / augmented reality, Human-centered computing, Ubiquitous and mobile computing, Empirical studies in ubiquitous and mobile computing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要