PACE: Data-Driven Virtual Agent Interaction in Dense and Cluttered Environments
IEEE Conference on Virtual Reality and 3D User Interfaces(2023)
Abstract
We present PACE, a novel method for modifying motion-captured virtual agents
to interact with and move throughout dense, cluttered 3D scenes. Our approach
changes a given motion sequence of a virtual agent as needed to adjust to the
obstacles and objects in the environment. We first take the individual frames
of the motion sequence most important for modeling interactions with the scene
and pair them with the relevant scene geometry, obstacles, and semantics such
that interactions in the agents motion match the affordances of the scene
(e.g., standing on a floor or sitting in a chair). We then optimize the motion
of the human by directly altering the high-DOF pose at each frame in the motion
to better account for the unique geometric constraints of the scene. Our
formulation uses novel loss functions that maintain a realistic flow and
natural-looking motion. We compare our method with prior motion generating
techniques and highlight the benefits of our method with a perceptual study and
physical plausibility metrics. Human raters preferred our method over the prior
approaches. Specifically, they preferred our method 57.1
the state-of-the-art method using existing motions, and 81.0
versus a state-of-the-art motion synthesis method. Additionally, our method
performs significantly higher on established physical plausibility and
interaction metrics. Specifically, we outperform competing methods by over 1.2
in terms of the non-collision metric and by over 18
metric. We have integrated our interactive system with Microsoft HoloLens and
demonstrate its benefits in real-world indoor scenes. Our project website is
available at https://gamma.umd.edu/pace/.
MoreTranslated text
AI Read Science
Must-Reading Tree
Example
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined