Eliminating Warping Shakes for Unsupervised Online Video Stitching
arxiv(2024)
摘要
In this paper, we retarget video stitching to an emerging issue, named
warping shake, when extending image stitching to video stitching. It unveils
the temporal instability of warped content in non-overlapping regions, despite
image stitching having endeavored to preserve the natural structures.
Therefore, in most cases, even if the input videos to be stitched are stable,
the stitched video will inevitably cause undesired warping shakes and affect
the visual experience. To eliminate the shakes, we propose StabStitch to
simultaneously realize video stitching and video stabilization in a unified
unsupervised learning framework. Starting from the camera paths in video
stabilization, we first derive the expression of stitching trajectories in
video stitching by elaborately integrating spatial and temporal warps. Then a
warp smoothing model is presented to optimize them with a comprehensive
consideration regarding content alignment, trajectory smoothness, spatial
consistency, and online collaboration. To establish an evaluation benchmark and
train the learning framework, we build a video stitching dataset with a rich
diversity in camera motions and scenes. Compared with existing stitching
solutions, StabStitch exhibits significant superiority in scene robustness and
inference speed in addition to stitching and stabilization performance,
contributing to a robust and real-time online video stitching system. The code
and dataset will be available at https://github.com/nie-lang/StabStitch.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要