Tobias Bertel, Christian Richardt

"MegaParallax: 360° Panoramas with Motion Parallax"

in ACM SIGGRAPH 2018 poster

We propose a novel approach for creating high-quality 360° panoramas – with motion parallax – from a single input
video sweep. 
We start from an input video captured with a consumer camera, and register each video frame on a circle using
structure-from-motion. Using flow-based blending, we synthesise novel views (green camera) on the fly between each pair
of captured images (black), which produces impressive motion parallax.
Our results show correct (improved) perspective compared to
Megastereo [Richardt et al. 2013] and avoid the ghosting artefacts of the Unstructured Lumigraph [Buehler et al. 2001].

Abstract: 
Capturing 360° panoramas has become straightforward now that this functionality is implemented on every phone. However, it remains difficult to capture immersive 360° panoramas with motion wparallax, which provide different views for different viewpoints. Alternatives such as omnidirectional stereo panoramas provide different views for each eye (binocular disparity), but do not support motion parallax, while Casual 3D Photography [Hedman et al. 2017] reconstructs textured 3D geometry that provides motion parallax but suffers from reconstruction artefacts. We propose a new image-based approach for capturing and rendering high-quality 360° panoramas with motion parallax. We use novel-view synthesis with flow-based blending to turn a standard monoscopic video into an enriched 360° panoramic experience that can be explored in real time. Our approach makes it possible for casual consumers to capture and view high-quality 360° panoramas with motion parallax.

Submission video

Back to Top