Tobias Bertel*, Moritz Mühlhausen*, Moritz Kappel, Paul M. Bittner, Christian Richardt, Marcus Magnor

"Depth Augmented Omnidirectional Stereo for 6-DoF VR Photography" 

in IEEE VR 2020 posters

We create a 6-DoF VR experience from a single omnidirectional stereo (ODS) pair of a scene.
Our approach takes an ODS panorama as input (1), along with the radius of the viewing circle.
We determine disparities between the left and right eye views using optical flow (2).
The disparities (3) are used to obtain depth per pixel (4) creating a pointcloud (5) used to generate a DASP.

We present an end-to-end pipeline that enables head-motion parallax for omnidirectional stereo (ODS) panoramas. Based on an ODS panorama containing a left and right eye view, our method estimates dense horizontal disparity fields between the stereo image pair. From this, we calculate a depth augmented stereo panorama (DASP) by explicitly reconstructing the scene geometry from the viewing circle corresponding to the ODS representation. The generated DASP representation supports motion parallax within the ODS viewing circle. Our approach operates directly on existing ODS panoramas. The experiments indicate the robustness and versatility of our approach on multiple real-world ODS panoramas.
Back to Top