Autonomous Vision Article 2018

Object Scene Flow

Objectflow

This work investigates the estimation of dense three-dimensional motion fields, commonly referred to as scene flow. While great progress has been made in recent years, large displacements and adverse imaging conditions as observed in natural outdoor environments are still very challenging for current approaches to reconstruction and motion estimation. In this paper, we propose a unified random field model which reasons jointly about 3D scene flow as well as the location, shape and motion of vehicles in the observed scene. We formulate the problem as the task of decomposing the scene into a small number of rigidly moving objects sharing the same motion parameters. Thus, our formulation effectively introduces long-range spatial dependencies which commonly employed local rigidity priors are lacking. Our inference algorithm then estimates the association of image segments and object hypotheses together with their three-dimensional shape and motion. We demonstrate the potential of the proposed approach by introducing a novel challenging scene flow benchmark which allows for a thorough comparison of the proposed scene flow approach with respect to various baseline models. In contrast to previous benchmarks, our evaluation is the first to provide stereo and optical flow ground truth for dynamic real-world urban scenes at large scale. Our experiments reveal that rigid motion segmentation can be utilized as an effective regularizer for the scene flow problem, improving upon existing two-frame scene flow methods. At the same time, our method yields plausible object segmentations without requiring an explicitly trained recognition model for a specific object class.

Author(s): Moritz Menze and Christian Heipke and Andreas Geiger
Journal: ISPRS Journal of Photogrammetry and Remote Sensing
Year: 2018
Project(s):
Bibtex Type: Article (article)
Electronic Archiving: grant_archive

BibTex

@article{Menze2018JPRS,
  title = {Object Scene Flow},
  journal = {ISPRS Journal of Photogrammetry and Remote Sensing},
  abstract = {This work investigates the estimation of dense three-dimensional motion fields, commonly
  referred to as scene flow. While great progress has been made in recent years,
  large displacements and adverse imaging conditions as observed in natural outdoor
  environments are still very challenging for current approaches to reconstruction and
  motion estimation. In this paper, we propose a unified random field model which reasons
  jointly about 3D scene flow as well as the location, shape and motion of vehicles
  in the observed scene. We formulate the problem as the task of decomposing the scene
  into a small number of rigidly moving objects sharing the same motion parameters.
  Thus, our formulation effectively introduces long-range spatial dependencies which
  commonly employed local rigidity priors are lacking. Our inference algorithm then
  estimates the association of image segments and object hypotheses together with their
  three-dimensional shape and motion. We demonstrate the potential of the proposed
  approach by introducing a novel challenging scene flow benchmark which allows for a
  thorough comparison of the proposed scene flow approach with respect to various baseline
  models. In contrast to previous benchmarks, our evaluation is the first to provide
  stereo and optical flow ground truth for dynamic real-world urban scenes at large scale.
  Our experiments reveal that rigid motion segmentation can be utilized as an effective
  regularizer for the scene flow problem, improving upon existing two-frame scene flow
  methods. At the same time, our method yields plausible object segmentations without requiring an explicitly trained recognition model for a specific object class.},
  year = {2018},
  slug = {menze2018jprs},
  author = {Menze, Moritz and Heipke, Christian and Geiger, Andreas}
}