Patchwork Stereo: Scalable, Structure-Aware 3D Reconstruction in Man-Made Environments

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In this paper, we address the problem of Multi-View Stereo (MVS) reconstruction of highly regular man-made scenes from calibrated, wide-baseline views and a sparse Structure-from-Motion (SfM) point cloud. We introduce a novel patch-based formulation via energy minimization which combines top-down segmentation hypotheses using appearance and vanishing line detections, as well as an arrangement of creased planar structures which are extracted automatically through a robust analysis of available SfM points and image features. The method produces a compact piecewise-planar depth map and a mesh which are aligned with the scene's structure. Experiments show that our approach not only reaches similar levels of accuracy w.r.t state-of-The-Art pixel-based methods while using much fewer images, but also produces a much more compact, structure-Aware mesh in a considerably shorter runtime by several of orders of magnitude.

Original languageEnglish
Title of host publicationProceedings - 2017 IEEE Winter Conference on Applications of Computer Vision, WACV 2017
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages292-301
Number of pages10
ISBN (Electronic)9781509048229
DOIs
Publication statusPublished - 11 May 2017
Externally publishedYes
Event17th IEEE Winter Conference on Applications of Computer Vision, WACV 2017 - Santa Rosa, United States
Duration: 24 Mar 201731 Mar 2017

Publication series

NameProceedings - 2017 IEEE Winter Conference on Applications of Computer Vision, WACV 2017

Conference

Conference17th IEEE Winter Conference on Applications of Computer Vision, WACV 2017
Country/TerritoryUnited States
CitySanta Rosa
Period24/03/1731/03/17

Fingerprint

Dive into the research topics of 'Patchwork Stereo: Scalable, Structure-Aware 3D Reconstruction in Man-Made Environments'. Together they form a unique fingerprint.

Cite this