Information

  • Publication Type: Journal Paper (without talk)
  • Workgroup(s)/Project(s):
  • Date: September 2014
  • ISSN: 1077-2626
  • Journal: IEEE Transactions on Visualization & Computer Graphics
  • Number: 9
  • Volume: 20
  • Pages: 1280 – 1292
  • Keywords: image-based rendering, large-scale models, color, surface representation

Abstract

In this paper, we introduce a novel scene representation for the visualization of large-scale point clouds accompanied by a set of high-resolution photographs. Many real-world applications deal with very densely sampled point-cloud data, which are augmented with photographs that often reveal lighting variations and inaccuracies in registration. Consequently, the high-quality representation of the captured data, i.e., both point clouds and photographs together, is a challenging and time-consuming task. We propose a two-phase approach, in which the first (preprocessing) phase generates multiple overlapping surface patches and handles the problem of seamless texture generation locally for each patch. The second phase stitches these patches at render-time to produce a high-quality visualization of the data. As a result of the proposed localization of the global texturing problem, our algorithm is more than an order of magnitude faster than equivalent mesh-based texturing techniques. Furthermore, since our preprocessing phase requires only a minor fraction of the whole dataset at once, we provide maximum flexibility when dealing with growing datasets.

Additional Files and Images

Additional images and videos

video: [33 MB] video: [33 MB]

Additional files

ackn: Acknowledgments datasets ackn: Acknowledgments datasets
draft: [13 MB] draft: [13 MB]

Weblinks

BibTeX

@article{arikan-2014-pcvis,
  title =      "Large-Scale Point-Cloud Visualization through Localized
               Textured Surface Reconstruction",
  author =     "Murat Arikan and Reinhold Preiner and Claus Scheiblauer and
               Stefan Jeschke and Michael Wimmer",
  year =       "2014",
  abstract =   "In this paper, we introduce a novel scene representation for
               the visualization of large-scale point clouds accompanied by
               a set of high-resolution photographs. Many real-world
               applications deal with very densely sampled point-cloud
               data, which are augmented with photographs that often reveal
               lighting variations and inaccuracies in registration.
               Consequently, the high-quality representation of the
               captured data, i.e., both point clouds and photographs
               together, is a challenging and time-consuming task. We
               propose a two-phase approach, in which the first
               (preprocessing) phase generates multiple overlapping surface
               patches and handles the problem of seamless texture
               generation locally for each patch. The second phase stitches
               these patches at render-time to produce a high-quality
               visualization of the data. As a result of the proposed
               localization of the global texturing problem, our algorithm
               is more than an order of magnitude faster than equivalent
               mesh-based texturing techniques. Furthermore, since our
               preprocessing phase requires only a minor fraction of the
               whole dataset at once, we provide maximum flexibility when
               dealing with growing datasets.",
  month =      sep,
  issn =       "1077-2626",
  journal =    "IEEE Transactions on Visualization & Computer Graphics",
  number =     "9",
  volume =     "20",
  pages =      "1280--1292",
  keywords =   "image-based rendering, large-scale models, color, surface
               representation",
  URL =        "https://www.cg.tuwien.ac.at/research/publications/2014/arikan-2014-pcvis/",
}