Real-time consensus-based scene reconstruction using commodity graphics hardware

Ruigang Yang, Greg Welch, Gary Bishop

Research output: Contribution to journalArticlepeer-review

119 Scopus citations

Abstract

We present a novel use of commodity graphics hardware that effectively combines a plane-sweeping algorithm with view synthesis for real-time, on-line 3D scene acquisition and view synthesis. Using real-time imagery from a few calibrated cameras, our method can generate new images from nearby viewpoints, estimate a dense depth map from the current viewpoint, or create a textured triangular mesh. We can do this without prior geometric information or requiring any user interaction, in real time and on line. The heart of our method is using programmable pixel shader technology to square intensity differences between reference image pixels, and then to choose final colors (or depths) that correspond to the minimum difference, i.e. the most consistent color. In this paper we describe the method, place it in the context of related work in computer graphics and computer vision, and present results.

Original languageEnglish
Article number1167864
Pages (from-to)225-234
Number of pages10
JournalProceedings - Pacific Conference on Computer Graphics and Applications
Volume2002-January
DOIs
StatePublished - 2002

Keywords

  • Cameras
  • Computer graphics
  • Computer vision
  • Hardware
  • Heart
  • Image generation
  • Image reconstruction
  • Layout
  • Mesh generation
  • Pixel

ASJC Scopus subject areas

  • Software
  • Computer Graphics and Computer-Aided Design
  • Modeling and Simulation

Fingerprint

Dive into the research topics of 'Real-time consensus-based scene reconstruction using commodity graphics hardware'. Together they form a unique fingerprint.

Cite this