Growing depth image superpixels for foliage modeling

Daniel Morris, Saif Imran, Jin Chen, David M. Kramer

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


This paper presents a method for segmenting depth images into superpixels without requiring color images. Typically superpixel methods cluster pixels based on proximity in a multidimensional color space. However, building superpixels from time-of-flight depth images poses a number of new challenges including: pixels do not have color channels for similarity comparisons, the resolution of depth cameras is low compared to color, and there is significant depth noise. To address these we propose a superpixel method that approximates a depth image with set of planar facets. Facets are grown from seed points to cover the scene. Facet boundaries tend to coincide with high curvature regions and depth discontinuities, typically giving an over-segmentation of the scene. This work is motivated by automated foliage modeling, and the data we consider are of dense 3D foliage. Superpixel results are shown on foliage and are quantified using labeled data.

Original languageEnglish
Title of host publicationProceedings - 2016 13th Conference on Computer and Robot Vision, CRV 2016
EditorsJuan Guerrero
Number of pages4
ISBN (Electronic)9781509024919
StatePublished - Dec 28 2016
Event13th Conference on Computer and Robot Vision, CRV 2016 - Victoria, Canada
Duration: Jun 1 2016Jun 3 2016

Publication series

NameProceedings - 2016 13th Conference on Computer and Robot Vision, CRV 2016


Conference13th Conference on Computer and Robot Vision, CRV 2016

Bibliographical note

Publisher Copyright:
© 2016 IEEE.


  • Depth image
  • Energy minimization
  • RGB-D
  • Segmentation
  • Superpixels
  • Time-of-flight camera

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Signal Processing


Dive into the research topics of 'Growing depth image superpixels for foliage modeling'. Together they form a unique fingerprint.

Cite this