Forest understory trees can be segmented accurately within sufficiently dense airborne laser scanning point clouds

Hamid Hamraz, Marco A. Contreras, Jun Zhang

Research output: Contribution to journalArticlepeer-review

49 Scopus citations


Airborne laser scanning (LiDAR) point clouds over large forested areas can be processed to segment individual trees and subsequently extract tree-level information. Existing segmentation procedures typically detect more than 90% of overstory trees, yet they barely detect 60% of understory trees because of the occlusion effect of higher canopy layers. Although understory trees provide limited financial value, they are an essential component of ecosystem functioning by offering habitat for numerous wildlife species and influencing stand development. Here we model the occlusion effect in terms of point density. We estimate the fractions of points representing different canopy layers (one overstory and multiple understory) and also pinpoint the required density for reasonable tree segmentation (where accuracy plateaus). We show that at a density of 170 pt/m2 understory trees can likely be segmented as accurately as overstory trees. Given the advancements of LiDAR sensor technology, point clouds will affordably reach this required density. Using modern computational approaches for big data, the denser point clouds can efficiently be processed to ultimately allow accurate remote quantification of forest resources. The methodology can also be adopted for other similar remote sensing or advanced imaging applications such as geological subsurface modelling or biomedical tissue analysis.

Original languageEnglish
Article number6770
JournalScientific Reports
Issue number1
StatePublished - Dec 1 2017

Bibliographical note

Publisher Copyright:
© 2017 The Author(s).

ASJC Scopus subject areas

  • General


Dive into the research topics of 'Forest understory trees can be segmented accurately within sufficiently dense airborne laser scanning point clouds'. Together they form a unique fingerprint.

Cite this