Towards automatic photometric correction of casually illuminated documents

George V. Landon, Yun Lin, W. Brent Seales

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

14 Scopus citations

Abstract

Creating uniform lighting for archival-quality document acquisition remains a non-trivial problem. We propose a novel method for automatic photometric correction of nonplanar documents by estimating a single, point light-source using a simple light probe. By adding a simple piece of folded white paper with a known 3D surface to a scene, we are able to extract the 3D position of a light source, automatically perform white balance correction, and determine areas of poor illumination. Furthermore, this method is designed with the purpose of adding it to an already implemented document digitization pipeline. To justify our claims, we provide an accuracy analysis of our correction technique using simulated ground-truth data which allows individual sources of error to be determined and compared. These techniques are then applied on real documents that have been acquired using a 3D scanner.

Original languageEnglish
Title of host publication2007 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR'07
DOIs
StatePublished - 2007
Event2007 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR'07 - Minneapolis, MN, United States
Duration: Jun 17 2007Jun 22 2007

Publication series

NameProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
ISSN (Print)1063-6919

Conference

Conference2007 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR'07
Country/TerritoryUnited States
CityMinneapolis, MN
Period6/17/076/22/07

ASJC Scopus subject areas

  • Software
  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'Towards automatic photometric correction of casually illuminated documents'. Together they form a unique fingerprint.

Cite this