Abstract
Time-of-flight range sensors have error characteristics, which are complementary to passive stereo. They provide real-time depth estimates in conditions where passive stereo does not work well, such as on white walls. In contrast, these sensors are noisy and often perform poorly on the textured scenes where stereo excels. We explore their complementary characteristics and introduce a method for combining the results from both methods that achieve better accuracy than either alone. In our fusion framework, the depth probability distribution functions from each of these sensor modalities are formulated and optimized. Robust and adaptive fusion is built on a pixel-wise reliability weighting function calculated for each method. In addition, since time-of-flight devices have primarily been used as individual sensors, they are typically poorly calibrated. We introduce a method that substantially improves upon the manufacturer's calibration. We demonstrate that our proposed techniques lead to improved accuracy and robustness on an extensive set of experimental results.
Original language | English |
---|---|
Article number | 5567112 |
Pages (from-to) | 1400-1414 |
Number of pages | 15 |
Journal | IEEE Transactions on Pattern Analysis and Machine Intelligence |
Volume | 33 |
Issue number | 7 |
DOIs | |
State | Published - 2011 |
Bibliographical note
Funding Information:Ruigang Yang was supported by the University of Kentucky Research Foundation, the US Department of Homeland Security, US National Science Foundation (NSF) HCC-0448185, and CPA-0811647. James E. Davis was supported by NSF CCF-0746690. Zhigeng Pan was supported by the China NSFC 60533080 and the China 863 Plans 2006AA01Z335. The authors thank Qing Zhang and Xueqing Xiang for collecting part of the data. This work was done when Jiejie Zhu was with the University of Kentucky as a postdoctoral researcher.
Funding
Ruigang Yang was supported by the University of Kentucky Research Foundation, the US Department of Homeland Security, US National Science Foundation (NSF) HCC-0448185, and CPA-0811647. James E. Davis was supported by NSF CCF-0746690. Zhigeng Pan was supported by the China NSFC 60533080 and the China 863 Plans 2006AA01Z335. The authors thank Qing Zhang and Xueqing Xiang for collecting part of the data. This work was done when Jiejie Zhu was with the University of Kentucky as a postdoctoral researcher.
Funders | Funder number |
---|---|
US National Science Foundation | |
University of Kentucky and São Paulo Research Foundation | |
National Science Foundation (NSF) | CCF-0746690, HCC-0448185, CPA-0811647 |
U.S. Department of Homeland Security | |
National Natural Science Foundation of China (NSFC) | 2006AA01Z335, 60533080 |
Keywords
- Time-of-Flight sensor
- global optimization
- multisensor fusion
- stereo vision
ASJC Scopus subject areas
- Software
- Computer Vision and Pattern Recognition
- Computational Theory and Mathematics
- Artificial Intelligence
- Applied Mathematics