TY - GEN
T1 - Fusion of time-of-flight depth and stereo for high accuracy depth maps
AU - Zhu, Jiejie
AU - Wang, Liang
AU - Yang, Ruigang
AU - Davis, James
PY - 2008
Y1 - 2008
N2 - Time-of-flight range sensors have error characteristics which are complementary to passive stereo. They provide real time depth estimates in conditions where passive stereo does not work well, such as on white walls. In contrast, these sensors are noisy and often perform poorly on the textured scenes for which stereo excels. We introduce a method for combining the results from both methods that performs better than either alone. A depth probability distribution function from each method is calculated and then merged. In addition, stereo methods have long used global methods such as belief propagation and graph cuts to improve results, and we apply these methods to this sensor. Since time-of-flight devices have primarily been used as individual sensors, they are typically poorly calibrated. We introduce a method that substantially improves upon the manufacturer's calibration. We show that these techniques lead to improved accuracy and robustness.
AB - Time-of-flight range sensors have error characteristics which are complementary to passive stereo. They provide real time depth estimates in conditions where passive stereo does not work well, such as on white walls. In contrast, these sensors are noisy and often perform poorly on the textured scenes for which stereo excels. We introduce a method for combining the results from both methods that performs better than either alone. A depth probability distribution function from each method is calculated and then merged. In addition, stereo methods have long used global methods such as belief propagation and graph cuts to improve results, and we apply these methods to this sensor. Since time-of-flight devices have primarily been used as individual sensors, they are typically poorly calibrated. We introduce a method that substantially improves upon the manufacturer's calibration. We show that these techniques lead to improved accuracy and robustness.
UR - http://www.scopus.com/inward/record.url?scp=51949105325&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=51949105325&partnerID=8YFLogxK
U2 - 10.1109/CVPR.2008.4587761
DO - 10.1109/CVPR.2008.4587761
M3 - Conference contribution
AN - SCOPUS:51949105325
SN - 9781424422432
T3 - 26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR
BT - 26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR
T2 - 26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR
Y2 - 23 June 2008 through 28 June 2008
ER -