Time-of-flight range sensors and passive stereo have complimentary characteristics in nature. To fuse them to get high accuracy depth maps varying over time, we extend traditional spatial MRFs to dynamic MRFs with temporal coherence. This new model allows both the spatial and the temporal relationship to be propagated in local neighbors. By efficiently finding a maximum of the posterior probability using Loopy Belief Propagation, we show that our approach leads to improved accuracy and robustness of depth estimates for dynamic scenes.
|Number of pages||11|
|Journal||IEEE Transactions on Pattern Analysis and Machine Intelligence|
|State||Published - 2010|
Bibliographical noteFunding Information:
This work is supported in part by the University of Kentucky Research Foundation, the US Department of Homeland Security, US National Science Foundation Grants HCC-0448185 and CPA-0811647, and the Open Project Program of the State Key Lab of CAD&CG (grant no.: A0812), Zhejiang University, China. This work was performed while Jiejie Zhu was with the University of Kentucky.
- Data fusion
- Global optimization
- Time-of-flight sensor
ASJC Scopus subject areas
- Computer Vision and Pattern Recognition
- Computational Theory and Mathematics
- Artificial Intelligence
- Applied Mathematics