TY - JOUR
T1 - Virtual mirror rendering with stationary RGB-D cameras and stored 3-D background
AU - Shen, Ju
AU - Su, Po Chang
AU - Cheung, Sen Ching Samson
AU - Zhao, Jian
PY - 2013
Y1 - 2013
N2 - Mirrors are indispensable objects in our lives. The capability of simulating a mirror on a computer display, augmented with virtual scenes and objects, opens the door to many interesting and useful applications from fashion design to medical interventions. Realistic simulation of a mirror is challenging as it requires accurate viewpoint tracking and rendering, wide-angle viewing of the environment, as well as real-time performance to provide immediate visual feedback. In this paper, we propose a virtual mirror rendering system using a network of commodity structured-light RGB-D cameras. The depth information provided by the RGB-D cameras can be used to track the viewpoint and render the scene from different prospectives. Missing and erroneous depth measurements are common problems with structured-light cameras. A novel depth denoising and completion algorithm is proposed in which the noise removal and interpolation procedures are guided by the foreground/background label at each pixel. The foreground/background label is estimated using a probabilistic graphical model that considers color, depth, background modeling, depth noise modeling, and spatial constraints. The wide viewing angle of the mirror system is realized by combining the dynamic scene, captured by the static camera network with a 3-D background model created off-line, using a color-depth sequence captured by a movable RGB-D camera. To ensure a real-time response, a scalable client-and-server architecture is used with the 3-D point cloud processing, the viewpoint estimate, and the mirror image rendering are all done on the client side. The mirror image and the viewpoint estimate are then sent to the server for final mirror view synthesis and viewpoint refinement. Experimental results are presented to show the accuracy and effectiveness of each component and the entire system.
AB - Mirrors are indispensable objects in our lives. The capability of simulating a mirror on a computer display, augmented with virtual scenes and objects, opens the door to many interesting and useful applications from fashion design to medical interventions. Realistic simulation of a mirror is challenging as it requires accurate viewpoint tracking and rendering, wide-angle viewing of the environment, as well as real-time performance to provide immediate visual feedback. In this paper, we propose a virtual mirror rendering system using a network of commodity structured-light RGB-D cameras. The depth information provided by the RGB-D cameras can be used to track the viewpoint and render the scene from different prospectives. Missing and erroneous depth measurements are common problems with structured-light cameras. A novel depth denoising and completion algorithm is proposed in which the noise removal and interpolation procedures are guided by the foreground/background label at each pixel. The foreground/background label is estimated using a probabilistic graphical model that considers color, depth, background modeling, depth noise modeling, and spatial constraints. The wide viewing angle of the mirror system is realized by combining the dynamic scene, captured by the static camera network with a 3-D background model created off-line, using a color-depth sequence captured by a movable RGB-D camera. To ensure a real-time response, a scalable client-and-server architecture is used with the 3-D point cloud processing, the viewpoint estimate, and the mirror image rendering are all done on the client side. The mirror image and the viewpoint estimate are then sent to the server for final mirror view synthesis and viewpoint refinement. Experimental results are presented to show the accuracy and effectiveness of each component and the entire system.
KW - 3-D scene scanning
KW - Markov random field
KW - Mirrors
KW - RGB-D system
KW - client-server systems
KW - depth image denoising
KW - image denoising
KW - image reconstruction
UR - http://www.scopus.com/inward/record.url?scp=84880546020&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84880546020&partnerID=8YFLogxK
U2 - 10.1109/TIP.2013.2268941
DO - 10.1109/TIP.2013.2268941
M3 - Article
C2 - 23782808
AN - SCOPUS:84880546020
SN - 1057-7149
VL - 22
SP - 3433
EP - 3448
JO - IEEE Transactions on Image Processing
JF - IEEE Transactions on Image Processing
IS - 9
M1 - 6532397
ER -