Abstract
So far extending light field rendering to dynamic scenes has been trivially treated as the rendering of static light fields stacked in time. This type of approaches requires input video sequences in strict synchronization and allows only discrete exploration in the temporal domain determined by the capture rate. In this paper we propose a novel framework, space-time light field rendering, which allows continuous exploration of a dynamic scene in both spatial and temporal domain with unsynchronized input video sequences. In order to synthesize novel views from any viewpoint at any time instant, we develop a two-stage rendering algorithm, We first interpolate in the temporal domain to generate globally synchronized images using a robust spatial-temporal image registration algorithm followed by edge-preserving image morphing. We then interpolate those software-synchronized images in the spatial domain to synthesize the final view. Our experimental results show that our approach is robust and capable of maintaining photo-realistic results.
Original language | English |
---|---|
Pages | 125-132 |
Number of pages | 8 |
State | Published - 2005 |
Event | I3D 2005: ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games - Washington, DC, United States Duration: Apr 3 2005 → Apr 6 2005 |
Conference
Conference | I3D 2005: ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games |
---|---|
Country/Territory | United States |
City | Washington, DC |
Period | 4/3/05 → 4/6/05 |
Keywords
- Epipolar constraints
- Image-based rendering
- Space-time light field
ASJC Scopus subject areas
- Software
- Human-Computer Interaction
- Computer Graphics and Computer-Aided Design