Towards space-time light field rendering

Huamin Wang, Ruigang Yang

Producción científica: Paperrevisión exhaustiva

10 Citas (Scopus)

Resumen

So far extending light field rendering to dynamic scenes has been trivially treated as the rendering of static light fields stacked in time. This type of approaches requires input video sequences in strict synchronization and allows only discrete exploration in the temporal domain determined by the capture rate. In this paper we propose a novel framework, space-time light field rendering, which allows continuous exploration of a dynamic scene in both spatial and temporal domain with unsynchronized input video sequences. In order to synthesize novel views from any viewpoint at any time instant, we develop a two-stage rendering algorithm, We first interpolate in the temporal domain to generate globally synchronized images using a robust spatial-temporal image registration algorithm followed by edge-preserving image morphing. We then interpolate those software-synchronized images in the spatial domain to synthesize the final view. Our experimental results show that our approach is robust and capable of maintaining photo-realistic results.

Idioma originalEnglish
Páginas125-132
Número de páginas8
EstadoPublished - 2005
EventoI3D 2005: ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games - Washington, DC, United States
Duración: abr 3 2005abr 6 2005

Conference

ConferenceI3D 2005: ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games
País/TerritorioUnited States
CiudadWashington, DC
Período4/3/054/6/05

ASJC Scopus subject areas

  • Software
  • Human-Computer Interaction
  • Computer Graphics and Computer-Aided Design

Huella

Profundice en los temas de investigación de 'Towards space-time light field rendering'. En conjunto forman una huella única.

Citar esto