Towards Space-time Light Field Rendering

Huamin Wang

Ruigang Yang
Georgia Institute of Technology University of Kentucky

ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games (I3D) 2005

 

Full Paper (PDF, 2.8Mbytes)

Video (encoded in DIVX, 4.1Mbytes)

 

Abstract 

So far extending light field rendering to dynamic scenes has been trivially treated as the rendering of static light fields stacked in time. This type of approaches requires input video sequences in strict synchronization and allows only discrete exploration in the temporal domain determined by the capture rate. In this paper we propose a novel framework, space-time light field rendering, which allows continuous exploration of a dynamic scene in both spatial and temporal domain with unsynchronized input video sequences.

In order to synthesize novel views from any viewpoint at any time instant, we develop a two-stage rendering algorithm. We first interpolate in the temporal domain to generate globally synchronized images using a robust spatial-temporal image registration algorithm followed by edge-preserving image morphing. We then interpolate those software-synchronized images in the spatial domain to synthesize the final view. Our experimental results show that our approach is robust and capable of maintaining photo-realistic results.

 

Results

Figure 1: Synthesized results of the book sequence. Top: traditional LFR with unsynchronized frames. Bottom: Our space-time LFR.