Space-Time Light Field Rendering
Jul 1, 2007·
,,·
0 min read

Huamin Wang
Mingxuan Sun
Ruigang Yang

Abstract
In this paper, we propose a novel framework called space-time light field rendering, which allows continuous exploration of a dynamic scene in both space and time. Compared to existing light field capture/rendering systems, it offers the capability of using unsynchronized video inputs and the added freedom of controlling the visualization in the temporal domain, such as smooth slow motion and temporal integration. In order to synthesize novel views from any viewpoint at any time instant, we develop a two-stage rendering algorithm. We first interpolate in the temporal domain to generate globally synchronized images using a robust spatial-temporal image registration algorithm followed by edge-preserving image morphing. We then interpolate these software-synchronized images in the spatial domain to synthesize the final view. In addition, we introduce a very accurate and robust algorithm to estimate subframe temporal offsets among input video sequences. Experimental results from unsynchronized videos with or without time stamps show that our approach is capable of maintaining photorealistic quality from a variety of real scenes.
Type
Publication
IEEE Transactions on Visualization and Computer Graphics, 13(4)
Layout
Robustness
Motion Control
Lighting Control
Control Systems
Visualization
Control System Synthesis
Rendering (Computer Graphics)
Image Generation
Image Registration
Image-Based Rendering
Space-Time Light Field
Epipolar Constraint
Image Morphing
Authors
Chief Scientist
My research interests include computer graphics, computer vision, generative AI, and embodied AI.