Light-field Technology

What is Light-field Technology?

A light-field is defined by the number of light rays within a specific area. If all light rays are captured within a scene, this makes it possible to generate a perspective from any position, including the reconstruction of all depth of field information. Although it's not possible to capture a complete light-field with current technologies, a large portion can still be acquired under special conditions. This is a major advantage when editing scenes during the subsequent postproduction process. Researchers in the Moving Picture Technologies department at Fraunhofer IIS are already demonstrating how scenes can be optimized using a limited number of perspectives and how light-field technology opens up new creative opportunities.

Currently, researchers rely on a set up with professional cameras in a planar arrangement that capture slightly different scene perspectives and from which information such as the depth of field can be extracted. This makes it possible to shift the perspective and implement special effects such as virtual camera panning and zooming, all with a single light-field recording.

Light-field Data Opens up New Possibilities in Movie Production

On-location retakes are time-consuming and expensive. What if the focus was incorrectly set during shooting or the perspective has to be changed? The use of intelligent algorithms and multicamera systems that provide multiple simultaneous views opens the door to a world of new post-production possibilities.

For recording a set up of high-quality cameras in a planar arrangement can be used. The material can be processed and the scenes edited as needed during post-production. The algorithms developed by Fraunhofer IIS offer a wide range of editing features such as the matrix effect, in which the actor is essentially "frozen" in the scene and the camera moves around the object. The camera perspective can be shifted or expanded, as if the camera was actually moved, even though it was actually stationary during shooting. The multiple views allow the camera position to be virtually shifted. Without actually changing the position of the cameras, the object can be moved closer to or further away from the viewer - the so-called dolly zoom effect. The system can generate HDR images from the various perspectives or create 3D views from the existing depth information. Plug-ins can be integrated into the post-production software for added creativity during the editing process.

The light-field recording system is also suitable for the production of stop-motion films where the camera remains stationary. Recent developments show the initial results of applying light-field technology to moving picture content.



Realception® Plug-in Suite

Realception® is a light-field and multi camera data based Plug-in Suite for the professional movie post-production.

Here you can find more information about our creation and editing Plug-in Suite.

Download of light-field data

Are you interested in light-field?
Fraunhofer IIS now offers a light-field data-set for free download.


EU-Project RealVision - Hyperrealistic Imaging Experience

The RealVision network brings together leading universities, research centres focused on industrial development and companies to train a new generation of highly-skilled scientists and entrepreneurs in the area of hyper-realistic imaging, encoding and display technologies for the creation of high-quality accurate imaginery of realistic digital video.

ETN-FPI - European Training Network on Full Parallax Imaging

ETN-FPI is a four year (2015-2019) Innovative Training Network, which has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie Grant Agreement. It brings together eight beneficiaries and eight partner organizations from Finland, Sweden, Germany, United Kingdom, Spain, Hungary and Singapore, with the aim of training a new generation of researchers in the area of full parallax imaging. This will be achieved by hiring talented 15 early stage researchers and training them to become future research leaders in this area.

Notice: Starting the video transfers usage data to vimeo.

On-line Light-field Video Processing

Fraunhofer IIS and Max-Planck Insitute for Informatics introduce a real-time multi-view correspondence algorithm that extracts multi-view depth maps from sparse, wide-baseline light-field video to produce high-quality novel views for virtual effects such as apertures or virtual camera positions.


"Coming Home" − First Live-Action-Clip with Light-field Processing

On set more and more cameras and camera arrays are used to generate additional views and data for visual effects or depth maps information. Together with the German Stuttgart Media University HDM, the team Computational Imaging of Fraunhofer´s Department Moving Picture Technologies set up a project to evaluate and demonstrates the potential of shooting with multi-camera arrays and light-field processing.

This project results in a short live-action pilot clip called "Coming Home" which impressively shows the possibility and performance of light-field processing in a real production workflow. The break-down movie demonstrate the requirements and conditions on set.


Notice: Starting the video transfers usage data to vimeo.

Breakdown Coming Home

Notice: Starting the video transfers usage data to vimeo.


"Light-field acquisition system allowing camera viewpoint and depth of field compositing in post-production."

Frederik Zilly, Michael Schöberl, Peter Schäfer, Matthias Ziegler, Joachim Keinert, Siegfried Foessel.

IBC 2013, Amsterdam 13.-17. September 2013.

"Acquisition system for dense lightfield of large scenes."

Matthias Ziegler, Ron op het Veld, Joachim Keinert, Frederik Zilly.

3DTV-Con 2017, Copenhagen 07.-09. June 2017.

"Immersive Virtual Reality for live-action video using camera arrays."

Matthias Ziegler, Joachim Keinert, Nina Holzer, Thorsten Wolf, Tobias Jaschke, Ron op het Veld, Faezeh Sadat Zakeri, Siegfried Foessel.

IBC 2017, Amsterdam 15.-19. September 2017.