Free-viewpoint Indoor Neural Relighting
from Multi-view Stereo


Julien Philip Sébastien Morgenthaler Michaël Gharbi George Drettakis
Inria, Université Côte d'Azur and Adobe Research Inria, Université Côte d'Azur Adobe Research Inria, Université Côte d'Azur

Abstract

We introduce a neural relighting algorithm for captured indoors scenes, that allows interactive free-viewpoint navigation. Our method allows illumination to be changed synthetically, while coherently rendering cast shadows and complex glossy materials. We start with multiple images of the scene and a 3D mesh obtained by multi-view stereo (MVS) reconstruction. We assume that lighting is well-explained as the sum of a view-independent diffuse component and a view-dependent glossy term concentrated around the mirror reflection direction. We design a convolutional network around input feature maps that facilitate learning of an implicit representation of scene materials and illumination, enabling both relighting and free-viewpoint navigation. We generate these input maps by exploiting the best elements of both image-based and physically-based rendering. We sample the input views to estimate diffuse scene irradiance, and compute the new illumination caused by user-specified light sources using path tracing. To facilitate the network's understanding of materials and synthesize plausible glossy reflections, we reproject the views and compute mirror images. We train the network on a synthetic dataset where each scene is also reconstructed with MVS. We show results of our algorithm relighting real indoor scenes and performing free-viewpoint navigation with complex and realistic glossy reflections, which so far remained out of reach for view-synthesis techniques.

Bibtex Citation


@Article{PMGD21,
   author       = "Philip, Julien and Morgenthaler, S\'ebastien and Gharbi, Micha{\"e}l and Drettakis, George",
   title        = "Free-viewpoint Indoor Neural Relighting from Multi-view Stereo",
   journal      = "ACM Transactions on Graphics",
   year         = "2021",
   url          = "http://www-sop.inria.fr/reves/Basilic/2021/PMGD21"
     }


Paper | Video | Supplementals | Test Data | Code (New)

Acknowledgments

This research was funded in part by the ERC Advanced grant FUNGRAPH No 788065 (http://fungraph.inria.fr). The authors are grateful to the OPAL infrastructure from Université Côte d'Azur for providing resources and support. The authors thank G. Riegler and J-H. Wu for help with comparisons. Thanks to A. Bousseau and P. Shirley for proofreading earlier drafts. Finally, the authors thank the anonymous reviewers for their valuable feedback.

Paper

Free-viewpoint Indoor Neural Relighting from Multi-view Stereo pdf
GraphDeco publication page

Video

Code

The code is released as a SIBR framework project at: gitlab.inria.fr/sibr/projects/indoor_relighting


Supplemental Materials and Results

Real Captured Scenes

For each captured scene we provide:
(1) A video of the interpolation between a view and its relit counterparts
(2) A path of view synthesis and in original and relit condition(s)
(3) Comparisons for view synthesis.

Bedroom 1
Bedroom 2
Hall
Kitchen
Living room
Sofa

Synthetic Scenes

For these synthetic scenes we provide:
(1) Comparisons for view synthesis.

Synthetic Living Room
Synthetic Kitchen

Test Data Download

For each captured scene we provide a scene containing images, meshes, and camera calibration compatible with the SIBR framework.

Bedroom 1
Bedroom 2
Hall
Kitchen
Living room
Sofa

We also provide the RAW capture for some datasets.

Bedroom1 | Bedroom2 | Hall | Sofa91