Abstract

This paper studies how to flexibly integrate reconstructed 3D models into practical 3D modeling pipelines such as 3D scene creation and rendering. Due to the technical difficulty, one can only obtain rough 3D models (R3DMs) for most real objects using existing 3D reconstruction techniques. As a result, physically-based rendering (PBR) would render low-quality images or videos for scenes that are constructed by R3DMs. One promising solution would be representing real-world objects as Neural Fields such as NeRFs, which are able to generate photo-realistic renderings of an object under desired viewpoints. However, a drawback is that the synthesized views through Neural Fields Rendering (NFR) cannot reflect the simulated lighting details on R3DMs in PBR pipelines, especially when object interactions in the 3D scene creation cause local shadows. To solve this dilemma, we propose a lighting transfer network (LighTNet) to bridge NFR and PBR, such that they can benefit from each other. LighTNet reasons about a simplified image composition model, remedies the uneven surface issue caused by R3DMs, and is empowered by several perceptual-motivated constraints and a new Lab angle loss which enhances the contrast between lighting strength and colors. Comparisons demonstrate that LighTNet is superior in synthesizing impressive lighting, and is promising in pushing NFR further in practical 3D modeling workflows.


overview

Interior Design

Real scenes


Lighting Transfer Network (LighTNet)

We propose a Lighting Transfer Network (LighTNet) to bridge NFR and PBR, such that they can benefit from each other. LighTNet takes “Shading” rendered from a PBR system and a synthesized image by NFR techniques (e.g. NeRF) as input and outputs photo-realistic renderings with rich lighting details.

overview

Rendering with LighTNet and R3DMs

We can represent real-world objects as individual NeRFs and R3DMs, and freely composite them to create unlimited 3D scenes. After lighting editing by artists, LighTNet can transfer direct and indirect lighting effects on R3DMs to the corresponding NFR instances.

overview


Generalizing to Real-Lighting

We reconstruct some real objects and use them to create some scenes. Here, NeRF means the 2D instance synthesized by NeRF. The lighting details have been successfully preserved by our LighTNet. Please see the shadows caused by object-to-object interactions.

overview

Qualitative Comparisons

We make qualitative comparisons with the reformulated Pix2Pix and SSVBRDF.

overview

Acknowledgements

We are very grateful for the support provided by the Tao Technology Department, Alibaba Group and Alibaba Homestyler.
The website template was borrowed from Michaël Gharbi.