Proceedings,

Deep Light Direction Reconstruction from single RGB images

, , and .
WSCG 2021: full papers proceedings: 29. International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, p. 31-40., (2021)
DOI: 10.24132/CSRN.2021.3101.4

Abstract

In augmented reality applications, consistent illumination between virtual and real objects is important for creating an immersive user experience. Consistent illumination can be achieved by appropriate parameterisation of the virtual illumination model, that is consistent with real-world lighting conditions. In this study, we developed a method to reconstruct the general light direction from red-green-blue (RGB) images of real-world scenes using a modified VGG-16 neural network. We reconstructed the general light direction as azimuth and elevation angles. To avoid inaccurate results caused by coordinate uncertainty occurring at steep elevation angles, we further introduced stereographically projected coordinates. Unlike recent deep-learning-based approaches for reconstructing the light source direction, our approach does not require depth information and thus does not rely on special red-green-bluedepth (RGB-D) images as input

Tags

Users

  • @baywiss1

Comments and Reviews