GS-ID: Illumination Decomposition on Gaussian Splatting

HKUST-GZ1, HKUST2, Tencent Hunyuan3

ICCV 2025

Abstract

We present GS-ID, a novel end-to-end framework that achieves comprehensive illumination decomposition by integrating adaptive light aggregation with diffusion-based material priors. In addition to a learnable environment map that captures ambient illumination, we model complex local lighting conditions by adaptively aggregating a set of anisotropic and spatially-varying spherical Gaussian mixtures during optimization. To better model shadow effects, we associate a learnable unit vector with each splat to represent how multiple light sources cause the shadow, further enhancing lighting and material estimation. Together with intrinsic priors from diffusion models, GS-ID significantly reduces light-geometry-material ambiguity and achieves state-of-the-art illumination decomposition performance. Experiments also show that GS-ID effectively supports various downstream applications such as relighting and scene composition.


Methodology

Recovering the Pre-Fine-Tuning Weight of an Aligned Mistral 7B

We first reconstruct a coarse 3DGS scene with normal priors from a diffusion model, and then leverage material priors to estimate the illumination and intrinsics via joint optimization. Intrinsics are stored as G-Buffer maps and used for deferred shading to accelerate training. We represent the illumination using an adaptive light model including a set of SGMs and a learnable environment map.

BibTeX


        @misc{du2024gsidilluminationdecompositiongaussian,
          title={GS-ID: Illumination Decomposition on Gaussian Splatting via Diffusion Prior and Parametric Light Source Optimization}, 
          author={Kang Du and Zhihao Liang and Zeyu Wang},
          year={2024},
          eprint={2408.08524},
          archivePrefix={arXiv},
          primaryClass={cs.CV},
          url={https://arxiv.org/abs/2408.08524}, 
    }