Leveraging Intrinsic Components for Few-Shot Neural Radiance Fields in Unconstrained Illumination

Authors
Lee, SeokyeongChoi, JunyongKim, SeungryongKim, Ig-JaeCho, Junghyun
Issue Date
2025-09
Publisher
Institute of Electrical and Electronics Engineers Inc.
Citation
IEEE Access, v.13, pp.169150 - 169165
Abstract
Neural Radiance Fields (NeRF) have demonstrated strong performance in novel view synthesis under idealized conditions such as dense multi-view observations, consistent illumination, and known camera poses. However, these assumptions often do not hold in real-world scenarios, where inputs are sparse and lighting varies significantly. This paper presents a novel regularization framework for few-shot NeRF reconstruction under unconstrained illumination. By leveraging intrinsic, illumination-invariant representations (e.g., albedo), our method enforces cross-view appearance consistency, leading to more stable synthesis. To further improve applicability, we propose a lightweight variant that achieves comparable improvements with significantly reduced computational cost. We also establish new benchmarks that reflect diverse illumination and viewpoint conditions. Extensive experiments show that our method improves robustness and rendering quality across challenging real-world scenes, without relying on dense inputs or manual supervision.
Keywords
VIEW SYNTHESIS; RECONSTRUCTION; Color; Rendering (computer graphics); Image color analysis; Geometry; Benchmark testing; Robustness; Optimization; Cameras; Illumination decomposition; neural radiance fields; view synthesis; Neural radiance field; Lighting
URI
https://pubs.kist.re.kr/handle/201004/153372
DOI
10.1109/ACCESS.2025.3610908
Appears in Collections:
KIST Article > 2025
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE