ePrints@IIScePrints@IISc Home | About | Browse | Latest Additions | Advanced Search | Contact | Help

SimpleNeRF: Regularizing Sparse Input Neural Radiance Fields with Simpler Solutions

Somraj, N and Karanayil, A and Soundararajan, R (2023) SimpleNeRF: Regularizing Sparse Input Neural Radiance Fields with Simpler Solutions. In: UNSPECIFIED.

Pro_STG_Asl_2023_Con_Pap_2023.pdf - Published Version

Download (5MB) | Preview
Official URL: https://doi.org/10.1145/3610548.3618188


Neural Radiance Fields (NeRF) show impressive performance for the photo-realistic free-view rendering of scenes. However, NeRFs require dense sampling of images in the given scene, and their performance degrades significantly when only a sparse set of views are available. Researchers have found that supervising the depth estimated by the NeRF helps train it effectively with fewer views. The depth supervision is obtained either using classical approaches or neural networks pre-trained on a large dataset. While the former may provide only sparse supervision, the latter may suffer from generalization issues. As opposed to the earlier approaches, we seek to learn the depth supervision by designing augmented models and training them along with the NeRF. We design augmented models that encourage simpler solutions by exploring the role of positional encoding and view-dependent radiance in training the few-shot NeRF. The depth estimated by these simpler models is used to supervise the NeRF depth estimates. Since the augmented models can be inaccurate in certain regions, we design a mechanism to choose only reliable depth estimates for supervision. Finally, we add a consistency loss between the coarse and fine multi-layer perceptrons of the NeRF to ensure better utilization of hierarchical sampling. We achieve state-of-the-art view-synthesis performance on two popular datasets by employing the above regularizations. The source code for our model can be found on our project page: https://nagabhushansn95.github.io/publications/2023/SimpleNeRF.html © 2023 ACM.

Item Type: Conference Paper
Publication: Proceedings - SIGGRAPH Asia 2023 Conference Papers, SA 2023
Publisher: Association for Computing Machinery, Inc
Additional Information: The copyright for this article belongs to publisher, Association for Computing Machinery, Inc.
Keywords: Rendering (computer graphics), Classical approach; Densest sampling; Neural rendering; Novel view synthesis; Performance; Photo-realistic; Simple++; Sparse input neural radiance field; Sparse set; View rendering, Large dataset
Department/Centre: Others
Date Deposited: 01 Mar 2024 09:54
Last Modified: 01 Mar 2024 09:54
URI: https://eprints.iisc.ac.in/id/eprint/84015

Actions (login required)

View Item View Item