Splat and Replace:
3D Reconstruction with Repetitive Elements

SIGGRAPH Conference Papers 2025

Nicolás Violante 1,2      Andreas Meuleman 1,2      Alban Gauthier 1,2      Frédo Durand3      Thibault Groueix 4      George Drettakis 1,2
1Inria      2Université Côte d'Azur      3MIT      4Adobe
1 2 3 4

Abstract

We leverage repetitive elements in 3D scenes to improve novel view synthesis. Neural Radiance Fields (NeRF) and 3D Gaussian Splatting (3DGS) have greatly improved novel view synthesis but renderings of unseen and occluded parts remain low-quality if the training views are not exhaustive enough. Our key observation is that our environment is often full of repetitive elements. We propose to leverage those repetitions to improve the reconstruction of low-quality parts of the scene due to poor coverage and occlusions. We propose a method that segments each repeated instance in a 3DGS reconstruction, registers them together, and allows information to be shared among instances. Our method improves the geometry while also accounting for appearance variations across instances. We demonstrate our method on a variety of synthetic and real scenes with typical repetitive elements, leading to a substantial improvement in the quality of novel view synthesis.

Video

BibTeX

@article{violante2025splatandreplace,
  title={Splat and Replace: 3D Reconstruction with Repetitive Elements},
  author={Violante, Nicolás and Meuleman, Andreas and Gauthier, Alban and Durand, Fredo and Groueix, Thibault and Drettakis, George},
  journal={SIGGRAPH Conference Papers},
  year={2025}
}

Acknowledgments and Funding

This work was funded by the European Research Council (ERC) Advanced Grant NERPHYS, number 101141721 https://project.inria.fr/nerphys. The authors are grateful to the OPAL infrastructure of the Université Côte d'Azur for providing resources and support, as well as Adobe and NVIDIA for software and hardware donations. The authors thank the anonymous reviewers for their valuable feedback.