Simultaneous Localization and Mapping (SLAM) with 3D Gaussian Splatting (3DGS) enables fast, differentiable rendering and high-fidelity reconstruction across diverse real-world scenes. However, existing 3DGS-SLAM approaches handle measurement reliability implicitly, making pose estimation and global alignment susceptible to drift in low-texture regions, transparent surfaces, or areas with complex reflectance properties.
To this end, we introduce VarSplat, an uncertainty-aware 3DGS-SLAM system that explicitly learns per-splat appearance variance. By using the law of total variance with alpha compositing, we compute corresponding differentiable per-pixel uncertainty map via efficient, single-pass rasterization. This variance map guides tracking, submap registration, and loop detection toward focusing on reliable regions and contributes to more stable optimization.
Experimental results on Replica (synthetic) and TUM-RGBD, ScanNet, and ScanNet++ (real-world) show that VarSplat improves robustness and achieves competitive or superior tracking, mapping, and novel view synthesis rendering compared to existing studies for dense RGB-D SLAM.

VarSplat builds upon 3D Gaussian Splatting SLAM systems and introduces three main contributions:
@inproceedings{tran2026varsplat,
title = {VarSplat: Uncertainty-aware 3D Gaussian Splatting for Robust RGB-D SLAM},
author = {Tran, Anh Thuan and Kosecka, Jana},
booktitle = {CVPR},
year = {2026}
}