USplat4D: Uncertainty Matters in Dynamic
Gaussian Splatting for Monocular 4D Reconstruction

1Texas A&M University   2Mercedes-Benz Research & Development North America

We show a challenging case from the DyCheck dataset, where a person casually rotates a backpack while being captured by a moving monocular camera. The goal is to reconstruct the dynamic object for arbitrary viewpoints and timestamps. The state-of-the-art dynamic Gaussian Splatting methods, e.g., Shape-of-Motion (SoM), struggle on extreme novel views far from the input trajectory, such as opposite-side views (view 1) or large angle offsets (view 2). We propose USplat4D, an Uncertainty-aware dynamic Gaussian Splatting model that produces more accurate and consistent 4D reconstruction.

Abstract

Reconstructing dynamic 3D scenes from monocular input is fundamentally under-constrained, with ambiguities arising from occlusion and extreme novel views. While dynamic Gaussian Splatting offers an efficient representation, vanilla models optimize all Gaussian primitives uniformly, ignoring whether they are well or poorly observed. This limitation leads to motion drifts under occlusion and degraded synthesis when extrapolating to unseen views. We argue that uncertainty matters: Gaussians with recurring observations across views and time act as reliable anchors to guide motion, whereas those with limited visibility are treated as less reliable. To this end, we introduce USplat4D, a novel Uncertainty-aware dynamic Gaussian Splatting framework that propagates reliable motion cues to enhance 4D reconstruction. Our key insight is to estimate time-varying per-Gaussian uncertainty and leverage it to construct a spatio-temporal graph for uncertainty-aware optimization. Experiments on diverse real and synthetic datasets show that explicitly modeling uncertainty consistently improves dynamic Gaussian Splatting models, yielding more stable geometry under occlusion and high-quality synthesis at extreme viewpoints.


More Results

We compare SoM and MoSca on DyCheck dataset and show the results.

Backpack-View 1
Backpack-View 2
Haru-sit-View 1

We compare SoM on DAVIS dataset and show the results.

We compare MoSca on DAVIS dataset and show the results.

We compare SoM and MoSca on Objaverse dataset and show the results.

Rotational Views

Front Views

Tracking Results


Validation View

Acknowledgements

We thank the authors of Shape of Motion and MoSca for their great works and sharing the code and results. We also thank the authors of Gaussian Splatting and gsplat for their great contributions on the Gaussian Splatting implementation.

BibTeX

@inproceedings{usplat4d2025,
  title     = {Uncertainty Matters in Dynamic Gaussian Splatting for Monocular 4D Reconstruction},
  author    = {Guo, Fengzhi and Hsu, Chih-Chuan and Ding, Sihao and Zhang, Cheng},
  journal   = {arXiv preprint arXiv:},
  year      = {2025}
}