SGS-SLAM: Semantic Gaussian Splatting For Neural Dense SLAM

136
citations
#72
in ECCV 2024
of 2387 papers
7
Top Authors
5
Data Points

Abstract

We present SGS-SLAM, the first semantic visual SLAM system based on Gaussian Splatting. It incorporates appearance, geometry, and semantic features through multi-channel optimization, addressing the oversmoothing limitations of neural implicit SLAM systems in high-quality rendering, scene understanding, and object-level geometry. We introduce a unique semantic feature loss that effectively compensates for the shortcomings of traditional depth and color losses in object optimization. Through a semantic-guided keyframe selection strategy, we prevent erroneous reconstructions caused by cumulative errors. Extensive experiments demonstrate that SGS-SLAM delivers state-of-the-art performance in camera pose estimation, map reconstruction, precise semantic segmentation, and object-level geometric accuracy, while ensuring real-time rendering capabilities.

Citation History

Jan 25, 2026
129
Jan 30, 2026
131+2
Feb 13, 2026
136+5
Feb 13, 2026
136
Feb 13, 2026
136