SEE-CSOM: Sharp-Edged and Efficient Continuous Semantic Occupancy Mapping for Mobile Robots

Yinan Deng, Meiling Wang, Yi Yang, Danwei Wang, Yufeng Yue*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)

Abstract

Generating an accurate and continuous semantic occupancy map is a key component of autonomous robotics. Most existing continuous semantic occupancy mapping methods neglect the potential differences between voxels, which reconstruct an overinflated map. What is more, these methods have high computational complexity due to the fixed and large query range. To address the challenges of overinflation and inefficiency, this article proposes a novel sharp-edged and efficient continuous semantic occupancy mapping algorithm (SEE-CSOM). The main contribution of this work is to design the Redundant Voxel Filter Model (RVFM) and the Adaptive Kernel Length Model (AKLM) to improve the performance of the map. RVFM applies context entropy to filter out the redundant voxels with a low degree of confidence, so that the representation of objects will have accurate boundaries with sharp edges. AKLM adaptively adjusts the kernel length with class entropy, which reduces the amount of data used for training. Then, the multientropy kernel inference function is formulated to integrate the two models to generate the continuous semantic occupancy map. The algorithm has been verified on indoor and outdoor public datasets and implemented on a real robot platform, validating the significant improvement in accuracy and efficiency.

Original languageEnglish
Pages (from-to)1718-1728
Number of pages11
JournalIEEE Transactions on Industrial Electronics
Volume71
Issue number2
DOIs
Publication statusPublished - 1 Feb 2024

Keywords

  • Bayesian rule
  • Mobile robots
  • kernel inference
  • semantic mapping

Fingerprint

Dive into the research topics of 'SEE-CSOM: Sharp-Edged and Efficient Continuous Semantic Occupancy Mapping for Mobile Robots'. Together they form a unique fingerprint.

Cite this