Abstract
Wearable auxiliary devices for visually impaired people are highly attractive research top-ics. Although many proposed wearable navigation devices can assist visually impaired people in obstacle avoidance and navigation, these devices cannot feedback detailed information about the obstacles or help the visually impaired understand the environment. In this paper, we proposed a wearable navigation device for the visually impaired by integrating the semantic visual SLAM (Sim-ultaneous Localization And Mapping) and the newly launched powerful mobile computing plat-form. This system uses an Image-Depth (RGB-D) camera based on structured light as the sensor, as the control center. We also focused on the technology that combines SLAM technology with the extraction of semantic information from the environment. It ensures that the computing platform understands the surrounding environment in real-time and can feed it back to the visually impaired in the form of voice broadcast. Finally, we tested the performance of the proposed semantic visual SLAM system on this device. The results indicate that the system can run in real-time on a wearable navigation device with sufficient accuracy.
| Original language | English |
|---|---|
| Article number | 1536 |
| Pages (from-to) | 1-14 |
| Number of pages | 14 |
| Journal | Sensors |
| Volume | 21 |
| Issue number | 4 |
| DOIs | |
| Publication status | Published - 2 Feb 2021 |
Keywords
- Assistance for visually impaired peo-ple
- Localization
- SLAM
- Semantic map
- Semantic segmentation
- Wearable device
Fingerprint
Dive into the research topics of 'A wearable navigation device for visually impaired people based on the real-time semantic visual slam system'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver