Using visual feedback to improve hand movement accuracy in confined-occluded spaces in virtual reality

Yu Wang, Ziran Hu, Shouwen Yao*, Hui Liu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)

Abstract

Accurate and informative hand-object collision feedback is of vital importance for hand manipulation in virtual reality (VR). However, to our best knowledge, the hand movement performance in fully-occluded and confined VR spaces under visual collision feedback is still under investigation. In this paper, we firstly studied the effects of several popular visual feedback of hand-object collision on hand movement performance. To test the effects, we conducted a within-subject user study (n=18) using a target-reaching task in a confined box. Results indicated that users had the best task performance with see-through visualization, and the most accurate movement with the hybrid of proximity-based gradation and deformation. By further analysis, we concluded that the integration of see-through visualization and proximity-based visual cue could be the best compromise between the speed and accuracy for hand movement in the enclosed VR space. On the basis, we designed a visual collision feedback based on projector decal,which incorporates the advantages of see-through and color gradation. In the end, we present demos of potential usage of the proposed visual cue.

Original languageEnglish
Pages (from-to)1485-1501
Number of pages17
JournalVisual Computer
Volume39
Issue number4
DOIs
Publication statusPublished - Apr 2023

Keywords

  • 3D occlusion management
  • Hand movement performance
  • Occluded interaction
  • See-through visualization
  • Visual collision feedback

Fingerprint

Dive into the research topics of 'Using visual feedback to improve hand movement accuracy in confined-occluded spaces in virtual reality'. Together they form a unique fingerprint.

Cite this