TY - GEN
T1 - Functional-Penetrated Interactive System Towards Virtual-Real Fusion Environments
AU - Weng, Dongdong
AU - He, Wenjie
AU - Guo, Shushan
AU - Li, Dong
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Current mixed reality technology cannot completely map the information in the physical world to the virtual environment, and the user must take off the head-mounted display to deal with the affairs in the real environment when using the mixed reality system. This greatly reduces the immersion created by the virtual reality system, and also reduces work efficiency. In order to realize the user's direct interaction with the physical world in the virtual environment without destroying the immersion of the virtual reality system, this paper proposes a functional-penetrated interactive system towards the virtual-real fusion environment, which extracts the visual information of objects from the real scene captured by the RGB camera through deep neural networks, completes virtual-real registration and presents it in real time in the virtual environment, and finally enables the user to complete the manipulation of the physical objects in the virtual environment. In addition, the system also allows the user to dynamically adjust the virtual-real fusion ratio according to the user's needs, which achieves a balance between the immersion of the virtual-real fusion environment and the interaction efficiency of the system. Experimental results show that the system can provide users with more realistic physical feedback, with higher interaction freedom and universality.
AB - Current mixed reality technology cannot completely map the information in the physical world to the virtual environment, and the user must take off the head-mounted display to deal with the affairs in the real environment when using the mixed reality system. This greatly reduces the immersion created by the virtual reality system, and also reduces work efficiency. In order to realize the user's direct interaction with the physical world in the virtual environment without destroying the immersion of the virtual reality system, this paper proposes a functional-penetrated interactive system towards the virtual-real fusion environment, which extracts the visual information of objects from the real scene captured by the RGB camera through deep neural networks, completes virtual-real registration and presents it in real time in the virtual environment, and finally enables the user to complete the manipulation of the physical objects in the virtual environment. In addition, the system also allows the user to dynamically adjust the virtual-real fusion ratio according to the user's needs, which achieves a balance between the immersion of the virtual-real fusion environment and the interaction efficiency of the system. Experimental results show that the system can provide users with more realistic physical feedback, with higher interaction freedom and universality.
KW - deep learning
KW - function penetration
KW - human-computer interaction
KW - virtual-real fusion
UR - http://www.scopus.com/inward/record.url?scp=85171781926&partnerID=8YFLogxK
U2 - 10.1109/ICSPS58776.2022.00151
DO - 10.1109/ICSPS58776.2022.00151
M3 - Conference contribution
AN - SCOPUS:85171781926
T3 - Proceedings - 2022 14th International Conference on Signal Processing Systems, ICSPS 2022
SP - 842
EP - 850
BT - Proceedings - 2022 14th International Conference on Signal Processing Systems, ICSPS 2022
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 14th International Conference on Signal Processing Systems, ICSPS 2022
Y2 - 18 November 2022 through 20 November 2022
ER -