Text Input Through Swipe Gestures Based Personalized AI Agent

Xiangyu Qi, Dongdong Weng*, Jie Hao, Zihao Li

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Text input based on virtual reality is a common technique in interactive systems, but it may face challenges related to input efficiency and task load when interacting with VR hardware and applications. The development of artificial intelligence interaction technologies offers new approaches to addressing text input challenges. In this paper, we propose a swipe gesture-based text input method under a personalized AI agent framework, combining portable devices (e.g. smartphones) with VR input and incorporating user profile information, input habits, and conversational intent. By integrating the GPT-3.5 model to train a personalized AI agent, we emphasize the importance of understanding and responding to human behavior or capabilities from the agent's perspective, enabling text prediction based on specific contexts. The keyboard layout design is based on a disk divided into 8 equal regions, where the outer circle is subdivided into key areas containing letters, and the inner circle serves as the input buffer area. By resolving word ambiguities based on user input and leveraging the extensive capabilities of large language models in context awareness and text prediction, the system allows complete sentences to be generated from keywords. This reduces the number of manual inputs required by the user, improving text input efficiency and enhancing the overall user experience.

Original languageEnglish
Title of host publicationProceedings - 2024 3rd International Conference on Automation, Robotics and Computer Engineering, ICARCE 2024
EditorsJinyang Xu
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages435-439
Number of pages5
ISBN (Electronic)9798331529505
DOIs
Publication statusPublished - 2024
Event3rd International Conference on Automation, Robotics and Computer Engineering, ICARCE 2024 - Virtual, Online
Duration: 17 Dec 202418 Dec 2024

Publication series

NameProceedings - 2024 3rd International Conference on Automation, Robotics and Computer Engineering, ICARCE 2024

Conference

Conference3rd International Conference on Automation, Robotics and Computer Engineering, ICARCE 2024
CityVirtual, Online
Period17/12/2418/12/24

Keywords

  • Human-computer interaction (HCI)
  • LLM
  • Text Input
  • Virtual Reality

Fingerprint

Dive into the research topics of 'Text Input Through Swipe Gestures Based Personalized AI Agent'. Together they form a unique fingerprint.

Cite this