Multi-modal human-computer interaction system in cockpit

Jie Ren, Yanyan Cui, Jing Chen*, Yuanyuan Qiao, Luhui Wang

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

5 Citations (Scopus)

Abstract

In order to explore new human-machine interaction methods, a set of multi-modal human-machine interaction coordinated control system is proposed, which realizes the basic flight control based on the change of the pilot's field of view, touch control, voice control and other information obtained from multi-mode coordinated control. This system introduces a new type of human-computer interaction into the cockpit application on the basis of the human-computer interaction interface for flight driving, and has carried out research on multi-mode collaborative control system, mainly including eye movement interaction, touch interaction gesture Interaction and voice interaction. Finally, this project formed a multi-mode cooperative control software and hardware system.

Original languageEnglish
Article number012212
JournalJournal of Physics: Conference Series
Volume1693
Issue number1
DOIs
Publication statusPublished - 16 Dec 2020
Event2020 3rd International Conference on Computer Information Science and Artificial Intelligence, CISAI 2020 - Hulun Buir, Inner Mongolia, China
Duration: 25 Sept 202027 Sept 2020

Fingerprint

Dive into the research topics of 'Multi-modal human-computer interaction system in cockpit'. Together they form a unique fingerprint.

Cite this