Multimodal Emotion Recognition and Intention Understanding in Human-Robot Interaction

Luefeng Chen*, Zhentao Liu, Min Wu, Kaoru Hirota, Witold Pedrycz

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

5 Citations (Scopus)

Abstract

Emotion recognition and intention understanding are important components of human-robot interaction. In multimodal emotion recognition and intent understanding, feature extraction and selection of recognition methods are related to the calculation of affective computing and the diversity of human-robot interaction. Therefore, by studying multimodal emotion recognition and intention understanding we to create an emotional and human-friendly human-robot interaction environment. This chapter introduces the characteristics of multimodal emotion recognition and intention understanding, presents different modalities emotion feature extraction methods and emotion recognition methods, proposes the intention understanding method, and finally applies them in practice to achieve human-robot interaction.

Original languageEnglish
Title of host publicationStudies in Systems, Decision and Control
PublisherSpringer Science and Business Media Deutschland GmbH
Pages255-288
Number of pages34
DOIs
Publication statusPublished - 2021
Externally publishedYes

Publication series

NameStudies in Systems, Decision and Control
Volume329
ISSN (Print)2198-4182
ISSN (Electronic)2198-4190

Keywords

  • Human-robot interaction
  • Intention understanding
  • Multimodal emotion recognition

Fingerprint

Dive into the research topics of 'Multimodal Emotion Recognition and Intention Understanding in Human-Robot Interaction'. Together they form a unique fingerprint.

Cite this