Modeling of performance creative evaluation driven by multimodal affective data

Yufeng Wu, Longfei Zhang*, Gangyi Ding, Tong Xue, Fuquan Zhang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

Performance creative evaluation can be achieved through affective data, and the use of affective features to evaluate performance creative is a new research trend. This paper proposes a “Performance Creative— Multimodal Affective (PC-MulAff)” model based on the multimodal affective features for performance creative evaluation. The multimedia data acquisition equipment is used to collect the physiological data of the audience, including the multimodal affective data such as the facial expression, heart rate and eye movement. Calculate affective features of multimodal data combined with director annotation, and defined “Performance Creative— Affective Acceptance (PC-Acc)” based on multimodal affective features to evaluate the quality of performance creative. This paper verifies the PC-MulAff model on different performance data sets. The experimental results show that the PC-MulAff model shows high evaluation quality in different performance forms. In the creative evaluation of dance performance, the accuracy of the model is 7.44% and 13.95% higher than that of the single textual and single video evaluation.

Original languageEnglish
Pages (from-to)90-100
Number of pages11
JournalInternational Journal of Interactive Multimedia and Artificial Intelligence
Volume6
Issue number7
DOIs
Publication statusPublished - 2021

Keywords

  • Affective Acceptance
  • Data-driven
  • Multimedia Acquisition
  • Multimodal Affective Feature
  • Performance Creative Evaluation

Fingerprint

Dive into the research topics of 'Modeling of performance creative evaluation driven by multimodal affective data'. Together they form a unique fingerprint.

Cite this