CMMA: Benchmarking Multi-Affection Detection in Chinese Multi-Modal Conversations

Yazhou Zhang, Yang Yu, Qing Guo, Benyou Wang, Dongming Zhao, Sagar Uprety, Dawei Song, Qiuchi Li*, Jing Qin

*此作品的通讯作者

科研成果: 期刊稿件会议文章同行评审

摘要

Human communication has a multi-modal and multi-affect nature. The interrelatedness of different emotions and sentiments poses a challenge to jointly detect multiple human affects with multi-modal clues. Recent advances in this field employed multi-task learning paradigms to render the inter-relatedness across tasks, but the scarcity of publicly available resources sets a limit to the potential of works. To fill this gap, we build the first Chinese Multi-modal Multi-Affect conversation (CMMA) dataset, which contains 3, 000 multi-party conversations and 21, 795 multi-modal utterances collected from various styles of TV-series. CMMA contains a wide variety of affect labels, including sentiment, emotion, sarcasm and humor, as well as the novel inter-correlations values between certain pairs of tasks. Moreover, it provides the topic and speaker information in conversations, which promotes better modeling of conversational context. On the dataset, we empirically analyze the influence of different data modalities and conversational contexts on different affect analysis tasks, and exhibit the practical benefit of inter-task correlations. The full dataset will be publicly available for research.

源语言英语
期刊Advances in Neural Information Processing Systems
36
出版状态已出版 - 2023
活动37th Conference on Neural Information Processing Systems, NeurIPS 2023 - New Orleans, 美国
期限: 10 12月 202316 12月 2023

指纹

探究 'CMMA: Benchmarking Multi-Affection Detection in Chinese Multi-Modal Conversations' 的科研主题。它们共同构成独一无二的指纹。

引用此