CEAP-360VR: A Continuous Physiological and Behavioral Emotion Annotation Dataset for 360 VR Videos

Tong Xue*, Abdallah El Ali, Tianyi Zhang, Gangyi Ding, Pablo Cesar

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

23 引用 (Scopus)

摘要

Watching 360 videos using Virtual Reality (VR) head-mounted displays (HMDs) provides interactive and immersive experiences, where videos can evoke different emotions. Existing emotion self-report techniques within VR however are either retrospective or interrupt the immersive experience. To address this, we introduce the Continuous Physiological and Behavioral Emotion Annotation Dataset for 360 Videos (CEAP-360VR). We conducted a controlled study (N=32) where participants used a Vive Pro Eye HMD to watch eight validated affective 360 video clips, and annotated their valence and arousal (V-A) continuously. We collected (a) behavioral (head and eye movements; pupillometry) signals (b) physiological (heart rate, skin temperature, electrodermal activity) responses (c) momentary emotion self-reports (d) within-VR discrete emotion ratings (e) motion sickness, presence, and workload. We show the consistency of continuous annotation trajectories and verify their mean V-A annotations. We find high consistency between viewed 360 video regions across subjects, with higher consistency for eye than head movements. We furthermore run baseline classification experiments, where Random Forest classifiers with 2s segments show good accuracies for subject-independent models: 66.80% (V) and 64.26% (A) for binary classification; 49.92% (V) and 52.20% (A) for 3-class classification. Our open dataset allows further experiments with continuous emotion self-reports collected in 360 VR environments, which can enable automatic assessment of immersive Quality of Experience (QoE) andmomentary affective states.

源语言英语
页(从-至)243-255
页数13
期刊IEEE Transactions on Multimedia
25
DOI
出版状态已出版 - 2023

指纹

探究 'CEAP-360VR: A Continuous Physiological and Behavioral Emotion Annotation Dataset for 360 VR Videos' 的科研主题。它们共同构成独一无二的指纹。

引用此