TY - JOUR
T1 - CEAP-360VR
T2 - A Continuous Physiological and Behavioral Emotion Annotation Dataset for 360 VR Videos
AU - Xue, Tong
AU - El Ali, Abdallah
AU - Zhang, Tianyi
AU - Ding, Gangyi
AU - Cesar, Pablo
N1 - Publisher Copyright:
© 1999-2012 IEEE.
PY - 2023
Y1 - 2023
N2 - Watching 360 videos using Virtual Reality (VR) head-mounted displays (HMDs) provides interactive and immersive experiences, where videos can evoke different emotions. Existing emotion self-report techniques within VR however are either retrospective or interrupt the immersive experience. To address this, we introduce the Continuous Physiological and Behavioral Emotion Annotation Dataset for 360 Videos (CEAP-360VR). We conducted a controlled study (N=32) where participants used a Vive Pro Eye HMD to watch eight validated affective 360 video clips, and annotated their valence and arousal (V-A) continuously. We collected (a) behavioral (head and eye movements; pupillometry) signals (b) physiological (heart rate, skin temperature, electrodermal activity) responses (c) momentary emotion self-reports (d) within-VR discrete emotion ratings (e) motion sickness, presence, and workload. We show the consistency of continuous annotation trajectories and verify their mean V-A annotations. We find high consistency between viewed 360 video regions across subjects, with higher consistency for eye than head movements. We furthermore run baseline classification experiments, where Random Forest classifiers with 2s segments show good accuracies for subject-independent models: 66.80% (V) and 64.26% (A) for binary classification; 49.92% (V) and 52.20% (A) for 3-class classification. Our open dataset allows further experiments with continuous emotion self-reports collected in 360 VR environments, which can enable automatic assessment of immersive Quality of Experience (QoE) andmomentary affective states.
AB - Watching 360 videos using Virtual Reality (VR) head-mounted displays (HMDs) provides interactive and immersive experiences, where videos can evoke different emotions. Existing emotion self-report techniques within VR however are either retrospective or interrupt the immersive experience. To address this, we introduce the Continuous Physiological and Behavioral Emotion Annotation Dataset for 360 Videos (CEAP-360VR). We conducted a controlled study (N=32) where participants used a Vive Pro Eye HMD to watch eight validated affective 360 video clips, and annotated their valence and arousal (V-A) continuously. We collected (a) behavioral (head and eye movements; pupillometry) signals (b) physiological (heart rate, skin temperature, electrodermal activity) responses (c) momentary emotion self-reports (d) within-VR discrete emotion ratings (e) motion sickness, presence, and workload. We show the consistency of continuous annotation trajectories and verify their mean V-A annotations. We find high consistency between viewed 360 video regions across subjects, with higher consistency for eye than head movements. We furthermore run baseline classification experiments, where Random Forest classifiers with 2s segments show good accuracies for subject-independent models: 66.80% (V) and 64.26% (A) for binary classification; 49.92% (V) and 52.20% (A) for 3-class classification. Our open dataset allows further experiments with continuous emotion self-reports collected in 360 VR environments, which can enable automatic assessment of immersive Quality of Experience (QoE) andmomentary affective states.
KW - 360 video
KW - HMD
KW - continuous annotation
KW - dataset
KW - emotion
KW - head and eye movement
KW - physiological signals
KW - virtual reality
UR - http://www.scopus.com/inward/record.url?scp=85118688427&partnerID=8YFLogxK
U2 - 10.1109/TMM.2021.3124080
DO - 10.1109/TMM.2021.3124080
M3 - Article
AN - SCOPUS:85118688427
SN - 1520-9210
VL - 25
SP - 243
EP - 255
JO - IEEE Transactions on Multimedia
JF - IEEE Transactions on Multimedia
ER -