Abstract
To evoke a place illusion, virtual reality builds upon the integration of coherent sensory information from multiple modalities. This integrative view of perception could be contradicted when quality evaluation of virtual reality is divided into multiple uni-modal tests. We show the type and cross-modal consistency of visual content to affect overall audio quality in a six-degrees-of-freedom virtual environment with expert and naïve participants. The effect is observed both in their movement patterns and direct quality scores given to three real-time binaural audio rendering technologies. Our experiments show that the visual content has a statistically significant effect on the perceived audio quality.
Original language | English |
---|---|
Publication status | Published - 2018 |
Event | 145th Audio Engineering Society International Convention, AES 2018 - New York, United States Duration: 18 Oct 2018 → 21 Oct 2018 |
Conference
Conference | 145th Audio Engineering Society International Convention, AES 2018 |
---|---|
Country/Territory | United States |
City | New York |
Period | 18/10/18 → 21/10/18 |