A Taxonomy of Sounds in Virtual Reality

  • Dhruv Jain ,
  • ,
  • Eyal Ofek ,
  • Mike Sinclair ,
  • John Porter ,
  • Chris Yoon ,
  • Swetha Machanavajhala ,
  • Meredith Ringel Morris

ACM Designing Interactive Systems (DIS2021) |

Published by ACM | Organized by ACM

DOI | Presentation (ppt)

Abstract

Virtual reality (VR) leverages human sight, hearing and touch senses to convey virtual experiences. For d/Deaf and hard of hearing (DHH) people, information conveyed through sound may not be accessible. To help with future design of accessible VR sound representations for DHH users, this paper contributes a consistent language and structure for representing sounds in VR. Using two studies, we report on the design and evaluation of a novel taxonomy for VR sounds. Study 1 included interviews with 10 VR sound designers to develop our taxonomy along two dimensions: sound source and intent. To evaluate this taxonomy, we conducted another study (Study 2) where eight HCI researchers used our taxonomy to document sounds in 33 VR apps. We found that our taxonomy was able to successfully categorize nearly all sounds (265/267) in these apps. We also uncovered additional insights for designing accessible visual and haptic-based sound substitutes for DHH users.

ACM DIS 2021 Presentation