{"id":546402,"date":"2018-10-29T15:51:40","date_gmt":"2018-10-29T22:51:40","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-research-item&p=546402"},"modified":"2020-10-16T12:38:20","modified_gmt":"2020-10-16T19:38:20","slug":"wave-acoustics-in-a-mixed-reality-shell","status":"publish","type":"msr-research-item","link":"https:\/\/www.microsoft.com\/en-us\/research\/publication\/wave-acoustics-in-a-mixed-reality-shell\/","title":{"rendered":"Wave Acoustics in a Mixed Reality Shell"},"content":{"rendered":"
We demonstrate the first integration of wave acoustics in a virtual reality operating system. The Windows mixed reality shell hosts third-party applications inside a 3D virtual home, propagating sound from these applications throughout the environment to provide a natural user interface. Rather than applying manually-designed reverberation volumes or ray-traced geometric acoustics, we use wave acoustics that robustly captures cues like diffracted occlusion and reverberation propagating through portals while reducing the design and maintenance burden. Not modeling such cues can result in jarring inconsistencies like applications in an adjoining room heard clearly across the wall, breaking immersion. We describe our rendering implementation, materials-based design techniques, reverberation tuning, dynamic range management, and temporal smoothing that ensure a natural listening experience across unpredictable audio content and user motion.<\/p>\n