{"id":558039,"date":"2019-01-02T20:14:06","date_gmt":"2019-01-03T04:14:06","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-project&p=558039"},"modified":"2023-01-13T13:31:43","modified_gmt":"2023-01-13T21:31:43","slug":"audio-devices","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/audio-devices\/","title":{"rendered":"Audio Devices"},"content":{"rendered":"

This research explores audio devices that contain one or more microphones and\/or one or more loudspeakers, and which support audible audio or ultrasound waves. The goal is to optimize the design and placement of microphones and loudspeakers, by considering the characteristics of the acoustical and geometrical environment, along with the perceived soundscape. This research is useful for products such as Microsoft HoloLens and Kinect for Xbox.<\/p>\n

Microphone enclosure acoustical design<\/h3>\n

Placing a transducer into a device changes its acoustical parameters, which means device enclosures are a subject of acoustical design. We have done work on the acoustical design of various transducers for research purposes and assisted Microsoft engineering teams with the design of their products, including the acoustical design optimization for microphones in Kinect for Xbox 360. By optimizing the placement of the microphones and their positions, we achieved a directivity pattern better than that provided by open-air microphones.<\/p>\n

Our acoustics measurement tool: the anechoic chamber<\/h3>\n
\"The

The Microsoft Research anechoic chamber set for measuring human spatial hearing<\/p><\/div>\n

In our work on acoustical design and spatial audio, we have built a specialized setup in the Microsoft Research anechoic chamber. An arc with sixteen loudspeakers and sixteen microphones can be rotated around the device under test, letting us easily and quickly measure the 3D directivity or radiation patterns. We have also used this setup to measure the 3D directivity patterns of human hearing, also known as Head Related Transfer Functions (HRTFs).<\/p>\n

Microphone arrays<\/h3>\n

A single microphone placed in front of a human speaker captures not only speech, but also noise and reverberation. These distortion effects can be reduced through beamforming: positioning several microphones closely together to form a microphone array, then processing the signals from these microphones to achieve higher directivity. Over the years, we have designed and built several prototypes of linear and circular microphone arrays. The gained experience and designed algorithms were used in several Microsoft products: RoundTable device, microphone array support in Windows, Kinect for Xbox, and Microsoft HoloLens.<\/p>\n

Loudspeaker arrays<\/h3>\n

In a similar way, multiple loudspeakers can be used to form a beam of sound toward a certain direction, reaching one listener and reducing the signal level for the others. We can even create multiple beams simultaneously, sending different audio streams to different listeners in the same room. We have built several prototypes of linear loudspeaker arrays targeting, in the main, media room scenarios. One of the latest developments in this area is cross-talk cancellation algorithms. This technology sends two different signals separately to each ear of the listener, such that a loudspeaker array can act as virtual headphones. By combining cross-talk cancellation with spatial audio reproduction and tracking the position and orientation of the user\u2019s head, sound sources behind, above, and below the listener can be reproduced virtually.<\/p>\n

Ultrasound probing devices<\/h3>\n
\"Hand

Hand gesture images, obtained with the ultrasound sensing device for gesture recognition.<\/p><\/div>\n

Similar to how bats and dolphins use echolocation, we are researching the use of beamforming in the ultrasound band to construct images of objects. We can focus the sound by using a loudspeaker array facing a given direction, listen with the microphone array towards the same direction, and capture the reflections from objects in this direction. By scanning the space, we can construct an image of the objects in front of this ultrasound sensing device. The short wavelength of ultrasound allows detection of even small objects. This low-energy-using ultrasound probing device can generate three types of images: reflection, distance, and Doppler, making it possible to use in body position retrieval or gesture recognition applications.<\/p>\n

Devices for augmented and virtual reality (AR\/VR)<\/h3>\n
\"Audio

64-element spherical microphone array with two wide angle cameras for video and audio capture for AR\/VR scenarios.<\/p><\/div>\n

With recent advancements in augmented and virtual reality, it becomes more important to capture audio as a three-dimensional sound field. For these scenarios, we have built a variety of cylindrical and spherical microphone arrays, with 16- or 64-microphone elements. By processing these signals, we can achieve device-independent representation of the sound field. This representation allows further rendering of the audio on an arbitrary system of loudspeakers or on headphones, using spatial audio. In some cases, our spherical microphone arrays have even integrated wide-angle cameras, which converts them into a device that can broadcast AR\/VR audio and video in real time.<\/p>\n","protected":false},"excerpt":{"rendered":"

Our research explores audio devices that contain one or more microphones and\/or one or more loudspeakers, and which support audible audio or ultrasound waves.<\/p>\n","protected":false},"featured_media":558057,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"research-area":[243062],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-558039","msr-project","type-msr-project","status-publish","has-post-thumbnail","hentry","msr-research-area-audio-acoustics","msr-locale-en_us","msr-archive-status-active"],"msr_project_start":"","related-publications":[346982,166735,165800,155562,155563,155571,687735,155559,155573,165801],"related-downloads":[],"related-videos":[],"related-groups":[],"related-events":[],"related-opportunities":[],"related-posts":[],"related-articles":[],"tab-content":[],"slides":[],"related-researchers":[{"type":"user_nicename","display_name":"Sebastian Braun","user_id":37688,"people_section":"Audio and Acoustics Research Group","alias":"sebraun"},{"type":"user_nicename","display_name":"Hannes Gamper","user_id":31943,"people_section":"Audio and Acoustics Research Group","alias":"hagamper"},{"type":"user_nicename","display_name":"David Johnston","user_id":31562,"people_section":"Audio and Acoustics Research Group","alias":"davidjo"},{"type":"user_nicename","display_name":"Ivan Tashev","user_id":32127,"people_section":"Audio and Acoustics Research Group","alias":"ivantash"},{"type":"user_nicename","display_name":"John Romualdez","user_id":32375,"people_section":"Collaborators","alias":"johnrom"},{"type":"guest","display_name":"Becky Gagnon","user_id":632013,"people_section":"Collaborators","alias":""},{"type":"guest","display_name":"Md Tamzeed Islam","user_id":814600,"people_section":"Past Interns","alias":""},{"type":"guest","display_name":"Amit Das","user_id":558048,"people_section":"Past Interns","alias":""},{"type":"guest","display_name":"Supreeth Krishna Rao","user_id":558051,"people_section":"Past Interns","alias":""},{"type":"guest","display_name":"Ivan Dokmanic","user_id":558054,"people_section":"Past Interns","alias":""}],"msr_research_lab":[],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/558039"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-project"}],"version-history":[{"count":10,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/558039\/revisions"}],"predecessor-version":[{"id":665970,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/558039\/revisions\/665970"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/558057"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=558039"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=558039"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=558039"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=558039"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=558039"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}