{"id":510287,"date":"2018-10-30T14:45:27","date_gmt":"2018-10-30T21:45:27","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-project&p=510287"},"modified":"2018-12-03T13:24:38","modified_gmt":"2018-12-03T21:24:38","slug":"prism","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/prism\/","title":{"rendered":"Project PRISM"},"content":{"rendered":"

Early in 2016 our team began developing augmented reality (AR) headsets by adding stereo cameras to a virtual reality (VR) system. The user sees the real world through the video feed from the cameras. Synthetic imagery is composited with the video to create a low-cost AR system.<\/p>\n

This form of AR has several advantages over optical see through AR:<\/p>\n

1. Low cost camera modules add roughly $40 parts cost to a VR headset<\/p>\n

2. Wide field of view, achieving more than 100+ degrees horizontally and vertically<\/p>\n

3. Works in full sunlight, with truly opaque synthetic imagery. This improves color rendition and contrast of synthetic imagery even indoors.<\/p>\n

Our first prototypes used off-the-shelf machine vision cameras attached to a commercial VR headset with a display resolution of approximately 1M pixel\/eye at 60Hz or 90Hz (Figs. 1,2). Two USB 3.0 cables transmit the camera images to the host CPU.<\/p>\n

We wrote our own real-time image signal processing pipeline to demosaic the Bayer image, correct for camera lens distortion, and correct color. Photon-to-photon latency is approximately 50ms. This delay can be corrected surprisingly well with a late-stage reprojection homography.<\/p>\n

\"

Figure 1\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Figure 2<\/p><\/div>\n

The cameras were synchronized to the displays by a custom sync board (Fig. 3) , designed and built in the Microsoft Research (MSR) Labs hardware lab. It was triggered by the vertical refresh signal from the display circuit on the head-mounted display (HMD). This signal is not available on the exterior of any of the commercial headsets we used, so we opened the HMD case and probed pins on the display circuit board to find one that had the correct display update frequency.<\/p>\n

\"\"

Figure 3<\/p><\/div>\n

These early prototypes were promising enough to encourage us to make a second generation headset, which we began in the summer of 2017. The biggest weaknesses of the first generation were the bulk and weight of the large form factor-machine vision cameras, and the low resolution of the displays. Both have been addressed in the second generation.<\/p>\n

The new headsets have a custom camera control circuit board designed and built by the MSR Labs hardware lab. This uses a rolling-shutter image sensor, the OmniVision OV4689, which can capture 4M pixels at 90Hz, within a cell phone form factor. The cameras are again synchronized to the VR headset displays by tapping the sync signal from the display controller board, but the sync circuitry is contained completely in the camera controller board.<\/p>\n

The new camera module is much smaller and lighter than the smallest off-the-shelf machine vision system of equivalent resolution and frame rate (Figs. 4,5). Image quality is also surprisingly good given the small 2-micron-square pixels.<\/p>\n

\"The

Figure 4\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Figure 5<\/p><\/div>\n

The sensor video is transferred to the host computer over USB 3.0 cables via a Cypress CX3 MIPI to USB bridge chip. Bandwidth limitations in this chip limited 90Hz capture to 1920\u00d71440, not quite the full 2688\u00d71520 resolution the sensor is capable of.<\/p>\n

The MSR Labs Central Engineering team wrote custom camera-controller firmware so the camera frame rate, exposure, and other sensor parameters can be set in software on the host PC. They also wrote custom USB drivers to handle the high-data-rate video coming over the USB 3.0 cables. Photon-to-photon latency is approximately 50 ms, again corrected with a late-stage reprojection homography.<\/p>\n

\"headset

Figure 6\u00a0\u00a0\u00a0 \u00a0 \u00a0\u00a0 \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Figure 7<\/p><\/div>\n

The higher resolution significantly improves the view of the real world and the lightweight camera modules make it far more comfortable than the first-generation prototypes. In addition, the inside out tracking of the Mixed Reality headset also did away with the mobility limitations of our previous prototypes. This system uses a backpack computer and is fully mobile (Figs. 6,7).<\/p>\n

Approximately 40 of these headsets were produced, which are being used for research purposes in various groups inside Microsoft. Calibrating this many headsets manually is unreasonably time consuming so we created a robotic camera calibration system to do it automatically (Fig. 8).<\/p>\n

\"Calibrating

Figure 8<\/p><\/div>\n

We are actively developing future prototypes with latency in the 5-10 ms range. Higher resolution prototypes will be made as the resolution of off-the shelf HMDs increases.<\/p>\n","protected":false},"excerpt":{"rendered":"

Early in 2016 our team began developing augmented reality (AR) headsets by adding stereo cameras to a virtual reality (VR) system. The user sees the real world through the video feed from the cameras. Synthetic imagery is composited with the video to create a low-cost AR system. This form of AR has several advantages over […]<\/p>\n","protected":false},"featured_media":511049,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"research-area":[13551],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-510287","msr-project","type-msr-project","status-publish","has-post-thumbnail","hentry","msr-research-area-graphics-and-multimedia","msr-locale-en_us","msr-archive-status-active"],"msr_project_start":"","related-publications":[],"related-downloads":[],"related-videos":[],"related-groups":[],"related-events":[],"related-opportunities":[],"related-posts":[],"related-articles":[],"tab-content":[],"slides":[],"related-researchers":[{"type":"user_nicename","display_name":"Ran Gal","user_id":33388,"people_section":"Interactive Media Group","alias":"rgal"},{"type":"user_nicename","display_name":"Dr Kiran Muthabatulla","user_id":37923,"people_section":"Interactive Media Group","alias":"kiranm"},{"type":"user_nicename","display_name":"Irina Spiridonova","user_id":32110,"people_section":"Engineering Team","alias":"irinasp"},{"type":"user_nicename","display_name":"Robert Gruen","user_id":33427,"people_section":"Engineering Team","alias":"robgruen"},{"type":"user_nicename","display_name":"Rahee Ghosh Peshawaria","user_id":36864,"people_section":"Engineering Team","alias":"raghosh"},{"type":"user_nicename","display_name":"Chuck Needham","user_id":31435,"people_section":"Engineering Team","alias":"chuckne"},{"type":"user_nicename","display_name":"John Romualdez","user_id":32375,"people_section":"Hardware Lab","alias":"johnrom"},{"type":"guest","display_name":"Teresa LaScala","user_id":475962,"people_section":"Hardware Lab","alias":""},{"type":"guest","display_name":"Song Vong","user_id":475971,"people_section":"Hardware Lab","alias":""},{"type":"guest","display_name":"Todd Jurgensen","user_id":475959,"people_section":"Hardware Lab","alias":""},{"type":"guest","display_name":"Lex Story","user_id":475968,"people_section":"Hardware Lab","alias":""},{"type":"guest","display_name":"Sokunthea Neang","user_id":475965,"people_section":"Hardware Lab","alias":""}],"msr_research_lab":[199565],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/510287"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-project"}],"version-history":[{"count":46,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/510287\/revisions"}],"predecessor-version":[{"id":548253,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/510287\/revisions\/548253"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/511049"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=510287"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=510287"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=510287"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=510287"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=510287"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}