{"id":381242,"date":"2017-05-04T14:06:35","date_gmt":"2017-05-04T21:06:35","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-project&p=381242"},"modified":"2019-08-19T10:01:47","modified_gmt":"2019-08-19T17:01:47","slug":"sparse-haptic-proxy","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/sparse-haptic-proxy\/","title":{"rendered":"Sparse Haptic Proxy"},"content":{"rendered":"
\u00a0<\/b>\"\"<\/h6>\n
The user of Virtual Reality, may find himself in different virtual worlds, such as a spy game (middle) or a space simulator(right) , yet they all give him tangible feedback using the same physical geometry in the real world (left).<\/h6>\n

We propose a class of passive haptics that we call Sparse Haptic Proxy: a set of geometric primitives that simulate touch feedback in elaborate virtual reality scenes. Unlike previous passive haptics that replicate the virtual environment in physical space, a Sparse Haptic Proxy simulates a scene\u2019s detailed geometry by redirecting the user\u2019s hand to a matching primitive of the proxy. To bridge the divergence of the scene from the proxy, we augment an existing Haptic Retargeting technique with an on-the-fly target remapping: We predict users\u2019 intentions during interaction in the virtual space by analyzing their gaze and hand motions, and consequently redirect their hand to a matching part of the proxy. \\ We conducted three user studies on haptic retargeting technique and implemented a system from three main results: 1) The maximum angle participants found acceptable for retargeting their hand is 40\u00b0, with an average rating of 4.6 out of 5. 2) Tracking participants\u2019 eye gaze reliably predicts their touch intentions (97.5%), even while simultaneously manipulating the user\u2019s hand-eye coordination for retargeting. 3) Participants preferred minimized retargeting distances over better-matching surfaces of our Sparse Haptic Proxy when receiving haptic feedback for single-finger touch input. \\ We demonstrate our system with two virtual scenes: a flight cockpit and a room quest game. While their scene geometries differ substantially, both use the same sparse haptic proxy to provide haptic feedback to the user during task completion..<\/p>\n

\u00a0<\/span>\"\"<\/p>\n

The user operations in the virtual world: Touching a log, picking a key, rolling a dial, vs. the user touch in the real world.<\/h6>\n

\u00a0<\/span><\/p>\n

\u00a0<\/span><\/p>\n

 <\/p>\n","protected":false},"excerpt":{"rendered":"

\u00a0 The user of Virtual Reality, may find himself in different virtual worlds, such as a spy game (middle) or a space simulator(right) , yet they all give him tangible feedback using the same physical geometry in the real world (left). We propose a class of passive haptics that we call Sparse Haptic Proxy: a […]<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"footnotes":""},"research-area":[13554],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-381242","msr-project","type-msr-project","status-publish","hentry","msr-research-area-human-computer-interaction","msr-locale-en_us","msr-archive-status-active"],"msr_project_start":"2017-05-04","related-publications":[381233],"related-downloads":[],"related-videos":[],"related-groups":[],"related-events":[],"related-opportunities":[],"related-posts":[],"related-articles":[],"tab-content":[],"slides":[],"related-researchers":[{"type":"user_nicename","display_name":"Andy Wilson","user_id":31159,"people_section":"Group 1","alias":"awilson"}],"msr_research_lab":[],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/381242"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-project"}],"version-history":[{"count":4,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/381242\/revisions"}],"predecessor-version":[{"id":391229,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/381242\/revisions\/391229"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=381242"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=381242"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=381242"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=381242"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=381242"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}