{"id":332726,"date":"2016-12-07T09:01:29","date_gmt":"2016-12-07T17:01:29","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-research-item&p=332726"},"modified":"2018-10-16T20:09:39","modified_gmt":"2018-10-17T03:09:39","slug":"rubberneck-navigation-manipulating-frames-reference-3d-navigation-2d-input-devices","status":"publish","type":"msr-research-item","link":"https:\/\/www.microsoft.com\/en-us\/research\/publication\/rubberneck-navigation-manipulating-frames-reference-3d-navigation-2d-input-devices\/","title":{"rendered":"Rubberneck Navigation: Manipulating Frames of Reference for 3D Navigation with 2D Input Devices"},"content":{"rendered":"

We present a novel technique for allowing users to control both viewpoint motion and orientation using a 2D input device with a button to provide cognitively simple and unobtrusive navigation in desktop 3D virtual environments. In this task, the user must control the three frames of reference\u2014environment, body, and head\u2014with a single 2D input device. The underlying observation to our solution is the fact that users’ control of their frames of reference may be separately, but transparently, constrained in the two dissociated modes of use: wayfinding and travel. When users are stationary, we couple the head to the body and allow them to control the orientation of their entire being and to specify a 3D path to control future viewpoint motion. While in motion along this path, we decouple control of the head from the body, giving users the flexibility to look around the environment while the body continues to move along the specified path, a process we term rubbernecking (Figure 1). With a button click, users may toggle between modes and may re-specify paths at any time.<\/p>\n","protected":false},"excerpt":{"rendered":"

We present a novel technique for allowing users to control both viewpoint motion and orientation using a 2D input device with a button to provide cognitively simple and unobtrusive navigation in desktop 3D virtual environments. In this task, the user must control the three frames of reference\u2014environment, body, and head\u2014with a single 2D input device. […]<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"footnotes":""},"msr-content-type":[3],"msr-research-highlight":[],"research-area":[13551,13554],"msr-publication-type":[193726],"msr-product-type":[],"msr-focus-area":[],"msr-platform":[],"msr-download-source":[],"msr-locale":[268875],"msr-field-of-study":[],"msr-conference":[],"msr-journal":[],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-332726","msr-research-item","type-msr-research-item","status-publish","hentry","msr-research-area-graphics-and-multimedia","msr-research-area-human-computer-interaction","msr-locale-en_us"],"msr_publishername":"","msr_edition":"","msr_affiliation":"","msr_published_date":"1999-09-14","msr_host":"","msr_duration":"","msr_version":"","msr_speaker":"","msr_other_contributors":"","msr_booktitle":"","msr_pages_string":"","msr_chapter":"","msr_isbn":"","msr_journal":"","msr_volume":"","msr_number":"","msr_editors":"","msr_series":"","msr_issue":"","msr_organization":"","msr_how_published":"","msr_notes":"","msr_highlight_text":"","msr_release_tracker_id":"","msr_original_fields_of_study":"","msr_download_urls":"","msr_external_url":"","msr_secondary_video_url":"","msr_longbiography":"","msr_microsoftintellectualproperty":1,"msr_main_download":"332732","msr_publicationurl":"","msr_doi":"","msr_publication_uploader":[{"type":"file","title":"rubber-neck-navigation-unpublished-manuscript-1999","viewUrl":"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2016\/12\/Rubber-Neck-Navigation-Unpublished-Manuscript-1999.pdf","id":332732,"label_id":0}],"msr_related_uploader":"","msr_attachments":[],"msr-author-ordering":[{"type":"user_nicename","value":"desney","user_id":31613,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=desney"},{"type":"text","value":"Mathew J. Conway","user_id":0,"rest_url":false},{"type":"user_nicename","value":"kenh","user_id":32521,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=kenh"},{"type":"text","value":"Takeo Igarashi","user_id":0,"rest_url":false},{"type":"text","value":"Dennis R. Proffitt","user_id":0,"rest_url":false},{"type":"text","value":"George G. Robertson","user_id":0,"rest_url":false},{"type":"text","value":"Randy Pausch","user_id":0,"rest_url":false}],"msr_impact_theme":[],"msr_research_lab":[],"msr_event":[],"msr_group":[],"msr_project":[],"publication":[],"video":[],"download":[],"msr_publication_type":"unpublished","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/332726"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-research-item"}],"version-history":[{"count":1,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/332726\/revisions"}],"predecessor-version":[{"id":523519,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/332726\/revisions\/523519"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=332726"}],"wp:term":[{"taxonomy":"msr-content-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-content-type?post=332726"},{"taxonomy":"msr-research-highlight","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-highlight?post=332726"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=332726"},{"taxonomy":"msr-publication-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-publication-type?post=332726"},{"taxonomy":"msr-product-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-product-type?post=332726"},{"taxonomy":"msr-focus-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-focus-area?post=332726"},{"taxonomy":"msr-platform","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-platform?post=332726"},{"taxonomy":"msr-download-source","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-download-source?post=332726"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=332726"},{"taxonomy":"msr-field-of-study","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-field-of-study?post=332726"},{"taxonomy":"msr-conference","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-conference?post=332726"},{"taxonomy":"msr-journal","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-journal?post=332726"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=332726"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=332726"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}