{"id":306041,"date":"2010-10-04T05:45:01","date_gmt":"2010-10-04T12:45:01","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?p=306041"},"modified":"2019-09-30T17:43:20","modified_gmt":"2019-10-01T00:43:20","slug":"uist-showcases-novel-interfaces","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/research\/blog\/uist-showcases-novel-interfaces\/","title":{"rendered":"UIST Showcases Novel Interfaces"},"content":{"rendered":"

By Janie Chang, Writer, Microsoft Research<\/em><\/p>\n

Hallway conversations at UIST 2010<\/a> can sound like planning discussions for science-fiction-movie special effects, buzzing with terms such as \u201cwearable computing,\u201d \u201caugmented reality,\u201d and \u201csmart rooms.\u201d<\/p>\n

UIST, the Association for Computing Machinery\u2019s (ACM\u2019s) Symposium on User Interface Software and Technology, brings together researchers and practitioners of new user interfaces. Taking place in New York City from Oct. 3 to 6, the event features presentations and demos that cover topics ranging from traditional graphical and web user interfaces to virtual and augmented reality to tools that assist the motor-impaired. Sponsored by the ACM’s special-interest groups on computer-human interaction and computer graphics, UIST is an opportunity for attendees to stay current with the latest breakthroughs in user-interaction techniques.<\/p>\n

Mixing Real and Virtual Worlds<\/h2>\n

Significant hardware technology advances in recent years have brought radical changes to the types of research projects on natural user interfaces (NUIs) attendees are seeing during UIST. One of the many intriguing projects coming out of Microsoft Research Redmond<\/a>\u2019s Adaptive Systems and Interaction<\/a> group is being presented by Andy Wilson<\/a>, senior researcher, and Hrvoje Benko<\/a>, researcher, in a paper titled Combining Multiple Depth Cameras and Projectors for Interactions On, Above, and Between Surfaces<\/em><\/a>.<\/em> The paper describes the interactions and algorithms unique to LightSpace, a small \u201csmart room,\u201d instrumented with depth cameras and projectors, that has been designed to explore a variety of computational strategies related to interactive displays and the space they inhabit.<\/p>\n

\"LightSpace

LightSpace configuration: All projectors and cameras are suspended at a central location above the users, leaving the rest of the space open and configurable.<\/p><\/div>\n

Latest-generation depth cameras simplify traditionally difficult computer-vision problems and enables inexpensive real-time 3-D modeling of surface geometry. In LightSpace, cameras and projectors are calibrated to 3-D real-world coordinates, enabling projection of graphics onto any surface visible by both camera and projector, including floors, walls, furniture, and people.<\/p>\n

Benko and Wilson used LightSpace to study how depth cameras enable new interactive experiences such as transferring projected objects from one display to another by simultaneously touching the object and the destination display. Or a user can \u201cpick up\u201d the object by sweeping it into a hand, see a representation of the object in the hand while walking to an interactive wall display, and \u201cdrop\u201d the object onto the wall by touching it with the other hand.<\/p>\n

\"Picking

Picking up objects from a table is accomplished by swiping them into a hand. The object\u2014in this case, a red ball\u2014is then represented visually with an icon in the hand.<\/p><\/div>\n

The research is a continuation of many previous projects in which the pair combined projectors and cameras to provide interactivity in space without having to wear tracking gear or displays. The LightSpace project began in January 2010, and a first working demonstration of their \u201csmart room\u201d was presented in March during TechFest 2010<\/a>, Microsoft Research\u2019s annual technology showcase for Microsoft employees.<\/p>\n

\u201cAndy and I are fascinated,\u201d Benko says, \u201cwith the idea that all the intelligence in the room actually just consists of smart projectors and cameras that simulate various interactive surfaces and provide connectivity between them. We treat our projector-camera units as smart lighting that can sense the user and the user\u2019s actions, as well as provide information on whatever surface the user desires. One of the most fascinating aspects of LightSpace is that, in the absence of the available surface, in mid-air, we use the human body as a canvas to project information.\u201d<\/p>\n

\"through-body

LightSpace enables users to move an image from one surface to another using through-body transition.<\/p><\/div>\n

Wilson already sees a number of future directions exploring the range of new interactions that LightSpace enables.<\/p>\n

\u201cWhat other kinds of interactions can be embedded in 3-D space?\u201d he speculates. \u201cWhat if users were able to reconfigure their workspace, moving tables and wall surfaces into place as needed? LightSpace could find and track these interactive surfaces on the fly.<\/p>\n

\u201cOr think about conference-room settings. Imagine sitting down to a meeting and having all your relevant documents appear before you, then sharing them with other meeting participants effortlessly with a quick flicking gesture, or opening up a document on a wall by touching the document on the table and pointing at a spot on the wall, rather than fumbling with video cables.\u201d<\/p>\n

When asked whether LightSpace could be the early stages of a Star Trek holodeck, Wilson laughs out loud.<\/p>\n

\u201cWe would be lying if we said that science fiction had no influence on our work!\u201d he says. \u201cYes, the holodeck is a great vision, but until we have room-sized holographic video, fantastic wireless tactile displays, and all the artificial intelligence implied by the holodeck, we need to carefully design our interactive systems with current technical limitations in mind. But you know, designing around technical limitations is all part of the fun and challenge, and the solutions can be quite inventive!\u201d<\/p>\n

Productive Collaboration<\/h2>\n

Microsoft Research has a long history of external collaboration, and this year\u2019s contributions to the conference reflect this ongoing commitment, as evidenced in the paper Jogging over a Distance between Europe and Australia<\/em><\/a>, <\/em>a collaboration between Florian \u201cFloyd\u201d Mueller<\/a> at Australia\u2019s University of Melbourne and Scotland\u2019s Distance Lab; Frank Vetere and Martin R. Gibbs from the University of Melbourne; Darren Edge<\/a>, researcher at Microsoft Research Asia<\/a>; Stefan Agamanolis of Distance Lab; and Jennifer G. Sheridan from London Knowledge Lab. The paper is one of four in UIST 2010 that reflect joint work between Microsoft Research and external collaborators.<\/p>\n

The Jogging over a Distance paper is result of Mueller\u2019s ongoing Ph.D. work, which led to collaboration with Edge and Microsoft Research Asia when he interned at that facility after winning a Microsoft Research Asia fellowship.<\/p>\n

\u201cThe fellowship,\u201d Mueller observes, \u201cgave me the opportunity to draw on the insights of world leaders in the field, and my ongoing collaboration with Darren Edge has provided a new depth of understanding and focus on the elements that design plays in exertion interfaces. The goal of this research is to bring people in different locations closer together through the power of sports. Teaming up with Microsoft Research Asia also allowed me to better understand the cross-cultural implications such research can have in bridging distances between countries.\u201d<\/p>\n

Microsoft has been a longtime supporter of UIST, and this year is no exception. The company is a symposium sponsor, and the Microsoft Applied Sciences Group<\/a> provided a set of \u201cadaptive keyboards\u201d to students who entered the UIST 2010 Student Innovation Contest<\/a>. Program participation is also strong, with six of the 38 accepted papers this year authored or co-authored by Microsoft Research staff members. This reflects the company\u2019s commitment to bringing natural user interfaces into mainstream products such as Kinect<\/a>, with the vision of building an entirely new breed of applications that will change the nature of software interfaces the way graphical user interfaces changed the market more than 20 years ago.<\/p>\n

Microsoft Research Contributions to UIST 2010<\/h2>\n

A Framework for Robust and Flexible Handling of Inputs with Uncertainty<\/em><\/strong><\/a> Julia Schwarz, Carnegie Mellon University; Scott Hudson, Carnegie Mellon University; and Andy Wilson, Microsoft Research Redmond.<\/p>\n

Combining Multiple Depth Cameras and Projectors for Interactions On, Above, and Between Surfaces<\/em><\/strong><\/a> Andy Wilson, Microsoft Research Redmond; and Hrvoje Benko, Microsoft Research Redmond.<\/p>\n

Content-Aware Dynamic Timeline for Video Browsing<\/em><\/strong><\/a> Suporn Pongnumkul, University of Washington; Jue Wang, Adobe; Gonzalo Ramos, Microsoft; and Michael Cohen, Microsoft Research Redmond.<\/p>\n

Gestalt: Integrated Support for Implementation and Analysis in Machine Learning Processes<\/em><\/strong><\/a> Kayur Patel, University of Washington; Naomi Bancroft, University of Washington; Steven Drucker, Microsoft Research Redmond; James Fogarty, University of Washington; Amy Ko, University of Washington; and James Landay, University of Washington.<\/p>\n

Jogging over a Distance between Europe and Australia<\/em><\/strong><\/a> Florian \u201cFloyd\u201d Mueller, University of Melbourne and Distance Lab; Frank Vetere, University of Melbourne; Martin R. Gibbs, University of Melbourne; Darren Edge, Microsoft Research Asia; Stefan Agamanolis, Distance Lab; and Jennifer G. Sheridan, London Knowledge Lab.<\/p>\n

Pen + Touch = New Tools<\/em><\/strong><\/a> Ken Hinckley, Microsoft Research Redmond; Koji Yatani, Microsoft Research and the University of Toronto; Michel Pahud, Microsoft Research Redmond; Nicole Coddington, Microsoft; Jenny Rodenhouse, Microsoft; Andy Wilson, Microsoft Research Redmond; Hrvoje Benko, Microsoft Research Redmond; and Bill Buxton, Microsoft Research.<\/p>\n","protected":false},"excerpt":{"rendered":"

By Janie Chang, Writer, Microsoft Research Hallway conversations at UIST 2010 can sound like planning discussions for science-fiction-movie special effects, buzzing with terms such as \u201cwearable computing,\u201d \u201caugmented reality,\u201d and \u201csmart rooms.\u201d UIST, the Association for Computing Machinery\u2019s (ACM\u2019s) Symposium on User Interface Software and Technology, brings together researchers and practitioners of new user interfaces. […]<\/p>\n","protected":false},"author":39507,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"categories":[194480,194464],"tags":[194719,187313,214469,187316,187348,193507,193625,214463,214466,193515,214460,214457,214472,214007],"research-area":[13551,13568],"msr-region":[],"msr-event-type":[],"msr-locale":[268875],"msr-post-option":[],"msr-impact-theme":[],"msr-promo-type":[],"msr-podcast-series":[],"class_list":["post-306041","post","type-post","status-publish","format-standard","hentry","category-graphics-and-multimedia","category-technology-for-emerging-markets","tag-association-for-computing-machinery","tag-augmented-reality","tag-computer-human-interaction","tag-depth-cameras","tag-kinect","tag-lightspace","tag-natural-user-interfaces","tag-smart-rooms","tag-symposium-on-user-interface-software-and-technology","tag-techfest-2010","tag-uist-2010","tag-uist-2010-student-innovation-contest","tag-user-interaction","tag-wearable-computing","msr-research-area-graphics-and-multimedia","msr-research-area-technology-for-emerging-markets","msr-locale-en_us"],"msr_event_details":{"start":"","end":"","location":""},"podcast_url":"","podcast_episode":"","msr_research_lab":[199560,199565],"msr_impact_theme":[],"related-publications":[],"related-downloads":[],"related-videos":[],"related-academic-programs":[],"related-groups":[144633],"related-projects":[],"related-events":[],"related-researchers":[],"msr_type":"Post","byline":"","formattedDate":"October 4, 2010","formattedExcerpt":"By Janie Chang, Writer, Microsoft Research Hallway conversations at UIST 2010 can sound like planning discussions for science-fiction-movie special effects, buzzing with terms such as \u201cwearable computing,\u201d \u201caugmented reality,\u201d and \u201csmart rooms.\u201d UIST, the Association for Computing Machinery\u2019s (ACM\u2019s) Symposium on User Interface Software and…","locale":{"slug":"en_us","name":"English","native":"","english":"English"},"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/306041"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/users\/39507"}],"replies":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/comments?post=306041"}],"version-history":[{"count":3,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/306041\/revisions"}],"predecessor-version":[{"id":611811,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/306041\/revisions\/611811"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=306041"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/categories?post=306041"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/tags?post=306041"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=306041"},{"taxonomy":"msr-region","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-region?post=306041"},{"taxonomy":"msr-event-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event-type?post=306041"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=306041"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=306041"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=306041"},{"taxonomy":"msr-promo-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-promo-type?post=306041"},{"taxonomy":"msr-podcast-series","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-podcast-series?post=306041"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}