{"id":305885,"date":"2011-05-09T08:45:12","date_gmt":"2011-05-09T15:45:12","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?p=305885"},"modified":"2016-10-15T13:23:01","modified_gmt":"2016-10-15T20:23:01","slug":"chi-11-enhancing-human-condition","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/research\/blog\/chi-11-enhancing-human-condition\/","title":{"rendered":"CHI \u201911: Enhancing the Human Condition"},"content":{"rendered":"

By Janie Chang, Writer, Microsoft Research<\/em><\/p>\n

The Association for Computing Machinery\u2019s Conference on Human Factors in Computing Systems<\/a> (CHI 2011), being held May 7-12 in Vancouver, British Columbia, provides a showcase of the latest advances in human-computer interaction (HCI).<\/p>\n

\u201cThe ongoing challenge,\u201d says Desney S. Tan<\/a>, CHI 2011 general conference chair and senior researcher at Microsoft Research Redmond<\/a>, \u201cis to make computing more accessible by integrating technology seamlessly into our everyday tasks, to understand and enhance the human condition like never before.\u201d<\/p>\n

Microsoft Research has a consistent record of support for CHI through sponsorships and research contributions. This year, Microsoft researchers authored or co-authored 40 conference papers and notes, approximately 10 percent of the total accepted.<\/p>\n

This comes as no surprise to Tan.<\/p>\n

\u201cMicrosoft Research\u2019s goal,\u201d he says, \u201cis to further the state of the art in computer science and technology. As the realms of human and technology become more and more intertwined, Microsoft Research has focused more and more of our effort at the intersection of human and computer, and this is evident from our researchers\u2019 level of participation.\u201d<\/p>\n

One unusual contribution comes from Bill Buxton<\/a>, Microsoft Research principal researcher. Items from Buxton\u2019s impressive accumulation of interactive devices are on display in an exhibit titled \u201cThe Future Revealed in the Past: Selections from Bill Buxton\u2019s Collection of Interactive Devices.\u201d<\/p>\n

Effects of Community Size and Contact Rate in Synchronous Social Q&A<\/em><\/a>, by Ryen White<\/a> and Matthew Richardson<\/a> of Microsoft Research Redmond and Yandong Liu of Carnegie Mellon University, received one of 13 best-paper awards during the conference, as did Your Noise is My Command: Sensing Gestures Using the Body as an Antenna<\/em><\/a> by former Microsoft Research intern Gabe Cohn and visiting faculty member Shwetak Patel, both from the University of Washington, along with Dan Morris<\/a> and Tan of Microsoft Research Redmond. One of two best-notes awards went to Interactive Generator: A Self-Powered Haptic Feedback Device<\/em><\/a>, co-authored by Akash Badshah, of the Phillips Exeter Academy, a residential high school in Exeter, N.H.; Sidhant Gupta, Cohn, and Patel of the University of Washington; and Nicolas Villar<\/a> and Steve Hodges<\/a> of Microsoft Research Cambridge<\/a>.<\/p>\n

The Touch-Sensitive Home<\/h2>\n

Imagine being freed of physical attachments to input devices because your body is<\/em> the input device. One approach is to put sensors on the body. The challenge then is to separate actual \u201csignal\u201d from \u201cnoise,\u201d such as ambient electromagnetic interference, which overwhelms sensors and makes signal processing difficult. In Your Noise is My Command: Sensing Gestures Using the Body as an Antenna<\/em>, the researchers turned the problem on its head.<\/p>\n

\u201cCan we use that electrical noise as a source of information about where a user is and what that user is doing?\u201d Morris recalls asking. \u201cThese are the first experiments to assess whether this is feasible.\u201d<\/p>\n

\"human

The human body behaves as an antenna in the presence of noise radiated by power lines and appliances. By analyzing this noise, the entire home becomes an interaction surface.<\/p><\/div>\n

The human body is literally an antenna, picking up signals while moving through the noisy electrical environment of a typical home. The researchers tested whether it is possible to identify signals with enough precision to tell what the user is touching and from where. To measure those signals, the researchers placed a simple sensor on each study participant and recorded the electrical signals collected by those sensors. Laptop computers carried in each person\u2019s backpack collected data as the participants performed a series of \u201cgestures,\u201d such as touching spots on walls and appliances or moving through different rooms.<\/p>\n

Next came determining whether analysis of this data provided the ability to distinguish between gestures and locations. It was possible in many cases to recognize participants\u2019 actions based solely on the ambient noise picked up by their bodies.\u00a0For example, once a participant \u201ctaught\u201d the algorithms about the noise environment around a particular light switch by demonstrating gestures around the switch, it was possible to determine which of five spots near that switch the user was touching, with an accuracy of better than 90 percent. Similarly, researchers could identify in which room a participant was present at any given time with an accuracy exceeding 99 percent, because the electrical noise environment of each room is distinct.<\/p>\n

\u201cIt was quite a gratifying series of results,\u201d Morris says. \u201cNow, we are considering how we can package this up into a real-time, interactive system and what innovative scenarios we can enable when we turn your entire home into a touch-sensitive surface.\u201d<\/p>\n

The Patient as Medical Display Surface<\/h2>\n

Reports from the World Health Organization and the American Medical Association confirm that patient noncompliance is a major obstacle to successful medical outcomes in treatment of chronic conditions. Doctor-patient communication has been identified as one of the most important factors for improving compliance. The paper AnatOnMe: Facilitating Doctor-Patient Communication Using a Projection-Based Handheld Device<\/em><\/a>\u00a0focuses on understanding how lightweight, handheld projection technologies can be used to enhance doctor-patient communication during face-to-face exchanges in clinical settings.<\/p>\n

\"three

Three presentation surfaces: a) body, b) model, and c) wall.<\/p><\/div>\n

Focusing on physical therapy, co-authors Tao Ni of Virginia Tech\u2014a former Microsoft Research Redmond intern\u2014Amy K. Karlson of Microsoft Research Redmond, and Daniel Wigdor, formerly of Microsoft Research Redmond and now at the University of Toronto, spoke with doctors to understand general communication challenges and design requirements, then built and studied a handheld projection system that flexibly supports the key aspects of information exchange. Doctors can direct handheld projectors at walls or curtains to create an \u201canywhere\u201d display, or at a patient to overlay useful medical information directly atop the appropriate portion of the anatomy for an augmented-reality view, or \u201cvirtual X-ray.\u201d<\/p>\n

Reviews and formal lab studies with physical therapists and patients established that handheld projections delivered high value and a more engaging, informative experience than what is traditionally available.<\/p>\n

\u201cThis is an interesting new space,\u201d Karlson says, \u201cbecause, despite the prevalence of technology in many medical settings, technology has been relatively absent from face-to-face communication and education opportunities between doctors and patients.<\/p>\n

\u201cThe coolest part was hearing the positive reactions from study participants when we projected medical imagery directly onto their arms and legs. We got, \u2018Wow!\u2019 \u2018Cool!\u2019 and \u2018I feel like I am looking directly through my skin!\u2019 There seems to be something quite compelling and unique about viewing medical imagery on one\u2019s own body.\u201d<\/p>\n

Touch-Free Interactions in the Operating Room<\/h2>\n

The growth of image-guided procedures in surgical settings has led to an increased need to interact with digital images. In a collaboration with Lancaster University funded by Microsoft Research Connections<\/a>, Rose Johnson of the Open University in Milton Keynes, U.K.; Kenton O\u2019Hara, Abigail Sellen<\/a>, and Antonio Criminisi<\/a> of Microsoft Research Cambridge; and Claire Cousins of Addenbrooke\u2019s Hospital in Cambridge, U.K., address the problem of enabling rich, flexible, but touch-free interaction with patient data in surgical settings. The resulting paper, Exploring the Potential for Touchless Interaction in Image-Guided Interventional Radiology<\/em><\/a>, <\/em>has received a CHI 20 Honorable Mention paper award.<\/p>\n

During treatments such as interventional radiology, images are critical in guiding surgeons’ work; yet because of sterility issues, surgeons must avoid touching input devices such as mice or keyboards. They must navigate digital images “by proxy,\u201d using other members of the surgical team to find the right image, pan, or zoom. This can be onerous and time-consuming.<\/p>\n

\"X-ray

This view toward an X-ray table from a computer area shows a surgical team and the complex collaborative environment that touch-free interactions must address.<\/p><\/div>\n

The research team began fieldwork with the goal of understanding surgeons’ working practices. The researchers are collaborating with surgical teams to develop and evaluate a system. Touchless-interaction solutions such as Kinect for Xbox 360<\/a> offer opportunities for surgeons to regain control of navigating through data. There are many challenges, though, in terms of enabling collaborative control of the interface, as well as achieving fluid engagement and disengagement with the system, because the system needs to know which gestures are \u201cfor the system\u201d and which are not.<\/p>\n

\u201cThe most intriguing aspect of this project,\u201d Sellen says, \u201cis the potential to make a real impact on patient care and clinical outcome by reducing the time it takes to do complicated procedures and giving surgeons more control of the data they depend on. From a technical side, it is exciting to see where technologies like Kinect can realize their value outside of the gaming domain.\u201d<\/p>\n

Use Both Hands<\/h2>\n

Touch interfaces are great for impromptu casual interactions, but it is not easy to select a point precisely with your finger or to move an image without rotating it unless there are on-screen menus or handles. In the world of touch, though, such options are not desirable, because they introduce clutter. Rock & Rails: Extending Multi-touch Interactions with Shape Gestures to Enable Precise Spatial Manipulations<\/em><\/a>, by Wigdor, Hrvoje Benko<\/a> of Microsoft Research Redmond, and John Pella, Jarrod Lombardo, and Sarah Williams of Microsoft, proposes a solution by using recognized hand poses on the surface in combination with touch.<\/p>\n

\u201cRock and Rails\u201d is an extension of the touch-interaction vocabulary. It maintains the direct-touch input paradigm but enables users to make fluid, high degree-of-freedom manipulations while simultaneously providing easy mechanisms to increase precision, specify manipulation constraints, and avoid occlusions. The tool set provides mechanisms for positioning, isolating orientation, and scaling operations using system-recognized hand postures, while enabling traditional, simple, direct-touch manipulations.<\/p>\n

\"Rock

The Rock & Rails paper augments a) traditional direct-manipulation gestures with independently recognized hand postures used to restrict manipulations conducted with the other hand: b) rotate, c) resize, and d) 1-D scale. This enables fluid selection of degrees of freedom and, thus, rapid, high-precision manipulation of on-screen content.<\/p><\/div>\n

The project was a collaborative effort between Microsoft Research and the Microsoft Surface<\/a> team, so the researchers were able to test their work on real-world designers\u2014the intended audience.<\/p>\n

\u201cOne of the best moments of the project,\u201d Benko recalls, \u201cwas when we realized our gestures could be made \u2018persistent\u2019 on the screen. We had transitioned from the model where you had to keep the pose of the hand in order to signal a particular option, to a more relaxed mode where the user could \u2018create\u2019 or \u2018pin\u2019 a proxy representation of a gesture. This allows users to perform all sorts of wacky combinations of operations without needing to hold the gesture for a long period of time.\u201d<\/p>\n

These are just a few of Microsoft Research\u2019s current investigations in how to enhance the ways people can interact with computing devices.<\/p>\n

\u201cHCI is all about discovering and inventing technologies that deeply transform people\u2019s lives,\u201d Tan concludes. \u201cMicrosoft Research is committed to advancing the state of the art in human-computer interaction.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"

By Janie Chang, Writer, Microsoft Research The Association for Computing Machinery\u2019s Conference on Human Factors in Computing Systems (CHI 2011), being held May 7-12 in Vancouver, British Columbia, provides a showcase of the latest advances in human-computer interaction (HCI). \u201cThe ongoing challenge,\u201d says Desney S. Tan, CHI 2011 general conference chair and senior researcher at […]<\/p>\n","protected":false},"author":39507,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"categories":[194481,194484],"tags":[214358,214376,214370,214361,214355,201081,214379,214367,214385,186410,187405,214364,214394,196135,214382,214397,214391,214373,214388],"research-area":[13554,13553],"msr-region":[],"msr-event-type":[],"msr-locale":[268875],"msr-post-option":[],"msr-impact-theme":[],"msr-promo-type":[],"msr-podcast-series":[],"class_list":["post-305885","post","type-post","status-publish","format-standard","hentry","category-human-centered-computing","category-medical-health-and-genomics","tag-ambient-electromagnetic-interference","tag-anatonme","tag-antenna","tag-body-sensors","tag-chi-2011","tag-conference-on-human-factors-in-computing-systems","tag-doctor-patient-communication","tag-electrical-noise","tag-handheld-projection-system","tag-hci","tag-human-computer-interaction","tag-input-device","tag-interventional-radiology","tag-kinect-for-xbox-360","tag-physical-therapy","tag-rock-rails","tag-touch-free-interaction","tag-touch-sensitive-surface","tag-virtual-x-ray","msr-research-area-human-computer-interaction","msr-research-area-medical-health-genomics","msr-locale-en_us"],"msr_event_details":{"start":"","end":"","location":""},"podcast_url":"","podcast_episode":"","msr_research_lab":[199561,199565],"msr_impact_theme":[],"related-publications":[],"related-downloads":[],"related-videos":[],"related-academic-programs":[],"related-groups":[],"related-projects":[],"related-events":[],"related-researchers":[],"msr_type":"Post","byline":"","formattedDate":"May 9, 2011","formattedExcerpt":"By Janie Chang, Writer, Microsoft Research The Association for Computing Machinery\u2019s Conference on Human Factors in Computing Systems (CHI 2011), being held May 7-12 in Vancouver, British Columbia, provides a showcase of the latest advances in human-computer interaction (HCI). \u201cThe ongoing challenge,\u201d says Desney S.…","locale":{"slug":"en_us","name":"English","native":"","english":"English"},"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/305885"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/users\/39507"}],"replies":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/comments?post=305885"}],"version-history":[{"count":5,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/305885\/revisions"}],"predecessor-version":[{"id":305924,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/posts\/305885\/revisions\/305924"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=305885"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/categories?post=305885"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/tags?post=305885"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=305885"},{"taxonomy":"msr-region","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-region?post=305885"},{"taxonomy":"msr-event-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event-type?post=305885"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=305885"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=305885"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=305885"},{"taxonomy":"msr-promo-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-promo-type?post=305885"},{"taxonomy":"msr-podcast-series","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-podcast-series?post=305885"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}