{"id":305885,"date":"2011-05-09T08:45:12","date_gmt":"2011-05-09T15:45:12","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?p=305885"},"modified":"2016-10-15T13:23:01","modified_gmt":"2016-10-15T20:23:01","slug":"chi-11-enhancing-human-condition","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/research\/blog\/chi-11-enhancing-human-condition\/","title":{"rendered":"CHI \u201911: Enhancing the Human Condition"},"content":{"rendered":"
By Janie Chang, Writer, Microsoft Research<\/em><\/p>\n The Association for Computing Machinery\u2019s Conference on Human Factors in Computing Systems<\/a> (CHI 2011), being held May 7-12 in Vancouver, British Columbia, provides a showcase of the latest advances in human-computer interaction (HCI).<\/p>\n \u201cThe ongoing challenge,\u201d says Desney S. Tan<\/a>, CHI 2011 general conference chair and senior researcher at Microsoft Research Redmond<\/a>, \u201cis to make computing more accessible by integrating technology seamlessly into our everyday tasks, to understand and enhance the human condition like never before.\u201d<\/p>\n Microsoft Research has a consistent record of support for CHI through sponsorships and research contributions. This year, Microsoft researchers authored or co-authored 40 conference papers and notes, approximately 10 percent of the total accepted.<\/p>\n This comes as no surprise to Tan.<\/p>\n \u201cMicrosoft Research\u2019s goal,\u201d he says, \u201cis to further the state of the art in computer science and technology. As the realms of human and technology become more and more intertwined, Microsoft Research has focused more and more of our effort at the intersection of human and computer, and this is evident from our researchers\u2019 level of participation.\u201d<\/p>\n One unusual contribution comes from Bill Buxton<\/a>, Microsoft Research principal researcher. Items from Buxton\u2019s impressive accumulation of interactive devices are on display in an exhibit titled \u201cThe Future Revealed in the Past: Selections from Bill Buxton\u2019s Collection of Interactive Devices.\u201d<\/p>\n Effects of Community Size and Contact Rate in Synchronous Social Q&A<\/em><\/a>, by Ryen White<\/a> and Matthew Richardson<\/a> of Microsoft Research Redmond and Yandong Liu of Carnegie Mellon University, received one of 13 best-paper awards during the conference, as did Your Noise is My Command: Sensing Gestures Using the Body as an Antenna<\/em><\/a> by former Microsoft Research intern Gabe Cohn and visiting faculty member Shwetak Patel, both from the University of Washington, along with Dan Morris<\/a> and Tan of Microsoft Research Redmond. One of two best-notes awards went to Interactive Generator: A Self-Powered Haptic Feedback Device<\/em><\/a>, co-authored by Akash Badshah, of the Phillips Exeter Academy, a residential high school in Exeter, N.H.; Sidhant Gupta, Cohn, and Patel of the University of Washington; and Nicolas Villar<\/a> and Steve Hodges<\/a> of Microsoft Research Cambridge<\/a>.<\/p>\n Imagine being freed of physical attachments to input devices because your body is<\/em> the input device. One approach is to put sensors on the body. The challenge then is to separate actual \u201csignal\u201d from \u201cnoise,\u201d such as ambient electromagnetic interference, which overwhelms sensors and makes signal processing difficult. In Your Noise is My Command: Sensing Gestures Using the Body as an Antenna<\/em>, the researchers turned the problem on its head.<\/p>\n \u201cCan we use that electrical noise as a source of information about where a user is and what that user is doing?\u201d Morris recalls asking. \u201cThese are the first experiments to assess whether this is feasible.\u201d<\/p>\nThe Touch-Sensitive Home<\/h2>\n