{"id":380660,"date":"2017-05-04T09:00:17","date_gmt":"2017-05-04T16:00:17","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?p=380660"},"modified":"2017-05-08T17:41:13","modified_gmt":"2017-05-09T00:41:13","slug":"chi-1-interacting-touch-virtually-physically","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/research\/blog\/chi-1-interacting-touch-virtually-physically\/","title":{"rendered":"CHI Part 1: Interacting through touch, virtually and physically"},"content":{"rendered":"

A preview of the ACM CHI Conference on Human Factors in Computing Systems<\/h3>\n

By Justin Cranshaw<\/a>, Researcher, Microsoft Research<\/em><\/p>\n

\"CHI<\/p>\n

Human-computer interaction<\/a> (HCI) research at Microsoft provides a human-centered perspective on the future of computing, reshaping the roles computers play in our lives and propelling society forward through the tools, systems, and methods we investigate and invent. Microsoft Research\u2019s HCI team has expertise on a wide range of domain-specialties and disciplines, providing guidance to much of the work we do at Microsoft, in both product and research.<\/p>\n

One of the premier international venues for sharing academic work in HCI is the ACM CHI Conference on Human Factors in Computing Systems<\/a> (or CHI, as it\u2019s more commonly called). This year\u2019s conference will be held May 6\u201311 in Denver, where 73 of us at Microsoft contributed<\/a> to 37 papers, with 8 papers receiving awards. \u00a0In this post and the next, I\u2019ll share some of the major themes I see in the work my Microsoft colleagues are presenting at CHI this year.<\/p>\n

For decades, \u201ctouch\u201d has been a major center of focus for HCI research to build more natural methods of computational input. This year, CHI features some of the latest Microsoft advances in touch-related research, including work on multi-touch sensing digital pen input that mirrors how people naturally use their hands. For example, in the physical world, when people hold (\u201ctouch\u201d) a document, they will often frame a particular region of interest between thumb and forefinger in their non-preferred hand, while they mark up the page with a pen in their preferred hand.<\/p>\n

One paper, As We May Ink? Learning from Everyday Analog Pen Use to Improve Digital Ink Experiences<\/em><\/a>, <\/em>by Yann Riche, Nathalie Henry Riche<\/a>, Ken Hinckley<\/a>, Sarah Fuelling, Sarah Williams, and Sheri Panabaker, <\/strong>analyzes how people use freeform ink\u2014on both \u201canalog\u201d paper and \u201cdigital\u201d computer screens\u2014in their everyday lives. This revealed a key insight, namely that ink is laden with many latent expressive attributes (such as the placement, size, and orientation of notations) that far exceed any textual transcription of the handwriting alone. In this and many other ways, ink uniquely affords the capture of fleeting thoughts, weaving together symbolic and figurative content, in a way that one could never achieve with a keyboard.<\/p>\n

A second paper, WritLarge: Ink Unleashed by Unified Scope, Action, & Zoom<\/em><\/a>, by Haijun Xia, Ken Hinckley, Michel Pahud<\/a>, Xiao Tu, and William A.S. Buxton<\/a>, earning an Honorable Mention award, introduces the system that puts these fundamental insights into action\u2014on the mammoth 84-inch\u00a0Microsoft Surface Hub<\/a>, no less\u2014to unleash the latent expressive power of ink. Using multitouch, the user can simply frame a portion of their \u201cwhiteboard\u201d session between thumb and forefinger, and then act on such a selection (such as by copying, sharing, organizing, or otherwise transforming the content) using the pen wielded by the opposite hand.<\/p>\n

And yet a third paper, Thumb + Pen Interaction on Tablets<\/em><\/a>, by Ken Pfeuffer, Ken Hinckley, Michel Pahud, and William A.S. Buxton, addresses this notion of \u201cpen + touch\u201d interaction in laptop scenarios, such as when using a Surface<\/a> tablet on the couch, where the nonpreferred hand must often hold the device itself. In this case, the thumb is available and sufficiently mobile to manipulate many controls, enabling a whole new space of \u201cthumb + pen\u201d interactions, such as allowing the user to readily shift use of a pen between annotation and cell-selection functions on an Excel spreadsheet.<\/p>\n

HCI research on \u201ctouch\u201d is not just about enabling new ways to interact with the virtual world through physical devices. Simulating the physicality of touch is also an important dimension for driving natural interactions in virtual worlds. Since the beginning of virtual reality (VR) development, the goal has been\u00a0to immerse the users in a virtual world that will be as real to them as the physical one. While the capabilities of computer graphics and virtual reality display headsets, as well as 3D audio rendering, has progressed significantly during the last decade, our ability to simulate physicality in the virtual world is still in its infancy.<\/p>\n

To help crack this problem, a team of Microsoft Research researchers\u2014Eyal Ofek<\/a>, Christian Holz<\/a>, Hrvoje Benko<\/a>, and Andy Wilson<\/a>, along with Lung-Pan Chen, a visiting intern from HPI, Germany\u2014has proposed a solution that can help us feel<\/em> our environment while in VR. The authors built a system that tricks your senses by redirecting your hands, such that whenever you see yourself touching a virtual object, your hands are actually touching a physical geometry. Such a technology has the potential to make virtual worlds come alive in a tactile way.<\/p>\n