{"id":744157,"date":"2021-05-10T13:02:32","date_gmt":"2021-05-10T20:02:32","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?p=744157"},"modified":"2021-05-12T08:19:54","modified_gmt":"2021-05-12T15:19:54","slug":"research-collection-hands-on-research-and-prototyping-for-haptics","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/research\/blog\/research-collection-hands-on-research-and-prototyping-for-haptics\/","title":{"rendered":"Hands-on research and prototyping for haptics"},"content":{"rendered":"\n

While many of us think of human-computer interaction as a job for the eyes, ears and mind, we don\u2019t think as often about the importance and complexity of our tactile interactions with computers. Haptics \u2013 the sense of touch or tactile sensations \u2013 permeates our computing experience, though we are often so habituated to it that it goes unnoticed. Consider the feeling of resistance and key travel that moderates your typing speed and confirms that your key presses are working, a palpable \u201cclick\u201d on the mouse or touchpad that confirms an action, the patterns of vibration in your pocket or on your wrist to signal a notification or incoming call, or the subtle vibration you experience on some mobile phones when typing on a touchscreen keyboard.<\/p>\n\n\n\n

These forms of feedback give us confirmation of our interactions with computers, but haptics also provide \u201cfeed-forward\u201d to orient us to computer interfaces: the ridges that help your fingers find your keyboard\u2019s home row without looking, the shapes and orientations of the buttons on your gaming controllers, or the camera \u201cbulge\u201d or physical buttons that help you know which way is \u201cup\u201d on your mobile devices when you fumble for them in the dark.<\/p>\n\n\n\n

The addition of haptics also blurs the distinction between input and output devices \u2013 the most familiar example of this is the touch screen, which allows us to more directly \u201cgrasp\u201d and manipulate on-screen objects. The rapid proliferation of touch screens also brings to light the extent to which we quickly acclimate to new tactile interfaces \u2013 note how some young children, having used tablets from a very early age, assume that any screen (or even a printed magazine) has touch capabilities, and is disappointed or frustrated when it doesn\u2019t.<\/p>\n\n\n\n

One of the earliest explorations of haptics in human-computer interaction was a rotary knob (opens in new tab)<\/span><\/a> implemented in 1973 (opens in new tab)<\/span><\/a>that functioned both as an input and output device: it could change its physical characteristics depending on what it was controlling at the time. For instance, it could vary its resistance from free-wheeling to locked, or \u201csnap\u201d back to an initial position when turned. This device was introduced alongside one of the first implementations of a capacitive touch screen, as a proposed means of controlling the CERN particle accelerator.<\/p>\n\n\n\n

While haptics in computing could someday encompass the entirety of tactile sensations, particularly in virtual reality \u2013 from vibrations and textures you feel in your feet, to sensations of heat, cold or a light breeze on your arm \u2013 a great deal of research so far has focused on the hands. Our hands are the tactile interface for most of the computing technology we use (as well as for the physical tools we have used for thousands of years), and their importance to our survival is reflected in our physiology: 54 of the human body\u2019s 206 bones are in our hands.<\/p>\n\n\n\n

In virtual environments, further developments in haptics will enable a more realistic physical experience \u2013 while the illusion created by current VR experiences \u201cbreaks\u201d when the user interacts with a virtual object that has only visual or auditory signals of its apparent weight, shape or texture, future haptic devices could simulate the heft of a rock in a virtual forest and the sense of momentum when it is thrown or caught, the roughness of a stone wall, the roundness and pliability of a rubber ball, or the recoil of a weapon fired in a game. <\/p>\n\n\n\n

Microsoft researchers have been exploring haptics for many years, and their work has generally covered three overlapping areas: the psychology of perception (and the illusions we can apply to manipulate it), the development of prototype haptic input and output devices, and user studies on the effectiveness of haptic techniques. This research could be applied to compensate for or augment other senses (such as through the CaneController<\/a> for people with visual impairments), or to create a greater sense of immersion and agency in VR and AR environments.<\/p>\n\n\n\n

Explore more<\/h4>\n\n\n\n
\n
\n
\n\t