{"id":327749,"date":"2016-11-28T09:34:39","date_gmt":"2016-11-28T17:34:39","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-project&p=327749"},"modified":"2021-05-09T12:02:32","modified_gmt":"2021-05-09T19:02:32","slug":"muscle-computer-interfaces-mucis","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/muscle-computer-interfaces-mucis\/","title":{"rendered":"Muscle-Computer Interfaces (muCIs)"},"content":{"rendered":"

Many human-computer interaction technologies are currently mediated by physical transducers such as mice, keyboards, pens, dials, and touch-sensitive surfaces. While these transducers have enabled powerful interaction paradigms and leverage our human expertise in interacting with physical objects, they tether computation to a physical artifact that has to be within reach of the user.<\/p>\n

As computing and displays begin to integrate more seamlessly into our environment and are used in situations where the user is not always focused on the computing task, it is important to consider mechanisms for acquiring human input that may not necessarily require direct manipulation of a physical implement. We explore the feasibility of muscle-computer input: an interaction methodology that directly senses and decodes human muscular activity rather than relying on physical device actuation or user actions that are externally visible or audible.<\/p>\n

Video<\/h2>\n

https:\/\/www.youtube.com\/watch?v=pktVSTwC8qo (opens in new tab)<\/span><\/a><\/p>\n

https:\/\/www.youtube.com\/watch?v=0phjl804onU (opens in new tab)<\/span><\/a><\/p>\n