{"id":171348,"date":"2014-04-26T08:23:42","date_gmt":"2014-04-26T08:23:42","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/project\/type-hover-swipe-in-96-bytes-a-motion-sensing-mechanical-keyboard\/"},"modified":"2017-06-20T11:02:39","modified_gmt":"2017-06-20T18:02:39","slug":"type-hover-swipe-in-96-bytes-a-motion-sensing-mechanical-keyboard","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/type-hover-swipe-in-96-bytes-a-motion-sensing-mechanical-keyboard\/","title":{"rendered":"Type\u2013Hover\u2013Swipe in 96 Bytes: A Motion Sensing Mechanical Keyboard"},"content":{"rendered":"

We present a new type of augmented mechanical keyboard, sensing rich and expressive motion gestures performed both on and directly above the device. A low-resolution matrix of infrared (IR) proximity sensors is interspersed with the keys of a regular mechanical keyboard. This results in coarse but high frame-rate motion data. We extend a machine learning algorithm, traditionally used for static classification only, to robustly support dynamic, temporal gestures. We propose the use of motion signatures a technique that utilizes pairs of motion history images and a random forest classifier to robustly recognize a large set of motion gestures. Our technique achieves a mean per-frame classification accuracy of 75:6% in leave\u2013one\u2013subject\u2013out and 89:9% in half-test\/half-training cross-validation. We detail hardware and gesture recognition algorithm, provide accuracy results, and demonstrate a large set of gestures designed to be performed with the device. We conclude with qualitative feedback from users, discussion of limitations and areas for future work.<\/p>\n