{"id":353711,"date":"2020-02-22T16:14:14","date_gmt":"2017-01-17T02:27:49","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-research-item&p=353711"},"modified":"2020-02-22T16:14:15","modified_gmt":"2020-02-23T00:14:15","slug":"smartphone-based-gaze-gesture-communication-people-motor-disabilities-2","status":"publish","type":"msr-research-item","link":"https:\/\/www.microsoft.com\/en-us\/research\/publication\/smartphone-based-gaze-gesture-communication-people-motor-disabilities-2\/","title":{"rendered":"Smartphone-Based Gaze Gesture Communication for People with Motor Disabilities"},"content":{"rendered":"

Current eye-tracking input systems for people with ALS or other motor impairments are expensive, not robust under sunlight, and require frequent re-calibration and substantial, relatively immobile setups. Eye-gaze transfer (e-tran) boards, a low-tech alternative, are challenging to master and offer slow communication rates. To mitigate the drawbacks of these two status quo approaches, we created an eye gesture communication system that runs on a smartphone, and is designed to be low-cost, robust, portable, and easy-to-learn, with a higher communication bandwidth than an e-tran board. Our system can interpret eye gestures in real time, decode these gestures into predicted utterances, and facilitate communication, with different user interfaces for speakers and interpreters. Our evaluations demonstrate that this system is robust, has good user satisfaction, and provides a speed improvement with respect to an e-tran board; we also identify avenues for further improvement to low-cost, low-effort gaze-based communication technologies.<\/p>\n