Gaze & Tongue: A Subtle, Hands-Free Interaction for Head-Worn Devices
- Tan Gemicioglu ,
- R. Michael Winters ,
- Yu-Te Wang ,
- Thom Gable ,
- Ann Paradiso ,
- Ivan Tashev
CHI EA '23: Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems |
Published by ACM
Finalist - Best Interactivity Demo
Download BibTexGaze tracking allows hands-free and voice-free interaction with computers, and has gained more use recently in virtual and augmented reality headsets. However, it traditionally uses dwell time for selection tasks, which suffers from the Midas Touch problem. Tongue gestures are subtle, accessible and can be sensed non-intrusively using an IMU at the back of the ear, PPG and EEG. We demonstrate a novel interaction method combining gaze tracking with tongue gestures for gaze-based selection faster than dwell time and multiple selection options. We showcase its usage as a point-and-click interface in three hands-free games and a musical instrument.