{"id":727519,"date":"2021-03-01T00:01:55","date_gmt":"2021-03-01T08:01:55","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-project&p=727519"},"modified":"2022-02-04T13:57:59","modified_gmt":"2022-02-04T21:57:59","slug":"project-deepeyes","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/project-deepeyes\/","title":{"rendered":"Project DeepEyes"},"content":{"rendered":"
\n\t
\n\t\t
\n\t\t\t\"Project\t\t<\/div>\n\t\t\n\t\t
\n\t\t\t\n\t\t\t
\n\t\t\t\t\n\t\t\t\t
\n\t\t\t\t\t\n\t\t\t\t\t
\n\t\t\t\t\t\t
\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\n

Project DeepEyes<\/h1>\n\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/div>\n<\/section>\n\n\n\n\n\n
\"Project
A typical gaze tracking use case<\/figcaption><\/figure><\/div>\n\n\n\n

Gaze-tracking is a novel way of interacting with computers which allows new scenarios, such as enabling people with motor-neuron disabilities to control their computers or doctors to interact with patient information without touching screen or keyboard. Further, there are emerging applications of gaze-tracking in interactive gaming, user experience research, human attention analysis and behavioral studies. Accurate estimation of the gaze may involve accounting for head-pose, head-position, eye rotation, distance from the object as well as operating conditions such as illumination, occlusion, background noise and various biological aspects of the user. Gaze-tracking works by using a specialized assembly of infrared light source and infrared camera which highlight the pupils and measure the rotation of eyeball against the pupil-center. The current generation of technology is fraught with issues from sunlight interference to lack of consistency across the diversity of eye shapes and colors. Also, there are several challenges in the universal proliferation of gaze-tracking as accessibility technologies, specifically its affordability, reliability, inter-operability and ease-of-use.<\/p>\n\n\n\n

We believe this is about to change. Machine learning using Artificial Neural Networks are showing promise processing images from standard \u2018selfie cameras\u2019 without needing IR filters or emitters. Project DeepEyes aims to develop no cost and high-quality eye tracking for every person with motor-neuron disabilities. Here in the Enable Group<\/a>, we are utilizing advance Artificial Intelligence techniques to develop state-of-the-art deep neural networks, machine learning algorithms and inclusive designs towards development of hardware-agnostic gaze-trackers as accessibility technology. Specifically, we are investigating the use of the front-facing camera already present in most computing devices to predict where a person is looking on a screen and use this signal to interact with the device. We are trying to solve the aforementioned challenges in affordability, reliability, inter-operability and ease-of-use through our on-going research and development of a hardware-agnostic gaze-tracking. Our approach uses novel deep neural network architecture as an appearance-based method for constrained gaze-tracking that utilizes facial imagery captured on an ordinary RGB camera ubiquitous in all modern computing devices.<\/p>\n\n\n\n

Existing computer vision and machine learning datasets do not represent the diversity of people with various motor-neuron disorders and diseases. This results in issues with the accuracy of identifying breathing masks, ptosis (droopy eyelids), epiphora (watery eyes), and dry eyes due to medications that control sialorrhea (excessive saliva). To tackle this challenge of \u2018data-desert\u2019, we are working with our partners across the world to prepare inclusive datasets so that AI models can be trained on them.<\/p>\n\n\n\n

Our research shows promise that one day soon any computer, tablet, or phone will be controllable using just your eyes due to the prediction capabilities of deep neutral networks.<\/p>\n\n\n\n

Vision<\/strong><\/p>\n\n\n\n