{"id":1001,"date":"2020-04-02T15:00:00","date_gmt":"2020-04-02T22:00:00","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/education\/blog\/2020\/04\/02\/student-develops-visual-accommodations-with-microsoft-azure-kinect\/"},"modified":"2025-04-08T08:41:36","modified_gmt":"2025-04-08T15:41:36","slug":"student-develops-visual-accommodations-with-microsoft-azure-kinect","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/education\/blog\/2020\/04\/student-develops-visual-accommodations-with-microsoft-azure-kinect\/","title":{"rendered":"Student develops visual accommodations with Microsoft Azure Kinect"},"content":{"rendered":"
The world is built for people with sight. Most people rely on their vision to help them throughout the day, from cooking to reading to walking around. However, for people who have lost their sight or whose vision has become impaired, everyday tasks are harder. People with visual impairments must rely on other senses to get around, and they face many obstacles people with full sight don\u2019t have to worry about. Recently, I was able to use my passion for computer science (CS) to explore and address some of the issues that people with visual impairments might face.<\/p>\n
<\/p>\n
I began learning about computer science in ninth grade when I came into the high school program at my school, Porter-Gaud, an independent school in Charleston, South Carolina. There, Mr. Bergman, the CS teacher, taught us Python, which we used to create games that addressed issues we cared about. Then, in tenth grade, Mr. Renton, another CS teacher, introduced us to virtual reality (VR) and helped us create VR games, using Unity and C# to develop the application and the Microsoft Mixed Reality Headset to run the application.<\/p>\n
This year, with that computer science knowledge and the guidance of Mr. Renton, I attempted to solve two obstacles people with visual impairments face: the common task of avoiding running into things when walking around and noticing who is around you and what they look like.<\/p>\n
<\/p>\n
Using the Azure Kinect<\/strong><\/a> camera and C#, I developed a device that alerts the user when they walk too close to an obstacle using beeps of increasing frequency. Think of it sort of like a car\u2019s backup alarm for humans. Also, using the Microsoft Face API<\/a><\/strong>, I coded a program that can take a picture of the user\u2019s surroundings when a voice command is spoken or a button on the Xbox Adaptive controller<\/a><\/strong> is pressed. Then, the computer reads aloud the location, age, and emotion of anyone within view, allowing anyone who uses the device, regardless of quality of their vision, to know who and what is around them. I chose to use the adaptive controller because it has large, easily pressed buttons that would be easy to use for a person with a visual impairment. I also coded in the option to use voice commands using the C# Speech Recognition Engine to make the device even easier to use.<\/p>\n