{"id":472518,"date":"2018-03-28T07:22:53","date_gmt":"2018-03-28T14:22:53","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?p=472518"},"modified":"2018-05-23T10:33:55","modified_gmt":"2018-05-23T17:33:55","slug":"when-psychology-meets-technology-with-dr-daniel-mcduff","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/research\/podcast\/when-psychology-meets-technology-with-dr-daniel-mcduff\/","title":{"rendered":"When Psychology Meets Technology with Dr. Daniel McDuff"},"content":{"rendered":"
\"\"<\/a>

Microsoft Researcher Dr. Daniel McDuff. Photography by Maryatt Photography.<\/p><\/div>\n

Episode 17, March 28, 2018<\/strong><\/p>\n

One of the most intriguing areas of machine learning research is affective computing, where scientists are working to bridge the gap between human emotions and computers. It is here, at the intersection of psychology and computer science, that we find Dr. Daniel McDuff<\/a>, who has been designing systems, from hardware to algorithms, that can sense human behavior and respond to human emotions.<\/p>\n

Today, Dr. McDuff talks about why we need computers to understand us, outlines the pros and cons of designing emotionally sentient agents, explains the technology behind CardioLens<\/a>, a pair of augmented reality glasses that can take your heartrate by looking at your face, and addresses the challenges of maintaining trust and privacy when we\u2019re surrounded by devices that want to know not just what we\u2019re doing, but how we\u2019re feeling.<\/p>\n

Relate<\/b>d:<\/b><\/p>\n