{"id":192215,"date":"2015-05-08T00:00:00","date_gmt":"2015-05-08T17:11:23","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/msr-research-item\/human-robot-collaboration\/"},"modified":"2016-07-15T15:23:32","modified_gmt":"2016-07-15T22:23:32","slug":"human-robot-collaboration","status":"publish","type":"msr-video","link":"https:\/\/www.microsoft.com\/en-us\/research\/video\/human-robot-collaboration\/","title":{"rendered":"Human-Robot Collaboration"},"content":{"rendered":"
In order for robots to collaborate with humans, they must infer helpful actions in the physical world by observing the human’s language, gesture, and actions. A particular challenge for robots that operate under uncertainty is identifying and recovering from failures in perception, actuation, and language interpretation. I will describe our approaches to automatic failure recovery using language and probabilistic methods. First we describe how a robot can use a probabilistic language grounding framework to employ information-theoretic dialog strategies, asking targeted questions to reduce uncertainty about different parts of a natural language command. Second, I will show how to invert a model for interpreting language to generate targeted natural language requests for help from a human partner, enabling a robot team to actively solicit help from a person when they encounter problems. And third, I will describe steps toward incremental interpretation of language and gesture as an enabling technology for making robots that use coordination actions to establish common ground with their human partner. This approach points the way toward more general models of human-robot collaboration building world models from both linguistic and non-linguistic input, following complex grounded natural language commands, and engaging in fluid, flexible collaboration with their human partners.<\/p>\n<\/div>\n
<\/p>\n","protected":false},"excerpt":{"rendered":"
In order for robots to collaborate with humans, they must infer helpful actions in the physical world by observing the human’s language, gesture, and actions. A particular challenge for robots that operate under uncertainty is identifying and recovering from failures in perception, actuation, and language interpretation. I will describe our approaches to automatic failure recovery […]<\/p>\n","protected":false},"featured_media":199002,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"research-area":[],"msr-video-type":[206954],"msr-locale":[268875],"msr-post-option":[],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-192215","msr-video","type-msr-video","status-publish","has-post-thumbnail","hentry","msr-video-type-microsoft-research-talks","msr-locale-en_us"],"msr_download_urls":"","msr_external_url":"https:\/\/youtu.be\/Gj22xlvGpMk","msr_secondary_video_url":"","msr_video_file":"","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/192215"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-video"}],"version-history":[{"count":0,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/192215\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/199002"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=192215"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=192215"},{"taxonomy":"msr-video-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video-type?post=192215"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=192215"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=192215"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=192215"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=192215"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}