{"id":190977,"date":"2014-06-11T00:00:00","date_gmt":"2014-06-11T16:33:28","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/msr-research-item\/ml-day-2014-learning-to-act-in-multiagent-sequential-environments\/"},"modified":"2016-07-15T15:17:38","modified_gmt":"2016-07-15T22:17:38","slug":"ml-day-2014-learning-to-act-in-multiagent-sequential-environments","status":"publish","type":"msr-video","link":"https:\/\/www.microsoft.com\/en-us\/research\/video\/ml-day-2014-learning-to-act-in-multiagent-sequential-environments\/","title":{"rendered":"ML Day 2014 – Learning to Act in Multiagent Sequential Environments"},"content":{"rendered":"
\n

From routing to online auctions, many decision-making tasks for learning agents are carried out in the presence of other decision makers. I will give a brief overview of results developed in the context of adapting reinforcement-learning algorithms to work effectively in multiagent environments. Of particular interest is the idea that even simple scenarios, such as the well-known Prisoner\u2019s dilemma, require agents to work together, bearing some individual risk, to arrive at mutually beneficial outcomes<\/p>\n<\/div>\n

<\/p>\n","protected":false},"excerpt":{"rendered":"

From routing to online auctions, many decision-making tasks for learning agents are carried out in the presence of other decision makers. I will give a brief overview of results developed in the context of adapting reinforcement-learning algorithms to work effectively in multiagent environments. Of particular interest is the idea that even simple scenarios, such as […]<\/p>\n","protected":false},"featured_media":198400,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"footnotes":""},"research-area":[],"msr-video-type":[],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-190977","msr-video","type-msr-video","status-publish","has-post-thumbnail","hentry","msr-locale-en_us"],"msr_download_urls":"","msr_external_url":"https:\/\/youtu.be\/Rvnfvsd-Mis","msr_secondary_video_url":"","msr_video_file":"","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/190977"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-video"}],"version-history":[{"count":0,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/190977\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/198400"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=190977"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=190977"},{"taxonomy":"msr-video-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video-type?post=190977"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=190977"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=190977"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=190977"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}