{"id":629145,"date":"2020-01-02T15:07:25","date_gmt":"2020-01-02T23:07:25","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-group&p=629145"},"modified":"2022-09-02T14:43:49","modified_gmt":"2022-09-02T21:43:49","slug":"deep-learning-language-montreal","status":"publish","type":"msr-group","link":"https:\/\/www.microsoft.com\/en-us\/research\/theme\/deep-learning-language-montreal\/","title":{"rendered":"Deep Learning & Language | Montr\u00e9al"},"content":{"rendered":"
\n\t
\n\t\t
\n\t\t\t\"Theme:\t\t<\/div>\n\t\t\n\t\t
\n\t\t\t\n\t\t\t
\n\t\t\t\t\n\t\t\t\t
\n\t\t\t\t\t\n\t\t\t\t\t
\n\t\t\t\t\t\t
\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\t\t<\/span>\n\t\t\t\t\t\t\t\t\tReturn to Microsoft Research Lab \u2013 Montr\u00e9al\t\t\t\t\t\t\t\t<\/a>\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\n

Deep Learning & Language | Montr\u00e9al<\/h1>\n\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/div>\n<\/section>\n\n\n\n\n\n

Recent advances in Deep Learning have fostered state-of-the art performance in increasingly complex real-world tasks. An important property of deep learning, training end-to-end using high-dimensional data and large models<\/em>, continues to drive advances that empower people in surprising ways.<\/p>\n\n\n\n

Despite the incredible success of deep neural networks, we are far from \u201csolving\u201d AI. Deep learning-based AI systems still make glaring mistakes unlike those of humans. Deep models may be biased, unfair, and are often fragile\u2014they do not generalize well to new environments or tasks that diverge from the training setting.<\/p>\n\n\n\n

We focus on advancing the state of the art in deep learning and beyond to answer fundamental questions on intelligence. We pursue research along the following directions:<\/p>\n\n\n\n

  • Learning representations of high-dimensional data using unsupervised or self-supervised methods, including representations with complex structure.<\/li>
  • Analyzing and mitigating biases in learned representations.<\/li>
  • Meta-learning to generalize to out-of-distribution data and tasks and to adapt models rapidly.<\/li>
  • Grounded representation learning that connects different data modalities, such as natural language and video.<\/li>
  • Learning to discover, extract, and model common sense knowledge, especially in natural language understanding.<\/li>
  • \u201cThe Agent Perspective\u201d and learning through interaction: building learning systems that perform experiments, observe their outcomes, perceive actively, encounter multiple related learning problems, and optimize objectives motivated by the environment rather than arbitrary choice.<\/li><\/ul>\n\n\n\n

    Our team strives to build machines that learn from and understand the world. We are a leader in using deep learning to solve complex language-understanding problems and in training machines to model reasoning and decision-making capabilities. Based in Montr\u00e9al, a global hub for AI, our team brings together experts in various machine-learning domains and collaborates deeply with the surrounding academic community.<\/p>\n\n\n","protected":false},"excerpt":{"rendered":"

    Within Machine Learning and AI, we are focused on advancing the state of the art in Deep Learning and beyond to answer questions related to modeling intelligence. For us, this means training models and agents that can learn useful representations from high-dimensional data and to perform useful tasks such that their performance generalizes to new or unknown distributions and tasks.<\/p>\n","protected":false},"featured_media":627627,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"msr_group_start":"","footnotes":""},"research-area":[13556],"msr-group-type":[243688],"msr-locale":[268875],"msr-impact-theme":[],"class_list":["post-629145","msr-group","type-msr-group","status-publish","has-post-thumbnail","hentry","msr-research-area-artificial-intelligence","msr-group-type-theme","msr-locale-en_us"],"msr_group_start":"","msr_detailed_description":"","msr_further_details":"","msr_hero_images":[],"msr_research_lab":[437514],"related-researchers":[{"type":"user_nicename","display_name":"Marc-Alexandre C\u00f4t\u00e9","user_id":37197,"people_section":"Section name 0","alias":"macote"},{"type":"user_nicename","display_name":"Alessandro Sordoni","user_id":37230,"people_section":"Section name 0","alias":"alsordon"},{"type":"user_nicename","display_name":"Eric Yuan","user_id":37167,"people_section":"Section name 0","alias":"eryua"}],"related-publications":[549276,606396,549297,606402,692049,556338,622881,758296,558498,629274,445071,785476,577473,629280,455568,854574,581509,629289,466575,584533,629295,481284,585919,629301,485619,595489,629307,493214,596701,629313,547374,598555],"related-downloads":[],"related-videos":[],"related-projects":[],"related-events":[],"related-opportunities":[],"related-posts":[],"tab-content":[],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/629145"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-group"}],"version-history":[{"count":12,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/629145\/revisions"}],"predecessor-version":[{"id":875169,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/629145\/revisions\/875169"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/627627"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=629145"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=629145"},{"taxonomy":"msr-group-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group-type?post=629145"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=629145"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=629145"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}