{"id":372368,"date":"2017-03-20T11:42:31","date_gmt":"2017-03-20T18:42:31","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-group&p=372368"},"modified":"2025-11-20T11:25:32","modified_gmt":"2025-11-20T19:25:32","slug":"fate","status":"publish","type":"msr-group","link":"https:\/\/www.microsoft.com\/en-us\/research\/theme\/fate\/","title":{"rendered":"FATE: Fairness, Accountability, Transparency & Ethics in AI"},"content":{"rendered":"
\n\t
\n\t\t
\n\t\t\t\t\t<\/div>\n\t\t\n\t\t
\n\t\t\t\n\t\t\t
\n\t\t\t\t\n\t\t\t\t
\n\t\t\t\t\t\n\t\t\t\t\t
\n\t\t\t\t\t\t
\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\t\t<\/span>\n\t\t\t\t\t\t\t\t\tMicrosoft Research Lab \u2013 New York City\t\t\t\t\t\t\t\t<\/a>\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\n

FATE: Fairness, Accountability, Transparency, and Ethics in AI<\/h1>\n\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/div>\n<\/section>\n\n\n\n\n\n
\"FATE
FATE and friends<\/figcaption><\/figure>\n\n\n\n

We study the complex societal implications of artificial intelligence (AI), machine learning (ML), and natural language processing (NLP). Our aim is to facilitate computational techniques that are both innovative and responsible, while prioritizing issues of fairness, accountability, transparency, and ethics as they relate to AI, ML, and NLP by drawing on fields with a sociotechnical orientation, such as HCI, information science, sociology, anthropology, science and technology studies, media studies, political science, and law.<\/p>\n\n\n\n

We work closely with other research groups within Microsoft, such as the Sociotechnical Alignment Center<\/a> and the Social Media Collective (opens in new tab)<\/span><\/a>. We publish our work<\/a> in a variety of academic and other venues.<\/p>\n\n\n\n

NEWS: <\/strong>We are hiring interns (opens in new tab)<\/span><\/a> and postdocs (opens in new tab)<\/span><\/a> to start in summer 2026!  Apply by December 15, 2025<\/strong> for full consideration.<\/p>\n\n\n","protected":false},"excerpt":{"rendered":"

We study the complex societal implications of artificial intelligence (AI), machine learning (ML), and natural language processing (NLP). Our aim is to facilitate computational techniques that are both innovative and responsible, while prioritizing issues of fairness, accountability, transparency, and ethics as they relate to AI, ML, and NLP by drawing on fields with a sociotechnical orientation, such as HCI, information science, sociology, anthropology, science and technology studies, media studies, political science, and law.<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr_group_start":"","footnotes":""},"research-area":[13556,13554,13559],"msr-group-type":[243688],"msr-locale":[268875],"msr-impact-theme":[],"class_list":["post-372368","msr-group","type-msr-group","status-publish","hentry","msr-research-area-artificial-intelligence","msr-research-area-human-computer-interaction","msr-research-area-social-sciences","msr-group-type-theme","msr-locale-en_us"],"msr_group_start":"","msr_detailed_description":"","msr_further_details":"","msr_hero_images":[],"msr_research_lab":[199571,437514],"related-researchers":[{"type":"user_nicename","display_name":"Agathe Balayn","user_id":43641,"people_section":"Group 1","alias":"balaynagathe"},{"type":"user_nicename","display_name":"Solon Barocas","user_id":36051,"people_section":"Group 1","alias":"solon"},{"type":"user_nicename","display_name":"Kate Crawford","user_id":32494,"people_section":"Group 1","alias":"kate"},{"type":"user_nicename","display_name":"Miro Dud\u00edk","user_id":32867,"people_section":"Group 1","alias":"mdudik"},{"type":"user_nicename","display_name":"Darya Moldavskaya","user_id":43569,"people_section":"Group 1","alias":"dmoldavskaya"},{"type":"user_nicename","display_name":"Matthew Vogel","user_id":43560,"people_section":"Group 1","alias":"mavoge"},{"type":"user_nicename","display_name":"Hanna Wallach","user_id":34779,"people_section":"Group 1","alias":"wallach"},{"type":"user_nicename","display_name":"Jenn Wortman Vaughan","user_id":32235,"people_section":"Group 1","alias":"jenn"}],"related-publications":[579202,563964,1081281,896697,1049067,1049058,1049034,1030986,1030965,999801,996486,996246,964446,951228,950433,934374,923319,923313,920544,919683,919101,919095,919089,912675,904230,894141,1134889,1161857,1147541,1147535,1147506,1147483,1147480,1147478,1147476,1144124,1138603,1136227,1108629,1134768,1134753,1126449,1115496,1115490,1115484,1114086,1114080,1114071,1113531,1113519,647091,687096,687048,684486,683982,683943,663921,663915,661503,660249,654621,651078,647379,687102,647082,645714,644547,644349,625320,620094,600966,594775,559506,559500,559494,815431,876204,863979,863790,854574,851563,851524,851488,840211,833854,817582,815839,559482,766810,763549,763528,750562,749203,748450,726181,710506,694200,688662,687108],"related-downloads":[],"related-videos":[665562,736096,747361,748126,1003980],"related-projects":[716050,721693,721708],"related-events":[741676,729871,728776,632442,632154],"related-opportunities":[1156240,1162988],"related-posts":[469611,491609,583696,563847,572,494648,640473,659955,680358,965166,969147,1032900,1156686],"tab-content":[],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/372368","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-group"}],"version-history":[{"count":40,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/372368\/revisions"}],"predecessor-version":[{"id":1156241,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/372368\/revisions\/1156241"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=372368"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=372368"},{"taxonomy":"msr-group-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group-type?post=372368"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=372368"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=372368"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}