{"id":372368,"date":"2017-03-20T11:42:31","date_gmt":"2017-03-20T18:42:31","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-group&p=372368"},"modified":"2024-11-20T11:52:01","modified_gmt":"2024-11-20T19:52:01","slug":"fate","status":"publish","type":"msr-group","link":"https:\/\/www.microsoft.com\/en-us\/research\/theme\/fate\/","title":{"rendered":"FATE: Fairness, Accountability, Transparency & Ethics in AI"},"content":{"rendered":"
\n\t
\n\t\t
\n\t\t\t\t\t<\/div>\n\t\t\n\t\t
\n\t\t\t\n\t\t\t
\n\t\t\t\t\n\t\t\t\t
\n\t\t\t\t\t\n\t\t\t\t\t
\n\t\t\t\t\t\t
\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\t\t<\/span>\n\t\t\t\t\t\t\t\t\tMicrosoft Research Lab \u2013 New York City\t\t\t\t\t\t\t\t<\/a>\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\n

FATE: Fairness, Accountability, Transparency, and Ethics in AI<\/h1>\n\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/div>\n<\/section>\n\n\n\n\n\n
\"FATE
FATE and friends<\/figcaption><\/figure>\n\n\n\n

We study the complex societal implications of artificial intelligence (AI), machine learning (ML), and natural language processing (NLP). Our aim is to facilitate computational techniques that are both innovative and responsible, while prioritizing issues of fairness, accountability, transparency, and ethics as they relate to AI, ML, and NLP by drawing on fields with a sociotechnical orientation, such as HCI, information science, sociology, anthropology, science and technology studies, media studies, political science, and law.<\/p>\n\n\n\n

We are committed to working closely with external research institutions, including the AI Now Institute<\/a> at New York University, Data & Society<\/a>, and the Partnership on AI<\/a>, as well as other research groups within Microsoft, such as the Sociotechnical Alignment Center<\/a> and the Social Media Collective<\/a>. We publish our work<\/a> in a variety of academic and other venues.<\/p>\n\n\n","protected":false},"excerpt":{"rendered":"

We study the complex societal implications of artificial intelligence (AI), machine learning (ML), and natural language processing (NLP). Our aim is to facilitate computational techniques that are both innovative and responsible, while prioritizing issues of fairness, accountability, transparency, and ethics as they relate to AI, ML, and NLP by drawing on fields with a sociotechnical orientation, such as HCI, information science, sociology, anthropology, science and technology studies, media studies, political science, and law.<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr_group_start":"","footnotes":""},"research-area":[13556,13554,13559],"msr-group-type":[243688],"msr-locale":[268875],"msr-impact-theme":[],"class_list":["post-372368","msr-group","type-msr-group","status-publish","hentry","msr-research-area-artificial-intelligence","msr-research-area-human-computer-interaction","msr-research-area-social-sciences","msr-group-type-theme","msr-locale-en_us"],"msr_group_start":"","msr_detailed_description":"","msr_further_details":"","msr_hero_images":[],"msr_research_lab":[199571,437514],"related-researchers":[{"type":"user_nicename","display_name":"Agathe Balayn","user_id":43641,"people_section":"Group 1","alias":"balaynagathe"},{"type":"user_nicename","display_name":"Solon Barocas","user_id":36051,"people_section":"Group 1","alias":"solon"},{"type":"user_nicename","display_name":"Su Lin Blodgett","user_id":39766,"people_section":"Group 1","alias":"sublodge"},{"type":"user_nicename","display_name":"Alex Chouldechova","user_id":42390,"people_section":"Group 1","alias":"alexandrac"},{"type":"user_nicename","display_name":"A. Feder Cooper","user_id":43563,"people_section":"Group 1","alias":"acoo"},{"type":"user_nicename","display_name":"Kate Crawford","user_id":32494,"people_section":"Group 1","alias":"kate"},{"type":"user_nicename","display_name":"Miro Dud\u00edk","user_id":32867,"people_section":"Group 1","alias":"mdudik"},{"type":"user_nicename","display_name":"Q. Vera Liao","user_id":40912,"people_section":"Group 1","alias":"veraliao"},{"type":"user_nicename","display_name":"Alexandra Olteanu","user_id":38323,"people_section":"Group 1","alias":"aloltea"},{"type":"user_nicename","display_name":"Lesia Semenova","user_id":43557,"people_section":"Group 1","alias":"lsemenova"},{"type":"user_nicename","display_name":"Matthew Vogel","user_id":43560,"people_section":"Group 1","alias":"mavoge"},{"type":"user_nicename","display_name":"Hanna Wallach","user_id":34779,"people_section":"Group 1","alias":"wallach"},{"type":"user_nicename","display_name":"Jennifer Wortman Vaughan","user_id":32235,"people_section":"Group 1","alias":"jenn"}],"related-publications":[579202,563964,748450,964446,645714,851488,683982,919089,559482,749203,996246,647082,851524,684486,919095,559494,750562,996486,647091,851563,687048,919101,559500,763528,999801,647379,854574,687096,919683,559506,763549,1030965,651078,863790,687102,920544,594775,766810,1030986,654621,863979,687108,923313,600966,815431,1049034,660249,876204,688662,923319,620094,815839,1049058,661503,894141,694200,934374,625320,817582,1049067,663915,896697,710506,950433,644349,833854,1081281,663921,904230,726181,951228,644547,840211,683943,912675],"related-downloads":[493514],"related-videos":[],"related-projects":[],"related-events":[],"related-opportunities":[1104927],"related-posts":[],"tab-content":[],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/372368"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-group"}],"version-history":[{"count":35,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/372368\/revisions"}],"predecessor-version":[{"id":1105791,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group\/372368\/revisions\/1105791"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=372368"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=372368"},{"taxonomy":"msr-group-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-group-type?post=372368"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=372368"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=372368"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}