{"id":663915,"date":"2020-06-02T15:37:49","date_gmt":"2020-06-02T22:37:49","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-research-item&p=663915"},"modified":"2020-07-03T07:53:24","modified_gmt":"2020-07-03T14:53:24","slug":"language-technology-is-power-a-critical-survey-of-bias-in-nlp","status":"publish","type":"msr-research-item","link":"https:\/\/www.microsoft.com\/en-us\/research\/publication\/language-technology-is-power-a-critical-survey-of-bias-in-nlp\/","title":{"rendered":"Language (Technology) is Power: A Critical Survey of \u201cBias\u201d in NLP"},"content":{"rendered":"

Abstract<\/h3>\n

We survey 146 papers analyzing \u201cbias\u201d in NLP systems, finding that their motivations are often vague, inconsistent, and lacking in normative reasoning, despite the fact that analyzing \u201cbias\u201d is an inherently normative process. We further find that these papers\u2019 proposed quantitative techniques for measuring or mitigating \u201cbias\u201d are poorly matched to their motivations and do not engage with the relevant literature outside of NLP. Based on these findings, we describe the beginnings of a path forward by proposing three recommendations that should guide work analyzing \u201cbias\u201d in NLP systems. These recommendations rest on a greater recognition of the relationships between language and social hierarchies, encouraging researchers and practitioners to articulate their conceptualizations of \u201cbias\u201d\u2014i.e., what kinds of system behaviors are harmful, in what ways, to whom, and why, as well as the normative reasoning underlying these statements\u2014and to center work around the lived experiences of members of communities affected by NLP systems, while interrogating and reimagining the power relations between technologists and such communities.<\/p>\n

Press summary<\/h3>\n

Language plays many roles in communication: positive roles like conveying knowledge or relating to others, and problematic roles like transmitting stereotypes and serving as barriers to entry. A large body of research has been conducted recently focusing on the problematic ways in which language is used, and which natural language processing (NLP) systems pick up on, generally all scoped as “bias” in NLP. We survey 146 papers on “bias” in NLP systems, finding vague and inconsistent conceptualizations of “bias,” describing a wide range of system behaviors. Often these conceptualizations are given without explicit statements of why these system behaviors are harmful, in what ways, and to whom, and without the social values behind these statements. For example, some papers on \u201cracial bias\u201d focus on race-based stereotypes expressed in written text, while others focus on how NLP systems perform differently on language written by speakers from different racialized communities. This is a problem because it limits the ability of the NLP community to recognize and measure progress, or even to debate whether certain \u201cdebiasing\u201d approaches are mitigating harms or actually creating new ones.<\/p>\n

We provide three recommendations for research on “bias” in NLP systems. First, we recommend that the NLP community discuss “bias” in the context of the broader social conditions in which language is used, including the ways that beliefs about language shape society. For example, society\u2019s anti-Black beliefs stigmatize African-American English (AAE) as ungrammatical or offensive; this stigmatization is reinforced by toxicity detection systems that incorrectly flag AAE as more toxic than Mainstream U.S. English. Second, we urge the NLP community to be precise about what social values are being called upon when describing “bias,” and in what ways particular system behaviors are harmful. Third, because language is a social phenomenon, we encourage the NLP community to draw more heavily upon research traditions in HCI, social computing, sociolinguistics, and beyond to understand how NLP systems impact different communities in different ways. The NLP community currently has the power to decide how NLP systems are developed and deployed; to be more equitable, development and deployment decisions must be made accessible and accountable to the communities most likely to be impacted by these systems.<\/p>\n

Press coverage<\/h3>\n

“Microsoft Researchers Say NLP Bias Studies Must Consider Role of Social Hierarchies like Racism” (opens in new tab)<\/span><\/a><\/p>\n","protected":false},"excerpt":{"rendered":"

Abstract We survey 146 papers analyzing \u201cbias\u201d in NLP systems, finding that their motivations are often vague, inconsistent, and lacking in normative reasoning, despite the fact that analyzing \u201cbias\u201d is an inherently normative process. We further find that these papers\u2019 proposed quantitative techniques for measuring or mitigating \u201cbias\u201d are poorly matched to their motivations and […]<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"msr-content-type":[3],"msr-research-highlight":[],"research-area":[13556,13545,13559],"msr-publication-type":[193716],"msr-product-type":[],"msr-focus-area":[],"msr-platform":[],"msr-download-source":[],"msr-locale":[268875],"msr-post-option":[],"msr-field-of-study":[],"msr-conference":[],"msr-journal":[],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-663915","msr-research-item","type-msr-research-item","status-publish","hentry","msr-research-area-artificial-intelligence","msr-research-area-human-language-technologies","msr-research-area-social-sciences","msr-locale-en_us"],"msr_publishername":"","msr_edition":"","msr_affiliation":"","msr_published_date":"2020-6-1","msr_host":"","msr_duration":"","msr_version":"","msr_speaker":"","msr_other_contributors":"","msr_booktitle":"","msr_pages_string":"","msr_chapter":"","msr_isbn":"","msr_journal":"","msr_volume":"","msr_number":"","msr_editors":"","msr_series":"","msr_issue":"","msr_organization":"","msr_how_published":"","msr_notes":"","msr_highlight_text":"","msr_release_tracker_id":"","msr_original_fields_of_study":"","msr_download_urls":"","msr_external_url":"","msr_secondary_video_url":"","msr_longbiography":"","msr_microsoftintellectualproperty":1,"msr_main_download":"","msr_publicationurl":"","msr_doi":"","msr_publication_uploader":[{"type":"url","viewUrl":"false","id":"false","title":"https:\/\/www.aclweb.org\/anthology\/2020.acl-main.485\/","label_id":"243109","label":0}],"msr_related_uploader":"","msr_attachments":[],"msr-author-ordering":[{"type":"text","value":"Su Lin Blodgett","user_id":0,"rest_url":false},{"type":"user_nicename","value":"Solon Barocas","user_id":36051,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Solon Barocas"},{"type":"user_nicename","value":"Hal Daum\u00e9 III","user_id":36768,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Hal Daum\u00e9 III"},{"type":"user_nicename","value":"Hanna Wallach","user_id":34779,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Hanna Wallach"}],"msr_impact_theme":[],"msr_research_lab":[199571],"msr_event":[668643],"msr_group":[372368],"msr_project":[],"publication":[],"video":[],"download":[],"msr_publication_type":"inproceedings","related_content":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/663915","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-research-item"}],"version-history":[{"count":8,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/663915\/revisions"}],"predecessor-version":[{"id":671562,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/663915\/revisions\/671562"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=663915"}],"wp:term":[{"taxonomy":"msr-content-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-content-type?post=663915"},{"taxonomy":"msr-research-highlight","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-highlight?post=663915"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=663915"},{"taxonomy":"msr-publication-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-publication-type?post=663915"},{"taxonomy":"msr-product-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-product-type?post=663915"},{"taxonomy":"msr-focus-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-focus-area?post=663915"},{"taxonomy":"msr-platform","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-platform?post=663915"},{"taxonomy":"msr-download-source","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-download-source?post=663915"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=663915"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=663915"},{"taxonomy":"msr-field-of-study","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-field-of-study?post=663915"},{"taxonomy":"msr-conference","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-conference?post=663915"},{"taxonomy":"msr-journal","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-journal?post=663915"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=663915"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=663915"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}