{"id":791462,"date":"2021-11-16T08:00:16","date_gmt":"2021-11-16T16:00:16","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-research-item&p=791462"},"modified":"2021-11-01T11:32:22","modified_gmt":"2021-11-01T18:32:22","slug":"panel-discussion-content-moderation-beyond-the-ban-reducing-borderline-toxic-misleading-and-low-quality-content","status":"publish","type":"msr-video","link":"https:\/\/www.microsoft.com\/en-us\/research\/video\/panel-discussion-content-moderation-beyond-the-ban-reducing-borderline-toxic-misleading-and-low-quality-content\/","title":{"rendered":"Panel discussion: Content moderation beyond the ban: Reducing borderline, toxic, misleading, and low-quality content"},"content":{"rendered":"

Public debate about content moderation focuses almost exclusively on removal, such as what is deleted and who is suspended. But what about content that is identified as \u201cborderline,\u201d which almost\u2014but not quite\u2014violates the guidelines? Faced with an expanding sense of responsibility, many platform companies have started identifying this type of content, as well as content that may be toxic, misleading, or harmful in the aggregate. Rather than remove it, they can minimize its effect by taking some of the following approaches: reduce its visibility in recommendations, limit its discoverability in search, add labels or warnings, or provide fact-checks or additional context. Tarleton Gillespie (Senior Principal Researcher at Microsoft) and Zoe Darm\u00e9 (Senior Manager of Search at Google) host a panel that includes Sarita Schoenebeck, (Associate Professor, School of Information, University of Michigan), Ryan Calo (Professor of Law, University of Washington), and Charlotte Willner (Founding Executive Director of the Trust & Safety Professional Association).<\/p>\n

Join us as they discuss these techniques and the questions they raise, such as: How is such content being identified? Are these approaches effective? How do users respond? How can platforms be transparent and accountable for such interventions? What are the ethical and practical implications of these approaches?<\/p>\n

Learn more about the 2021 Microsoft Research Summit: https:\/\/Aka.ms\/researchsummit (opens in new tab)<\/span><\/a><\/p>\n","protected":false},"excerpt":{"rendered":"

Public debate about content moderation focuses almost exclusively on removal, such as what is deleted and who is suspended. But what about content that is identified as \u201cborderline,\u201d which almost\u2014but not quite\u2014violates the guidelines? Faced with an expanding sense of responsibility, many platform companies have started identifying this type of content, as well as content […]<\/p>\n","protected":false},"featured_media":791465,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"footnotes":""},"research-area":[13556,13559],"msr-video-type":[261263,261278],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-791462","msr-video","type-msr-video","status-publish","has-post-thumbnail","hentry","msr-research-area-artificial-intelligence","msr-research-area-social-sciences","msr-video-type-research-summit-2021","msr-video-type-responsible-ai","msr-locale-en_us"],"msr_download_urls":"","msr_external_url":"https:\/\/youtu.be\/wOEfp69Uko4","msr_secondary_video_url":"","msr_video_file":"","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/791462"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-video"}],"version-history":[{"count":1,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/791462\/revisions"}],"predecessor-version":[{"id":791474,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/791462\/revisions\/791474"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/791465"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=791462"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=791462"},{"taxonomy":"msr-video-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video-type?post=791462"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=791462"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=791462"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=791462"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}