{"id":924204,"date":"2023-03-01T15:00:28","date_gmt":"2023-03-01T23:00:28","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/"},"modified":"2023-03-03T09:01:06","modified_gmt":"2023-03-03T17:01:06","slug":"transparency-and-simplicity-in-criminal-risk-assessment","status":"publish","type":"msr-research-item","link":"https:\/\/www.microsoft.com\/en-us\/research\/publication\/transparency-and-simplicity-in-criminal-risk-assessment\/","title":{"rendered":"Transparency and Simplicity in Criminal Risk Assessment"},"content":{"rendered":"
In \u201cThe Age of Secrecy and Unfairness in Recidivism Prediction,\u201d Rudin, Wang, and Coker (2020, hereafter, RWC) contend that the current focus on questions of algorithmic fairness is misplaced. Rather, they argue, we must insist first and foremost that the algorithms used in high-stakes settings such as criminal justice are transparent, prioritizing transparency over other forms of fairness. The authors make their case by taking us on a deep dive into what can be learned about a proprietary risk assessment tool, COMPAS. Through their methodical statistical detective work and reflections on stories such as that of Mr. Rodr\u00efguez, who was denied parole on the basis of an erroneously calculated COMPAS score, they make clear that without transparency \u201calgorithms may not do what we think they do,[…] they may not do what we want,\u201d and we may be left limited in our recourse against it.<\/div>\n
<\/div>\n
I agree that algorithmic transparency is in many ways of fundamental importance. However, to ensure that algorithms \u2018do\u2019 what we want them to, we need more than the transparency afforded by having a simple model whose formula is publicly accessible. A key issue here is that algorithms do not do very much. In settings such as criminal justice, secret (or not-so-secret) algorithms commonly influence and inform decisions about individuals, but they do not make them. Recidivism risk, such as may be accurately estimated through the simple models lauded by the authors, is only one part of the decision-making equation. As I will discuss further in this response, a big part of the challenge in assessing an algorithm\u2019s fit-for-purpose is that there is no decision-making equation \u2026<\/div>\n","protected":false},"excerpt":{"rendered":"

In \u201cThe Age of Secrecy and Unfairness in Recidivism Prediction,\u201d Rudin, Wang, and Coker (2020, hereafter, RWC) contend that the current focus on questions of algorithmic fairness is misplaced. Rather, they argue, we must insist first and foremost that the algorithms used in high-stakes settings such as criminal justice are transparent, prioritizing transparency over other […]<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"footnotes":""},"msr-content-type":[3],"msr-research-highlight":[],"research-area":[13556],"msr-publication-type":[193715],"msr-product-type":[],"msr-focus-area":[],"msr-platform":[],"msr-download-source":[],"msr-locale":[268875],"msr-field-of-study":[247348,247705,255958],"msr-conference":[],"msr-journal":[],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-924204","msr-research-item","type-msr-research-item","status-publish","hentry","msr-research-area-artificial-intelligence","msr-locale-en_us","msr-field-of-study-business","msr-field-of-study-risk-analysis-engineering","msr-field-of-study-transparency-behavior"],"msr_publishername":"","msr_edition":"","msr_affiliation":"","msr_published_date":"2020-3-30","msr_host":"","msr_duration":"","msr_version":"","msr_speaker":"","msr_other_contributors":"","msr_booktitle":"","msr_pages_string":"","msr_chapter":"","msr_isbn":"","msr_journal":"Harvard Data Science Review","msr_volume":"2","msr_number":"","msr_editors":"","msr_series":"","msr_issue":"1","msr_organization":"","msr_how_published":"","msr_notes":"","msr_highlight_text":"","msr_release_tracker_id":"","msr_original_fields_of_study":"","msr_download_urls":"","msr_external_url":"","msr_secondary_video_url":"","msr_longbiography":"","msr_microsoftintellectualproperty":0,"msr_main_download":"","msr_publicationurl":"","msr_doi":"","msr_publication_uploader":[{"type":"doi","viewUrl":"false","id":"false","title":"10.1162\/99608F92.B9343EEC","label_id":"243106","label":0},{"type":"url","viewUrl":"false","id":"false","title":"https:\/\/hdsr.mitpress.mit.edu\/pub\/xlfiu7za","label_id":"243109","label":0}],"msr_related_uploader":"","msr_attachments":[],"msr-author-ordering":[{"type":"user_nicename","value":"Alex Chouldechova","user_id":42390,"rest_url":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Alex Chouldechova"}],"msr_impact_theme":[],"msr_research_lab":[],"msr_event":[],"msr_group":[],"msr_project":[],"publication":[],"video":[],"download":[],"msr_publication_type":"article","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/924204"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-research-item"}],"version-history":[{"count":1,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/924204\/revisions"}],"predecessor-version":[{"id":924210,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/924204\/revisions\/924210"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=924204"}],"wp:term":[{"taxonomy":"msr-content-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-content-type?post=924204"},{"taxonomy":"msr-research-highlight","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-highlight?post=924204"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=924204"},{"taxonomy":"msr-publication-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-publication-type?post=924204"},{"taxonomy":"msr-product-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-product-type?post=924204"},{"taxonomy":"msr-focus-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-focus-area?post=924204"},{"taxonomy":"msr-platform","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-platform?post=924204"},{"taxonomy":"msr-download-source","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-download-source?post=924204"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=924204"},{"taxonomy":"msr-field-of-study","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-field-of-study?post=924204"},{"taxonomy":"msr-conference","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-conference?post=924204"},{"taxonomy":"msr-journal","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-journal?post=924204"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=924204"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=924204"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}