{"id":703,"date":"2021-05-13T20:11:29","date_gmt":"2021-05-13T20:11:29","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/haxtoolkit\/?post_type=pattern&p=703"},"modified":"2023-10-25T07:53:40","modified_gmt":"2023-10-25T14:53:40","slug":"g11-d-map-system-input-attributes-to-system-outputs","status":"publish","type":"pattern","link":"https:\/\/www.microsoft.com\/en-us\/haxtoolkit\/pattern\/g11-d-map-system-input-attributes-to-system-outputs\/","title":{"rendered":"Pattern 11D: Map system input attributes to system outputs"},"content":{"rendered":"\n\n
The user needs insights into why the system did what it did.\u200b\u200b\u200b\u200b\u200b\u200b\u200b<\/p>\n\n\n\n
Provide an explanation that enables the user to infer a connection between user behaviors and the system\u2019s decision(s). Often used together with G11-E: Map user behaviors to system outputs<\/a>.<\/p>\n\n\n\n Collaborate with an AI\/ML practitioner to collect information about which input attributes informed the system:<\/p>\n\n\n\n The explanation might cover a specific system decision (see G11-A: Local explanations<\/a>) or general system behavior (see G11-B: Global explanations<\/a>).<\/p>\n\n\n\n The explanation might include, for each displayed attribute, information about its importance in the system\u2019s decision making.<\/p>\n\n\n\n The content of local<\/em> explanations can also include a set of attributes most influential to the system\u2019s output<\/p>\n\n\n\n The content of global<\/em> explanations can also include types of attributes the system uses to determine its decisions.<\/p>\n\n\n\nUse when<\/h4>\n\n\n\n
\n
How<\/h4>\n\n\n\n
\n
User benefits<\/h4>\n\n\n\n
\n
Common pitfalls<\/h4>\n\n\n\n
\n