Pattern 11A: Local explanations

Problem

The user needs an explanation for why the system did what it did, and it is important for the user to understand a particular system decision.

Solution

Make an explanation available for one specific action or decision the AI system made.

Use when

  • The user wants transparency into the system’s reasoning about a specific action.
  • Policy or regulations require the system to make available an explanation.

How

Get information about how the AI system made the decision. See patterns G11-B-G for different explanation styles.

If a local explanation isn’t possible (e.g., when the AI system doesn’t pass information to the UI that’s useful for a local explanation), consider advocating with your team to pursue known methods for generating such explanations.

Ensure that the representation communicates that the explanation is specific to one system decision. For example, use proximity or other principles of grouping to make that association clear to the user.

User benefits

  • Enables the user to understand specific AI system decisions.
  • Enables the user to quickly update their understanding of how the system behaves.
  • Enables the user to understand the system’s reasoning.
  • Understanding the system’s reasoning in turn enables the user to predict the system’s behavior and troubleshoot when the system’s behavior is undesirable.

Common pitfalls

References

Examples

Guideline 11 > Pattern 11A > Example
card example thumbnail
Guideline 11 > Pattern 11A > Example
card example thumbnail
Guideline 11 > Pattern 11A > Example
card example thumbnail
Guideline 11 > Pattern 11A > Example
card example thumbnail
Guideline 11 > Pattern 11A > Example
card example thumbnail
Guideline 11 > Pattern 11A > Example
card example thumbnail