Pattern 10A: Disambiguate before acting

Problem

The AI system is uncertain of user intent and what further actions to take.

Solution

Elicit clarification from the user before taking action to resolve the system’s uncertainty.

Use when

  • The system is able to compute its own uncertainty.
  • The system is able to generate a list of probable alternate interpretations but is uncertain about which one is correct.

How

Collaborate with an AI/ML practitioner to:

  • Identify multiple probable options to show the user.
  • Determine the uncertainty threshold that triggers asking the user. In instances when system confidence is sufficiently high, consider not interrupting the user to ask for clarification. In instances when system confidence is extremely low, consider suppressing the feature altogether.

To elicit clarification, ask the user a clarifying question or prompt the user to select from one or more probable options.

  • Clarifying questions may take the form of “Did you mean…?” “Do you want…?” “I didn’t understand that…”
  • Prompt the user to select from multiple probable options by using selection controls.

If working with a fixed number of options, consider keeping them in a consistent order. If working with a large and open set of options, place the most probable option first.

Consider enabling the user to request an updated set of probable options if the initial set is not useful.

User benefits

  • Avoids system errors by explicitly communicating intent before the system takes action.
  • The user is still able to benefit from system capabilities even when the system has low confidence.

Common pitfalls

  • Disambiguation dialogues interrupt the user’s flow. Use them cautiously.
  • If disambiguation fails, avoid endless loops of eliciting clarification. Consider using pattern G10-B: Avoid cold starts by eliciting user preferences.
  • The probable options shown to the user are irrelevant or not sufficiently diverse.
  • The ordering of probable options is confusing to the user.
  • It is not clear to the user how to make a selection.
  • It is not clear to the user how to solicit additional probable options.

​​​​​​​References

  • Jaime Teevan, Eytan Adar, Rosie Jones, and Michael A. S. Potts. 2007. Information re-retrieval: repeat queries in Yahoo’s logs. In Proceedings of the 30th annual international ACM SIGIR conference on Research and development in information retrieval (SIGIR ’07). Association for Computing Machinery, New York, NY, USA, 151–158. DOI:https://doi.org/10.1145/1277741.1277770

Examples