{"id":671,"date":"2021-05-12T20:58:51","date_gmt":"2021-05-12T20:58:51","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/haxtoolkit\/?post_type=pattern&p=671"},"modified":"2023-10-25T07:36:56","modified_gmt":"2023-10-25T14:36:56","slug":"g10-a-disambiguate-before-acting","status":"publish","type":"pattern","link":"https:\/\/www.microsoft.com\/en-us\/haxtoolkit\/pattern\/g10-a-disambiguate-before-acting\/","title":{"rendered":"Pattern 10A: Disambiguate before acting"},"content":{"rendered":"\n\n
The AI system is uncertain of user intent and what further actions to take.<\/p>\n\n\n\n
Elicit clarification from the user before taking action to resolve the system\u2019s uncertainty.<\/p>\n\n\n\n
Collaborate with an AI\/ML practitioner to:<\/p>\n\n\n\n
To elicit clarification, ask the user a clarifying question<\/strong> or prompt the user to select from one or more probable options<\/strong>.<\/p>\n\n\n\n If working with a fixed number of options, consider keeping them in a consistent order. If working with a large and open set of options, place the most probable option first.<\/p>\n\n\n\n Consider enabling the user to request an updated set of probable options if the initial set is not useful.<\/p>\n\n\n\n\n
User benefits<\/h4>\n\n\n\n
\n
Common pitfalls<\/h4>\n\n\n\n
\n
\u200b\u200b\u200b\u200b\u200b\u200b\u200b<\/em>References<\/h4>\n\n\n\n