Gaze and Attention Management for Embodied Conversational Agents
- Tomislav Pejsa ,
- Sean Andrist ,
- Michael Gleicher ,
- Bilge Mutlu
ACM Transactions on Interactive Intelligent Systems (TiiS) | , Vol 5(1)
To facilitate natural interactions between humans and embodied conversational agents (ECAs), we need to endow the latter with the same nonverbal cues that humans use to communicate. Gaze cues in particular are integral in mechanisms for communication and management of attention in social interactions, which can trigger important social and cognitive processes, such as establishment of affiliation between people or learning new information. The fundamental building blocks of gaze behaviors are gaze shifts: coordinated movements of the eyes, head, and body toward objects and information in the environment. In this article, we present a novel computational model for gaze shift synthesis for ECAs that supports parametric control over coordinated eye, head, and upper body movements. We employed the model in three studies with human participants. In the first study, we validated the model by showing that participants are able to interpret the agent’s gaze direction accurately. In the second and third studies, we showed that by adjusting the participation of the head and upper body in gaze shifts, we can control the strength of the attention signals conveyed, thereby strengthening or weakening their social and cognitive effects. The second study shows that manipulation of eye–head coordination in gaze enables an agent to convey more information or establish stronger affiliation with participants in a teaching task, while the third study demonstrates how manipulation of upper body coordination enables the agent to communicate increased interest in objects in the environment.