Directions Robot: In-the-Wild Experiences and Lessons Learned

We introduce Directions Robot, a system we have fielded for
studying open-world human-robot interaction. The system brings
together models for situated spoken language interaction with
directions-generation and a gesturing humanoid robot. We describe
the perceptual, interaction, and output generation competencies of
this system. We then discuss experiences and lessons drawn from
data collected in an initial in-the-wild deployment, and highlight
several challenges with managing engagement, providing
directions, and handling out-of-domain queries that arise in openworld,
multiparty settings.