About the project
One of the essential functions for service-robots, like other AI systems, is the conversational capability that facilitates the interaction with humans. Here, the difference between service robots and other AI systems is that robots perform physical action. Namely, one of the central issues in service-robot HRI is to generation of gestures along robot interactions. We are working on designing a basic library for gesture generation, gesture authoring tools, referred to as LabanSuite (opens in new tab), to get customized libraries from human gestures, and handy DIY kit (opens in new tab) to represent such gestures.
Research topics
- Gesture-generation system from textual input. Appropriate co-speech gestures will convey intentions or emotions effectively and thus facilitate the interaction with a human. We have been developing gesture-generation systems that generate gestures based on the semantics of the text. Gestures are generated by using gesture libraries that store human co-speech gestures in a hardware-independent format. With the use of recent conversational agents, we could build a natural human-robot interface for arbitrary robot hardware.
- Gesture authoring tool and handy DIY kit. We have been developing an open-source project LabanSuite (opens in new tab) for authorizing human gestures capture by a Microsoft Kinect sensor. Gestures are encoded as Labanotation, a human dance notation, which is hardware-independent. We also provide a handy DIY kit (opens in new tab) for developing your own service robot and playing gestures.
- Robot ego-noise reduction. Recent advancements in speech recognition have improved the listening capability of service robots. However, the recorded human speech from robots’ microphones often includes noises such as cooling fan noise, robot ego noise when they operate. Those noises can degrade the recognition performance. We have been developing noise filters that are aware of the profile of ego-noise.