Learning Multi-Task Action Abstractions as a Sequence Compression Problem
- Ruijie Zheng ,
- Ching-An Cheng ,
- Furong Huang ,
- Andrey Kolobov
CoRL 2023 Pre-Training for Robot Learning Workshop |
Temporal abstractions, along with belief state representations, have long been recognized as a powerful knowledge sharing mechanism for decision-making scenarios ranging from computer games to robotics. In this work, we propose a novel approach that views inducing temporal action abstractions as sequence compression. In doing so, it brings well-established NLP tools such as byte pair encoding (BPE) to bear on the seemingly distant problem of learning variable-timespan action primitives over continuous control spaces in robotics. We empirically demonstrate that, given a multitask set of demonstrations, our technique constructs high-level actions that help significantly boost the performance of behavior cloning-learned policies for unseen downstream tasks from a given amount of data.