ChaCha for Online AutoML
- Qingyun Wu ,
- Chi Wang ,
- John Langford ,
- Paul Mineiro ,
- Marco Rossi
2021 International Conference on Machine Learning (ICML 2021) |
We propose the ChaCha (Champion-Challengers) algorithm for making an online choice of hyperparameters in online learning settings. ChaCha handles the process of determining a champion and scheduling a set of `live’ challengers over time based on sample complexity bounds. It is guaranteed to have sublinear regret after the optimal configuration is added into consideration by an application-dependent oracle based on the champions. Empirically, we show that ChaCha provides good performance across a wide array of datasets when optimizing over featurization and hyperparameter decisions.
Téléchargements de publications
FLAML: A Fast Library for AutoML and Tuning
décembre 15, 2020
FLAML is a Python library designed to automatically produce accurate machine learning models with low computational cost. It frees users from selecting learners and hyperparameters for each learner. FLAML is powered by a new, cost-effective hyperparameter optimization and learner selection method invented by Microsoft Research.
ChaCha for Online AutoML
In this work, we propose the ChaCha (Champion-Challengers) algorithm for making an online choice of hyperparameters in online learning settings. This work is published in ICML 2021. Python notebook demo: https://github.com/microsoft/FLAML/blob/main/notebook/autovw.ipynb