Just-In-Time Learning for Fast and Flexible Inference
- Ali Eslami ,
- Daniel Tarlow ,
- Pushmeet Kohli ,
- John Winn
NIPS'14 Proceedings of the 27th International Conference on Neural Information Processing Systems |
Published by MIT Press Cambridge
Much of research in machine learning has centered around the search for inference algorithms that are both general-purpose and efficient. The problem is extremely challenging and general inference remains computationally expensive. We seek to address this problem by observing that in most specific applications of a model, we typically only need to perform a small subset of all possible inference computations. Motivated by this, we introduce just-in-time learning, a framework for fast and flexible inference that learns to speed up inference at run-time. Through a series of experiments, we show how this framework can allow us to combine the flexibility of sampling with the efficiency of deterministic message-passing.