Bayesian Conditional Random Fields
- Yuan Qi ,
- Martin Szummer ,
- Tom Minka
Journal of Machine Learning Research (JMLR), AI & Statistics | , pp. 269-276
We propose Bayesian Conditional Random Fields (BCRFs) for classifying interdependent and structured data, such as sequences, images or webs. BCRFs are a Bayesian approach to training and inference with conditional random fields, which were previously trained by maximizing likelihood (ML) (Lafferty et al., 2001). Our framework avoids the problem of overfitting, and offers the full advantages of a Bayesian treatment. Unlike the ML approach, we estimate the posterior distribution of the model parameters during training, and average predictions over this posterior during inference. We apply two extensions of expectation propagation (EP), the power EP and the novel transformed EP methods, to incorporate the partition function. For algorithmic stability and accuracy, we flatten the approximation structures to avoid two-level approximations. We demonstrate the superior prediction accuracy of BCRFs over conditional random fields trained with ML or MAP on synthetic and real datasets.