PAC-Bayesian learning with asymmetric cost

2011 IEEE Signal Processing Workshop on Statistical Signal Processing |

Published by IEEE

Publication

PAC-Bayes generalization bounds offer a theoretical foundation for learning classifiers with low generalization error and predicting their performance on unseen data. Current formulations implicitly assume that the relative cost of misclassifying a positive or negative example is reflected by the class skew in the training dataset. We present a learning approach based on minimizing an asymmetric generalization bound that enables PAC-Bayesian model selection under a class-specific performance constraint.