Variational Gram Functions: Convex Analysis and Optimization
- Amin Jalali ,
- Maryam Fazel ,
- Lin Xiao
SIAM Journal on Optimization |
We introduce a new class of convex penalty functions, called variational Gram functions (VGFs), that can promote pairwise relations, such as orthogonality among a set of vectors in a vector space. These functions can serve as regularizers in convex optimization problems arising from hierarchical classication, multitask learning, and estimating vectors with disjoint supports, among other applications. We study convexity of VGFs, and give characterizations for their convex conjugates, subdifferentials, proximal operators, and related quantities. We discuss ecient optimization algorithms for regularized loss minimization problems where the loss admits a common, yet simple, variational representation and the regularizer is a VGF. These algorithms enjoy a simple kernel trick, an ecient line search, as well as computational advantages over rst order methods based on the subdifferential or proximal maps. We also establish a general representer theorem for such learning problems. Lastly, numerical experiments on a hierarchical classication problem are presented to demonstrate the effectiveness of VGFs and the associated optimization algorithms.