Solving sparse principal component analysis with global support

Mathematical Programming | , Vol 199: pp. 421-459

Publication | Publication | Publication

Row-sparse principal component analysis (rsPCA), also known as
principal component analysis (PCA) with global support, is the problem of
finding the top-r leading principal components such that all these principal
components are linear combination of a subset of k variables. rsPCA is a
popular dimension reduction tool in statistics that enhances interpretability
compared to regular PCA. Methods for solving
rsPCA in the literature are either greedy heuristics (in the special case of
r=1) with guarantees under restrictive statistical models, or algorithms with
stationary-point convergence for some regularized reformulation of rsPCA.
Crucially, none of the existing computational methods can efficiently guarantee
the quality of the solutions obtained by comparing against dual bounds.
In this work we first propose a convex relaxation based on operator norms
that provably approximates the feasible region of rsPCA within a O(logr)
factor. To prove this result we use a novel random sparsification procedure
that uses the Pietsch-Grothendieck factorization theorem and may be of independent interest. We also propose a simpler relaxation that is second-order cone representable and gives a (1+sqrt(r))-approximation for the feasible region.

Using these relaxations we then propose a convex integer program that
provides a dual bound for the optimal value of rsPCA. Moreover, it also has
worst-case guarantees: it is within a multiplicative/additive factor of the original optimal value, the multiplicative factor being O(logr) or O(r)depending on the relaxation used. Finally, our experiments demonstrate both the viability of computing these dual bounds on instance with up to 2000 attributes and also their quality compared to baselines available.