"Sparsity, Feature Selection and
the Shapley Folkman Theorem”
The Shapley Folkman
theorem acts a central limit theorem for convexity: It shows that Minkowski
sums of arbitrary bounded sets are increasingly close to their convex hull as
the number of terms in the sum increases. This produces simple and explicit
approximation bounds on a broad class of combinatorial optimization problems.
We use these results to show that several classical sparsity constrained
optimization problems in e.g. feature selection have low duality gaps in
meaningful data settings.
Dr.
Alexandre d’Aspremont received dual PhDs from Ecole Polytechnique and Stanford University
in optimisation and finance, followed by a postdoc at U.C. Berkeley, Alexandre
d'Aspremont joined the faculty at Princeton University as an assistant then
associate professor with joint appointments at the ORFE department and the
Bendheim Center for Finance. He returned to Europe in 2011 thanks to a grant
from the European Research Council and is now directeur de recherche at CNRS,
and professor at Ecole Normale Supérieure in Paris. He received the SIAM
Optimization prize for 2004-2007, a NSF CAREER award, and an ERC starting
grant. His research focuses on convex optimization and applications to machine
learning, statistics and finance.