Tuesday, March 25, 2014 at 4:15pm
Frank H. T. Rhodes Hall, 253
ORIE Colloquium: Rob Freund (MIT) - A First-Order View of Some Boosting Methods: Computational Guarantees and Connections to Regularization
Boosting methods are learning methods that combine weak models into more accurate and predictive models. Using the tools of first-order methods for convex optimization, we analyze some boosting methods in linear regression, namely Incremental Forward Stagewise Regression (FS-epsilon) and its adaptive shrinkage parameter variant (Forward Stagewise regression), as well as the LASSO; we also analyze the supervised classification learning method AdaBoost. We present a variety of (new) computational guarantees for these boosting methods, including convergence guarantees for empirical loss functions, generalization bounds (for the true unknown loss function), convergence of iterates to the empirical and the true model solutions, as well sparsity guarantees and regularization bounds. An overarching theme of this work is that tools from first-order methods in convex optimization can be used shed light on many properties of statistical boosting methods.
- Tags
-