[Ml-stat-talks] Wed 08/09: Proximal-Gradient Method for Structured Sparse Learning
chongw at CS.Princeton.EDU
Thu Aug 5 10:12:05 EDT 2010
On Monday, August 9, Xi Chen from Machine Learning Department at CMU
at will be giving a talk on "Proximal-Gradient Method for Structured
Sparse Learning". The talk will be at 2pm in Room 402 in Computer
Proximal-Gradient Method for Structured Sparse Learning
Machine Learning Department at CMU
Time: 2pm-3pm, Monday, August 9th.
Place: CS 402
We study the problem of learning regression models regularized by the
structured sparsity-inducing penalty which encodes the prior structural
information on either covariate or response side. We consider two most
widely adopted structures which cover almost all the existing sparsity
related structures in the literature: (1) group structure (might overlap)
which is encoded in L1/L2-norm; (2) graph structure which is encoded in
graph-guided fusion penalty.
For both structures, developing an efficient optimization method has
remained a challenge. Existing methods cast the problems as second-order
cone programming (SOCP) or quadratic-programming (QP) and solve them by
interior-point methods (IPM). However, this approach is computationally
expensive even for problems of moderate size. In this talk, we show an
efficient proximal-gradient method which can solve two seemingly different
problems in the same optimization framework. This method exploits the
structure of the non-smooth structured-sparsity-inducing penalty,
introduces its smooth approximation, and solves this approximation
function. It achieves a faster convergence rate than the standard
first-order method, sub-gradient method, and is much more scalable than
IPM for SOCP and QP. We demonstrate correctness of the model and the
efficiency and scalability of the optimization method on both simulated
and real genetic datasets.
Xi Chen is currently a third year Ph.D. student at machine learning
dept. at Carnegie Mellon U. (http://www.cs.cmu.edu/~xichen), working
on structured sparse learning problems for both parametric and
non-parametric settings with applications to both text and biology.
His advisor is Jaime G. Carbonell, and he is also actively working
with Prof. Eric Xing and John Lafferty (a bit). He is now doing a
summer research internship at NEC lab at Princeton.
More information about the Ml-stat-talks