Abstract
Solving l1 regularized optimization problems is common in the fields of computational biology, signal processing, and machine learning. Such l1 regularization is utilized to find sparse minimizers of convex functions. A well-known example is the least absolute shrinkage and selection operator (LASSO) problem, where the l1 norm regularizes a quadratic function. A multilevel framework is presented for solving such l1 regularized sparse optimization problems efficiently. We take advantage of the expected sparseness of the solution, and create a hierarchy of problems of similar type, which is traversed in order to accelerate the optimization process. This framework is applied for solving two problems: (1) the sparse inverse covariance estimation problem, and (2) l1 regularized logistic regression. In the first problem, the inverse of an unknown covariance matrix of a multivariate normal distribution is estimated, under the assumption that it is sparse. To this end, an l1 regularized log-determinant optimization problem needs to be solved. This task is challenging especially for large-scale datasets, due to time and memory limitations. In the second problem, the l1 regularization is added to the logistic regression classification objective to reduce overfitting to the data and obtain a sparse model. Numerical experiments demonstrate the efficiency of the multilevel framework in accelerating existing iterative solvers for both of these problems.
Original language | American English |
---|---|
Pages (from-to) | S566-S592 |
Journal | SIAM Journal on Scientific Computing |
Volume | 38 |
Issue number | 5 |
DOIs | |
State | Published - 1 Jan 2016 |
Keywords
- Block coordinate descent
- Covariance selection
- Multilevel methods
- Proximal Newton
- Sparse inverse covariance estimation
- Sparse optimization
- l regularized logistic regression
All Science Journal Classification (ASJC) codes
- Computational Mathematics
- Applied Mathematics