Abstract
We consider the graphical lasso, a popular optimization problem for learning the sparse representations of high-dimensional datasets, which is well-known to be computationally expensive for large-scale problems. A recent line of results has shown-under mild assumptions-that the sparsity pattern of the graphical lasso estimator can be retrieved by soft-thresholding the sample covariance matrix. Based on this result, a closed-form solution has been obtained that is optimal when the thresholded sample covariance matrix has an acyclic structure. In this paper, we prove an extension of this result to generalized graphical lasso (GGL), where additional sparsity constraints are imposed based on prior knowledge. Furthermore, we describe a recursive closed-form solution for the problem when the thresholded sample covariance matrix is chordal. By building upon this result, we describe a novel Newton-Conjugate Gradient algorithm that can efficiently solve the GGL with general structures. Assuming that the thresholded sample covariance matrix is sparse with a sparse Cholesky factorization, we prove that the algorithm converges to an epsilon -accurate solution in O(nlog (1/epsilon)) time and O(n) memory. The algorithm is highly efficient in practice: we solve instances with as many as 200000 variables to 7-9 digits of accuracy in less than an hour on a standard laptop computer running MATLAB.
Original language | English (US) |
---|---|
Article number | 8598839 |
Pages (from-to) | 12658-12672 |
Number of pages | 15 |
Journal | IEEE Access |
Volume | 7 |
DOIs | |
State | Published - 2019 |
Externally published | Yes |
Keywords
- Optimization
- graphical models
- numerical algorithms
ASJC Scopus subject areas
- General Computer Science
- General Materials Science
- General Engineering