Skip to content

Graphical LASSO Algorithm

Welcome to our Knowledge Base

Graphical LASSO Algorithm

You are here:
< last topic

Graphical LASSO Algorithm

GLASSOFAST is the Graphical LASSO algorithm to solve the covariance selection problem.

Consider observations \(X_1, X_2, \dots, X_n\) from multivariate Gaussian distribution \(X\sim \mathcal{N}(0,\Sigma)\). We are interested in estimating the precision matrix \(\Theta = \Sigma^{-1}\).

GLASSOFAST is an efficient GLASS implementation to solve the \(l_1\) regularized inverse covariance matrix estimation problem:

\begin{equation}
arg \underset{X\succ 0}{min}\{-log\ det\ X + tr(SX) + \lambda \sum_{i,j=1}^p |x_{ij}|\}
\end{equation}

where \(x_{ij}\) denotes the \((i,j)\)-th element of matrix \(X\). The \(i\)th columns and rows of matrix \(X\) are denoted by \(\textbf{x}_i\).

This algorithm solves a generalized form of the optimization problem (1), where the regularization term is of the form \(\sum_{ij}\lambda_{ij}|x_{ij}|\).

Here are the loops to realize algorithm:

  • The first loop is controlled by convergence on the whole matrix
  • The second loop iterates through the rows of the matrix
  • The two most inner loops solve a lasso problem for a given row using coordinate descent.

The computation speed of GLASSOFAST is faster than QUIC, GLASSO 1.7 and GLASSO 1.3. Also, the regularization parameter can have large impact on performance.

References

  1. “Sustik, M.A. and Calderhead, B., “GLASSOFAST: An efficient GLASSO implementation,” UTCS Technical Report TR-12-29, November 6, 2012.”
  2. “O. Banerjee, L. E. Ghaoui and A. d’Aspremont, “Model Selection Through Sparse Maximum Likelihood Estimation for multivariate Gaussian or Binary Data,” Journal of Machine Learning Research, 9, pp. 485-516, March 2008.”
Was this article helpful?
0 out Of 5 Stars
5 Stars 0%
4 Stars 0%
3 Stars 0%
2 Stars 0%
1 Stars 0%
How can we improve this article?
Please submit the reason for your vote so that we can improve the article.
Table of Contents