Welcome to our Knowledge Base
Graphical LASSO Algorithm
Graphical LASSO Algorithm
GLASSOFAST is the Graphical LASSO algorithm to solve the covariance selection problem.
Consider observations \(X_1, X_2, \dots, X_n\) from multivariate Gaussian distribution \(X\sim \mathcal{N}(0,\Sigma)\). We are interested in estimating the precision matrix \(\Theta = \Sigma^{-1}\).
GLASSOFAST is an efficient GLASS implementation to solve the \(l_1\) regularized inverse covariance matrix estimation problem:
\begin{equation}
arg \underset{X\succ 0}{min}\{-log\ det\ X + tr(SX) + \lambda \sum_{i,j=1}^p |x_{ij}|\}
\end{equation}
where \(x_{ij}\) denotes the \((i,j)\)-th element of matrix \(X\). The \(i\)th columns and rows of matrix \(X\) are denoted by \(\textbf{x}_i\).
This algorithm solves a generalized form of the optimization problem (1), where the regularization term is of the form \(\sum_{ij}\lambda_{ij}|x_{ij}|\).
Here are the loops to realize algorithm:
- The first loop is controlled by convergence on the whole matrix
- The second loop iterates through the rows of the matrix
- The two most inner loops solve a lasso problem for a given row using coordinate descent.
The computation speed of GLASSOFAST is faster than QUIC, GLASSO 1.7 and GLASSO 1.3. Also, the regularization parameter can have large impact on performance.
References
- “Sustik, M.A. and Calderhead, B., “GLASSOFAST: An efficient GLASSO implementation,” UTCS Technical Report TR-12-29, November 6, 2012.”
- “O. Banerjee, L. E. Ghaoui and A. d’Aspremont, “Model Selection Through Sparse Maximum Likelihood Estimation for multivariate Gaussian or Binary Data,” Journal of Machine Learning Research, 9, pp. 485-516, March 2008.”