Main content area

An efficient algorithm for sparse inverse covariance matrix estimation based on dual formulation

Li, Peili, Xiao, Yunhai
Computational statistics & data analysis 2018 v.128 pp. 292-307
algorithms, data collection, models, multivariate analysis, system optimization, variance covariance matrix
Estimating large and sparse inverse covariance matrix plays a fundamental role in modern multivariate analysis, because the zero entries capture the conditional independence between pairs of variables given all other variables. This estimation task can be realized by penalizing the maximum likelihood estimation with an adaptive group lasso penalty imposed directly on the elements of the inverse, which allows the estimated to have a blockwise sparse structure that is particularly useful in some applications. In the paper, we are particularly interested in studying the implementation of optimization algorithms for minimizing a class of log-determinant model. This considered minimization model, one the one hand, contains a large number of popular sparse models as special cases, but on the other hand, it poses more challenges especially in high-dimensional situations. Instead of targeting the challenging optimization problem directly, we employ the symmetric Gauss–Seidel (sGS) iteration based alternating direction method of multipliers (ADMM) to tackle the 3-block nonsmooth dual program. By choosing an appropriate proximal term, it was shown that the implemented sGS-ADMM is equivalent to the 2-block ADMM, so its convergence is followed directly from some existing theoretical results. Numerical experiments on synthetic data and real data sets, including the performance comparisons with the directly extended ADMM, demonstrate that the implemented algorithm is effective in estimating large and sparse inverse covariance matrices.