MARS: A Second-Order Reduction Algorithm for High-Dimensional Sparse Precision Matrices Estimation

Qian Li, Binyan Jiang, Defeng Sun; 24(134):1−44, 2023.

Abstract

The estimation of precision matrices, also known as inverse covariance matrices, is crucial in statistical data analysis and machine learning. However, computing these matrices becomes extremely challenging when dealing with high-dimensional data, as the number of parameters increases quadratically with the dimension. In this paper, we propose an innovative algorithm called MARS, which uses a second-order approach to solve subproblems and generate a solution path for precision matrix estimation under the $\ell_1$ penalized D-trace loss. Each iteration of our algorithm significantly reduces the number of variables based on the Karush-Kuhn-Tucker (KKT) conditions and the sparse structure of the estimated precision matrix from the previous iteration. This allows MARS to handle datasets with very high dimensions that surpass the capabilities of existing methods. Additionally, we develop a semismooth Newton augmented Lagrangian algorithm with a global linear convergence rate to solve the subproblems more efficiently. Theoretical properties of our algorithm have been established, including its asymptotically superlinear convergence rate. We demonstrate the high efficiency and promising performance of MARS through extensive simulation studies and real data applications, comparing it to several state-of-the-art solvers.

[abs]

[pdf][bib]