Download a PDF of the paper titled “Does the $\\ell_1$-norm Learn a Sparse Graph under Laplacian Constrained Graphical Models?” by Jiaxi Ying and 2 other authors

Download PDF

Abstract: We investigate the problem of learning a sparse graph under Laplacian constrained Gaussian graphical models. Our goal is to estimate the Laplacian constrained precision matrix through penalized maximum likelihood estimation. While the classical graphical lasso problem utilizes the $\\ell_1$-norm regularization to promote sparsity, we find that it is not effective in imposing a sparse solution in this specific problem. Empirical evidence shows that the number of nonzero graph weights increases with the regularization parameter. From a theoretical perspective, we prove that a large regularization parameter leads to a complete graph, where every pair of vertices is connected by an edge. To overcome this issue, we introduce a nonconvex sparsity penalty and propose a new estimator that solves a sequence of weighted $\\ell_1$-norm penalized sub-problems. We establish non-asymptotic optimization performance guarantees for both optimization error and statistical error, demonstrating that the proposed estimator can correctly recover the edges with high probability. Additionally, we develop a projected gradient descent algorithm to solve each sub-problem, which exhibits a linear convergence rate. Finally, we propose an extension to learn disconnected graphs by imposing an additional rank constraint. We propose a numerical algorithm based on the alternating direction method of multipliers and establish its theoretical sequence convergence. Numerical experiments with synthetic and real-world datasets validate the effectiveness of our proposed method.

Submission history

From: Jiaxi Ying [view email]

[v1]
Fri, 26 Jun 2020 12:06:10 UTC (4,897 KB)
[v2]
Tue, 5 Sep 2023 17:14:54 UTC (5,122 KB)