An Inexact Augmented Lagrangian Algorithm for Training Leaky ReLU Neural Network with Group Sparsity
Authors: Wei Liu, Xin Liu, Xiaojun Chen; Journal: 24(212):1−43, 2023.
Abstract
The use of leaky ReLU networks with a group sparse regularization term has become widespread in recent years. However, training such networks presents a nonsmooth nonconvex optimization problem and there is a lack of deterministic approaches to compute a stationary point. In this paper, we address this issue by introducing auxiliary variables and additional constraints to resolve the multi-layer composite term in the original optimization problem. We demonstrate that the new model has a nonempty and bounded solution set, and its feasible set satisfies the Mangasarian-Fromovitz constraint qualification. Furthermore, we establish the relationship between the new model and the original problem. Notably, we propose an inexact augmented Lagrangian algorithm for solving the new model and prove the convergence of the algorithm to a KKT point. Numerical experiments show that our algorithm is more efficient for training sparse leaky ReLU neural networks compared to some well-known algorithms.
[abs]