Optimal Convergence Rates for Distributed Nystroem Approximation
Jian Li, Yong Liu, Weiping Wang; 24(141):1−39, 2023.
Abstract
The distributed kernel ridge regression (DKRR) has demonstrated significant potential in handling complex tasks. However, DKRR only relies on local samples, which may not capture the global characteristics. Additionally, the existing optimal learning guarantees are based on expected values and only apply to cases where the target regression lies exactly in the kernel space. In this study, we introduce distributed learning with globally-shared Nystroem centers (DNystroem), which utilizes global information across multiple local clients. We analyze the statistical properties of DNystroem in terms of both expected values and probabilities, and obtain several state-of-the-art results with the minimax optimal learning rates. Notably, the optimal convergence rates for DNystroem are applicable even in non-attainable cases, while the statistical results allow for more partitions and require fewer Nystroem centers. Finally, we conduct experiments on various real-world datasets to validate the effectiveness of the proposed algorithm, and the empirical results align with our theoretical findings.
[abs]