Federated Learning (FL) is a method that uses locally trained models from individual clients to create a global model. FL allows for model training while preserving data privacy. However, it can suffer from performance degradation when the data distributions among clients are different. To address this issue, previous FL algorithms have introduced proximal restrictions to encourage global alignment. But these restrictions limit local learning and interfere with the original local objectives. Another approach focuses on improving local learning generality by obtaining local models within a smooth loss landscape. However, this approach does not guarantee stable global alignment as it doesn’t consider the global objective. In this study, we propose Federated Stability on Learning (FedSoL), which combines global alignment and local generality. In FedSoL, local learning aims to find a parameter region that is robust against proximal perturbations. This strategy introduces an implicit proximal restriction effect while maintaining the original local objective for parameter updates. Our experiments demonstrate that FedSoL consistently achieves state-of-the-art performance in various scenarios.
FedSoL: Unifying Global Alignment and Local Generality in Federated Learning. (arXiv:2308.12532v1 [cs.LG])
by instadatahelp | Aug 27, 2023 | AI Blogs