FedSOL: Stabilized Orthogonal Learning with Proximal Restrictions in Federated Learning

16citations
arXiv:2308.12532
16
citations
#1438
in CVPR 2024
of 2716 papers
5
Top Authors
4
Data Points

Abstract

Federated Learning (FL) aggregates locally trained models from individual clients to construct a global model. While FL enables learning a model with data privacy, it often suffers from significant performance degradation when clients have heterogeneous data distributions. This data heterogeneity causes the model to forget the global knowledge acquired from previously sampled clients after being trained on local datasets. Although the introduction of proximal objectives in local updates helps to preserve global knowledge, it can also hinder local learning by interfering with local objectives. To address this problem, we propose a novel method, Federated Stabilized Orthogonal Learning (FedSOL), which adopts an orthogonal learning strategy to balance the two conflicting objectives. FedSOL is designed to identify gradients of local objectives that are inherently orthogonal to directions affecting the proximal objective. Specifically, FedSOL targets parameter regions where learning on the local objective is minimally influenced by proximal weight perturbations. Our experiments demonstrate that FedSOL consistently achieves state-of-the-art performance across various scenarios.

Citation History

Jan 28, 2026
0
Feb 13, 2026
16+16
Feb 13, 2026
16
Feb 13, 2026
16