Interaction-Aware Gaussian Weighting for Clustered Federated Learning

3
citations
#1367
in ICML 2025
of 3340 papers
5
Top Authors
4
Data Points

Abstract

Federated Learning (FL) emerged as a decentralized paradigm to train models while preserving privacy. However, conventional FL struggles with data heterogeneity and class imbalance, which degrade model performance.Clustered FL balances personalization and decentralized training by grouping clients with analogous data distributions, enabling improved accuracy while adhering to privacy constraints. This approach effectively mitigates the adverse impact of heterogeneity in FL.In this work, we propose a novel clustering method for FL,FedGWC(Federated Gaussian Weighting Clustering), which groups clients based on their data distribution, allowing training of a more robust and personalized model on the identified clusters.FedGWCidentifies homogeneous clusters by transforming individual empirical losses to model client interactions with a Gaussian reward mechanism. Additionally, we introduce theWasserstein Adjusted Score, a new clustering metric for FL to evaluate cluster cohesion with respect to the individual class distribution. Our experiments on benchmark datasets show thatFedGWCoutperforms existing FL algorithms in cluster quality and classification accuracy, validating the efficacy of our approach.

Citation History

Jan 28, 2026
0
Feb 13, 2026
3+3
Feb 13, 2026
3
Feb 13, 2026
3