Cluster Based Heterogeneous Federated Foundation Model Adaptation and Fine-Tuning

5citations
PDFProject
5
citations
#862
in AAAI 2025
of 3028 papers
5
Top Authors
2
Data Points

Abstract

In recent years, the distributed training of foundation models (FMs) has seen a surge in popularity. In particular, federated learning enables collaborative model training among edge clients while safeguarding the privacy of their data. However, federated training of FMs across resource-constrained and highly heterogeneous edge devices encounter several challenges. These include the difficulty of deploying FMs on clients with limited computational resources and the high computation and communication costs associated with fine-tuning and collaborative training. To address these challenges, we propose FedCKMS, a Cluster-Aware Framework with Knowledge-Aware Model Search. Specifically, FedCKMS incorporates three key components. The first component is multi-factor heterogeneity-aware clustering, which groups clients based on both data distribution and resource limitations and selects an appropriate model for each cluster. The second component is knowledge-aware model architecture search, which enables each client to identify the optimal sub-model from the cluster model, facilitating adaptive deployment that accommodates highly heterogeneous computational resources across clients. The final component is cluster-aware knowledge transfer, which facilitates knowledge sharing between clusters and the server, addressing model heterogeneity, and reducing communication overhead. Extensive experiments demonstrate that FedCKMS outperforms state-of-the-art baselines by 3-10% in accuracy.

Citation History

Jan 27, 2026
0
Feb 7, 2026
5+5