FedHARM: Harmonizing Model Architectural Diversity in Federated Learning
Anestis Kastellos*, Athanasios Psaltis, Charalampos Z Patrikakis, Petros Daras
;
Abstract
"In the domain of Federated Learning (FL), the issue of managing variability in model architectures surpasses a mere technical barrier, representing a crucial aspect of the field’s evolution, especially considering the ever-increasing number of model architectures emerging in the literature. This focus on architecture variability emerges from the unique nature of FL, where diverse devices or participants, each with their own data and computational constraints, collaboratively train a shared model. The proposed FL system architecture facilitates the deployment of diverse convolutional neural network (CNN) architectures across distinct clients, while outperforming the state-of-the-art FL methodologies. F edHARM 1 capitalizes on the strengths of different architectures while limiting their weaknesses by converging each local client on a shared dataset to achieve superior performance on the test set. 1 Code: https://github.com/Kastellos/FedHARM"
Related Material
[pdf]
[supplementary material]
[DOI]