Distributed Active Client Selection With Noisy Clients Using Model Association Scores

Kwang In Kim* ;

Abstract


"Active client selection (ACS) strategically identifies clients for model updates during each training round of federated learning. In scenarios with limited communication resources, ACS emerges as a superior alternative to random client selection, significantly improving the convergence rate. However, existing ACS methods struggle with clients providing noisy updates, those from noisy labels. To address this challenge, we present a new ACS algorithm for scenarios with unknown noisy clients. Our algorithm constructs a client sampling distribution based on the global association among model updates, which quantifies the ability of a client’s model update to align with those from other clients. By leveraging these associations, we efficiently identify and mitigate the impact of clients with substantial noise that could disrupt training. This approach is simple, computationally efficient, and requires no hyperparameter tuning. Experiments on six benchmark datasets demonstrate that conventional ACS methods fail to outperform random selection. In contrast, our approach significantly enhances convergence speed while using the same communication resources."

Related Material


[pdf] [supplementary material] [DOI]