Lecturer
Department of Computer Science
University of York
CSE/139, YO10 5GH, UK
Authors
Yuanshao Zhu, Christos Markos, Ruihui Zhao, Yefeng Zheng, and James J.Q. Yu*
Publication
Proc. International Joint Conference on Neural Networks, Shenzhen, China, July 2021
Abstract
Federated Learning (FL) is a privacy-oriented framework that allows distributed edge devices to jointly train a shared global model without transmitting their sensed data to centralized servers. FL aims to balance the naturally conflicting objectives of obtaining massive amounts of data while protecting sensitive information. However, the data stored locally on each edge device are typically not independent and identically distributed (non-IID). Such data heterogeneity poses a severe statistical challenge for the optimization and convergence of the global model. In response to this issue, we propose Federated One-vs-All (FedOVA), an efficient FL algorithm that first decomposes a multi-class classification problem into more straightforward binary classification problems and then combines their respective outputs using ensemble learning. Experiments on several public datasets show that FedOVA achieves higher accuracy and faster convergence than federated averaging and data sharing. Furthermore, our approach can support practical settings with a large number of clients (up to 1000 clients) in FL.
[ Download PDF ] [ Digital Library ] [ Copy Citation ]