作者
Chenhan Zhang, Shiyao Zhang, Shui Yu, and James J.Q. Yu*
发表
Proc. IEEE Wireless Communications and Networking Conference, Austin, TX, US, April 2022
摘要
The existing Federated Learning (FL) systems encounter an enormous communication overhead when employing GNN-based models for traffic forecasting tasks since these models commonly incorporate enormous number of parameters to be transmitted in the FL systems. In this paper, we propose a FL framework, namely, Clustering-based hierarchical and Two-step-optimized FL (CTFL), to overcome this practical problem. CTFL employs a divide-and-conquer strategy, clustering clients based on the closeness of their local model parameters. Furthermore, we incorporate the particle swarm optimization algorithm in CTFL, which employs a two-step strategy for optimizing local models. This technique enables the central server to upload only one representative local model update from each cluster, thus reducing the communication overhead associated with model update transmission in the FL. Comprehensive case studies on two real-world datasets and two state-of-the-art GNN-based models demonstrate the proposed framework's outstanding training efficiency and prediction accuracy, and the hyperparameter sensitivity of CTFL is also investigated.