Smart City Gnosys

Smart city article details

Title Quantization And Knowledge Distillation For Efficient Federated Learning On Edge Devices
ID_Doc 43973
Authors Qu X.; Wang J.; Xiao J.
Year 2020
Published Proceedings - 2020 IEEE 22nd International Conference on High Performance Computing and Communications, IEEE 18th International Conference on Smart City and IEEE 6th International Conference on Data Science and Systems, HPCC-SmartCity-DSS 2020
DOI http://dx.doi.org/10.1109/HPCC-SmartCity-DSS50907.2020.00129
Abstract Federated learning enables distributed machine learning for decentralized data on edge devices. As communication is a critical bottleneck for federated learning, we utilize model compression techniques for efficient federated learning. First, we propose an adaptive quantized federated average algorithm to reduce the communication cost by dynamically quantizing neural networks' weights. Then, we design a federated knowledge distillation method to achieve high-quality small models with limited labeled data. Adaptive quantized federated learning can significantly speed up model training while retaining model accuracy. With a small fraction of data as labeled data, our federated knowledge distillation can reach a fixed accuracy achieved by supervised learning with the entire labeled data set. © 2020 IEEE.
Author Keywords Federated Edge Intelligence.; Federated Learning; Knowledge Distillation; Quantization


Similar Articles


Id Similarity Authors Title Published
58065 View0.87Zaman S.; Talukder S.; Hossain M.Z.; Puppala S.M.T.; Imteaj A.Towards Communication-Efficient Federated Learning Through Particle Swarm Optimization And Knowledge DistillationProceedings - 2024 IEEE 48th Annual Computers, Software, and Applications Conference, COMPSAC 2024 (2024)
26330 View0.855Janaki G.; Umanandhini D.Federated Learning Approaches For Decentralized Data Processing In Edge ComputingProceedings of the 5th International Conference on Smart Electronics and Communication, ICOSEC 2024 (2024)