Smart City Gnosys
Smart city article details
| Title | Quantization And Knowledge Distillation For Efficient Federated Learning On Edge Devices |
|---|---|
| ID_Doc | 43973 |
| Authors | Qu X.; Wang J.; Xiao J. |
| Year | 2020 |
| Published | Proceedings - 2020 IEEE 22nd International Conference on High Performance Computing and Communications, IEEE 18th International Conference on Smart City and IEEE 6th International Conference on Data Science and Systems, HPCC-SmartCity-DSS 2020 |
| DOI | http://dx.doi.org/10.1109/HPCC-SmartCity-DSS50907.2020.00129 |
| Abstract | Federated learning enables distributed machine learning for decentralized data on edge devices. As communication is a critical bottleneck for federated learning, we utilize model compression techniques for efficient federated learning. First, we propose an adaptive quantized federated average algorithm to reduce the communication cost by dynamically quantizing neural networks' weights. Then, we design a federated knowledge distillation method to achieve high-quality small models with limited labeled data. Adaptive quantized federated learning can significantly speed up model training while retaining model accuracy. With a small fraction of data as labeled data, our federated knowledge distillation can reach a fixed accuracy achieved by supervised learning with the entire labeled data set. © 2020 IEEE. |
| Author Keywords | Federated Edge Intelligence.; Federated Learning; Knowledge Distillation; Quantization |
