Secure federated learning based on coded distributed computing
Federated learning (FL) enables multiple learning devices to exchange their training results and collaboratively develop a shared learning model without revealing their local data, thereby preserving data privacy. However, contemporary FL models have many drawbacks including limited security against malicious learning devices generating arbitrarily erroneous training results. Recently, a promising concept - coded distributed computing (CDC) has been proposed for maintaining security of various distributed systems by adding computational redundancy to the datasets exchanged in these systems. Although the CDC concept has already been adopted in several applications, it is yet to be applied to FL systems. Accordingly, in this paper, we develop the first integrated FL-CDC model that represents a low-complexity approach for enhancing security of FL systems. We implement the model for predicting the traffic slowness in vehicular applications and verify that the model can effectively secure the system even if the number of malicious devices is large.
Funding
Guangdong Provincial Department of Education, Characteristic Innovation Project No. 2021KTSCX110
National Natural Science Foundation of China (NSFC): project no. 61950410603
History
School
- Science
Department
- Computer Science
Published in
2021 IEEE Globecom Workshops (GC Wkshps)Source
2021 IEEE Globecom Workshops (GC Wkshps)Publisher
IEEEVersion
- AM (Accepted Manuscript)
Rights holder
© IEEEPublisher statement
© 2022 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.Publication date
2022-01-24Copyright date
2022ISBN
9781665423908; 9781665423915Publisher version
Language
- en