Title | FedPDD: A Privacy-preserving Double Distillation Framework for Cross-silo Federated Recommendation |
Author | |
Corresponding Author | Gao,Dashan |
DOI | |
Publication Years | 2023
|
Conference Name | International Joint Conference on Neural Networks (IJCNN)
|
ISSN | 2161-4393
|
Source Title | |
Volume | 2023-June
|
Conference Date | JUN 18-23, 2023
|
Conference Place | null,Broadbeach,AUSTRALIA
|
Publication Place | 345 E 47TH ST, NEW YORK, NY 10017 USA
|
Publisher | |
Abstract | Cross-platform recommendation aims to improve recommendation accuracy by gathering heterogeneous features from different platforms. However, such cross-silo collaborations between platforms are restricted by increasingly stringent privacy protection regulations, thus data cannot be aggregated for training. Federated learning (FL) is a practical solution to deal with the data silo problem in recommendation scenarios. Existing cross-silo FL methods transmit model information to collaboratively build a global model by leveraging the data of overlapped users. However, in reality, the number of overlapped users is often very small, thus largely limiting the performance of such approaches. Moreover, transmitting model information during training requires high communication costs and may cause serious privacy leakage. In this paper, we propose a novel privacy-preserving double distillation framework named FedPDD for cross-silo federated recommendation, which efficiently transfers knowledge when overlapped users are limited. Specifically, our double distillation strategy enables local models to learn not only explicit knowledge from the other party but also implicit knowledge from its past predictions. Moreover, to ensure privacy and high efficiency, we employ an offline training scheme to reduce communication needs and privacy leakage risk. In addition, we adopt differential privacy to further protect the transmitted information. The experiments on two real-world recommendation datasets, HetRec-MovieLens and Criteo, demonstrate the effectiveness of FedPDD compared to the state-of-the-art approaches. |
SUSTech Authorship | First
; Corresponding
|
Language | English
|
URL | [Source Record] |
Indexed By | |
Funding Project | Guangdong Province Focus Research Project[2019KZDZX2014]
; Guangdong Province Research Fund[2019QN01X277]
; National Natural Science Foundation of China["71971106","72001099"]
|
WOS Research Area | Computer Science
; Engineering
|
WOS Subject | Computer Science, Artificial Intelligence
; Computer Science, Hardware & Architecture
; Engineering, Electrical & Electronic
|
WOS Accession No | WOS:001046198706045
|
Scopus EID | 2-s2.0-85169540296
|
Data Source | Scopus
|
Citation statistics |
Cited Times [WOS]:0
|
Document Type | Conference paper |
Identifier | http://kc.sustech.edu.cn/handle/2SGJ60CL/560081 |
Affiliation | 1.SUSTech,Hkust,Dept. of Cse,Hong Kong 2.Webank,Shenzhen,China 3.SUSTech,Dept. of Finance,Shenzhen,China |
First Author Affilication | Southern University of Science and Technology |
Corresponding Author Affilication | Southern University of Science and Technology |
First Author's First Affilication | Southern University of Science and Technology |
Recommended Citation GB/T 7714 |
Wan,Sheng,Gao,Dashan,Gu,Hanlin,et al. FedPDD: A Privacy-preserving Double Distillation Framework for Cross-silo Federated Recommendation[C]. 345 E 47TH ST, NEW YORK, NY 10017 USA:IEEE,2023.
|
Files in This Item: | There are no files associated with this item. |
|
Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Edit Comment