FedX: Unsupervised Federated Learning with Cross Knowledge Distillation
2022-07-19Code Available1· sign in to hype
Sungwon Han, Sungwon Park, Fangzhao Wu, Sundong Kim, Chuhan Wu, Xing Xie, Meeyoung Cha
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/sungwon-han/fedxOfficialIn paperpytorch★ 73
Abstract
This paper presents FedX, an unsupervised federated learning framework. Our model learns unbiased representation from decentralized and heterogeneous local data. It employs a two-sided knowledge distillation with contrastive learning as a core component, allowing the federated system to function without requiring clients to share any data features. Furthermore, its adaptable architecture can be used as an add-on module for existing unsupervised algorithms in federated settings. Experiments show that our model improves performance significantly (1.58--5.52pp) on five unsupervised algorithms.