ZTE Communications ›› 2022, Vol. 20 ›› Issue (3): 17-26.DOI: 10.12142/ZTECOM.202203003
• Special Topic • Previous Articles Next Articles
HAN Xuming1, GAO Minghan2, WANG Limin3(), HE Zaobo1, WANG Yanze1
Received:
2022-06-10
Online:
2022-09-13
Published:
2022-09-14
About author:
HAN Xuming received his PhD degree from Jilin University, China. Now he is a professor and PhD supervisor at Jinan University, China. He is in charge of about 10 important scientific research projects and 80 journal papers and conference papers, and has publish four academic monographs. His research interests include artificial intelligence, federated Learning, and machine learning.|GAO Minghan is currently a graduate student in Changchun University of Technology, China. His research interests include federated learning, multitasking optimization, and clustering.|WANG Limin (HAN Xuming, GAO Minghan, WANG Limin, HE Zaobo, WANG Yanze. A Survey of Federated Learning on Non-IID Data[J]. ZTE Communications, 2022, 20(3): 17-26.
Add to citation manager EndNote|Ris|BibTeX
URL: https://zte.magtechjournal.com/EN/10.12142/ZTECOM.202203003
Figure 4 Heterogeneity reducing strategies and adaptability enhancing strategies: (a) Heterogeneity reducing strategies and (b) adaptability enhancing strategies
Methods | Ways | Advantages | Disadvantages |
---|---|---|---|
Data preprocessing | Direct | ? Easy to implement | ? May reveal privacy ? Proxy dataset required |
Indirect | ? Strong privacy | ? Contextual information may be required ? More complex to implement | |
Client selection | Context-based | ? Faster model converges | ? May reveal privacy |
Deep-learning-based | ? No context required ? Better effect | ? Higher time and space costs |
Table 1 Summary of specific methods based on heterogeneity reducing strategies
Methods | Ways | Advantages | Disadvantages |
---|---|---|---|
Data preprocessing | Direct | ? Easy to implement | ? May reveal privacy ? Proxy dataset required |
Indirect | ? Strong privacy | ? Contextual information may be required ? More complex to implement | |
Client selection | Context-based | ? Faster model converges | ? May reveal privacy |
Deep-learning-based | ? No context required ? Better effect | ? Higher time and space costs |
Figure 6 Method settings for adaptability enhancing strategies: (a) federated multitask learning, (b) federated clustering learning, and (c) federated knowledge distillation
Methods | Ways | Advantages | Disadvantages |
---|---|---|---|
Federated multitask learning | Client-based | ? Easy to implement | ? Possibility to isolate heterogeneous clients |
Subtask- division-based | ? Part-time joins allowed | ? Data quality sensitive | |
Federated clustering learning | Model-loss-based | ? Easy to implement ? Predictable effect | ? Need to preset the number of clusters ? Communication overhead is high |
Client-similarity- based | ? No need to preset the number of clusters | ? Lack of theoretical analysis | |
Federated knowledge distillation | One-way distillation | ? Strong privacy | ? Poor to heterogeneity model ? Contextual information may be required |
Mutual distillation | ? Robust to heterogeneous models ? Suitable for a large number of clients | ? Negative transfer possible ? Lack of theoretical analysis |
Table 2 Summary of specific methods based on adaptability enhancing strategies
Methods | Ways | Advantages | Disadvantages |
---|---|---|---|
Federated multitask learning | Client-based | ? Easy to implement | ? Possibility to isolate heterogeneous clients |
Subtask- division-based | ? Part-time joins allowed | ? Data quality sensitive | |
Federated clustering learning | Model-loss-based | ? Easy to implement ? Predictable effect | ? Need to preset the number of clusters ? Communication overhead is high |
Client-similarity- based | ? No need to preset the number of clusters | ? Lack of theoretical analysis | |
Federated knowledge distillation | One-way distillation | ? Strong privacy | ? Poor to heterogeneity model ? Contextual information may be required |
Mutual distillation | ? Robust to heterogeneous models ? Suitable for a large number of clients | ? Negative transfer possible ? Lack of theoretical analysis |
1 |
LECUN Y, BENGIO Y, HINTON G. Deep learning [J]. Nature, 2015, 521(7553): 436–444. DOI: 10.1038/nature14539
DOI |
2 |
LIM W Y B, LUONG N C, HOANG D T, et al. Federated learning in mobile edge networks: a comprehensive survey [J]. IEEE communications surveys & tutorials, 2020, 22(3): 2031–2063. DOI: 10.1109/COMST.2020.2986024
DOI |
3 |
NGUYEN D C, DING M, PATHIRANA P N, et al. Federated learning for Internet of Things: a comprehensive survey [J]. IEEE communications surveys & tutorials, 2021, 23(3): 1622–1658. DOI: 10.1109/COMST.2021.3075439
DOI |
4 |
PFITZNER B, STECKHAN N, ARNRICH B. Federated learning in a medical context: a systematic literature review [J]. ACM transactions on Internet technology, 2021, 21(2): 1–31. DOI: 10.1145/3412357
DOI |
5 | Google. Google trends [EB/OL]. [2022-06-01]. |
6 |
MCMAHAN B, MOORE E, RAMAGE D, et al. Communication-efficient learning of deep networks from decentralized data [C]//Artificial Intelligence and Statistics. PMLR, 2017: 1273–1282. DOI: 10.48550/arXiv.1602.05629
DOI |
7 |
TAN A Z, YU H, CUI L, et al. Towards personalized federated learning [J]. IEEE transactions on neural networks and learning systems, 2022. DOI: 10.1109/TNNLS.2022.3160699
DOI |
8 |
ZHU H Y, XU J J, LIU S Q, et al. Federated learning on non-IID data: a survey [J]. Neurocomputing, 2021, 465: 371–390. DOI: 10.1016/j.neucom.2021.07.098
DOI |
9 |
HSIEH K, PHANISHAYEE A, MUTLU O, et al. The non-IID data quagmire of decentralized machine learning [C]//International Conference on Machine Learning. PMLR, 2020: 4387–4398. DOI: 10.48550/arXiv.1910.00189
DOI |
10 |
KARIMIREDDY S P, KALE S, MOHRI M, et al. Scaffold: stochastic controlled averaging for federated learning [C]//International Conference on Machine Learning. PMLR, 2020: 5132–5143. DOI: 10.48550/arXiv.1910.06378
DOI |
11 |
KOVÁCS G. An empirical comparison and evaluation of minority oversampling techniques on a large number of imbalanced datasets [J]. Applied soft computing, 2019, 83: 105662. DOI: 10.1016/j.asoc.2019.105662
DOI |
12 |
GUZMÁN-PONCE A, SÁNCHEZ J S, VALDOVINOS R M, et al. DBIG-US: a two-stage under-sampling algorithm to face the class imbalance problem [J]. Expert systems with applications, 2021, 168: 114301. DOI: 10.1016/j.eswa.2020.114301
DOI |
13 |
TUOR T, WANG S Q, KO B J, et al. Overcoming noisy and irrelevant data in federated learning [C]//The 25th International Conference on Pattern Recognition (ICPR). IEEE, 2020: 5020–5027. DOI: 10.1109/ICPR48806.2021.9412599
DOI |
14 |
YOSHIDA N, NISHIO T, MORIKURA M, et al. Hybrid-FL for wireless networks: cooperative learning mechanism using non-IID data [C]//2020 IEEE International Conference on Communications. IEEE, 2020: 1–7. DOI: 10.1109/ICC40277.2020.9149323
DOI |
15 | YOON T, SHIN S, HWANG S J, et al. FedMix: approximation of mixup under mean augmented federated learning [EB/OL]. [2021-06-01]. |
16 |
DUAN M M, LIU D, CHEN X Z, et al. Self-balancing federated learning with global imbalanced data in mobile systems [J]. IEEE transactions on parallel and distributed systems, 2021, 32(1): 59–71. DOI: 10.1109/TPDS.2020.3009406
DOI |
17 |
WU Q, CHEN X, ZHOU Z, et al. FedHome: cloud-edge based personalized federated learning for in-home health monitoring [J]. IEEE transactions on mobile computing, 2022, 21(8): 2818–2832. DOI: 10.1109/TMC.2020.3045266
DOI |
18 |
ZHAO J X, CHANG X Y, FENG Y H, et al. Participant selection for federated learning with heterogeneous data in intelligent transport system [J]. IEEE transactions on intelligent transportation systems, 2022, 99: 1–10. DOI: 10.1109/TITS.2022.3149753
DOI |
19 |
SHU J G, ZHANG W Z, ZHOU Y, et al. FLAS: computation and communication efficient federated learning via adaptive sampling [J]. IEEE transactions on network science and engineering, 2022, 9(4): 2003–2014. DOI: 10.1109/TNSE.2021.3056655
DOI |
20 |
ZHANG J, GUO S, QU Z H, et al. Adaptive federated learning on non-IID data with resource constraint [J]. IEEE transactions on computers, 2022, 71(7): 1655–1667. DOI: 10.1109/TC.2021.3099723
DOI |
21 |
WANG H, KAPLAN Z, NIU D, et al. Optimizing federated learning on non-IID data with reinforcement learning [C]//IEEE Conference on Computer Communications. IEEE, 2020: 1698–1707. DOI: 10.1109/INFOCOM41043.2020.9155494
DOI |
22 |
STANDLEY T, ZAMIR A, CHEN D, et al. Which tasks should be learned together in multitask learning? [C]//International Conference on Machine Learning. PMLR, 2020: 9120–9132. DOI: 10.48550/arXiv.1905.07553
DOI |
23 |
SMITH V, CHIANG C K, SANJABI M, et al. Federated multitask learning [J]. Advances in neural information processing systems, 2017, 30. DOI:10.48550/arXiv.1705.10467
DOI |
24 |
HUANG Y T, CHU L Y, ZHOU Z R, et al. Personalized cross-silo federated learning on non-IID data [C]//Proceedings of the AAAI Conference on Artificial Intelligence. AAAI, 2021: 7865–7873. DOI: 10.48550/arXiv.2007.03797
DOI |
25 |
JAMALI-RAD H, ABDIZADEH M, SINGH A. Federated learning with taskonomy for non-IID data [J]. IEEE transactions on neural networks and learning systems, 2022. DOI: 10.1109/TNNLS.2022.3152581
DOI |
26 |
LI T, HU S, BEIRAMI A, et al. Ditto: fair and robust federated learning through personalization [C]//International Conference on Machine Learning. PMLR, 2021: 6357–6368. DOI: 10.48550/arXiv.2012.04221
DOI |
27 | MARFOQ O, NEGLIA G, BELLET A, et al. Federated multi-task learning under a mixture of distributions [EB/OL]. [2022-06-01]. |
28 |
LI Y, HU P, LIU Z, et al. Contrastive clustering [C]//2021 AAAI Conference on Artificial Intelligence. AAAI, 2021. DOI: 10.48550/arXiv.2009.09687
DOI |
29 |
GHOSH A, CHUNG J, YIN D, et al. An efficient framework for clustered federated learning [J]. Advances in neural information processing systems, 2020, 33: 19586–19597. DOI: 10.48550/arXiv.2006.04088
DOI |
30 |
LI C X, LI G, VARSHNEY P K. Federated learning with soft clustering [J]. IEEE Internet of Things journal, 2022, 9(10): 7773–7782. DOI: 10.1109/JIOT.2021.3113927
DOI |
31 |
SATTLER F, MÜLLER K R, SAMEK W. Clustered federated learning: model-agnostic distributed multitask optimization under privacy constraints [J]. IEEE transactions on neural networks and learning systems, 2021, 32(8): 3710–3722. DOI: 10.1109/TNNLS.2020.3015958
DOI |
32 |
FRABONI Y, VIDAL R, KAMENI L, et al. Clustered sampling: low-variance and improved representativity for clients selection in federated learning [C]//International Conference on Machine Learning. PMLR, 2021: 3407–3416. DOI: 10.48550/arXiv.2105.05883
DOI |
33 |
ZHANG Y, DUAN M, LIU D, et al. CSAFL: a clustered semi-asynchronous federated learning framework [C]//Proceedings of 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 2021: 1–10. DOI: 10.1109/IJCNN52387.2021.9533794
DOI |
34 |
PARK D Y, CHA M H, KIM D, et al. Learning student-friendly teacher networks for knowledge distillation [J]. Advances in neural information processing systems, 2021, 34. DOI: 10.48550/arXiv.2102.07650
DOI |
35 |
LIN T, KONG L, STICH S U, et al. Ensemble distillation for robust model fusion in federated learning [J]. advances in neural information processing systems, 2020, 33: 2351–2363. DOI: 10.48550/arXiv.2006.07242
DOI |
36 | LI D L, WANG J P. FedMD: heterogenous federated learning via model distillation [EB/OL]. (2021-01-27) [2022-06-01]. |
37 |
ZHU Z D, HONG J Y, ZHOU J Y. Data-free knowledge distillation for heterogeneous federated learning [J]. Proceedings of machine learning research, 2021, 139: 12878–12889. DOI:10.48550/arXiv.2105.10056
DOI |
38 | BISTRITZ I, MANN A, BAMBOS N. Distributed distillation for on-device learning [J]. Advances in neural information processing systems, 2020, 33: 22593–22604 |
39 | LI Y Y, ZHOU W, WANG H M, et al. FedH 2L: federated learning with model and statistical heterogeneity [EB/OL]. (2021-01-27) [2022-06-01]. |
40 | JIN X, CHEN P Y, HSU C Y, et al. CAFE: catastrophic data leakage in vertical federated learning [EB/OL]. [2022-06-01]. |
41 |
ZHANG J W, ZHANG J L, CHEN J J, et al. GAN enhanced membership inference: a passive local attack in federated learning [C]//2020 IEEE International Conference on Communications. IEEE, 2020, 1–6. DOI: 10.1109/ICC40277.2020.9148790
DOI |
[1] | GU Cheng, LI Baochun. Hierarchical Federated Learning Architectures for the Metaverse [J]. ZTE Communications, 2024, 22(2): 39-48. |
[2] | YAN Yuna, LIU Ying, NI Tao, LIN Wensheng, LI Lixin. Content Popularity Prediction via Federated Learning in Cache-Enabled Wireless Networks [J]. ZTE Communications, 2023, 21(2): 18-24. |
[3] | ZHAO Moke, HUANG Yansong, LI Xuan. Federated Learning for 6G: A Survey From Perspective of Integrated Sensing, Communication and Computation [J]. ZTE Communications, 2023, 21(2): 25-33. |
[4] | YAN Jintao, CHEN Tan, XIE Bowen, SUN Yuxuan, ZHOU Sheng, NIU Zhisheng. Hierarchical Federated Learning: Architecture, Challenges, and Its Implementation in Vehicular Networks [J]. ZTE Communications, 2023, 21(1): 38-45. |
[5] | DING Yahao, SHIKH‑BAHAEI Mohammad, YANG Zhaohui, HUANG Chongwen, YUAN Weijie. Secure Federated Learning over Wireless Communication Networks with Model Compression [J]. ZTE Communications, 2023, 21(1): 46-54. |
[6] | ZHANG Weiting, LIANG Haotian, XU Yuhua, ZHANG Chuan. Reliable and Privacy-Preserving Federated Learning with Anomalous Users [J]. ZTE Communications, 2023, 21(1): 15-24. |
[7] | WANG Yiji, WEN Dingzhu, MAO Yijie, SHI Yuanming. RIS-Assisted Federated Learning in Multi-Cell Wireless Networks [J]. ZTE Communications, 2023, 21(1): 25-37. |
[8] | WANG Pengfei, SONG Wei, SUN Geng, WEI Zongzheng, ZHANG Qiang. Air-Ground Integrated Low-Energy Federated Learning for Secure 6G Communications [J]. ZTE Communications, 2022, 20(4): 32-40. |
[9] | NAN Yucen, FANG Minghao, ZOU Xiaojing, DOU Yutao, Albert Y. ZOMAYA. A Collaborative Medical Diagnosis System Without Sharing Patient Data [J]. ZTE Communications, 2022, 20(3): 3-16. |
[10] | LIU Qinbo, JIN Zhihao, WANG Jiabo, LIU Yang, LUO Wenjian. MSRA-Fed: A Communication-Efficient Federated Learning Method Based on Model Split and Representation Aggregate [J]. ZTE Communications, 2022, 20(3): 35-42. |
[11] | TANG Bo, ZHANG Chengming, WANG Kewen, GAO Zhengguang, HAN Bingtao. Neursafe-FL: A Reliable, Efficient, Easy-to- Use Federated Learning Framework [J]. ZTE Communications, 2022, 20(3): 43-53. |
[12] | SHI Wenqi, SUN Yuxuan, HUANG Xiufeng, ZHOU Sheng, NIU Zhisheng. Scheduling Policies for Federated Learning in Wireless Networks: An Overview [J]. ZTE Communications, 2020, 18(2): 11-19. |
[13] | YANG Howard H., ZHAO Zhongyuan, QUEK Tony Q. S.. Enabling Intelligence at Network Edge:An Overview of Federated Learning [J]. ZTE Communications, 2020, 18(2): 2-10. |
Viewed | ||||||
Full text |
|
|||||
Abstract |
|
|||||