Application of Federated Learning and Decentralized Learning for Large Model Training
- DOI
- 10.2991/978-94-6463-823-3_104How to use a DOI?
- Keywords
- Application; Federated Learning(FL) Decentralized Learning(DL); Large Model Training
- Abstract
With the increasing scale of large model training, the traditional centralized training model faces serious challenges in terms of data privacy, communication efficiency and device heterogeneity. The data privacy problem stems from the inability to centrally store sensitive information (e.g., in the medical and financial fields), the communication efficiency is limited by the bandwidth pressure of massive parameter transmission, and the device heterogeneity leads to the difficulty of adapting the hardware resources to the model architecture. To cope with these problems, Federated Learning (FL) and Decentralized Learning (DL) have received widespread attention as two distributed training paradigms. This paper systematically compares the performance of the two in the dimensions of privacy protection, communication optimization, non-independent and identically distributed (non-IID) data adaptation, and model heterogeneity support, and find that FL is more suitable for privacy-sensitive but device-homogeneous scenarios, whereas DL has more potential in distributed autonomy and heterogeneous environments. Future research needs to explore hybrid architecture design, non-convex optimization theory breakthroughs and cross-modal collaboration mechanisms to promote the application of large model training techniques in complex scenarios.
- Copyright
- © 2025 The Author(s)
- Open Access
- Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
Cite this article
TY - CONF AU - Xu Zixuan PY - 2025 DA - 2025/08/31 TI - Application of Federated Learning and Decentralized Learning for Large Model Training BT - Proceedings of the 2025 3rd International Conference on Image, Algorithms, and Artificial Intelligence (ICIAAI 2025) PB - Atlantis Press SP - 1080 EP - 1089 SN - 2352-538X UR - https://doi.org/10.2991/978-94-6463-823-3_104 DO - 10.2991/978-94-6463-823-3_104 ID - Zixuan2025 ER -