Decentralized Federated Large Model Tuning over Edge Networks with Zeroth-Order Optimization

Zihan Chen, Jihong Park, Boxiang He*, Yanli Yuan, Howard H. Yang

*此作品的通讯作者

科研成果: 书/报告/会议事项章节会议稿件同行评审

摘要

Federated tuning of large models is an emerging paradigm that pushes the promising generative AI services into the network edge. However, as model sizes scale up, the conflicts between their intensive resource demands and the naturally limited resource at edge networks significantly constrained the performance of tuning large models over edge networks. In view of these, we propose a fully decentralized federated large model tuning framework with zeroth-order (ZO) optimization, addressing the significant computation and communication costs during the federated learning (FL) process. The proposed framework offers superior performance in dynamic and infrastructure-less edge networks with a theoretical convergence guarantee. Extensive experiments demonstrate the efficacy and outperformance of the proposed framework regarding the communication efficiency and robustness performance.

源语言英语
主期刊名2025 IEEE/CIC International Conference on Communications in China:Shaping the Future of Integrated Connectivity, ICCC 2025
出版商Institute of Electrical and Electronics Engineers Inc.
ISBN(电子版)9798331544447
DOI
出版状态已出版 - 2025
活动2025 IEEE/CIC International Conference on Communications in China, ICCC 2025 - Shanghai, 中国
期限: 10 8月 202513 8月 2025

出版系列

姓名2025 IEEE/CIC International Conference on Communications in China:Shaping the Future of Integrated Connectivity, ICCC 2025

会议

会议2025 IEEE/CIC International Conference on Communications in China, ICCC 2025
国家/地区中国
Shanghai
时期10/08/2513/08/25

指纹

探究 'Decentralized Federated Large Model Tuning over Edge Networks with Zeroth-Order Optimization' 的科研主题。它们共同构成独一无二的指纹。

引用此