Decentralized Federated Large Model Tuning over Edge Networks with Zeroth-Order Optimization

Zihan Chen, Jihong Park, Boxiang He*, Yanli Yuan, Howard H. Yang

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Federated tuning of large models is an emerging paradigm that pushes the promising generative AI services into the network edge. However, as model sizes scale up, the conflicts between their intensive resource demands and the naturally limited resource at edge networks significantly constrained the performance of tuning large models over edge networks. In view of these, we propose a fully decentralized federated large model tuning framework with zeroth-order (ZO) optimization, addressing the significant computation and communication costs during the federated learning (FL) process. The proposed framework offers superior performance in dynamic and infrastructure-less edge networks with a theoretical convergence guarantee. Extensive experiments demonstrate the efficacy and outperformance of the proposed framework regarding the communication efficiency and robustness performance.

Original languageEnglish
Title of host publication2025 IEEE/CIC International Conference on Communications in China:Shaping the Future of Integrated Connectivity, ICCC 2025
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798331544447
DOIs
Publication statusPublished - 2025
Event2025 IEEE/CIC International Conference on Communications in China, ICCC 2025 - Shanghai, China
Duration: 10 Aug 202513 Aug 2025

Publication series

Name2025 IEEE/CIC International Conference on Communications in China:Shaping the Future of Integrated Connectivity, ICCC 2025

Conference

Conference2025 IEEE/CIC International Conference on Communications in China, ICCC 2025
Country/TerritoryChina
CityShanghai
Period10/08/2513/08/25

Keywords

  • communication efficiency
  • decentralized learning
  • Federated learning
  • large model tuning
  • zeroth-order optimization

Fingerprint

Dive into the research topics of 'Decentralized Federated Large Model Tuning over Edge Networks with Zeroth-Order Optimization'. Together they form a unique fingerprint.

Cite this