DeRelayL: Sustainable Decentralized Relay Learning

Haihan Duan, Tengfei Ma, Yuyang Qin, Runhao Zeng, Wei Cai, Victor C.M. Leung, Xiping Hu*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

In the era of Big Data, large-scale machine learning models have revolutionized various fields, driving significant advancements. However, large-scale model training demands high financial and computational resources, which are only affordable by a few technological giants and well-funded institutions. In this case, common users like mobile users, the real creators of valuable data, are often excluded from fully benefiting due to the barriers, while the current methods for accessing largescale models either limit user ownership or lack sustainability. This growing gap highlights the urgent need for a collaborative model training approach, allowing common users to train and share models. However, existing collaborative model training paradigms, especially federated learning (FL), primarily focus on data privacy and group-based model aggregation. To this end, this paper intends to address this issue by proposing a novel training paradigm named decentralized relay learning (DeRelayL), a sustainable learning system where permissionless participants can contribute to model training in a relay-like manner and share the model.

Original languageEnglish
JournalIEEE Transactions on Mobile Computing
DOIs
Publication statusAccepted/In press - 2025

Keywords

  • Blockchain
  • Decentralized Model Training
  • Federated Learning
  • Relay Learning
  • Sustainable Model Training

Fingerprint

Dive into the research topics of 'DeRelayL: Sustainable Decentralized Relay Learning'. Together they form a unique fingerprint.

Cite this