Dynamic Byzantine-Robust Learning: Adapting to Switching Byzantine Workers

Ron Dorfman, Naseem Yehya, Kfir Y. Levy

Research output: Contribution to journalConference articlepeer-review

Abstract

Byzantine-robust learning has emerged as a prominent fault-tolerant distributed machine learning framework. However, most techniques focus on the static setting, wherein the identity of Byzantine workers remains unchanged throughout the learning process. This assumption fails to capture real-world dynamic Byzantine behaviors, which may include intermittent malfunctions or targeted, time-limited attacks. Addressing this limitation, we propose DynaBRO - a new method capable of withstanding any sub-linear number of identity changes across rounds. Specifically, when the number of such changes is O(√T) (where T is the total number of training rounds), DynaBRO nearly matches the state-of-the-art asymptotic convergence rate of the static setting. Our method utilizes a multi-level Monte Carlo (MLMC) gradient estimation technique applied at the server to robustly aggregated worker updates. By additionally leveraging an adaptive learning rate, we circumvent the need for prior knowledge of the fraction of Byzantine workers.

Original languageEnglish
Pages (from-to)11501-11543
Number of pages43
JournalProceedings of Machine Learning Research
Volume235
StatePublished - 2024
Event41st International Conference on Machine Learning, ICML 2024 - Vienna, Austria
Duration: 21 Jul 202427 Jul 2024

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Dynamic Byzantine-Robust Learning: Adapting to Switching Byzantine Workers'. Together they form a unique fingerprint.

Cite this