Exploring Temporal Similarity for Joint Computation and Communication in Online Distributed Optimization

Juncheng Wang, Min Dong, Ben Liang*, Gary Boudreau, Ali Afana

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

Abstract

We consider online distributed optimization in a networked system, where multiple devices assisted by a server collaboratively minimize the accumulation of a sequence of global loss functions that can vary over time. To reduce the amount of communication, the devices send quantized and compressed local decisions to the server, resulting in noisy global decisions. Therefore, there exists a tradeoff between the optimization performance and the communication overhead. Existing works separately optimize computation and communication. In contrast, we jointly consider computation and communication over time, by proactively encouraging temporal similarity in the decision sequence to control the communication overhead. We propose an efficient algorithm, termed Online Distributed Optimization with Temporal Similarity (ODOTS), where the local decisions are both computation-and communication-aware. Furthermore, ODOTS uses a novel tunable virtual queue, which removes the commonly assumed Slater’s condition through a modified Lyapunov drift analysis. ODOTS delivers provable performance bounds on both the optimization objective and constraint violation. Furthermore, we consider a variant of ODOTS with multi-step local gradient descent updates, termed ODOTS-MLU, and show that it provides improved performance bounds. As an example application, we apply both ODOTS and ODOTS-MLU to enable communication-efficient federated learning. Our experimental results based on canonical image classification demonstrate that ODOTS and ODOTS-MLU obtain higher classification accuracy and lower communication overhead compared with the current best alternatives for both convex and non-convex loss functions.
Original languageEnglish
Number of pages17
JournalIEEE Transactions on Networking
DOIs
Publication statusE-pub ahead of print - 22 Jan 2025

User-Defined Keywords

  • Online optimization
  • federated learning
  • temporal similarity,
  • long-term constraint
  • multi-step gradient

Fingerprint

Dive into the research topics of 'Exploring Temporal Similarity for Joint Computation and Communication in Online Distributed Optimization'. Together they form a unique fingerprint.

Cite this