TY - JOUR
T1 - Exploring Temporal Similarity for Joint Computation and Communication in Online Distributed Optimization
AU - Wang, Juncheng
AU - Dong, Min
AU - Liang, Ben
AU - Boudreau, Gary
AU - Afana, Ali
N1 - This work was supported in part by Ericsson Canada, in part by the Natural Sciences and Engineering Research Council (NSERC) of Canada, and in part by Hong Kong Research Grants Council (RGC) Early Career Scheme (ECS) under Grant 22200324.
Publisher Copyright:
© 2025 IEEE.
PY - 2025/1/22
Y1 - 2025/1/22
N2 - We consider online distributed optimization in a networked system, where multiple devices assisted by a server collaboratively minimize the accumulation of a sequence of global loss functions that can vary over time. To reduce the amount of communication, the devices send quantized and compressed local decisions to the server, resulting in noisy global decisions. Therefore, there exists a tradeoff between the optimization performance and the communication overhead. Existing works separately optimize computation and communication. In contrast, we jointly consider computation and communication over time, by proactively encouraging temporal similarity in the decision sequence to control the communication overhead. We propose an efficient algorithm, termed Online Distributed Optimization with Temporal Similarity (ODOTS), where the local decisions are both computation-and communication-aware. Furthermore, ODOTS uses a novel tunable virtual queue, which removes the commonly assumed Slater’s condition through a modified Lyapunov drift analysis. ODOTS delivers provable performance bounds on both the optimization objective and constraint violation. Furthermore, we consider a variant of ODOTS with multi-step local gradient descent updates, termed ODOTS-MLU, and show that it provides improved performance bounds. As an example application, we apply both ODOTS and ODOTS-MLU to enable communication-efficient federated learning. Our experimental results based on canonical image classification demonstrate that ODOTS and ODOTS-MLU obtain higher classification accuracy and lower communication overhead compared with the current best alternatives for both convex and non-convex loss functions.
AB - We consider online distributed optimization in a networked system, where multiple devices assisted by a server collaboratively minimize the accumulation of a sequence of global loss functions that can vary over time. To reduce the amount of communication, the devices send quantized and compressed local decisions to the server, resulting in noisy global decisions. Therefore, there exists a tradeoff between the optimization performance and the communication overhead. Existing works separately optimize computation and communication. In contrast, we jointly consider computation and communication over time, by proactively encouraging temporal similarity in the decision sequence to control the communication overhead. We propose an efficient algorithm, termed Online Distributed Optimization with Temporal Similarity (ODOTS), where the local decisions are both computation-and communication-aware. Furthermore, ODOTS uses a novel tunable virtual queue, which removes the commonly assumed Slater’s condition through a modified Lyapunov drift analysis. ODOTS delivers provable performance bounds on both the optimization objective and constraint violation. Furthermore, we consider a variant of ODOTS with multi-step local gradient descent updates, termed ODOTS-MLU, and show that it provides improved performance bounds. As an example application, we apply both ODOTS and ODOTS-MLU to enable communication-efficient federated learning. Our experimental results based on canonical image classification demonstrate that ODOTS and ODOTS-MLU obtain higher classification accuracy and lower communication overhead compared with the current best alternatives for both convex and non-convex loss functions.
KW - Online optimization
KW - federated learning
KW - temporal similarity,
KW - long-term constraint
KW - multi-step gradient
UR - https://ieeexplore.ieee.org/document/10849650/
U2 - 10.1109/TON.2025.3530460
DO - 10.1109/TON.2025.3530460
M3 - Journal article
SN - 2998-4157
JO - IEEE Transactions on Networking
JF - IEEE Transactions on Networking
ER -