Communication-Efficient Distributed PCA by Riemannian Optimization

Research output: Chapter in book/report/conference proceedingConference proceedingpeer-review

Abstract

In this paper, we study the leading eigenvector problem in a statistically distributed setting and propose a communication-efficient algorithm based on Riemannian optimization, which trades local computation for global communication. Theoretical analysis shows that the proposed algorithm linearly converges to the centralized empirical risk minimization solution regarding the number of communication rounds. When the number of data points in local machines is sufficiently large, the proposed algorithm achieves a significant reduction of communication cost over existing distributed PCA algorithms. Superior performance in terms of communication cost of the proposed algorithm is verified on real-world and synthetic datasets.
Original languageEnglish
Title of host publicationProceedings of the 37 th International Conference on Machine Learning, ICML 2021
PublisherML Research Press
Pages4465-4474
Number of pages10
Publication statusPublished - Jul 2020
Event37th International Conference on Machine Learning, ICML 2020 - Virtual
Duration: 13 Jul 202018 Jul 2020
https://proceedings.mlr.press/v119/ (Conference Proceedings)

Publication series

NameProceedings of Machine Learning Research
Volume119
ISSN (Print)2640-3498
NameProceedings of the International Conference on Machine Learning

Conference

Conference37th International Conference on Machine Learning, ICML 2020
Period13/07/2018/07/20
Internet address

Fingerprint

Dive into the research topics of 'Communication-Efficient Distributed PCA by Riemannian Optimization'. Together they form a unique fingerprint.

Cite this