Simple Stochastic and Online Gradient Descent Algorithms for Pairwise Learning

Zhenhuan Yang, Yunwen Lei, Puyu Wang, Tianbao Yang, Yiming Ying*

*Corresponding author for this work

Research output: Chapter in book/report/conference proceedingConference proceedingpeer-review

15 Citations (Scopus)

Abstract

Pairwise learning refers to learning tasks where the loss function depends on a pair of instances. It instantiates many important machine learning tasks such as bipartite ranking and metric learning. A popular approach to handle streaming data in pairwise learning is an online gradient descent (OGD) algorithm, where one needs to pair the current instance with a buffering set of previous instances with a sufficiently large size and therefore suffers from a scalability issue. In this paper, we propose simple stochastic and online gradient descent methods for pairwise learning. A notable difference from the existing studies is that we only pair the current instance with the previous one in building a gradient direction, which is efficient in both the storage and computational complexity. We develop novel stability results, optimization, and generalization error bounds for both convex and nonconvex as well as both smooth and nonsmooth problems. We introduce novel techniques to decouple the dependency of models and the previous instance in both the optimization and generalization analysis. Our study resolves an open question on developing meaningful generalization bounds for OGD using a buffering set with a very small fixed size. We also extend our algorithms and stability analysis to develop differentially private SGD algorithms for pairwise learning which significantly improves the existing results.

Original languageEnglish
Title of host publication35th Conference on Neural Information Processing Systems (NeurIPS 2021)
EditorsMarc'Aurelio Ranzato, Alina Beygelzimer, Yann Dauphin, Percy S. Liang, Jenn Wortman Vaughan
PublisherNeural Information Processing Systems Foundation
Pages20160-20171
Number of pages12
Volume24
ISBN (Print)9781713845393
Publication statusPublished - 6 Dec 2021
Event35th Conference on Neural Information Processing Systems, NeurIPS 2021 - Virtual
Duration: 6 Dec 202114 Dec 2021
https://nips.cc/Conferences/2021 (Conference website)
https://neurips.cc/Conferences/2021 (Conference website)
https://papers.nips.cc/paper_files/paper/2021 (Conference proceedings)
https://proceedings.neurips.cc/paper/2021 (Conference proceedings)

Publication series

NameAdvances in Neural Information Processing Systems
Volume34
ISSN (Print)1049-5258
NameNeurIPS Proceedings

Conference

Conference35th Conference on Neural Information Processing Systems, NeurIPS 2021
Period6/12/2114/12/21
Internet address

Scopus Subject Areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Simple Stochastic and Online Gradient Descent Algorithms for Pairwise Learning'. Together they form a unique fingerprint.

Cite this