Optimal Stochastic and Online Learning with Individual Iterates

Yunwen Lei, Peng Yang, Ke Tang*, Ding Xuan Zhou

*Corresponding author for this work

Research output: Chapter in book/report/conference proceedingConference proceedingpeer-review

3 Citations (Scopus)


Stochastic composite mirror descent (SCMD) is a simple and efficient method able to capture both geometric and composite structures of optimization problems in machine learning. Existing strategies require to take either an average or a random selection of iterates to achieve optimal convergence rates, which, however, can either destroy the sparsity of solutions or slow down the practical training speed. In this paper, we propose a theoretically sound strategy to select an individual iterate of the vanilla SCMD, which is able to achieve optimal rates for both convex and strongly convex problems in a non-smooth learning setting. This strategy of outputting an individual iterate can preserve the sparsity of solutions which is crucial for a proper interpretation in sparse learning problems. We report experimental comparisons with several baseline methods to show the effectiveness of our method in achieving a fast training speed as well as in outputting sparse solutions.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 32
EditorsH. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alché-Buc, E. Fox, R. Garnett
Number of pages11
ISBN (Electronic)9781713807933
Publication statusPublished - Dec 2019
Event33rd Conference on Neural Information Processing Systems, NeurIPS 2019 - Vancouver, Canada
Duration: 8 Dec 201914 Dec 2019


Conference33rd Conference on Neural Information Processing Systems, NeurIPS 2019
Internet address

Scopus Subject Areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing


Dive into the research topics of 'Optimal Stochastic and Online Learning with Individual Iterates'. Together they form a unique fingerprint.

Cite this