Abstract
Stochastic composite mirror descent (SCMD) is a simple and efficient method able to capture both geometric and composite structures of optimization problems in machine learning. Existing strategies require to take either an average or a random selection of iterates to achieve optimal convergence rates, which, however, can either destroy the sparsity of solutions or slow down the practical training speed. In this paper, we propose a theoretically sound strategy to select an individual iterate of the vanilla SCMD, which is able to achieve optimal rates for both convex and strongly convex problems in a non-smooth learning setting. This strategy of outputting an individual iterate can preserve the sparsity of solutions which is crucial for a proper interpretation in sparse learning problems. We report experimental comparisons with several baseline methods to show the effectiveness of our method in achieving a fast training speed as well as in outputting sparse solutions.
Original language | English |
---|---|
Title of host publication | Advances in Neural Information Processing Systems 32 |
Editors | H. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alché-Buc, E. Fox, R. Garnett |
Number of pages | 11 |
Volume | 32 |
ISBN (Electronic) | 9781713807933 |
Publication status | Published - Dec 2019 |
Event | 33rd Conference on Neural Information Processing Systems, NeurIPS 2019 - Vancouver, Canada Duration: 8 Dec 2019 → 14 Dec 2019 https://neurips.cc/Conferences/2019 https://proceedings.neurips.cc/paper/2019 |
Conference
Conference | 33rd Conference on Neural Information Processing Systems, NeurIPS 2019 |
---|---|
Country/Territory | Canada |
City | Vancouver |
Period | 8/12/19 → 14/12/19 |
Internet address |
Scopus Subject Areas
- Computer Networks and Communications
- Information Systems
- Signal Processing