Abstract
Nonconvex regularization has been popularly used in low-rank matrix
learning. However, extending it for low-rank tensor learning is still
computationally expensive. To address this problem, we develop an
efficient solver for use with a nonconvex extension of the overlapped
nuclear norm regularizer. Based on the proximal average algorithm, the
proposed algorithm can avoid expensive tensor folding/unfolding
operations. A special “sparse plus low-rank" structure is maintained
throughout the iterations, and allows fast computation of the individual
proximal steps. Empirical convergence is further improved with the use
of adaptive momentum. We provide convergence guarantees to critical
points on smooth losses and also on objectives satisfying the
Kurdyka-Lojasiewicz condition. While the optimization problem is
nonconvex and nonsmooth, we show that its critical points still have
good statistical performance on the tensor completion problem.
Experiments on various synthetic and real-world data sets show that the
proposed algorithm is efficient in both time and space and more accurate
than the existing state-of-the-art.
Original language | English |
---|---|
Article number | 136 |
Number of pages | 60 |
Journal | Journal of Machine Learning Research |
Volume | 23 |
DOIs | |
Publication status | Published - 22 Apr 2022 |
Scopus Subject Areas
- Software
- Control and Systems Engineering
- Statistics and Probability
- Artificial Intelligence
User-Defined Keywords
- Low-rank tensor
- Nonconvex regularization
- Overlapped nuclear norm
- Proximal algorithm
- Proximal average algorithm