TY - JOUR
T1 - Quantized Low-Rank Multivariate Regression With Random Dithering
AU - Chen, Junren
AU - Wang, Yueqi
AU - Ng, Michael K.
N1 - Funding information:
The work of Junren Chen and Yueqi Wang was supported by Hong Kong PhD Fellowship from Hong Kong Research Grant Council (HKRGC). The work of Michael K. Ng was supported in part by the HKRGC GRF under Grants 17201020 and 17300021, in part by CRF under Grant C7004-21GF, and in part by Joint NSFC-RGC under Grant N-HKU76921.
Publisher Copyright:
© 2023 The Authors.
PY - 2023/10/12
Y1 - 2023/10/12
N2 - Low-rank multivariate regression (LRMR) is an important statistical learning model that combines highly correlated tasks as a multiresponse regression problem with low-rank priori on the coefficient matrix. In this paper, we study quantized LRMR, a practical setting where the responses and/or the covariates are discretized to finite precision. We focus on the estimation of the underlying coefficient matrix. To make consistent estimator that could achieve arbitrarily small error possible, we employ uniform quantization with random dithering, i.e., we add appropriate random noise to the data before quantization. Specifically, uniform dither and triangular dither are used for responses and covariates, respectively. Based on the quantized data, we propose the constrained Lasso and regularized Lasso estimators, and derive the non-asymptotic error bounds. With the aid of dithering, the estimators achieve minimax optimal rate, while quantization only slightly worsens the multiplicative factor in the error rate. Moreover, we extend our results to a low-rank regression model with matrix responses. We corroborate and demonstrate our theoretical results via simulations on synthetic data, image restoration, as well as a real data application.
AB - Low-rank multivariate regression (LRMR) is an important statistical learning model that combines highly correlated tasks as a multiresponse regression problem with low-rank priori on the coefficient matrix. In this paper, we study quantized LRMR, a practical setting where the responses and/or the covariates are discretized to finite precision. We focus on the estimation of the underlying coefficient matrix. To make consistent estimator that could achieve arbitrarily small error possible, we employ uniform quantization with random dithering, i.e., we add appropriate random noise to the data before quantization. Specifically, uniform dither and triangular dither are used for responses and covariates, respectively. Based on the quantized data, we propose the constrained Lasso and regularized Lasso estimators, and derive the non-asymptotic error bounds. With the aid of dithering, the estimators achieve minimax optimal rate, while quantization only slightly worsens the multiplicative factor in the error rate. Moreover, we extend our results to a low-rank regression model with matrix responses. We corroborate and demonstrate our theoretical results via simulations on synthetic data, image restoration, as well as a real data application.
KW - dithering
KW - low-rankness
KW - M-estimator
KW - Multiresponse regression
KW - quantization
UR - http://www.scopus.com/inward/record.url?scp=85174852212&partnerID=8YFLogxK
U2 - 10.1109/TSP.2023.3322813
DO - 10.1109/TSP.2023.3322813
M3 - Journal article
AN - SCOPUS:85174852212
SN - 1053-587X
VL - 71
SP - 3913
EP - 3928
JO - IEEE Transactions on Signal Processing
JF - IEEE Transactions on Signal Processing
ER -