Quantized Low-Rank Multivariate Regression With Random Dithering

Junren Chen*, Yueqi Wang*, Michael K. Ng

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

1 Citation (Scopus)

Abstract

Low-rank multivariate regression (LRMR) is an important statistical learning model that combines highly correlated tasks as a multiresponse regression problem with low-rank priori on the coefficient matrix. In this paper, we study quantized LRMR, a practical setting where the responses and/or the covariates are discretized to finite precision. We focus on the estimation of the underlying coefficient matrix. To make consistent estimator that could achieve arbitrarily small error possible, we employ uniform quantization with random dithering, i.e., we add appropriate random noise to the data before quantization. Specifically, uniform dither and triangular dither are used for responses and covariates, respectively. Based on the quantized data, we propose the constrained Lasso and regularized Lasso estimators, and derive the non-asymptotic error bounds. With the aid of dithering, the estimators achieve minimax optimal rate, while quantization only slightly worsens the multiplicative factor in the error rate. Moreover, we extend our results to a low-rank regression model with matrix responses. We corroborate and demonstrate our theoretical results via simulations on synthetic data, image restoration, as well as a real data application.

Original languageEnglish
Pages (from-to)3913-3928
Number of pages16
JournalIEEE Transactions on Signal Processing
Volume71
DOIs
Publication statusPublished - 12 Oct 2023

Scopus Subject Areas

  • Signal Processing
  • Electrical and Electronic Engineering

User-Defined Keywords

  • dithering
  • low-rankness
  • M-estimator
  • Multiresponse regression
  • quantization

Fingerprint

Dive into the research topics of 'Quantized Low-Rank Multivariate Regression With Random Dithering'. Together they form a unique fingerprint.

Cite this