Project Details
Description
Low-rank related models are of extreme importance in many disciplines, such as data science, image processing, signal processing, and machine learning. An increasing number of real-world applications involve multi-dimensional data, which is often organized into a higher order tensor, i.e., an array with more than two indices. Researchers are motivated to investigate the low-rank tensor models and find tensor decompositions as fundamental tools. In the spirit of distinguishing the dimensions of different physical meanings, we suggest a tubal way to design tensor decompositions and tackle the low-rank models.
Besides taken as a cube, a third-order tensor can also be regarded as a matrix whose entries are tubes, i.e., vectors along the third dimension. By introducing a multiplication of tubes, we define the corresponding tensor multiplications and decompositions in the tubal sense. We show in this proposal that any tube multiplication is determined by a weight tensor and the selection of the weight tensor is crucial in applications. Several existing tensor decompositions such as tensor train, t-SVD, and quaternion matrix decompositions can be unified into this framework.
In this proposal, we apply the weighted tubal decompositions to the low-rank approximation model and the low-rank plus sparse model of tensors. When the weight tensor is chosen so that the tube space forms a ring, the weighted tubal singular value decomposition is well-defined. We would like to design efficient algorithms for the weighted tubal SVD similar to the matrix case and employ it to solve these two tensor models. When the tube space does not form a ring with the given weight tensor, we cannot define the SVD and thus we use the weighted tubal low-rank decompositions for these models, which lead to nonconvex optimization problems. We would also design and analyze efficient and reliable algorithms for this case. We also propose optimization methods to select the weight tensors adaptively for different applications to pursue the optimal approximation in a particular sense. We plan to apply the low-rank approximation model and the low-rank plus sparse model to color image compression and seismic imaging applications, respectively.
Besides taken as a cube, a third-order tensor can also be regarded as a matrix whose entries are tubes, i.e., vectors along the third dimension. By introducing a multiplication of tubes, we define the corresponding tensor multiplications and decompositions in the tubal sense. We show in this proposal that any tube multiplication is determined by a weight tensor and the selection of the weight tensor is crucial in applications. Several existing tensor decompositions such as tensor train, t-SVD, and quaternion matrix decompositions can be unified into this framework.
In this proposal, we apply the weighted tubal decompositions to the low-rank approximation model and the low-rank plus sparse model of tensors. When the weight tensor is chosen so that the tube space forms a ring, the weighted tubal singular value decomposition is well-defined. We would like to design efficient algorithms for the weighted tubal SVD similar to the matrix case and employ it to solve these two tensor models. When the tube space does not form a ring with the given weight tensor, we cannot define the SVD and thus we use the weighted tubal low-rank decompositions for these models, which lead to nonconvex optimization problems. We would also design and analyze efficient and reliable algorithms for this case. We also propose optimization methods to select the weight tensors adaptively for different applications to pursue the optimal approximation in a particular sense. We plan to apply the low-rank approximation model and the low-rank plus sparse model to color image compression and seismic imaging applications, respectively.
Status | Finished |
---|---|
Effective start/end date | 1/09/19 → 28/02/23 |
Fingerprint
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.