Coupled segmentation and denoising/deblurring models for hyperspectral material identification

Fang Li, Kwok Po NG, Robert J. Plemmons*

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

43 Citations (Scopus)

Abstract

A crucial aspect of spectral image analysis is the identification of the materials present in the object or scene being imaged and to quantify their abundance in the mixture. An increasingly useful approach to extracting such underlying structure is to employ image classification and object identification techniques to compressively represent the original data cubes by a set of spatially orthogonal bases and a set of spectral signatures. Owing to the increasing quantity of data usually encountered in hyperspectral data sets, effective data compressive representation is an important consideration, and noise and blur can present data analysis problems. In this paper, we develop image segmentation methods for hyperspectral space object material identification. We also couple the segmentation with a hyperspectral image data denoising/deblurring model and propose this method as an alternative to a tensor factorization methods proposed recently for space object material identification. The model provides the segmentation result and the restored image simultaneously. Numerical results show the effectiveness of our proposed combined model in hyperspectral material identification.

Original languageEnglish
Pages (from-to)153-173
Number of pages21
JournalNumerical Linear Algebra with Applications
Volume19
Issue number1
DOIs
Publication statusPublished - Jan 2012

Scopus Subject Areas

  • Algebra and Number Theory
  • Applied Mathematics

User-Defined Keywords

  • Compressive representation
  • Deblurring
  • Denoising
  • Hyperspectral image analysis
  • Segmentation
  • Tensors

Fingerprint

Dive into the research topics of 'Coupled segmentation and denoising/deblurring models for hyperspectral material identification'. Together they form a unique fingerprint.

Cite this