A Discriminative Matrix Feature of 2-D Object

Li Feng*, Yuanyan Tang, Jiming Liu, Tao Guo

*Corresponding author for this work

Research output: Chapter in book/report/conference proceedingConference proceedingpeer-review


This paper proposes novel spatial feature extraction combining wavelet analysis and sparse matrix techniques to solve the problem of identifying objects that are of subtle differences. This hybrid matrix feature is not put forward before in any literature. The differences between slightly dissimilar objects are distinctions in the spatial orientations of the objects or the local positions of points on their contours. The time-frequency localization of wavelet transform distinguishes these differences and leads to a sparse form of underlying objects. This sparsity allow us re-arrange the pixels in the wavelet decomposed details sub-images. Treating three directional details as sparse matrices, different sparse matrix reordering are applied upon them. The reordering produces a considerable increase of the distinction between slightly dissimilar objects. In consequence, the difficulty to discriminate between objects is largely reduced. A series of discriminative simulations are shown which verify the feasibility and effectiveness of the proposed method.

Original languageEnglish
Title of host publicationProceedings of the Fifth Joint Conference on Information Sciences, JCIS 2000, Volume 2
EditorsP.P. Wang, P.P. Wang
Number of pages4
Publication statusPublished - 2000
EventProceedings of the Fifth Joint Conference on Information Sciences, JCIS 2000 - Atlantic City, NJ, United States
Duration: 27 Feb 20003 Mar 2000

Publication series

NameProceedings of the Joint Conference on Information Sciences


ConferenceProceedings of the Fifth Joint Conference on Information Sciences, JCIS 2000
Country/TerritoryUnited States
CityAtlantic City, NJ

Scopus Subject Areas

  • Computer Science(all)


Dive into the research topics of 'A Discriminative Matrix Feature of 2-D Object'. Together they form a unique fingerprint.

Cite this