Density-Convoluted Support Vector Machines for High-Dimensional Classification

Boxiang Wang, Le Zhou, Yuwen Gu, Hui Zou*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

The support vector machine (SVM) is a popular classification method which enjoys good performance in many real applications. The SVM can be viewed as a penalized minimization problem in which the objective function is the expectation of hinge loss function with respect to the standard non-smooth empirical measure corresponding to the true underlying measure. We further extend this viewpoint and propose a smoothed SVM by substituting a kernel density estimator for the measure in the expectation calculation. The resulting method is called density convoluted support vector machine (DCSVM). We argue that the DCSVM is particularly more interesting than the standard SVM in the context of high-dimensional classification. We systematically study the rate of convergence of the elastic-net penalized DCSVM under general random design setting. We further develop novel efficient algorithm for computing elastic-net penalized DCSVM. Simulation studies and ten benchmark datasets are used to demonstrate the superior classification performance of elastic-net DCSVM over other competitors, and it is demonstrated in these numerical studies that the computation of DCSVM can be more than 100 times faster than that of the SVM.

Original languageEnglish
Number of pages17
JournalIEEE Transactions on Information Theory
DOIs
Publication statusE-pub ahead of print - 17 Nov 2022

Scopus Subject Areas

  • Statistics and Probability

User-Defined Keywords

  • Classification
  • DCSVM
  • Support vector machines
  • Ultra-high dimension
  • Kernel density smoother

Fingerprint

Dive into the research topics of 'Density-Convoluted Support Vector Machines for High-Dimensional Classification'. Together they form a unique fingerprint.

Cite this