TY - GEN
T1 - A Practical Parameters Selection Method for SVM
AU - Zhu, Yongsheng
AU - Li, Chun Hung
AU - Zhang, Youyun
N1 - Publisher copyright:
© 2004 Springer-Verlag Berlin Heidelberg
PY - 2004/8/11
Y1 - 2004/8/11
N2 - The performance of Support Vector Machine (SVM) is significantly affected by model parameters. One commonly used parameters selection method of SVM, Grid search (GS) method, is very time consuming. Present paper introduces Uniform Design (UD) and Support Vector Regression (SVR) method to reduce the computation cost of traditional GS method: the error bounds of SVM are only computed on some nodes that are selected by UD method, then a Support Vector Regression (SVR) are trained by the computation results. Subsequently, the values of error bound of SVM on other nodes are estimated by the SVR function and the optimized parameters can be selected based on the estimated results. Experiments on seven standard datasets show that parameters selected by proposed method can result in similar test error rate as that obtained by conventional GS method, while the computation cost can be reduced at most from o( n m) to o(n), where m is the number of parameters, n is the number of levels of each parameter.
AB - The performance of Support Vector Machine (SVM) is significantly affected by model parameters. One commonly used parameters selection method of SVM, Grid search (GS) method, is very time consuming. Present paper introduces Uniform Design (UD) and Support Vector Regression (SVR) method to reduce the computation cost of traditional GS method: the error bounds of SVM are only computed on some nodes that are selected by UD method, then a Support Vector Regression (SVR) are trained by the computation results. Subsequently, the values of error bound of SVM on other nodes are estimated by the SVR function and the optimized parameters can be selected based on the estimated results. Experiments on seven standard datasets show that parameters selected by proposed method can result in similar test error rate as that obtained by conventional GS method, while the computation cost can be reduced at most from o( n m) to o(n), where m is the number of parameters, n is the number of levels of each parameter.
KW - Support Vector Machine
KW - Support Vector Regression
KW - Support Vector Machine Classifier
KW - Uniform Design
KW - Support Vector Machine Regression
UR - https://www.scopus.com/inward/record.uri?eid=2-s2.0-33749634049&doi=10.1007%2f978-3-540-28647-9_86&partnerID=40&md5=ed843d9e32d7c81aef220c91e224a7bc
U2 - 10.1007/978-3-540-28647-9_86
DO - 10.1007/978-3-540-28647-9_86
M3 - Conference proceeding
SN - 9783540228417
T3 - Lecture Notes in Computer Science
SP - 518
EP - 523
BT - Advances in Neural Networks - ISNN 2004
A2 - Yin, Fu Liang
A2 - Wang, Jun
A2 - Guo, Chengan
PB - Springer Berlin Heidelberg
ER -