On the Sparseness and Generalization Capability of Least Squares Support Vector Machines release_sifve2uiebcmphgc4ourjgskwq

by Aijun Yan, Xiaoqian Huang, Hongshan Shao

Published in Journal of Systems Science and Information by Walter de Gruyter GmbH.

2015  

Abstract

<jats:title>Abstract</jats:title>Compared with standard support vector machines (SVM), sparseness is lost in the modeling process of least squares support vector machines (LS-SVM), causing limited generalization capability. An improved method using quadratic renyi-entropy pruning is presented to deal with the above problems. First, a kernel principal component analysis (KPCA) is used to denoise the training data. Next, the authors use the genetic algorithm to estimate and optimize the kernel function parameter and penalty factor. Then, pick the subset that has the largest quadratic entropy to train and prune, and repeat this process until the cumulative error rate reaches the condition requirement. Finally, comparing experiments on the data classification and regression indicates that the proposed method is effective and may improve the sparseness and the generalization capability of LS-SVM model.
In application/xml+jats format

Archived Files and Locations

application/pdf  260.4 kB
file_oqv5w2mgp5dhtnb773lrjnn74i
www.degruyter.com (web)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article-journal
Stage   published
Date   2015-01-01
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: 58bf871f-59fc-4f93-bb46-08265a20596e
API URL: JSON