Abstract
Suykens et al. (Neurocomputing (2002), in press) describe a weighted least-squares formulation of the support vector machine for regression problems and present a simple algorithm for sparse approximation of the typically fully dense kernel expansions obtained using this method. In this paper, we present an improved method for achieving sparsity in least-squares support vector machines, which takes into account the residuals for all training patterns, rather than only those incorporated in the sparse kernel expansion. The superiority of this algorithm is demonstrated on the motorcycle and Boston housing data sets.
Original language | English |
---|---|
Pages (from-to) | 1025-1031 |
Number of pages | 7 |
Journal | Neurocomputing |
Volume | 48 |
Issue number | 1-4 |
DOIs | |
Publication status | Published - Oct 2002 |