Improved sparse least-squares support vector machines

Gavin C. Cawley, Nicola L. C. Talbot

Research output: Contribution to journalArticlepeer-review

63 Citations (Scopus)

Abstract

Suykens et al. (Neurocomputing (2002), in press) describe a weighted least-squares formulation of the support vector machine for regression problems and present a simple algorithm for sparse approximation of the typically fully dense kernel expansions obtained using this method. In this paper, we present an improved method for achieving sparsity in least-squares support vector machines, which takes into account the residuals for all training patterns, rather than only those incorporated in the sparse kernel expansion. The superiority of this algorithm is demonstrated on the motorcycle and Boston housing data sets.
Original languageEnglish
Pages (from-to)1025-1031
Number of pages7
JournalNeurocomputing
Volume48
Issue number1-4
DOIs
Publication statusPublished - Oct 2002

Cite this