Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers

Gavin C. Cawley, Nicola L. C. Talbot

Research output: Contribution to journalArticlepeer-review

374 Citations (Scopus)

Abstract

Mika et al. (in: Neural Network for Signal Processing, Vol. IX, IEEE Press, New York, 1999; pp. 41–48) apply the “kernel trick” to obtain a non-linear variant of Fisher's linear discriminant analysis method, demonstrating state-of-the-art performance on a range of benchmark data sets. We show that leave-one-out cross-validation of kernel Fisher discriminant classifiers can be implemented with a computational complexity of only $\mathcal{O}(\ell^3)$ operations rather than the $\mathcal{O}(\ell^4)$ of a naïve implementation, where l is the number of training patterns. Leave-one-out cross-validation then becomes an attractive means of model selection in large-scale applications of kernel Fisher discriminant analysis, being significantly faster than conventional k-fold cross-validation procedures commonly used.
Original languageEnglish
Pages (from-to)2585-2592
Number of pages8
JournalPattern Recognition
Volume36
Issue number11
DOIs
Publication statusPublished - Nov 2003

Cite this