Abstract
Mika et al. (in: Neural Network for Signal Processing, Vol. IX, IEEE Press, New York, 1999; pp. 41–48) apply the “kernel trick” to obtain a non-linear variant of Fisher's linear discriminant analysis method, demonstrating state-of-the-art performance on a range of benchmark data sets. We show that leave-one-out cross-validation of kernel Fisher discriminant classifiers can be implemented with a computational complexity of only $\mathcal{O}(\ell^3)$ operations rather than the $\mathcal{O}(\ell^4)$ of a naïve implementation, where l is the number of training patterns. Leave-one-out cross-validation then becomes an attractive means of model selection in large-scale applications of kernel Fisher discriminant analysis, being significantly faster than conventional k-fold cross-validation procedures commonly used.
Original language | English |
---|---|
Pages (from-to) | 2585-2592 |
Number of pages | 8 |
Journal | Pattern Recognition |
Volume | 36 |
Issue number | 11 |
DOIs | |
Publication status | Published - Nov 2003 |