Abstract
Mika et al. (1999) introduce a non-linear formulation of Fisher's linear discriminant, based the now familiar "kernel trick", demonstrating state-of-the-art performance on a wide range of real-world benchmark datasets. In this paper, we show that the usual regularisation parameter can be adjusted so as to minimise the leave-one-out cross-validation error with a computational complexity of only O(l2) operations, where l is the number of training patterns, rather than the O(l4) operations required for a naive implementation of the leave-one-out procedure. This procedure is then used to form a component of an efficient hierarchical model selection strategy where the regularisation parameter is optimised within the inner loop while the kernel parameters are optimised in the outer loop.
Original language | English |
---|---|
Pages | 427-430 |
Number of pages | 4 |
DOIs | |
Publication status | Published - Aug 2004 |
Event | 17th International Conference on Pattern Recognition - Cambridge, United Kingdom Duration: 23 Aug 2004 → 26 Aug 2004 |
Conference
Conference | 17th International Conference on Pattern Recognition |
---|---|
Abbreviated title | ICPR-2004 |
Country/Territory | United Kingdom |
City | Cambridge |
Period | 23/08/04 → 26/08/04 |