Abstract
Kernel logistic regression models, like their linear counterparts, can be trained using the efficient iteratively reweighted least-squares (IRWLS) algorithm. This approach suggests an approximate leave-one-out cross-validation estimator based on an existing method for exact leave-one-out cross-validation of least-squares models. Results compiled over seven benchmark datasets are presented for kernel logistic regression with model selection procedures based on both conventional k-fold and approximate leave-one-out cross-validation criteria, demonstrating the proposed approach to be viable.
Original language | English |
---|---|
Pages | 439-442 |
Number of pages | 4 |
DOIs | |
Publication status | Published - Aug 2004 |
Event | 17th International Conference on Pattern Recognition - Cambridge, United Kingdom Duration: 23 Aug 2004 → 26 Aug 2004 |
Conference
Conference | 17th International Conference on Pattern Recognition |
---|---|
Abbreviated title | ICPR-2004 |
Country/Territory | United Kingdom |
City | Cambridge |
Period | 23/08/04 → 26/08/04 |