Kernel logistic regression models, like their linear counterparts, can be trained using the efficient iteratively reweighted least-squares (IRWLS) algorithm. This approach suggests an approximate leave-one-out cross-validation estimator based on an existing method for exact leave-one-out cross-validation of least-squares models. Results compiled over seven benchmark datasets are presented for kernel logistic regression with model selection procedures based on both conventional k-fold and approximate leave-one-out cross-validation criteria, demonstrating the proposed approach to be viable.
|Number of pages||4|
|Publication status||Published - Aug 2004|
|Event||Proceedings of the 17th International Conference on Pattern Recognition (ICPR-2004) - |
Duration: 23 Aug 2004 → 26 Aug 2004
|Conference||Proceedings of the 17th International Conference on Pattern Recognition (ICPR-2004)|
|Period||23/08/04 → 26/08/04|