Abstract
The generalised linear model (GLM) is the standard approach in classical statistics for regression tasks where it is appropriate to measure the data misfit using a likelihood drawn from the exponential family of distributions. In this paper, we apply the kernel trick to give a non-linear variant of the GLM, the generalised kernel machine (GKM), in which a regularised GLM is constructed in a fixed feature space implicitly defined by a Mercer kernel. The MATLAB symbolic maths toolbox is used to automatically create a suite of generalised kernel machines, including methods for automated model selection based on approximate leave-one-out cross-validation. In doing so, we provide a common framework encompassing a wide range of existing and novel kernel learning methods, and highlight their connections with earlier techniques from classical statistics. Examples including kernel ridge regression, kernel logistic regression and kernel Poisson regression are given to demonstrate the flexibility and utility of the generalised kernel machine.
Original language | English |
---|---|
Pages | 1732-1737 |
Number of pages | 6 |
DOIs | |
Publication status | Published - 2007 |
Event | IEEE/INNS International Joint Conference on Neural Networks - Orlando, United States Duration: 12 Aug 2007 → 17 Aug 2007 |
Conference
Conference | IEEE/INNS International Joint Conference on Neural Networks |
---|---|
Abbreviated title | IJCNN-2007 |
Country/Territory | United States |
City | Orlando |
Period | 12/08/07 → 17/08/07 |