Abstract

The generalised linear model (GLM) is the standard approach in classical statistics for regression tasks where it is appropriate to measure the data misfit using a likelihood drawn from the exponential family of distributions. In this paper, we apply the kernel trick to give a non-linear variant of the GLM, the generalised kernel machine (GKM), in which a regularised GLM is constructed in a fixed feature space implicitly defined by a Mercer kernel. The MATLAB symbolic maths toolbox is used to automatically create a suite of generalised kernel machines, including methods for automated model selection based on approximate leave-one-out cross-validation. In doing so, we provide a common framework encompassing a wide range of existing and novel kernel learning methods, and highlight their connections with earlier techniques from classical statistics. Examples including kernel ridge regression, kernel logistic regression and kernel Poisson regression are given to demonstrate the flexibility and utility of the generalised kernel machine.
Original languageEnglish
Pages1732-1737
Number of pages6
DOIs
Publication statusPublished - 2007
EventIEEE/INNS International Joint Conference on Neural Networks - Orlando, United States
Duration: 12 Aug 200717 Aug 2007

Conference

ConferenceIEEE/INNS International Joint Conference on Neural Networks
Abbreviated titleIJCNN-2007
Country/TerritoryUnited States
CityOrlando
Period12/08/0717/08/07

Cite this