Over-Fitting in Model Selection with Gaussian Process Regression

Rekar O. Mohammed, Gavin C. Cawley

Research output: Chapter in Book/Report/Conference proceedingConference contribution

25 Citations (Scopus)
80 Downloads (Pure)

Abstract

Model selection in Gaussian Process Regression (GPR) seeks to determine the optimal values of the hyper-parameters governing the covariance function, which allows flexible customization of the GP to the problem at hand. An oft-overlooked issue that is often encountered in the model process is over-fitting the model selection criterion, typically the marginal likelihood. The over-fitting in machine learning refers to the fitting of random noise present in the model selection criterion in addition to features improving the generalisation performance of the statistical model. In this paper, we construct several Gaussian process regression models for a range of high-dimensional datasets from the UCI machine learning repository. Afterwards, we compare both MSE on the test dataset and the negative log marginal likelihood (nlZ), used as the model selection criteria, to find whether the problem of overfitting in model selection also affects GPR. We found that the squared exponential covariance function with Automatic Relevance Determination (SEard) is better than other kernels including squared exponential covariance function with isotropic distance measure (SEiso) according to the nLZ, but it is clearly not the best according to MSE on the test data, and this is an indication of over-fitting problem in model selection.
Original languageEnglish
Title of host publicationMachine Learning and Data Mining in Pattern Recognition
Subtitle of host publication13th International Conference, MLDM 2017, New York, NY, USA, July 15-20, 2017, Proceedings
EditorsPetra Perner
PublisherSpringer
Pages192-205
Number of pages14
Volume10358
Edition1
ISBN (Electronic)978-3-319-62416-7
ISBN (Print)978-3-319-62415-0
DOIs
Publication statusPublished - 15 Jul 2017

Publication series

NameLecture Notes in Computer Science
NameLecture Notes in Artificial Intelligence

Cite this