Model selection: Beyond the Bayesian/frequentist divide

Isabelle Guyon, Amir Saffari, Gideon Dror, Gavin Cawley

Research output: Contribution to journalArticle

139 Citations (Scopus)

Abstract

The principle of parsimony also known as "Ockham's razor" has inspired many theories of model selection. Yet such theories, all making arguments in favor of parsimony, are based on very different premises and have developed distinct methodologies to derive algorithms. We have organized challenges and edited a special issue of JMLR and several conference proceedings around the theme of model selection. In this editorial, we revisit the problem of avoiding overfitting in light of the latest results. We note the remarkable convergence of theories as different as Bayesian theory, Minimum Description Length, bias/variance tradeoff, Structural Risk Minimization, and regularization, in some approaches. We also present new and interesting examples of the complementarity of theories leading to hybrid algorithms, neither frequentist, nor Bayesian, or perhaps both frequentist and Bayesian!
Original languageEnglish
Pages (from-to)61-87
Number of pages27
JournalJournal of Machine Learning Research
Volume11
Publication statusPublished - Jan 2010

Cite this