Manipulation of prior probabilities in support vector classifications

Gavin C. Cawley, Nicola L. C. Talbot

Research output: Contribution to conferencePaper

3 Citations (Scopus)

Abstract

Asymmetric margin error costs for positive and negative examples are often cited as an efficient heuristic compensating for unrepresentative priors in training support vector classifiers. In this paper we show that this heuristic is well justified via simple re-sampling ideas applied to the dual Lagrangian defining the 1-norm soft-margin support vector machine. This observation also provides a simple expression for the asymptotically optimal ratio of margin error penalties, eliminating the need for the trial-and-error experimentation normally encountered. This method allows the use of a smaller, balanced training data set in problems characterised by widely disparate prior probabilities, reducing the training time. The usefulness of this method is then demonstrated on a real world benchmark problem
Original languageEnglish
Pages2433-2438
Number of pages6
DOIs
Publication statusPublished - Jul 2001
Event2001 International Joint Conference on Neural Networks - Washington DC, United States
Duration: 15 Jul 200119 Jul 2001

Conference

Conference2001 International Joint Conference on Neural Networks
Abbreviated titleIJCNN-2001
Country/TerritoryUnited States
CityWashington DC
Period15/07/0119/07/01

Cite this