One-Class Classification with Subgaussians

Amir Madany Mamlouk, Jan T. Kim, Erhardt Barth, Michael Brauckmann, Thomas Martinetz

Research output: Chapter in Book/Report/Conference proceedingChapter

3 Citations (Scopus)

Abstract

If a simple and fast solution for one-class classification is required, the most common approach is to assume a Gaussian distribution for the patterns of the single class. Bayesian classification then leads to a simple template matching. In this paper we show for two very different applications that the classification performance can be improved significantly if a more uniform subgaussian instead of a Gaussian class distribution is assumed. One application is face detection, the other is the detection of transcription factor binding sites on a genome. As for the Gaussian, the distance from a template, i.e., the distribution center, determines a pattern’s class assignment. However, depending on the distribution assumed, maximum likelihood learning leads to different templates from the training data. These new templates lead to significant improvements of the classification performance.
Original languageEnglish
Title of host publicationPattern Recognition
PublisherSpringer
Pages346-353
Number of pages8
Volume2781
ISBN (Print)978-3-540-40861-1
DOIs
Publication statusPublished - 2003
Event25th DAGM Symposium - Magdeburg, Germany
Duration: 10 Sep 200312 Sep 2003

Publication series

NameLecture Notes in Computer Science
PublisherSpringer Verlag, Berlin Heidelberg

Conference

Conference25th DAGM Symposium
Country/TerritoryGermany
CityMagdeburg
Period10/09/0312/09/03

Cite this