Probability distribution normalisation of data applied to neural net classifiers

Graham D. Tattersall

Research output: Contribution to journalArticlepeer-review

Abstract

The individual elements of pattern vectors generated by real systems often have widely different value ranges. Direct application of these patterns to a distance-based classifier such as a multilayer perceptron can cause the large value range elements to dominate in the classification decision. A commonly used remedy is to normalise the variance of each pattern element before use. However, the author shows that this approach is often inappropriate and that better results can be obtained by nonlinearly scaling the pattern elements to render their probability distributions approximately uniform as well as having the same variance
Original languageEnglish
Pages (from-to)56-57
Number of pages2
JournalIEE Electronics Letters
Volume30
Issue number1
DOIs
Publication statusPublished - Jan 1994

Cite this