Diversity between Neural Networks and Decision Trees for Building Multiple Classifier Systems

Wenjia Wang, Phillis Jones, Derek Partridge

Research output: Chapter in Book/Report/Conference proceedingChapter

37 Citations (Scopus)

Abstract

A multiple classifier system can only improve the performance when the members in the system are diverse from each other. Combining some methodologically different techniques is considered a constructive way to expand the diversity. This paper investigates the diversity between the two different data mining techniques, neural networks and automatically induced decision trees. Input decimation through salient feature selection is also explored in the paper in the hope of acquiring further diversity. Among various diversities defined, the coincident failure diversity (CFD) appears to be an effective measure of useful diversity among classifiers in a multiple classifier system when the majority voting decision strategy is applied. A real-world medical classification problem is presented as an application of the techniques. The constructed multiple classifier systems are evaluated with a number of statistical measures in terms of reliability and generalisation. The results indicate that combined MCSs of the nets and trees trained with the selected features have higher diversity and produce better classification results.
Original languageEnglish
Title of host publicationMultiple Classifier Systems
PublisherSpringer Berlin / Heidelberg
Pages240-249
Number of pages10
Volume1857
ISBN (Print)978-3-540-67704-8
DOIs
Publication statusPublished - 2000
EventFirst International Workshop on Multiple Classifier Systems (MCS 2000) - Cagliari, Italy
Duration: 21 Jun 200023 Jun 2000

Publication series

NameLecture Notes in Computer Science
PublisherSpringer Berlin / Heidelberg

Conference

ConferenceFirst International Workshop on Multiple Classifier Systems (MCS 2000)
Country/TerritoryItaly
CityCagliari
Period21/06/0023/06/00

Cite this