A Comparative Evaluation of Feature Set Evolution Strategies for Multirelational Boosting

Susanne Hoche, Stefan Wrobel

Research output: Chapter in Book/Report/Conference proceedingChapter


Boosting has established itself as a successful technique for decreasing the generalization error of classification learners by basing predictions on ensembles of hypotheses. While previous research has shown that this technique can be made to work efficiently even in the context of multirelational learning by using simple learners and active feature selection, such approaches have relied on simple and static methods of determining feature selection ordering a priori and adding features only in a forward manner. In this paper, we investigate whether the distributional information present in boosting can usefully be exploited in the course of learning to reweight features and in fact even to dynamically adapt the feature set by adding the currently most relevant features and removing those that are no longer needed. Preliminary results show that these more informed feature set evolution strategies surprisingly have mixed effects on the number of features ultimately used in the ensemble, and on the resulting classification accuracy.
Original languageEnglish
Title of host publicationInductive Logic Programming
EditorsTamás Horváth, Akihiro Yamamoto
PublisherSpringer Berlin / Heidelberg
Number of pages17
ISBN (Print)978-3-540-20144-1
Publication statusPublished - 2003
Event13th International Conference, ILP 2003 - Szeged, Hungary
Duration: 29 Sep 20031 Oct 2003

Publication series

NameLecture Notes in Computer Science
PublisherSpringer Berlin / Heidelberg


Conference13th International Conference, ILP 2003

Cite this