A Significantly Faster Elastic-Ensemble for Time-Series Classification

George Oastler, Jason Lines

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)
22 Downloads (Pure)

Abstract

The Elastic-Ensemble [7] has one of the longest build times of all constituents of the current state of the art algorithm for time series classification: the Hierarchical Vote Collective of Transformation-based Ensembles (HIVE-COTE) [8]. We investigate two simple and intuitive techniques to reduce the time spent training the Elastic Ensemble to consequently reduce HIVE-COTE train time. Our techniques reduce the effort involved in tuning parameters of each constituent nearest-neighbour classifier of the Elastic Ensemble. Firstly, we decrease the parameter space of each constituent to reduce tuning effort. Secondly, we limit the number of training series in each nearest neighbour classifier to reduce parameter option evaluation times during tuning. Experimentation over 10-folds of the UEA/UCR time-series classification problems show both techniques and give much faster build times and, crucially, the combination of both techniques give even greater speedup, all without significant loss in accuracy.
Original languageEnglish
Title of host publication Intelligent Data Engineering and Automated Learning – IDEAL 2019
PublisherSpringer
Pages446-453
ISBN (Electronic)978-3-030-33607-3
ISBN (Print)978-3-030-33606-6
DOIs
Publication statusPublished - 2019

Publication series

NameProceedings of the 20th International Conference on Intelligent Data Engineering and Automated Learning

Cite this