Abstract
The Elastic-Ensemble [7] has one of the longest build times of all constituents of the current state of the art algorithm for time series classification: the Hierarchical Vote Collective of Transformation-based Ensembles (HIVE-COTE) [8]. We investigate two simple and intuitive techniques to reduce the time spent training the Elastic Ensemble to consequently reduce HIVE-COTE train time. Our techniques reduce the effort involved in tuning parameters of each constituent nearest-neighbour classifier of the Elastic Ensemble. Firstly, we decrease the parameter space of each constituent to reduce tuning effort. Secondly, we limit the number of training series in each nearest neighbour classifier to reduce parameter option evaluation times during tuning. Experimentation over 10-folds of the UEA/UCR time-series classification problems show both techniques and give much faster build times and, crucially, the combination of both techniques give even greater speedup, all without significant loss in accuracy.
Original language | English |
---|---|
Title of host publication | Intelligent Data Engineering and Automated Learning – IDEAL 2019 |
Publisher | Springer |
Pages | 446-453 |
ISBN (Electronic) | 978-3-030-33607-3 |
ISBN (Print) | 978-3-030-33606-6 |
DOIs | |
Publication status | Published - 2019 |
Publication series
Name | Proceedings of the 20th International Conference on Intelligent Data Engineering and Automated Learning |
---|
Profiles
-
Jason Lines
- School of Computing Sciences - Associate Professor in Computing Sciences
- Data Science and AI - Member
- Smart Emerging Technologies - Member
Person: Research Group Member, Academic, Teaching & Research