Multi-Scale Ensemble Booster for Improving Existing TSD Classifiers

IEEE Transactions on Knowledge and Data Engineering(2023)

引用 1|浏览0
暂无评分
摘要
Time Series Classification (TSC) is an essential task in Time Series Data (TSD) analysis. Ensemble-based approaches now achieve the best performance on TSC tasks. However, integrating numerous different models makes them highly suffer from heavy preprocessing. Even worse, non-deep-learning ensemble-based methods suffer from substantial computational costs due to lacking GPU acceleration. Multi-scale information in TSD can improve TSC performance. However, Existing TSD classifiers employing multi-scale information struggle with heavy preprocessing and cannot help other TSD classifiers obtain multi-scale feature extraction capabilities. Inspired by these, we proposed a performance enhancement framework called multi-scale ensemble booster (MEB), helping existing TSD classifiers achieve performance leaps. In MEB, we proposed an easy-to-combine network structure without changing any of their structure and hyperparameters, only needed to set one hyperparameter, consisting of multi-scale transformation and multi-output decision fusion. Then, a probability distribution co-evolution strategy is proposed to attain the optimal label probability distribution. We conducted numerous ablation experiments of MEB on 128 univariate datasets and 29 multivariate datasets and comparative experiments with 11 state-of-the-art methods, which demonstrated the significant performance improvement ability of MEB and the most advanced performance of the model enhanced by MEB, respectively. Furthermore, to figure out why MEB can improve model performance, we provided a chain of interpretability analyses.
更多
查看译文
关键词
Time series classification,UCR repository,multivariate time series,ensemble,multi-scale
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要