Towards efficient similarity embedded temporal Transformers via extended timeframe analysis

Kenniy Olorunnimbe,Herna Viktor

Complex & Intelligent Systems(2024)

引用 0|浏览0
暂无评分
摘要
Price prediction remains a crucial aspect of financial market research as it forms the basis for various trading strategies and portfolio management techniques. However, traditional models such as ARIMA are not effective for multi-horizon forecasting, and current deep learning approaches do not take into account the conditional heteroscedasticity of financial market time series. In this work, we introduce the similarity embedded temporal Transformer (SeTT) algorithms, which extend the state-of-the-art temporal Transformer architecture. These algorithms utilise historical trends in financial time series, as well as statistical principles, to enhance forecasting performance. We conducted a thorough analysis of various hyperparameters including learning rate, local window size, and the choice of similarity function in this extension of the study in a bid to get optimal model performance. We also experimented over an extended timeframe, which allowed us to more accurately assess the performance of the models in different market conditions and across different lengths of time. Overall, our results show that SeTT provides improved performance for financial market prediction, as it outperforms both classical financial models and state-of-the-art deep learning methods, across volatile and non-volatile extrapolation periods, with varying effects of historical volatility on the extrapolation. Despite the availability of a substantial amount of data spanning up to 13 years, optimal results were primarily attained through a historical window of 1–3 years for the extrapolation period under examination.
更多
查看译文
关键词
Deep learning,Financial price prediction,Temporal Transformer,Stock market forecast,Multi-horizon forecast,Hyperparameter optimisation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要