Modality-aware Transformer for Financial Time series Forecasting
arxiv(2023)
摘要
Time series forecasting presents a significant challenge, particularly when
its accuracy relies on external data sources rather than solely on historical
values. This issue is prevalent in the financial sector, where the future
behavior of time series is often intricately linked to information derived from
various textual reports and a multitude of economic indicators. In practice,
the key challenge lies in constructing a reliable time series forecasting model
capable of harnessing data from diverse sources and extracting valuable
insights to predict the target time series accurately. In this work, we tackle
this challenging problem and introduce a novel multimodal transformer-based
model named the Modality-aware Transformer. Our model excels in
exploring the power of both categorical text and numerical timeseries to
forecast the target time series effectively while providing insights through
its neural attention mechanism. To achieve this, we develop feature-level
attention layers that encourage the model to focus on the most relevant
features within each data modality. By incorporating the proposed feature-level
attention, we develop a novel Intra-modal multi-head attention (MHA),
Inter-modal MHA and Target-modal MHA in a way that both feature and temporal
attentions are incorporated in MHAs. This enables the MHAs to generate temporal
attentions with consideration of modality and feature importance which leads to
more informative embeddings. The proposed modality-aware structure enables the
model to effectively exploit information within each modality as well as foster
cross-modal understanding. Our extensive experiments on financial datasets
demonstrate that Modality-aware Transformer outperforms existing methods,
offering a novel and practical solution to the complex challenges of
multi-modal financial time series forecasting.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要