MultiResFormer: Transformer with Adaptive Multi-Resolution Modeling for General Time Series Forecasting
CoRR(2023)
摘要
Transformer-based models have greatly pushed the boundaries of time series
forecasting recently. Existing methods typically encode time series data into
patches using one or a fixed set of patch lengths. This, however,
could result in a lack of ability to capture the variety of intricate temporal
dependencies present in real-world multi-periodic time series. In this paper,
we propose MultiResFormer, which dynamically models temporal variations by
adaptively choosing optimal patch lengths. Concretely, at the beginning of each
layer, time series data is encoded into several parallel branches, each using a
detected periodicity, before going through the transformer encoder block. We
conduct extensive evaluations on long- and short-term forecasting datasets
comparing MultiResFormer with state-of-the-art baselines. MultiResFormer
outperforms patch-based Transformer baselines on long-term forecasting tasks
and also consistently outperforms CNN baselines by a large margin, while using
much fewer parameters than these baselines.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要