Skip to main content

Research Repository

Advanced Search

PWDformer: deformable transformer for long-term series forecasting.

Wang, Zheng; Ran, Haowei; Ren, Jinchang; Sun, Meijun

Authors

Zheng Wang

Haowei Ran

Meijun Sun



Abstract

Long-term forecasting is of paramount importance in numerous scenarios, including predicting future energy, water, and food consumption. For instance, extreme weather events and natural disasters can profoundly impact infrastructure operations and pose severe safety concerns. Traditional CNN-based models often struggle to capture long-distance dependencies effectively. In contrast, Transformers-based models have shown significant promise in long-term forecasting. This paper investigates the long-term forecasting problem and identifies a common limitation in existing Transformer-based models: they tend to reduce computational complexity at the expense of time information aggregation capability. Moreover, the order of time series plays a crucial role in accurate predictions, but current Transformer-based models lack sensitivity to time series order, rendering them unreasonable. To address these issues, we propose a novel Deformable-Local (DL) aggregation mechanism. This mechanism enhances the model's ability to aggregate time information and allows the model to adaptively adjust the size of the time aggregation window. Consequently, the model can discern more complex time patterns, leading to more accurate predictions. Additionally, our model incorporates a Frequency Selection module to reinforce effective features and reduce noise. Furthermore, we introduce Position Weights to mitigate the order-insensitivity problem present in existing methods. In extensive evaluations of long-term forecasting tasks, we conducted benchmark tests on six datasets covering various practical applications, including energy, traffic, economics, weather, and disease. Our method achieved state-of-the-art (SOTA) results, demonstrating significant improvements. For instance, on the ETT dataset, our model achieved an average MSE improvement of approximately 19% and an average MAE improvement of around 27%. Remarkably, for predicted lengths of 96 and 192, we achieved outstanding MSE and MAE improvements of 32.1% and 30.9%, respectively.

Citation

WANG, Z., RAN, H., REN, J. and SUN, M. 2024. PWDformer: deformable transformer for long-term series forecasting. Pattern recognition [online], 147, article number 110118. Available from: https://doi.org/10.1016/j.patcog.2023.110118

Journal Article Type Article
Acceptance Date Nov 13, 2023
Online Publication Date Nov 15, 2023
Publication Date May 31, 2024
Deposit Date Jan 30, 2024
Publicly Available Date Nov 16, 2024
Journal Pattern recognition
Print ISSN 0031-3203
Electronic ISSN 1873-5142
Publisher Elsevier
Peer Reviewed Peer Reviewed
Volume 147
Article Number 110118
DOI https://doi.org/10.1016/j.patcog.2023.110118
Keywords Long-term forecasting; Time series forecasting; Deep learning; Transformer
Public URL https://rgu-repository.worktribe.com/output/2153037

Files




You might also like



Downloadable Citations