Zheng Wang
PWDformer: deformable transformer for long-term series forecasting.
Wang, Zheng; Ran, Haowei; Ren, Jinchang; Sun, Meijun
Abstract
Long-term forecasting is of paramount importance in numerous scenarios, including predicting future energy, water, and food consumption. For instance, extreme weather events and natural disasters can profoundly impact infrastructure operations and pose severe safety concerns. Traditional CNN-based models often struggle to capture long-distance dependencies effectively. In contrast, Transformers-based models have shown significant promise in long-term forecasting. This paper investigates the long-term forecasting problem and identifies a common limitation in existing Transformer-based models: they tend to reduce computational complexity at the expense of time information aggregation capability. Moreover, the order of time series plays a crucial role in accurate predictions, but current Transformer-based models lack sensitivity to time series order, rendering them unreasonable. To address these issues, we propose a novel Deformable-Local (DL) aggregation mechanism. This mechanism enhances the model's ability to aggregate time information and allows the model to adaptively adjust the size of the time aggregation window. Consequently, the model can discern more complex time patterns, leading to more accurate predictions. Additionally, our model incorporates a Frequency Selection module to reinforce effective features and reduce noise. Furthermore, we introduce Position Weights to mitigate the order-insensitivity problem present in existing methods. In extensive evaluations of long-term forecasting tasks, we conducted benchmark tests on six datasets covering various practical applications, including energy, traffic, economics, weather, and disease. Our method achieved state-of-the-art (SOTA) results, demonstrating significant improvements. For instance, on the ETT dataset, our model achieved an average MSE improvement of approximately 19% and an average MAE improvement of around 27%. Remarkably, for predicted lengths of 96 and 192, we achieved outstanding MSE and MAE improvements of 32.1% and 30.9%, respectively.
Citation
WANG, Z., RAN, H., REN, J. and SUN, M. 2024. PWDformer: deformable transformer for long-term series forecasting. Pattern recognition [online], 147, article number 110118. Available from: https://doi.org/10.1016/j.patcog.2023.110118
Journal Article Type | Article |
---|---|
Acceptance Date | Nov 13, 2023 |
Online Publication Date | Nov 15, 2023 |
Publication Date | May 31, 2024 |
Deposit Date | Jan 30, 2024 |
Publicly Available Date | Nov 16, 2024 |
Journal | Pattern recognition |
Print ISSN | 0031-3203 |
Electronic ISSN | 1873-5142 |
Publisher | Elsevier |
Peer Reviewed | Peer Reviewed |
Volume | 147 |
Article Number | 110118 |
DOI | https://doi.org/10.1016/j.patcog.2023.110118 |
Keywords | Long-term forecasting; Time series forecasting; Deep learning; Transformer |
Public URL | https://rgu-repository.worktribe.com/output/2153037 |
Files
WANG 2024 PWDformer (AAM)
(2.9 Mb)
PDF
Publisher Licence URL
https://creativecommons.org/licenses/by-nc-nd/4.0/
You might also like
Two-click based fast small object annotation in remote sensing images.
(2024)
Journal Article
Prompting-to-distill semantic knowledge for few-shot learning.
(2024)
Journal Article
Detection-driven exposure-correction network for nighttime drone-view object detection.
(2024)
Journal Article
Feature aggregation and region-aware learning for detection of splicing forgery.
(2024)
Journal Article
Downloadable Citations
About OpenAIR@RGU
Administrator e-mail: publications@rgu.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search