Paper
19 July 2024 Self-transformers: long-term series forecasting based on smooth series
Wei Feng, Liu Yang, Si Xie, Heng Liu
Author Affiliations +
Proceedings Volume 13181, Third International Conference on Electronic Information Engineering, Big Data, and Computer Technology (EIBDCT 2024); 131815Q (2024) https://doi.org/10.1117/12.3031087
Event: Third International Conference on Electronic Information Engineering, Big Data, and Computer Technology (EIBDCT 2024), 2024, Beijing, China
Abstract
This paper investigates the problem of long-term forecasting of time series. Previous Transformer-based models use various self-attention mechanisms to find remote dependencies. Autoformer designs a novel decomposition architecture with autocorrelation mechanisms. However, their models are less effective in dealing with long time series. In this paper, we propose a new time series forecasting model, Self-Transformers, based on Non-stationary Transformers. Designed to handle time series data with complex nonlinear trends and seasonality, our model improves forecasting performance and reduces the need for manual feature engineering. We validate the effectiveness of the model by conducting experiments on several benchmark datasets, and the results show that our model achieves significant performance gains on time series forecasting tasks.
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Wei Feng, Liu Yang, Si Xie, and Heng Liu "Self-transformers: long-term series forecasting based on smooth series", Proc. SPIE 13181, Third International Conference on Electronic Information Engineering, Big Data, and Computer Technology (EIBDCT 2024), 131815Q (19 July 2024); https://doi.org/10.1117/12.3031087
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Data modeling

Performance modeling

Transformers

Matrices

Modeling

Statistical modeling

Autoregressive models

Back to Top