GCformer: An Efficient Framework for Accurate and Scalable Long-Term Multivariate Time Series Forecasting

By Yanjun Zhao et al
Published on June 10, 2023
Read the original document by opening this link in a new tab.

Table of Contents

1. INTRODUCTION
2. RELATED WORK
3. METHOD
3.1 Global convolutional kernel
3.2 Efficient parameterization
3.3 Synergistic Fusion of Global and Local branch
4. RESULTS
5. CONCLUSIONS

Summary

Transformer-based models have emerged as promising tools for time series forecasting. However, these models face challenges in capturing long-range dependencies within time series data. To address these limitations, GCformer combines a structured global convolutional branch for processing long input sequences with a local Transformer-based branch for capturing recent signals. The framework introduces efficient parameterization methods and a dual-branch design to enhance the modeling of complex relationships in time series data. GCformer outperforms state-of-the-art methods, reducing MSE error by 4.38% and model parameters by 61.92%. The model demonstrates the potential to substantially enhance the performance of local models and achieves remarkable results in long-term time series forecasting benchmarks.
×
This is where the content will go.