Temporal Fusion Transformers for Interpretable Multi-horizon Time Series Forecasting

By B. Lima et al
Published on Sept. 29, 2020
Read the original document by opening this link in a new tab.

Table of Contents

Abstract
1. Introduction
2. Related Work
3. Multi-horizon Forecasting
4. Model Architecture

Summary

The document discusses the Temporal Fusion Transformer (TFT) architecture for multi-horizon time series forecasting. It addresses the complexity of forecasting with various inputs, including static covariates and unknown time series. TFT combines recurrent layers and self-attention mechanisms for interpretability and high performance. It introduces gating mechanisms, variable selection networks, and static covariate encoders to enhance forecasting accuracy. TFT offers insights into temporal dynamics and provides significant performance improvements over existing methods.
×
This is where the content will go.