Skip to content

Time Series Forecasting with Temporal Fusion Transformer in Pytorch

License

Notifications You must be signed in to change notification settings

fornasari12/time-series-forecasting

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Dec 8, 2021
7a4edc7 · Dec 8, 2021

History

84 Commits
Dec 2, 2021
Dec 8, 2021
Dec 8, 2021
Dec 8, 2021
Dec 8, 2021
Dec 3, 2021
Nov 4, 2021
Nov 12, 2021
Nov 8, 2021
Dec 4, 2021
Nov 23, 2021
Dec 3, 2021
Dec 8, 2021
Dec 3, 2021
Nov 4, 2021
Dec 8, 2021
Dec 3, 2021
Dec 8, 2021
Dec 3, 2021
Dec 8, 2021
Dec 8, 2021
Nov 22, 2021
Dec 8, 2021
Dec 2, 2021
Dec 8, 2021

Repository files navigation

Forecasting with the Temporal Fusion Transformer

img.png

Multi-horizon forecasting often contains a complex mix of inputs – including static (i.e. time-invariant) covariates, known future inputs, and other exogenous time series that are only observed in the past – without any prior information on how they interact with the target. Several deep learning methods have been proposed, but they are typically ‘black-box’ models which do not shed light on how they use the full range of inputs present in practical scenarios. In this paper, we introduce the Temporal Fusion Transformer (TFT) – a novel attentionbased architecture which combines high-performance multi-horizon forecasting with interpretable insights into temporal dynamics. To learn temporal relationships at different scales, TFT uses recurrent layers for local processing and interpretable self-attention layers for long-term dependencies. TFT utilizes specialized components to select relevant features and a series of gating layers to suppress unnecessary components, enabling high performance in a wide range of scenarios. On a variety of real-world datasets, we demonstrate significant performance improvements over existing

About

Time Series Forecasting with Temporal Fusion Transformer in Pytorch

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages