TY - GEN
T1 - Seq2VAR
T2 - 4th ECML PKDD Workshop on Advanced Analytics and Learning on Temporal Data, AALTD 2019
AU - Pineau, Edouard
AU - Razakarivony, Sébastien
AU - Bonald, Thomas
N1 - Publisher Copyright:
© Springer Nature Switzerland AG 2020.
PY - 2020/1/1
Y1 - 2020/1/1
N2 - Finding understandable and meaningful feature representation of multivariate time series (MTS) is a difficult task, since information is entangled both in temporal and spatial dimensions. In particular, MTS can be seen as the observation of simultaneous causal interactions between dynamical variables. Standard way to model these interactions is the vector linear autoregression (VAR). The parameters of VAR models can be used as MTS feature representation. Yet, VAR cannot generalize on new samples, hence independent VAR models must be trained to represent different MTS. In this paper, we propose to use the inference capacity of neural networks to overpass this limit. We propose to associate a relational neural network to a VAR generative model to form an encoder-decoder of MTS. The model is denoted Seq2VAR for Sequence-to-VAR. We use recent advances in relational neural network to build our MTS encoder by explicitly modeling interactions between variables of MTS samples. We also propose to leverage reparametrization tricks for binomial sampling in neural networks in order to build a sparse version of Seq2VAR and find back the notion of Granger causality defined in sparse VAR models. We illustrate the interest of our approach through experiments on synthetic datasets.
AB - Finding understandable and meaningful feature representation of multivariate time series (MTS) is a difficult task, since information is entangled both in temporal and spatial dimensions. In particular, MTS can be seen as the observation of simultaneous causal interactions between dynamical variables. Standard way to model these interactions is the vector linear autoregression (VAR). The parameters of VAR models can be used as MTS feature representation. Yet, VAR cannot generalize on new samples, hence independent VAR models must be trained to represent different MTS. In this paper, we propose to use the inference capacity of neural networks to overpass this limit. We propose to associate a relational neural network to a VAR generative model to form an encoder-decoder of MTS. The model is denoted Seq2VAR for Sequence-to-VAR. We use recent advances in relational neural network to build our MTS encoder by explicitly modeling interactions between variables of MTS samples. We also propose to leverage reparametrization tricks for binomial sampling in neural networks in order to build a sparse version of Seq2VAR and find back the notion of Granger causality defined in sparse VAR models. We illustrate the interest of our approach through experiments on synthetic datasets.
KW - Granger causality
KW - Multivariate time series
KW - Relational neural networks
KW - Vector linear autoregression
UR - https://www.scopus.com/pages/publications/85082115113
U2 - 10.1007/978-3-030-39098-3_10
DO - 10.1007/978-3-030-39098-3_10
M3 - Conference contribution
AN - SCOPUS:85082115113
SN - 9783030390976
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 126
EP - 140
BT - Advanced Analytics and Learning on Temporal Data - 4th ECML PKDD Workshop, AALTD 2019, Revised Selected Papers
A2 - Lemaire, Vincent
A2 - Malinowski, Simon
A2 - Bagnall, Anthony
A2 - Bondu, Alexis
A2 - Guyet, Thomas
A2 - Tavenard, Romain
PB - Springer
Y2 - 16 September 2019 through 20 September 2019
ER -