TY - GEN
T1 - Time Series Forecasting Models Copy the Past
T2 - 31st International Conference on Artificial Neural Networks, ICANN 2022
AU - Kosma, Chrysoula
AU - Nikolentzos, Giannis
AU - Xu, Nancy
AU - Vazirgiannis, Michalis
N1 - Publisher Copyright:
© 2022, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2022/1/1
Y1 - 2022/1/1
N2 - Time series forecasting is at the core of important application domains posing significant challenges to machine learning algorithms. Recently neural network architectures have been widely applied to the problem of time series forecasting. Most of these models are trained by minimizing a loss function that measures predictions’ deviation from the real values. Typical loss functions include mean squared error (MSE) and mean absolute error (MAE). In the presence of noise and uncertainty, neural network models tend to replicate the last observed value of the time series, thus limiting their applicability to real-world data. In this paper, we provide a formal definition of the above problem and we also give some examples of forecasts where the problem is observed. We also propose a regularization term penalizing the replication of previously seen values. We evaluate the proposed regularization term both on synthetic and real-world datasets. Our results indicate that the regularization term mitigates to some extent the aforementioned problem and gives rise to more robust models.
AB - Time series forecasting is at the core of important application domains posing significant challenges to machine learning algorithms. Recently neural network architectures have been widely applied to the problem of time series forecasting. Most of these models are trained by minimizing a loss function that measures predictions’ deviation from the real values. Typical loss functions include mean squared error (MSE) and mean absolute error (MAE). In the presence of noise and uncertainty, neural network models tend to replicate the last observed value of the time series, thus limiting their applicability to real-world data. In this paper, we provide a formal definition of the above problem and we also give some examples of forecasts where the problem is observed. We also propose a regularization term penalizing the replication of previously seen values. We evaluate the proposed regularization term both on synthetic and real-world datasets. Our results indicate that the regularization term mitigates to some extent the aforementioned problem and gives rise to more robust models.
KW - Deep learning
KW - Loss functions
KW - Time-series forecasting
U2 - 10.1007/978-3-031-15919-0_31
DO - 10.1007/978-3-031-15919-0_31
M3 - Conference contribution
AN - SCOPUS:85138770233
SN - 9783031159183
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 366
EP - 378
BT - Artificial Neural Networks and Machine Learning - ICANN 2022 - 31st International Conference on Artificial Neural Networks, Proceedings
A2 - Pimenidis, Elias
A2 - Aydin, Mehmet
A2 - Angelov, Plamen
A2 - Jayne, Chrisina
A2 - Papaleonidas, Antonios
PB - Springer Science and Business Media Deutschland GmbH
Y2 - 6 September 2022 through 9 September 2022
ER -