TY - GEN
T1 - Multi-Timescale Traffic Intensity Forecasting
AU - Li, Jing
AU - Crespi, Noel
AU - Minerva, Roberto
AU - Farahbakhsh, Reza
AU - Dutta, Hrishikesh
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.
PY - 2025/1/1
Y1 - 2025/1/1
N2 - Traffic intensity prediction is a core task in smart city while there is fast-paced urbanization. However, traffic data usually have complex characteristics such as non-linearity, spatiotemporal dependence, and burstiness, which bring great challenges to prediction models. Therefore, this study proposes a hybrid LSTM-IMTRAN model that combines LSTM and the improved Transformer (IMTRAN). The LSTM gate mechanism dynamically captures the short-term and long-term dependencies of time series, and IMTRAN allows the extraction of global features and improves model stability. The proposed model simplifies the Transformer structure by removing the decoder module and replacing the traditional position encoding with an LSTM module, enhancing time series modeling while reducing computational complexity. Based on the Madrid traffic dataset, this study validates multi-scenario traffic intensity data for both regular days and holidays. The experimental results show that the LSTM-IMTRAN model outperforms the LSTM, STGCN, CNN-LSTM, and Transformer models in short-term (15 min, 30 min), medium-term (60 min) and long-term (1 day) predictions, with a root mean square error (RMSE) reduction of approximately 1.87% - 6.47%.
AB - Traffic intensity prediction is a core task in smart city while there is fast-paced urbanization. However, traffic data usually have complex characteristics such as non-linearity, spatiotemporal dependence, and burstiness, which bring great challenges to prediction models. Therefore, this study proposes a hybrid LSTM-IMTRAN model that combines LSTM and the improved Transformer (IMTRAN). The LSTM gate mechanism dynamically captures the short-term and long-term dependencies of time series, and IMTRAN allows the extraction of global features and improves model stability. The proposed model simplifies the Transformer structure by removing the decoder module and replacing the traditional position encoding with an LSTM module, enhancing time series modeling while reducing computational complexity. Based on the Madrid traffic dataset, this study validates multi-scenario traffic intensity data for both regular days and holidays. The experimental results show that the LSTM-IMTRAN model outperforms the LSTM, STGCN, CNN-LSTM, and Transformer models in short-term (15 min, 30 min), medium-term (60 min) and long-term (1 day) predictions, with a root mean square error (RMSE) reduction of approximately 1.87% - 6.47%.
KW - Intelligent transportation systems
KW - LSTM
KW - Smart city
KW - Traffic intensity prediction
KW - Transformer
UR - https://www.scopus.com/pages/publications/105014392297
U2 - 10.1007/978-3-032-00071-2_5
DO - 10.1007/978-3-032-00071-2_5
M3 - Conference contribution
AN - SCOPUS:105014392297
SN - 9783032000705
T3 - Lecture Notes in Networks and Systems
SP - 84
EP - 99
BT - Intelligent Systems and Applications - Proceedings of the 2025 Intelligent Systems Conference IntelliSys
A2 - Arai, Kohei
PB - Springer Science and Business Media Deutschland GmbH
T2 - 11th Intelligent Systems Conference, IntelliSys 2025
Y2 - 28 August 2025 through 29 August 2025
ER -