TY - GEN
T1 - Quantum Reupload Units
T2 - 6th IEEE International Conference on Quantum Computing and Engineering, QCE 2025
AU - Casse, Lea
AU - Ponnambalam, Sabarikirishwaran
AU - Pfahringer, Bernhard
AU - Bifet, Albert
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025/1/1
Y1 - 2025/1/1
N2 - We propose a single-qubit Quantum Machine Learning (QML) model for time series forecasting, built around the concept of a Quantum Reupload Unit (QRU), a hardwareefficient quantum circuit architecture with shallow depth. The proposed model demonstrates enhanced predictive power compared to variational methods such as quantum circuits (VQC), parameterized quantum circuits (PQC), and quantum residual blocks (QRB). The proposed QRU outperforms classical learning models such as Recurrent Neural Networks (RNNs) and Long-Short Term Memory (LSTM) with the same number of parameters. The novelty of this approach is its ability to model temporal patterns without relying on an extensive memory state, which reduces resource demands while preserving forecast accuracy. The expressivity of the model is evaluated through Fourier spectral decomposition. We analyze the trainability of our model using the absorption witness metric. We benchmarked the proposed model on the Mackey-Glass chaotic time series and the real-world river level dataset from TAIAO. The proposed model consistently exhibits enhanced expressivity over both of the datasets. These results highlight the significance of QRUs as promising candidates for learning models that can be conveniently deployed on noisy intermediate-scale quantum (NISQ) hardware.
AB - We propose a single-qubit Quantum Machine Learning (QML) model for time series forecasting, built around the concept of a Quantum Reupload Unit (QRU), a hardwareefficient quantum circuit architecture with shallow depth. The proposed model demonstrates enhanced predictive power compared to variational methods such as quantum circuits (VQC), parameterized quantum circuits (PQC), and quantum residual blocks (QRB). The proposed QRU outperforms classical learning models such as Recurrent Neural Networks (RNNs) and Long-Short Term Memory (LSTM) with the same number of parameters. The novelty of this approach is its ability to model temporal patterns without relying on an extensive memory state, which reduces resource demands while preserving forecast accuracy. The expressivity of the model is evaluated through Fourier spectral decomposition. We analyze the trainability of our model using the absorption witness metric. We benchmarked the proposed model on the Mackey-Glass chaotic time series and the real-world river level dataset from TAIAO. The proposed model consistently exhibits enhanced expressivity over both of the datasets. These results highlight the significance of QRUs as promising candidates for learning models that can be conveniently deployed on noisy intermediate-scale quantum (NISQ) hardware.
KW - expressivity
KW - quantum machine learning
KW - quantum re uploading
KW - quantum reupload unit
KW - time series
KW - trainability
UR - https://www.scopus.com/pages/publications/105030142752
U2 - 10.1109/QCE65121.2025.00199
DO - 10.1109/QCE65121.2025.00199
M3 - Conference contribution
AN - SCOPUS:105030142752
T3 - Proceedings - IEEE Quantum Week 2025, QCE 2025
SP - 1815
EP - 1825
BT - Technical Papers Program
A2 - Culhane, Candace
A2 - Byrd, Greg
A2 - Muller, Hausi
A2 - Delgado, Andrea
A2 - Eidenbenz, Stephan
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 31 August 2025 through 5 September 2025
ER -