TY - JOUR
T1 - Ensemble Kalman filter in latent space using a variational autoencoder pair
AU - Pasmans, Ivo
AU - Chen, Yumeng
AU - Sebastian Finn, Tobias
AU - Bocquet, Marc
AU - Carrassi, Alberto
N1 - Publisher Copyright:
© 2025 The Author(s). Quarterly Journal of the Royal Meteorological Society published by John Wiley & Sons Ltd on behalf of Royal Meteorological Society.
PY - 2025/1/1
Y1 - 2025/1/1
N2 - Popular (ensemble) Kalman filter data assimilation (DA) approaches assume that the errors in both the a priori estimate of the state and the observations are Gaussian. For constrained variables, for example, sea-ice concentration or stress, such an assumption does not hold. The variational autoencoder (VAE) is a machine-learning (ML) technique that allows us to map an arbitrary distribution to/from a latent space in which the distribution is supposedly closer to a Gaussian. We propose a novel hybrid DA–ML approach in which VAEs are incorporated in the DA procedure. Specifically, we introduce a variant of the popular ensemble transform Kalman filter (ETKF) in which the analysis is applied in the latent space of a single VAE or a pair of VAEs. In twin experiments with a simple circular model, whereby the circle represents an underlying submanifold to be respected, we find that the use of a VAE ensures that a posteriori ensemble members lie close to the manifold containing the truth. Furthermore, online updating of the VAE is necessary and achievable when this manifold varies in time, that is, when it is non-stationary. We demonstrate that introducing an additional second latent space for the observational innovations improves robustness against detrimental effects of non-Gaussianity and bias in the observational errors but lessens the performance slightly if observational errors are strictly Gaussian.
AB - Popular (ensemble) Kalman filter data assimilation (DA) approaches assume that the errors in both the a priori estimate of the state and the observations are Gaussian. For constrained variables, for example, sea-ice concentration or stress, such an assumption does not hold. The variational autoencoder (VAE) is a machine-learning (ML) technique that allows us to map an arbitrary distribution to/from a latent space in which the distribution is supposedly closer to a Gaussian. We propose a novel hybrid DA–ML approach in which VAEs are incorporated in the DA procedure. Specifically, we introduce a variant of the popular ensemble transform Kalman filter (ETKF) in which the analysis is applied in the latent space of a single VAE or a pair of VAEs. In twin experiments with a simple circular model, whereby the circle represents an underlying submanifold to be respected, we find that the use of a VAE ensures that a posteriori ensemble members lie close to the manifold containing the truth. Furthermore, online updating of the VAE is necessary and achievable when this manifold varies in time, that is, when it is non-stationary. We demonstrate that introducing an additional second latent space for the observational innovations improves robustness against detrimental effects of non-Gaussianity and bias in the observational errors but lessens the performance slightly if observational errors are strictly Gaussian.
KW - data assimilation
KW - ensemble Kalman filter
KW - machine learning
KW - non-Gaussianity
KW - variational autoencoder
UR - https://www.scopus.com/pages/publications/105022808829
U2 - 10.1002/qj.70070
DO - 10.1002/qj.70070
M3 - Article
AN - SCOPUS:105022808829
SN - 0035-9009
JO - Quarterly Journal of the Royal Meteorological Society
JF - Quarterly Journal of the Royal Meteorological Society
ER -