TY - JOUR
T1 - Sharp Spectral Rates for Koopman Operator Learning
AU - Kostic, Vladimir R.
AU - Novelli, Pietro
AU - Lounici, Karim
AU - Pontil, Massimiliano
N1 - Publisher Copyright:
© 2023 Neural information processing systems foundation. All rights reserved.
PY - 2023/1/1
Y1 - 2023/1/1
N2 - Nonlinear dynamical systems can be handily described by the associated Koopman operator, whose action evolves every observable of the system forward in time.Learning the Koopman operator and its spectral decomposition from data is enabled by a number of algorithms.In this work we present for the first time non-asymptotic learning bounds for the Koopman eigenvalues and eigenfunctions.We focus on time-reversal-invariant stochastic dynamical systems, including the important example of Langevin dynamics.We analyze two popular estimators: Extended Dynamic Mode Decomposition (EDMD) and Reduced Rank Regression (RRR).Our results critically hinge on novel minimax estimation bounds for the operator norm error, that may be of independent interest.Our spectral learning bounds are driven by the simultaneous control of the operator norm error and a novel metric distortion functional of the estimated eigenfunctions.The bounds indicates that both EDMD and RRR have similar variance, but EDMD suffers from a larger bias which might be detrimental to its learning rate.Our results shed new light on the emergence of spurious eigenvalues, an issue which is well known empirically.Numerical experiments illustrate the implications of the bounds in practice.
AB - Nonlinear dynamical systems can be handily described by the associated Koopman operator, whose action evolves every observable of the system forward in time.Learning the Koopman operator and its spectral decomposition from data is enabled by a number of algorithms.In this work we present for the first time non-asymptotic learning bounds for the Koopman eigenvalues and eigenfunctions.We focus on time-reversal-invariant stochastic dynamical systems, including the important example of Langevin dynamics.We analyze two popular estimators: Extended Dynamic Mode Decomposition (EDMD) and Reduced Rank Regression (RRR).Our results critically hinge on novel minimax estimation bounds for the operator norm error, that may be of independent interest.Our spectral learning bounds are driven by the simultaneous control of the operator norm error and a novel metric distortion functional of the estimated eigenfunctions.The bounds indicates that both EDMD and RRR have similar variance, but EDMD suffers from a larger bias which might be detrimental to its learning rate.Our results shed new light on the emergence of spurious eigenvalues, an issue which is well known empirically.Numerical experiments illustrate the implications of the bounds in practice.
M3 - Conference article
AN - SCOPUS:85180357902
SN - 1049-5258
VL - 36
JO - Advances in Neural Information Processing Systems
JF - Advances in Neural Information Processing Systems
T2 - 37th Conference on Neural Information Processing Systems, NeurIPS 2023
Y2 - 10 December 2023 through 16 December 2023
ER -