用于核扩展动态模式分解的字典学习 / Dictionary learning for Kernel EDMD
1️⃣ 一句话总结
本文提出了一种通过自动优化核参数来简化核扩展动态模式分解(kEDMD)的方法,使得无需手动选择核函数及其参数,就能从多个候选核中自动筛选出最合适的组合,从而更准确地近似非线性动力系统的Koopman算子。
Studying nonlinear dynamical systems through their state space behavior can be challenging, and one possible alternative is to analyze them via their associated Koopman operator. This turns the nonlinear problem into a linear, infinite-dimensional one. To approximate the operator in finite dimensions, extended dynamic mode decomposition (EDMD) is a commonly used algorithm. It requires a finite list of functionals and a set of snapshots from the system to compute an approximation of the operator and its corresponding spectrum. Instead of choosing the list of functionals directly, it can be implicitly defined via kernels, a method known as kernel extended dynamic mode decomposition (kEDMD). However, one still needs to define the kernel and choose its parameter values. In this paper, we aim to streamline this process by extending dictionary learning for EDMD to kernel learning in kEDMD. By simplifying kEDMD we show how to perform gradient-based optimization over the learnable kernel parameters, and demonstrate that this method leads to useful kernels for the original kEDMD. The focus of our work is a method that takes a weighted list of kernels with randomly initialized values as input and outputs a list of kernels and parameter values suitable for approximating the Koopman operator of the underlying system. We demonstrate that unimportant kernels can be removed from the list by analyzing the weights in the weighted sum. We evaluate the method across several experiments, including the Duffing oscillator and the Kuramoto-Sivashinsky PDE, showcasing the method's different strengths.
用于核扩展动态模式分解的字典学习 / Dictionary learning for Kernel EDMD
本文提出了一种通过自动优化核参数来简化核扩展动态模式分解(kEDMD)的方法,使得无需手动选择核函数及其参数,就能从多个候选核中自动筛选出最合适的组合,从而更准确地近似非线性动力系统的Koopman算子。
源自 arXiv: 2604.25572