菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-02-19
📄 Abstract - Synergizing Transport-Based Generative Models and Latent Geometry for Stochastic Closure Modeling

Diffusion models recently developed for generative AI tasks can produce high-quality samples while still maintaining diversity among samples to promote mode coverage, providing a promising path for learning stochastic closure models. Compared to other types of generative AI models, such as GANs and VAEs, the sampling speed is known as a key disadvantage of diffusion models. By systematically comparing transport-based generative models on a numerical example of 2D Kolmogorov flows, we show that flow matching in a lower-dimensional latent space is suited for fast sampling of stochastic closure models, enabling single-step sampling that is up to two orders of magnitude faster than iterative diffusion-based approaches. To control the latent space distortion and thus ensure the physical fidelity of the sampled closure term, we compare the implicit regularization offered by a joint training scheme against two explicit regularizers: metric-preserving (MP) and geometry-aware (GA) constraints. Besides offering a faster sampling speed, both explicitly and implicitly regularized latent spaces inherit the key topological information from the lower-dimensional manifold of the original complex dynamical system, which enables the learning of stochastic closure models without demanding a huge amount of training data.

顶级标签: model training machine learning systems
详细标签: generative models stochastic closure latent space flow matching diffusion models 或 搜索:

融合基于传输的生成模型与潜在几何用于随机闭合建模 / Synergizing Transport-Based Generative Models and Latent Geometry for Stochastic Closure Modeling


1️⃣ 一句话总结

这篇论文提出了一种新方法,通过将生成模型放在低维潜在空间中训练并控制其几何变形,能够快速且准确地生成复杂物理系统的随机闭合项,采样速度比传统扩散模型快上百倍,同时所需训练数据更少。

源自 arXiv: 2602.17089