菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-02-11
📄 Abstract - Latent Generative Solvers for Generalizable Long-Term Physics Simulation

We study long-horizon surrogate simulation across heterogeneous PDE systems. We introduce Latent Generative Solvers (LGS), a two-stage framework that (i) maps diverse PDE states into a shared latent physics space with a pretrained VAE, and (ii) learns probabilistic latent dynamics with a Transformer trained by flow matching. Our key mechanism is an uncertainty knob that perturbs latent inputs during training and inference, teaching the solver to correct off-manifold rollout drift and stabilizing autoregressive prediction. We further use flow forcing to update a system descriptor (context) from model-generated trajectories, aligning train/test conditioning and improving long-term stability. We pretrain on a curated corpus of $\sim$2.5M trajectories at $128^2$ resolution spanning 12 PDE families. LGS matches strong deterministic neural-operator baselines on short horizons while substantially reducing rollout drift on long horizons. Learning in latent space plus efficient architectural choices yields up to \textbf{70$\times$} lower FLOPs than non-generative baselines, enabling scalable pretraining. We also show efficient adaptation to an out-of-distribution $256^2$ Kolmogorov flow dataset under limited finetuning budgets. Overall, LGS provides a practical route toward generalizable, uncertainty-aware neural PDE solvers that are more reliable for long-term forecasting and downstream scientific workflows.

顶级标签: systems model training machine learning
详细标签: physics simulation pde solvers generative models latent dynamics flow matching 或 搜索:

用于可泛化长期物理模拟的隐式生成求解器 / Latent Generative Solvers for Generalizable Long-Term Physics Simulation


1️⃣ 一句话总结

这篇论文提出了一种名为LGS的两阶段AI框架,它通过将复杂的物理系统映射到一个共享的隐式空间并学习其动态规律,能够更稳定、高效地预测各种物理现象的长期演化过程,同时显著降低了计算成本。

源自 arXiv: 2602.11229