菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-02-10
📄 Abstract - Stabilizing Physics-Informed Consistency Models via Structure-Preserving Training

We propose a physics-informed consistency modeling framework for solving partial differential equations (PDEs) via fast, few-step generative inference. We identify a key stability challenge in physics-constrained consistency training, where PDE residuals can drive the model toward trivial or degenerate solutions, degrading the learned data distribution. To address this, we introduce a structure-preserving two-stage training strategy that decouples distribution learning from physics enforcement by freezing the coefficient decoder during physics-informed fine-tuning. We further propose a two-step residual objective that enforces physical consistency on refined, structurally valid generative trajectories rather than noisy single-step predictions. The resulting framework enables stable, high-fidelity inference for both unconditional generation and forward problems. We demonstrate that forward solutions can be obtained via a projection-based zero-shot inpainting procedure, achieving consistent accuracy of diffusion baselines with orders of magnitude reduction in computational cost.

顶级标签: model training systems theory
详细标签: physics-informed ml consistency models partial differential equations generative inference training stability 或 搜索:

通过结构保持训练稳定物理信息一致性模型 / Stabilizing Physics-Informed Consistency Models via Structure-Preserving Training


1️⃣ 一句话总结

这篇论文提出了一种新的训练方法,通过分阶段学习和改进物理约束目标,解决了用生成式模型快速求解物理方程时容易崩溃或失真的问题,从而在保持高精度的同时大幅降低了计算成本。

源自 arXiv: 2602.09303