菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-03-16
📄 Abstract - Physics-informed fine-tuning of foundation models for partial differential equations

Foundation models for partial differential equations (PDEs) have emerged as powerful surrogates pre-trained on diverse physical systems, but adapting them to new downstream tasks remains challenging due to limited task-specific data and distribution shifts. While fine-tuning has proven transformative in natural language processing, best practices for adapting PDE foundation models remain underexplored. Although physics-informed training has successfully trained accurate solvers across a wide range of PDE problems, its potential for fine-tuning data-based foundation models has not been systematically studied. In this work, we introduce a physics-informed fine-tuning framework that adapts pre-trained PDE foundation models by incorporating physical constraints (PDE residuals and boundary conditions) directly into the fine-tuning objective. This enables effective adaptation in data-scarce regimes while promoting physical consistency. We evaluate our method on a downstream task composed of an unseen PDE class and compare it with data-driven finetuning counterparts. Our results demonstrate that physics-informed fine-tuning achieves competitive accuracy without requiring PDE solutions for training. Furthermore, a hybrid fine-tuning strategy yields superior generalization to out-of-distribution scenarios when only minimal training data is available. These findings establish physics-informed fine-tuning as a scalable and data-efficient paradigm, providing a physically interpretable pathway for adapting foundation models in scientific machine learning.

顶级标签: machine learning model training systems
详细标签: physics-informed machine learning partial differential equations foundation models fine-tuning scientific machine learning 或 搜索:

基于物理信息的基础模型微调方法用于偏微分方程求解 / Physics-informed fine-tuning of foundation models for partial differential equations


1️⃣ 一句话总结

这项研究提出了一种结合物理约束来微调偏微分方程基础模型的新方法,能够在数据稀缺的情况下,有效提升模型对新任务和未知场景的适应能力与泛化性能。

源自 arXiv: 2603.15431