物理信息神经网络的统计学习分析 / Statistical Learning Analysis of Physics-Informed Neural Networks
1️⃣ 一句话总结
这篇论文从统计学习的角度重新审视了物理信息神经网络,将其训练过程解释为拟合一个由物理方程生成的无限数据分布,并利用奇异学习理论分析了其参数估计和预测不确定性。
We study the training and performance of physics-informed learning for initial and boundary value problems (IBVP) with physics-informed neural networks (PINNs) from a statistical learning perspective. Specifically, we restrict ourselves to parameterizations with hard initial and boundary condition constraints and reformulate the problem of estimating PINN parameters as a statistical learning problem. From this perspective, the physics penalty on the IBVP residuals can be better understood not as a regularizing term bus as an infinite source of indirect data, and the learning process as fitting the PINN distribution of residuals $p(y \mid x, t, w) q(x, t) $ to the true data-generating distribution $\delta(0) q(x, t)$ by minimizing the Kullback-Leibler divergence between the true and PINN distributions. Furthermore, this analysis show that physics-informed learning with PINNs is a singular learning problem, and we employ singular learning theory tools, namely the so-called Local Learning Coefficient (Lau et al., 2025) to analyze the estimates of PINN parameters obtained via stochastic optimization for a heat equation IBVP. Finally, we discuss implications of this analysis on the quantification of predictive uncertainty of PINNs and the extrapolation capacity of PINNs.
物理信息神经网络的统计学习分析 / Statistical Learning Analysis of Physics-Informed Neural Networks
这篇论文从统计学习的角度重新审视了物理信息神经网络,将其训练过程解释为拟合一个由物理方程生成的无限数据分布,并利用奇异学习理论分析了其参数估计和预测不确定性。
源自 arXiv: 2602.11097