无需分布假设的鲁棒函数预测-然后-优化 / Distribution-Free Robust Functional Predict-Then-Optimize
1️⃣ 一句话总结
这篇论文提出了一种新方法,为用于快速求解偏微分方程的神经网络算子模型提供了无需假设数据分布、可计算的不确定性度量,并将这种不确定性量化应用于下游的鲁棒决策任务,从而在保证决策质量的同时,避免了传统方法对数据分布的苛刻假设或计算量过大的问题。
The solution of PDEs in decision-making tasks is increasingly being undertaken with the help of neural operator surrogate models due to the need for repeated evaluation. Such methods, while significantly more computationally favorable compared to their numerical counterparts, fail to provide any calibrated notions of uncertainty in their predictions. Current methods approach this deficiency typically with ensembling or Bayesian posterior estimation. However, these approaches either require distributional assumptions that fail to hold in practice or lack practical scalability, limiting their applications in practice. We, therefore, propose a novel application of conformal prediction to produce distribution-free uncertainty quantification over the function spaces mapped by neural operators. We then demonstrate how such prediction regions enable a formal regret characterization if leveraged in downstream robust decision-making tasks. We further demonstrate how such posited robust decision-making tasks can be efficiently solved using an infinite-dimensional generalization of Danskin's Theorem and calculus of variations and empirically demonstrate the superior performance of our proposed method over more restrictive modeling paradigms, such as Gaussian Processes, across several engineering tasks.
无需分布假设的鲁棒函数预测-然后-优化 / Distribution-Free Robust Functional Predict-Then-Optimize
这篇论文提出了一种新方法,为用于快速求解偏微分方程的神经网络算子模型提供了无需假设数据分布、可计算的不确定性度量,并将这种不确定性量化应用于下游的鲁棒决策任务,从而在保证决策质量的同时,避免了传统方法对数据分布的苛刻假设或计算量过大的问题。
源自 arXiv: 2602.08215