菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-03-17
📄 Abstract - Optimal uncertainty bounds for multivariate kernel regression under bounded noise: A Gaussian process-based dual function

Non-conservative uncertainty bounds are essential for making reliable predictions about latent functions from noisy data--and thus, a key enabler for safe learning-based control. In this domain, kernel methods such as Gaussian process regression are established techniques, thanks to their inherent uncertainty quantification mechanism. Still, existing bounds either pose strong assumptions on the underlying noise distribution, are conservative, do not scale well in the multi-output case, or are difficult to integrate into downstream tasks. This paper addresses these limitations by presenting a tight, distribution-free bound for multi-output kernel-based estimates. It is obtained through an unconstrained, duality-based formulation, which shares the same structure of classic Gaussian process confidence bounds and can thus be straightforwardly integrated into downstream optimization pipelines. We show that the proposed bound generalizes many existing results and illustrate its application using an example inspired by quadrotor dynamics learning.

顶级标签: machine learning theory systems
详细标签: kernel regression uncertainty quantification gaussian processes distribution-free bounds multi-output regression 或 搜索:

基于高斯过程对偶函数的有界噪声下多元核回归最优不确定性边界 / Optimal uncertainty bounds for multivariate kernel regression under bounded noise: A Gaussian process-based dual function


1️⃣ 一句话总结

这篇论文提出了一种新的、更精确且易于使用的数学方法,用于评估从带噪声数据中学习多元函数时预测结果的可信度,该方法不依赖于特定的噪声分布假设,并能方便地应用于后续的优化与控制任务。

源自 arXiv: 2603.16481