菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-02-04
📄 Abstract - Anytime-Valid Conformal Risk Control

Prediction sets provide a means of quantifying the uncertainty in predictive tasks. Using held out calibration data, conformal prediction and risk control can produce prediction sets that exhibit statistically valid error control in a computationally efficient manner. However, in the standard formulations, the error is only controlled on average over many possible calibration datasets of fixed size. In this paper, we extend the control to remain valid with high probability over a cumulatively growing calibration dataset at any time point. We derive such guarantees using quantile-based arguments and illustrate the applicability of the proposed framework to settings involving distribution shift. We further establish a matching lower bound and show that our guarantees are asymptotically tight. Finally, we demonstrate the practical performance of our methods through both simulations and real-world numerical examples.

顶级标签: theory model evaluation machine learning
详细标签: conformal prediction risk control statistical guarantees distribution shift prediction sets 或 搜索:

任意时间有效的共形风险控制 / Anytime-Valid Conformal Risk Control


1️⃣ 一句话总结

这篇论文提出了一种新的统计方法,能够确保在数据随时间累积增长的任何时刻,机器学习模型的预测不确定性(以预测集形式呈现)都能以高概率满足预设的误差控制要求,即使在数据分布发生变化时也有效。

源自 arXiv: 2602.04364