量化共形预测中的认知预测不确定性 / Quantifying Epistemic Predictive Uncertainty in Conformal Prediction
1️⃣ 一句话总结
这篇论文提出了一种在共形预测框架下,高效量化因存在多个合理预测模型而产生的认知不确定性的新方法,该方法比单纯依赖预测区间大小能提供更精细、更有信息量的不确定性评估,有助于在模型不确定时做出更可靠的决策。
We study the problem of quantifying epistemic predictive uncertainty (EPU) -- that is, uncertainty faced at prediction time due to the existence of multiple plausible predictive models -- within the framework of conformal prediction (CP). To expose the implicit model multiplicity underlying CP, we build on recent results showing that, under a mild assumption, any full CP procedure induces a set of closed and convex predictive distributions, commonly referred to as a credal set. Importantly, the conformal prediction region (CPR) coincides exactly with the set of labels to which all distributions in the induced credal set assign probability at least $1-\alpha$. As our first contribution, we prove that this characterisation also holds in split CP. Building on this connection, we then propose a computationally efficient and analytically tractable uncertainty measure, based on \emph{Maximum Mean Imprecision}, to quantify the EPU by measuring the degree of conflicting information within the induced credal set. Experiments on active learning and selective classification demonstrate that the quantified EPU provides substantially more informative and fine-grained uncertainty assessments than reliance on CPR size alone. More broadly, this work highlights the potential of CP serving as a principled basis for decision-making under epistemic uncertainty.
量化共形预测中的认知预测不确定性 / Quantifying Epistemic Predictive Uncertainty in Conformal Prediction
这篇论文提出了一种在共形预测框架下,高效量化因存在多个合理预测模型而产生的认知不确定性的新方法,该方法比单纯依赖预测区间大小能提供更精细、更有信息量的不确定性评估,有助于在模型不确定时做出更可靠的决策。
源自 arXiv: 2602.01667