超越准确率:卷积神经网络中的可靠性与不确定性估计 / Beyond Accuracy: Reliability and Uncertainty Estimation in Convolutional Neural Networks
1️⃣ 一句话总结
这篇论文通过比较两种不确定性估计方法,指出在评估深度学习模型时,除了关注预测准确率,还必须重视模型预测的可靠性和校准程度,这对于高风险决策至关重要。
Deep neural networks (DNNs) have become integral to a wide range of scientific and practical applications due to their flexibility and strong predictive performance. Despite their accuracy, however, DNNs frequently exhibit poor calibration, often assigning overly confident probabilities to incorrect predictions. This limitation underscores the growing need for integrated mechanisms that provide reliable uncertainty estimation. In this article, we compare two prominent approaches for uncertainty quantification: a Bayesian approximation via Monte Carlo Dropout and the nonparametric Conformal Prediction framework. Both methods are assessed using two convolutional neural network architectures; H-CNN VGG16 and GoogLeNet, trained on the Fashion-MNIST dataset. The empirical results show that although H-CNN VGG16 attains higher predictive accuracy, it tends to exhibit pronounced overconfidence, whereas GoogLeNet yields better-calibrated uncertainty estimates. Conformal Prediction additionally demonstrates consistent validity by producing statistically guaranteed prediction sets, highlighting its practical value in high-stakes decision-making contexts. Overall, the findings emphasize the importance of evaluating model performance beyond accuracy alone and contribute to the development of more reliable and trustworthy deep learning systems.
超越准确率:卷积神经网络中的可靠性与不确定性估计 / Beyond Accuracy: Reliability and Uncertainty Estimation in Convolutional Neural Networks
这篇论文通过比较两种不确定性估计方法,指出在评估深度学习模型时,除了关注预测准确率,还必须重视模型预测的可靠性和校准程度,这对于高风险决策至关重要。
源自 arXiv: 2603.10731