可复现可实现PAC学习的样本复杂度 / The Sample Complexity of Replicable Realizable PAC Learning
1️⃣ 一句话总结
这篇论文通过构造一个特别困难的学习问题,并利用图论和随机游走等新颖方法,首次证明了在可复现的PAC学习框架下,所需样本量的下界几乎与假设类大小的(对数)的3/2次方成正比,并且这个下界几乎是紧的。
In this paper, we consider the problem of replicable realizable PAC learning. We construct a particularly hard learning problem and show a sample complexity lower bound with a close to $(\log|H|)^{3/2}$ dependence on the size of the hypothesis class $H$. Our proof uses several novel techniques and works by defining a particular Cayley graph associated with $H$ and analyzing a suitable random walk on this graph by examining the spectral properties of its adjacency matrix. Furthermore, we show an almost matching upper bound for the lower bound instance, meaning if a stronger lower bound exists, one would have to consider a different instance of the problem.
可复现可实现PAC学习的样本复杂度 / The Sample Complexity of Replicable Realizable PAC Learning
这篇论文通过构造一个特别困难的学习问题,并利用图论和随机游走等新颖方法,首次证明了在可复现的PAC学习框架下,所需样本量的下界几乎与假设类大小的(对数)的3/2次方成正比,并且这个下界几乎是紧的。
源自 arXiv: 2602.19552