平滑不可知学习的统计查询下界 / Statistical Query Lower Bounds for Smoothed Agnostic Learning
1️⃣ 一句话总结
这篇论文证明了,在数据受到轻微高斯扰动的情况下,学习最佳分类器(特别是半空间分类器)的计算复杂度非常高,几乎达到了已知最优算法的上限,表明现有方法已接近理论极限。
We study the complexity of smoothed agnostic learning, recently introduced by~\cite{CKKMS24}, in which the learner competes with the best classifier in a target class under slight Gaussian perturbations of the inputs. Specifically, we focus on the prototypical task of agnostically learning halfspaces under subgaussian distributions in the smoothed model. The best known upper bound for this problem relies on $L_1$-polynomial regression and has complexity $d^{\tilde{O}(1/\sigma^2) \log(1/\epsilon)}$, where $\sigma$ is the smoothing parameter and $\epsilon$ is the excess error. Our main result is a Statistical Query (SQ) lower bound providing formal evidence that this upper bound is close to best possible. In more detail, we show that (even for Gaussian marginals) any SQ algorithm for smoothed agnostic learning of halfspaces requires complexity $d^{\Omega(1/\sigma^{2}+\log(1/\epsilon))}$. This is the first non-trivial lower bound on the complexity of this task and nearly matches the known upper bound. Roughly speaking, we show that applying $L_1$-polynomial regression to a smoothed version of the function is essentially best possible. Our techniques involve finding a moment-matching hard distribution by way of linear programming duality. This dual program corresponds exactly to finding a low-degree approximating polynomial to the smoothed version of the target function (which turns out to be the same condition required for the $L_1$-polynomial regression to work). Our explicit SQ lower bound then comes from proving lower bounds on this approximation degree for the class of halfspaces.
平滑不可知学习的统计查询下界 / Statistical Query Lower Bounds for Smoothed Agnostic Learning
这篇论文证明了,在数据受到轻微高斯扰动的情况下,学习最佳分类器(特别是半空间分类器)的计算复杂度非常高,几乎达到了已知最优算法的上限,表明现有方法已接近理论极限。
源自 arXiv: 2602.21191