窄残差网络的通用逼近约束:隧道效应 / Universal Approximation Constraints of Narrow ResNets: The Tunnel Effect
1️⃣ 一句话总结
这篇论文通过理论和数值分析发现,窄的残差网络(ResNet)在逼近某些函数时存在根本性局限,主要表现为一种‘隧道效应’——网络无法准确表示目标函数的某些关键点,其逼近能力高度依赖于网络中跳跃连接与残差通道的信号比例。
We analyze the universal approximation constraints of narrow Residual Neural Networks (ResNets) both theoretically and numerically. For deep neural networks without input space augmentation, a central constraint is the inability to represent critical points of the input-output map. We prove that this has global consequences for target function approximations and show that the manifestation of this defect is typically a shift of the critical point to infinity, which we call the ``tunnel effect'' in the context of classification tasks. While ResNets offer greater expressivity than standard multilayer perceptrons (MLPs), their capability strongly depends on the signal ratio between the skip and residual channels. We establish quantitative approximation bounds for both the residual-dominant (close to MLP) and skip-dominant (close to neural ODE) regimes. These estimates depend explicitly on the channel ratio and uniform network weight bounds. Low-dimensional examples further provide a detailed analysis of the different ResNet regimes and how architecture-target incompatibility influences the approximation error.
窄残差网络的通用逼近约束:隧道效应 / Universal Approximation Constraints of Narrow ResNets: The Tunnel Effect
这篇论文通过理论和数值分析发现,窄的残差网络(ResNet)在逼近某些函数时存在根本性局限,主要表现为一种‘隧道效应’——网络无法准确表示目标函数的某些关键点,其逼近能力高度依赖于网络中跳跃连接与残差通道的信号比例。
源自 arXiv: 2603.28591