菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-02-17
📄 Abstract - ExLipBaB: Exact Lipschitz Constant Computation for Piecewise Linear Neural Networks

It has been shown that a neural network's Lipschitz constant can be leveraged to derive robustness guarantees, to improve generalizability via regularization or even to construct invertible networks. Therefore, a number of methods varying in the tightness of their bounds and their computational cost have been developed to approximate the Lipschitz constant for different classes of networks. However, comparatively little research exists on methods for exact computation, which has been shown to be NP-hard. Nonetheless, there are applications where one might readily accept the computational cost of an exact method. These applications could include the benchmarking of new methods or the computation of robustness guarantees for small models on sensitive data. Unfortunately, existing exact algorithms restrict themselves to only ReLU-activated networks, which are known to come with severe downsides in the context of Lipschitz-constrained networks. We therefore propose a generalization of the LipBaB algorithm to compute exact Lipschitz constants for arbitrary piecewise linear neural networks and $p$-norms. With our method, networks may contain traditional activations like ReLU or LeakyReLU, activations like GroupSort or the related MinMax and FullSort, which have been of increasing interest in the context of Lipschitz constrained networks, or even other piecewise linear functions like MaxPool.

顶级标签: theory model evaluation machine learning
详细标签: lipschitz constant neural networks robustness guarantees piecewise linear exact computation 或 搜索:

ExLipBaB:面向分段线性神经网络的确切利普希茨常数计算 / ExLipBaB: Exact Lipschitz Constant Computation for Piecewise Linear Neural Networks


1️⃣ 一句话总结

这篇论文提出了一种名为ExLipBaB的新算法,能够精确计算使用各种分段线性激活函数(如ReLU、LeakyReLU、GroupSort等)的神经网络在任意p-范数下的利普希茨常数,为评估模型鲁棒性、可逆网络设计等应用提供了精确的基准工具。

源自 arXiv: 2602.15499