基于布尔阈值函数的学习方法 / Learning with Boolean threshold functions
1️⃣ 一句话总结
这篇论文提出了一种用非凸约束和投影算法来训练布尔神经网络的新方法,该方法能生成由简单逻辑门组成的、权重为±1的可解释模型,并在多种离散学习任务上比传统梯度方法表现更好。
We develop a method for training neural networks on Boolean data in which the values at all nodes are strictly $\pm 1$, and the resulting models are typically equivalent to networks whose nonzero weights are also $\pm 1$. The method replaces loss minimization with a nonconvex constraint formulation. Each node implements a Boolean threshold function (BTF), and training is expressed through a divide-and-concur decomposition into two complementary constraints: one enforces local BTF consistency between inputs, weights, and output; the other imposes architectural concurrence, equating neuron outputs with downstream inputs and enforcing weight equality across training-data instantiations of the network. The reflect-reflect-relax (RRR) projection algorithm is used to reconcile these constraints. Each BTF constraint includes a lower bound on the margin. When this bound is sufficiently large, the learned representations are provably sparse and equivalent to networks composed of simple logical gates with $\pm 1$ weights. Across a range of tasks -- including multiplier-circuit discovery, binary autoencoding, logic-network inference, and cellular automata learning -- the method achieves exact solutions or strong generalization in regimes where standard gradient-based methods struggle. These results demonstrate that projection-based constraint satisfaction provides a viable and conceptually distinct foundation for learning in discrete neural systems, with implications for interpretability and efficient inference.
基于布尔阈值函数的学习方法 / Learning with Boolean threshold functions
这篇论文提出了一种用非凸约束和投影算法来训练布尔神经网络的新方法,该方法能生成由简单逻辑门组成的、权重为±1的可解释模型,并在多种离散学习任务上比传统梯度方法表现更好。
源自 arXiv: 2602.17493