菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-01-29
📄 Abstract - Hebbian Learning with Global Direction

Backpropagation algorithm has driven the remarkable success of deep neural networks, but its lack of biological plausibility and high computational costs have motivated the ongoing search for alternative training methods. Hebbian learning has attracted considerable interest as a biologically plausible alternative to backpropagation. Nevertheless, its exclusive reliance on local information, without consideration of global task objectives, fundamentally limits its scalability. Inspired by the biological synergy between neuromodulators and local plasticity, we introduce a novel model-agnostic Global-guided Hebbian Learning (GHL) framework, which seamlessly integrates local and global information to scale up across diverse networks and tasks. In specific, the local component employs Oja's rule with competitive learning to ensure stable and effective local updates. Meanwhile, the global component introduces a sign-based signal that guides the direction of local Hebbian plasticity updates. Extensive experiments demonstrate that our method consistently outperforms existing Hebbian approaches. Notably, on large-scale network and complex datasets like ImageNet, our framework achieves the competitive results and significantly narrows the gap with standard backpropagation.

顶级标签: machine learning model training theory
详细标签: hebbian learning biologically plausible backpropagation alternative global-local learning neural network training 或 搜索:

具有全局指导的赫布学习 / Hebbian Learning with Global Direction


1️⃣ 一句话总结

本文提出了一种结合局部与全局信息的新型赫布学习框架,通过全局信号指导局部学习,使其能在大型网络和复杂任务上取得接近反向传播的性能,从而克服了传统赫布学习因缺乏全局目标而难以扩展的局限性。

源自 arXiv: 2601.21367