图霍普菲尔德网络:基于能量的节点分类与联想记忆 / Graph Hopfield Networks: Energy-Based Node Classification with Associative Memory
1️⃣ 一句话总结
这篇论文提出了一种新的图神经网络模型,它将联想记忆的能量检索与图拉普拉斯平滑相结合,通过迭代优化一个联合能量函数来进行节点分类,该模型在稀疏图、特征缺失以及异配性图上均表现出更强的性能和鲁棒性。
We introduce Graph Hopfield Networks, whose energy function couples associative memory retrieval with graph Laplacian smoothing for node classification. Gradient descent on this joint energy yields an iterative update interleaving Hopfield retrieval with Laplacian propagation. Memory retrieval provides regime-dependent benefits: up to 2.0~pp on sparse citation networks and up to 5 pp additional robustness under feature masking; the iterative energy-descent architecture itself is a strong inductive bias, with all variants (including the memory-disabled NoMem ablation) outperforming standard baselines on Amazon co-purchase graphs. Tuning enables graph sharpening for heterophilous benchmarks without architectural changes.
图霍普菲尔德网络:基于能量的节点分类与联想记忆 / Graph Hopfield Networks: Energy-Based Node Classification with Associative Memory
这篇论文提出了一种新的图神经网络模型,它将联想记忆的能量检索与图拉普拉斯平滑相结合,通过迭代优化一个联合能量函数来进行节点分类,该模型在稀疏图、特征缺失以及异配性图上均表现出更强的性能和鲁棒性。
源自 arXiv: 2603.03464