超越参数微调:面向节点分类的测试时表征精炼 / Beyond Parameter Finetuning: Test-Time Representation Refinement for Node Classification
1️⃣ 一句话总结
这篇论文提出了一种名为TTReFT的新方法,它通过直接调整神经网络学习到的中间数据表示(而非修改模型参数本身),来解决图神经网络在真实场景中遇到未知数据时性能下降的问题,既避免了遗忘原有知识,又显著提升了模型的适应能力。
Graph Neural Networks frequently exhibit significant performance degradation in the out-of-distribution test scenario. While test-time training (TTT) offers a promising solution, existing Parameter Finetuning (PaFT) paradigm suffer from catastrophic forgetting, hindering their real-world applicability. We propose TTReFT, a novel Test-Time Representation FineTuning framework that transitions the adaptation target from model parameters to latent representations. Specifically, TTReFT achieves this through three key innovations: (1) uncertainty-guided node selection for specific interventions, (2) low-rank representation interventions that preserve pre-trained knowledge, and (3) an intervention-aware masked autoencoder that dynamically adjust masking strategy to accommodate the node selection scheme. Theoretically, we establish guarantees for TTReFT in OOD settings. Empirically, extensive experiments across five benchmark datasets demonstrate that TTReFT achieves consistent and superior performance. Our work establishes representation finetuning as a new paradigm for graph TTT, offering both theoretical grounding and immediate practical utility for real-world deployment.
超越参数微调:面向节点分类的测试时表征精炼 / Beyond Parameter Finetuning: Test-Time Representation Refinement for Node Classification
这篇论文提出了一种名为TTReFT的新方法,它通过直接调整神经网络学习到的中间数据表示(而非修改模型参数本身),来解决图神经网络在真实场景中遇到未知数据时性能下降的问题,既避免了遗忘原有知识,又显著提升了模型的适应能力。
源自 arXiv: 2601.21615