菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-02-26
📄 Abstract - MUG: Meta-path-aware Universal Heterogeneous Graph Pre-Training

Universal graph pre-training has emerged as a key paradigm in graph representation learning, offering a promising way to train encoders to learn transferable representations from unlabeled graphs and to effectively generalize across a wide range of downstream tasks. However, recent explorations in universal graph pre-training primarily focus on homogeneous graphs and it remains unexplored for heterogeneous graphs, which exhibit greater structural and semantic complexity. This heterogeneity makes it highly challenging to train a universal encoder for diverse heterogeneous graphs: (i) the diverse types with dataset-specific semantics hinder the construction of a unified representation space; (ii) the number and semantics of meta-paths vary across datasets, making encoding and aggregation patterns learned from one dataset difficult to apply to others. To address these challenges, we propose a novel Meta-path-aware Universal heterogeneous Graph pre-training (MUG) approach. Specifically, for challenge (i), MUG introduces a input unification module that integrates information from multiple node and relation types within each heterogeneous graph into a unified this http URL representation is then projected into a shared space by a dimension-aware encoder, enabling alignment across graphs with diverse this http URL, for challenge (ii), MUG trains a shared encoder to capture consistent structural patterns across diverse meta-path views rather than relying on dataset-specific aggregation strategies, while a global objective encourages discriminability and reduces dataset-specific biases. Extensive experiments demonstrate the effectiveness of MUG on some real datasets.

顶级标签: machine learning systems model training
详细标签: graph neural networks heterogeneous graphs pre-training meta-paths transfer learning 或 搜索:

MUG:基于元路径感知的通用异质图预训练方法 / MUG: Meta-path-aware Universal Heterogeneous Graph Pre-Training


1️⃣ 一句话总结

这篇论文提出了一种名为MUG的新方法,它通过统一输入表示和共享编码器,首次解决了为结构复杂的异质图进行通用预训练的难题,使得训练出的模型能够有效适应多种不同的下游任务。

源自 arXiv: 2602.22645