菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-02-25
📄 Abstract - NGDB-Zoo: Towards Efficient and Scalable Neural Graph Databases Training

Neural Graph Databases (NGDBs) facilitate complex logical reasoning over incomplete knowledge structures, yet their training efficiency and expressivity are constrained by rigid query-level batching and structure-exclusive embeddings. We present NGDB-Zoo, a unified framework that resolves these bottlenecks by synergizing operator-level training with semantic augmentation. By decoupling logical operators from query topologies, NGDB-Zoo transforms the training loop into a dynamically scheduled data-flow execution, enabling multi-stream parallelism and achieving a $1.8\times$ - $6.8\times$ throughput compared to baselines. Furthermore, we formalize a decoupled architecture to integrate high-dimensional semantic priors from Pre-trained Text Encoders (PTEs) without triggering I/O stalls or memory overflows. Extensive evaluations on six benchmarks, including massive graphs like ogbl-wikikg2 and ATLAS-Wiki, demonstrate that NGDB-Zoo maintains high GPU utilization across diverse logical patterns and significantly mitigates representation friction in hybrid neuro-symbolic reasoning.

顶级标签: systems model training machine learning
详细标签: neural graph databases logical reasoning training efficiency semantic augmentation knowledge graphs 或 搜索:

NGDB-Zoo:迈向高效可扩展的神经图数据库训练 / NGDB-Zoo: Towards Efficient and Scalable Neural Graph Databases Training


1️⃣ 一句话总结

这篇论文提出了一个名为NGDB-Zoo的新框架,通过将训练过程拆解成可并行执行的算子流并融入外部语义知识,大幅提升了神经图数据库的训练效率和推理表达能力。

源自 arXiv: 2602.21597