📄
Abstract - GS-Quant: Granular Semantic and Generative Structural Quantization for Knowledge Graph Completion
Large Language Models (LLMs) have shown immense potential in Knowledge Graph Completion (KGC), yet bridging the modality gap between continuous graph embeddings and discrete LLM tokens remains a critical challenge. While recent quantization-based approaches attempt to align these modalities, they typically treat quantization as flat numerical compression, resulting in semantically entangled codes that fail to mirror the hierarchical nature of human reasoning. In this paper, we propose GS-Quant, a novel framework that generates semantically coherent and structurally stratified discrete codes for KG entities. Unlike prior methods, GS-Quant is grounded in the insight that entity representations should follow a linguistic coarse-to-fine logic. We introduce a Granular Semantic Enhancement module that injects hierarchical knowledge into the codebook, ensuring that earlier codes capture global semantic categories while later codes refine specific attributes. Furthermore, a Generative Structural Reconstruction module imposes causal dependencies on the code sequence, transforming independent discrete units into structured semantic descriptors. By expanding the LLM vocabulary with these learned codes, we enable the model to reason over graph structures isomorphically to natural language generation. Experimental results demonstrate that GS-Quant significantly outperforms existing text-based and embedding-based baselines. Our code is publicly available at this https URL.
GS-Quant:面向知识图谱补全的粒度语义与生成式结构量化方法 /
GS-Quant: Granular Semantic and Generative Structural Quantization for Knowledge Graph Completion
1️⃣ 一句话总结
本文提出了一种名为GS-Quant的新框架,通过将知识图谱中的实体编码为从粗到细的层级化离散代码(先概括大类后细化细节),并结合代码间的因果依赖关系,让大语言模型能像理解自然语言一样理解图结构,从而显著提升知识图谱补全的准确性。