用于提升长文本理解的心智景观感知检索增强生成 / Mindscape-Aware Retrieval Augmented Generation for Improved Long Context Understanding
1️⃣ 一句话总结
这篇论文提出了一种名为MiA-RAG的新方法,它通过模仿人类在阅读长文本时构建整体语义图景的能力,来显著提升AI系统对长文档的理解和推理效果。
Humans understand long and complex texts by relying on a holistic semantic representation of the content. This global view helps organize prior knowledge, interpret new information, and integrate evidence dispersed across a document, as revealed by the Mindscape-Aware Capability of humans in psychology. Current Retrieval-Augmented Generation (RAG) systems lack such guidance and therefore struggle with long-context tasks. In this paper, we propose Mindscape-Aware RAG (MiA-RAG), the first approach that equips LLM-based RAG systems with explicit global context awareness. MiA-RAG builds a mindscape through hierarchical summarization and conditions both retrieval and generation on this global semantic representation. This enables the retriever to form enriched query embeddings and the generator to reason over retrieved evidence within a coherent global context. We evaluate MiA-RAG across diverse long-context and bilingual benchmarks for evidence-based understanding and global sense-making. It consistently surpasses baselines, and further analysis shows that it aligns local details with a coherent global representation, enabling more human-like long-context retrieval and reasoning.
用于提升长文本理解的心智景观感知检索增强生成 / Mindscape-Aware Retrieval Augmented Generation for Improved Long Context Understanding
这篇论文提出了一种名为MiA-RAG的新方法,它通过模仿人类在阅读长文本时构建整体语义图景的能力,来显著提升AI系统对长文档的理解和推理效果。
源自 arXiv: 2512.17220