菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-04-09
📄 Abstract - HyperMem: Hypergraph Memory for Long-Term Conversations

Long-term memory is essential for conversational agents to maintain coherence, track persistent tasks, and provide personalized interactions across extended dialogues. However, existing approaches as Retrieval-Augmented Generation (RAG) and graph-based memory mostly rely on pairwise relations, which can hardly capture high-order associations, i.e., joint dependencies among multiple elements, causing fragmented retrieval. To this end, we propose HyperMem, a hypergraph-based hierarchical memory architecture that explicitly models such associations using hyperedges. Particularly, HyperMem structures memory into three levels: topics, episodes, and facts, and groups related episodes and their facts via hyperedges, unifying scattered content into coherent units. Leveraging this structure, we design a hybrid lexical-semantic index and a coarse-to-fine retrieval strategy, supporting accurate and efficient retrieval of high-order associations. Experiments on the LoCoMo benchmark show that HyperMem achieves state-of-the-art performance with 92.73% LLM-as-a-judge accuracy, demonstrating the effectiveness of HyperMem for long-term conversations.

顶级标签: llm agents natural language processing
详细标签: long-term memory hypergraph retrieval-augmented generation conversational agents memory architecture 或 搜索:

HyperMem:用于长程对话的超图记忆 / HyperMem: Hypergraph Memory for Long-Term Conversations


1️⃣ 一句话总结

这篇论文提出了一个名为HyperMem的新型记忆架构,它使用超图来组织对话内容,能更好地捕捉多个信息点之间的复杂关联,从而让AI在长对话中更连贯、更准确地记住和调用相关信息。

源自 arXiv: 2604.08256