Eval4Sim:一种用于角色模拟的评估框架 / Eval4Sim: An Evaluation Framework for Persona Simulation
1️⃣ 一句话总结
这篇论文提出了一个名为Eval4Sim的评估框架,通过从‘忠实度’、‘一致性’和‘自然度’三个维度来衡量大型语言模型模拟的对话与真实人类对话模式的接近程度,从而更科学地评估角色模拟的质量。
Large Language Model (LLM) personas with explicit specifications of attributes, background, and behavioural tendencies are increasingly used to simulate human conversations for tasks such as user modeling, social reasoning, and behavioural analysis. Ensuring that persona-grounded simulations faithfully reflect human conversational behaviour is therefore critical. However, current evaluation practices largely rely on LLM-as-a-judge approaches, offering limited grounding in observable human behavior and producing opaque scalar scores. We address this gap by proposing Eval4Sim, an evaluation framework that measures how closely simulated conversations align with human conversational patterns across three complementary dimensions. Adherence captures how effectively persona backgrounds are implicitly encoded in generated utterances, assessed via dense retrieval with speaker-aware representations. Consistency evaluates whether a persona maintains a distinguishable identity across conversations, computed through authorship verification. Naturalness reflects whether conversations exhibit human-like flow rather than overly rigid or optimized structure, quantified through distributions derived from dialogue-focused Natural Language Inference. Unlike absolute or optimization-oriented metrics, Eval4Sim uses a human conversational corpus (i.e., PersonaChat) as a reference baseline and penalizes deviations in both directions, distinguishing insufficient persona encoding from over-optimized, unnatural behaviour. Although demonstrated on PersonaChat, the applicability of Eval4Sim extends to any conversational corpus containing speaker-level annotations.
Eval4Sim:一种用于角色模拟的评估框架 / Eval4Sim: An Evaluation Framework for Persona Simulation
这篇论文提出了一个名为Eval4Sim的评估框架,通过从‘忠实度’、‘一致性’和‘自然度’三个维度来衡量大型语言模型模拟的对话与真实人类对话模式的接近程度,从而更科学地评估角色模拟的质量。
源自 arXiv: 2603.02876