编码化的伏笔-照应文本生成 / Codified Foreshadowing-Payoff Text Generation
1️⃣ 一句话总结
这篇论文提出了一个名为CFPG的新框架,通过将故事中的‘伏笔’与‘照应’关系编码成可执行的因果规则,来帮助大语言模型更好地生成前后逻辑连贯、能兑现早期承诺的叙事文本,而不仅仅是表面流畅的文字。
Foreshadowing and payoff are ubiquitous narrative devices through which authors introduce commitments early in a story and resolve them through concrete, observable outcomes. However, despite advances in story generation, large language models (LLMs) frequently fail to bridge these long-range narrative dependencies, often leaving "Chekhov's guns" unfired even when the necessary context is present. Existing evaluations largely overlook this structural failure, focusing on surface-level coherence rather than the logical fulfillment of narrative setups. In this paper, we introduce Codified Foreshadowing-Payoff Generation (CFPG), a novel framework that reframes narrative quality through the lens of payoff realization. Recognizing that LLMs struggle to intuitively grasp the "triggering mechanism" of a foreshadowed event, CFPG transforms narrative continuity into a set of executable causal predicates. By mining and encoding Foreshadow-Trigger-Payoff triples from the BookSum corpus, we provide structured supervision that ensures foreshadowed commitments are not only mentioned but also temporally and logically fulfilled. Experiments demonstrate that CFPG significantly outperforms standard prompting baselines in payoff accuracy and narrative alignment. Our findings suggest that explicitly codifying narrative mechanics is essential for moving LLMs from surface-level fluency to genuine narrative competence.
编码化的伏笔-照应文本生成 / Codified Foreshadowing-Payoff Text Generation
这篇论文提出了一个名为CFPG的新框架,通过将故事中的‘伏笔’与‘照应’关系编码成可执行的因果规则,来帮助大语言模型更好地生成前后逻辑连贯、能兑现早期承诺的叙事文本,而不仅仅是表面流畅的文字。
源自 arXiv: 2601.07033