模块化表示压缩:为大语言模型适配高效有效的推荐系统 / Modular Representation Compression: Adapting LLMs for Efficient and Effective Recommendations
1️⃣ 一句话总结
本文发现大语言模型用于推荐任务时,中间层表示比最终层效果更好,并据此提出了一种模块化表示压缩方法,通过分离表示学习和任务适配模块,在显著降低存储和计算成本的同时,有效提升了在线广告推荐的效果。
Recently, large language models (LLMs) have advanced recommendation systems (RSs), and recent works have begun to explore how to integrate LLMs into industrial RSs. While most approaches deploy LLMs offline to generate and pre-cache augmented representations for RSs, high-dimensional representations from LLMs introduce substantial storage and computational costs. Thus, it is crucial to compress LLM representations effectively. However, we identify a counterintuitive phenomenon during representation compression: Mid-layer Representation Advantage (MRA), where representations from middle layers of LLMs outperform those from final layers in recommendation tasks. This degraded final layer renders existing compression methods, which typically compress on the final layer, suboptimal. We interpret this based on modularity theory that LLMs develop spontaneous internal functional modularity and force the final layer to specialize in the proxy training task. Thus, we propose \underline{M}odul\underline{a}r \underline{R}epresentation \underline{C}ompression (MARC) to explicitly control the modularity of LLMs. First, Modular Adjustment explicitly introduces compression and task adaptation modules, enabling the LLM to operate strictly as a representation-learning module. Next, to ground each module to its specific task, Modular Task Decoupling uses information constraints and different network structures to decouple tasks. Extensive experiments validate that MARC addresses MRA and produces efficient representations. Notably, MARC achieved a 2.82% eCPM lift in an online A/B test within a large-scale commercial search advertising scenario.
模块化表示压缩:为大语言模型适配高效有效的推荐系统 / Modular Representation Compression: Adapting LLMs for Efficient and Effective Recommendations
本文发现大语言模型用于推荐任务时,中间层表示比最终层效果更好,并据此提出了一种模块化表示压缩方法,通过分离表示学习和任务适配模块,在显著降低存储和计算成本的同时,有效提升了在线广告推荐的效果。
源自 arXiv: 2604.18146