菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-02-23
📄 Abstract - BabyLM Turns 4: Call for Papers for the 2026 BabyLM Workshop

BabyLM aims to dissolve the boundaries between cognitive modeling and language modeling. We call for both workshop papers and for researchers to join the 4th BabyLM competition. As in previous years, we call for participants in the data-efficient pretraining challenge in the general track. This year, we also offer a new track: Multilingual. We also call for papers outside the competition in any relevant areas. These include training efficiency, cognitively plausible research, weak model evaluation, and more.

顶级标签: llm model training model evaluation
详细标签: data-efficient pretraining multilingual cognitively plausible weak model evaluation training efficiency 或 搜索:

BabyLM 四岁了:2026年 BabyLM 研讨会征稿启事 / BabyLM Turns 4: Call for Papers for the 2026 BabyLM Workshop


1️⃣ 一句话总结

这篇论文是一份征稿启事,旨在通过举办第四届 BabyLM 竞赛和研讨会,鼓励研究者在数据高效预训练、多语言模型以及认知合理性等方向进行探索,以弥合认知建模与语言建模之间的界限。

源自 arXiv: 2602.20092