GeoMotionGPT:基于大语言模型的几何对齐运动理解框架 / GeoMotionGPT: Geometry-Aligned Motion Understanding with Large Language Models
1️⃣ 一句话总结
这篇论文提出了一种新方法,通过强制运动代码本和大语言模型嵌入空间保持正交性,使两者共享统一的几何基础,从而显著提升模型对复杂运动的理解和推理能力,在HumanML3D数据集上性能超越了现有最佳方法20%。
Discrete motion tokenization has recently enabled Large Language Models (LLMs) to serve as versatile backbones for motion understanding and motion-language reasoning. However, existing pipelines typically decouple motion quantization from semantic embedding learning, linking them solely via token IDs. This approach fails to effectively align the intrinsic geometry of the motion space with the embedding space, thereby hindering the LLM's capacity for nuanced motion reasoning. We argue that alignment is most effective when both modalities share a unified geometric basis. Therefore, instead of forcing the LLM to reconstruct the complex geometry among motion tokens from scratch, we present a novel framework that explicitly enforces orthogonality on both the motion codebook and the LLM embedding space, ensuring that their relational structures naturally mirror each other. Specifically, we employ a decoder-only quantizer with Gumbel-Softmax for differentiable training and balanced codebook usage. To bridge the modalities, we use a sparse projection that maps motion codes into the LLM embedding space while preserving orthogonality. Finally, a two-stage orthonormal regularization schedule enforces soft constraints during tokenizer training and LLM fine-tuning to maintain geometric alignment without hindering semantic adaptation. Extensive experiments on HumanML3D demonstrate that our framework achieves a 20% performance improvement over current state-of-the-art methods, validating that a unified geometric basis effectively empowers the LLM for nuanced motion reasoning.
GeoMotionGPT:基于大语言模型的几何对齐运动理解框架 / GeoMotionGPT: Geometry-Aligned Motion Understanding with Large Language Models
这篇论文提出了一种新方法,通过强制运动代码本和大语言模型嵌入空间保持正交性,使两者共享统一的几何基础,从而显著提升模型对复杂运动的理解和推理能力,在HumanML3D数据集上性能超越了现有最佳方法20%。
源自 arXiv: 2601.07632