菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-01-29
📄 Abstract - Smooth Dynamic Cutoffs for Machine Learning Interatomic Potentials

Machine learning interatomic potentials (MLIPs) have proven to be wildly useful for molecular dynamics simulations, powering countless drug and materials discovery applications. However, MLIPs face two primary bottlenecks preventing them from reaching realistic simulation scales: inference time and memory consumption. In this work, we address both issues by challenging the long-held belief that the cutoff radius for the MLIP must be held to a fixed, constant value. For the first time, we introduce a dynamic cutoff formulation that still leads to stable, long timescale molecular dynamics simulation. In introducing the dynamic cutoff, we are able to induce sparsity onto the underlying atom graph by targeting a specific number of neighbors per atom, significantly reducing both memory consumption and inference time. We show the effectiveness of a dynamic cutoff by implementing it onto 4 state of the art MLIPs: MACE, Nequip, Orbv3, and TensorNet, leading to 2.26x less memory consumption and 2.04x faster inference time, depending on the model and atomic system. We also perform an extensive error analysis and find that the dynamic cutoff models exhibit minimal accuracy dropoff compared to their fixed cutoff counterparts on both materials and molecular datasets. All model implementations and training code will be fully open sourced.

顶级标签: machine learning systems model training
详细标签: interatomic potentials molecular dynamics dynamic cutoff computational efficiency sparse graphs 或 搜索:

机器学习原子间势函数的平滑动态截断方法 / Smooth Dynamic Cutoffs for Machine Learning Interatomic Potentials


1️⃣ 一句话总结

这篇论文提出了一种用于机器学习原子间势函数的新方法,通过引入平滑动态截断来替代传统的固定截断,从而在几乎不损失精度的前提下,显著降低了模型的内存占用和推理时间,加速了大规模分子动力学模拟。

源自 arXiv: 2601.21147