菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-01-27
📄 Abstract - Component-Aware Pruning Framework for Neural Network Controllers via Gradient-Based Importance Estimation

The transition from monolithic to multi-component neural architectures in advanced neural network controllers poses substantial challenges due to the high computational complexity of the latter. Conventional model compression techniques for complexity reduction, such as structured pruning based on norm-based metrics to estimate the relative importance of distinct parameter groups, often fail to capture functional significance. This paper introduces a component-aware pruning framework that utilizes gradient information to compute three distinct importance metrics during training: Gradient Accumulation, Fisher Information, and Bayesian Uncertainty. Experimental results with an autoencoder and a TD-MPC agent demonstrate that the proposed framework reveals critical structural dependencies and dynamic shifts in importance that static heuristics often miss, supporting more informed compression decisions.

顶级标签: model training systems machine learning
详细标签: structured pruning importance estimation gradient-based methods neural network compression multi-component architectures 或 搜索:

基于梯度重要性估计的神经网络控制器组件感知剪枝框架 / Component-Aware Pruning Framework for Neural Network Controllers via Gradient-Based Importance Estimation


1️⃣ 一句话总结

本文提出了一种新的神经网络控制器剪枝方法,它利用梯度信息动态评估不同组件的重要性,从而比传统静态方法更智能地压缩模型,减少计算开销。

源自 arXiv: 2601.19794