菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-05-07
📄 Abstract - Quantizing With Randomized Hadamard Transforms: Efficient Heuristic Now Proven

Uniform random rotations (URRs) are a common preprocessing step in modern quantization approaches used for gradient compression, inference acceleration, KV-cache compression, model weight quantization, and approximate nearest-neighbor search in vector databases. In practice, URRs are often replaced by randomized Hadamard transforms (RHTs), which preserve orthogonality while admitting fast implementations. The remaining issue is the performance for worst-case inputs. With a URR, each coordinate is individually distributed as a shifted beta distribution, which converges to a Gaussian distribution in high dimensions. Generally, one RHT is not suitable in the worst case, as individual coordinates can be far from these distributions. We show that after composing two RHTs on any $d$-sized input vector, the marginal distribution of every fixed coordinate of the normalized rotated vector is within $O(d^{-1/2})$ of a standard Gaussian both in Kolmogorov distance and in $1$-Wasserstein distance. We then plug these bounds into the analyses of modern compression schemes, namely DRIVE and QUIC-FL, and show that two RHTs achieve performance that asymptotically matches URRs. However, we show that two RHTs may not be sufficient for Vector Quantization (VQ), which often requires weak correlation across fixed-size blocks of coordinates (as opposed to only marginal distribution convergence for single coordinates). We prove that a composition of three RHTs leads to decaying coordinate covariance. This ensures that any fixed, bounded, multi-dimensional VQ codebook optimized for URRs has the same expected error when using three RHTs, up to an additive term that vanishes with the dimension. Finally, because practical inputs are rarely adversarial, we propose a linear-time ${O}(d)$ check on the input's moments to dynamically adapt the number of RHTs used at runtime to improve performance.

顶级标签: machine learning theory
详细标签: randomized hadamard transform quantization vector quantization gaussian approximation rotation 或 搜索:

使用随机化哈达玛变换进行量化:高效启发式方法现已得到证明 / Quantizing With Randomized Hadamard Transforms: Efficient Heuristic Now Proven


1️⃣ 一句话总结

本文证明了在量化压缩(如梯度压缩、向量量化)中,使用两次随机哈达玛变换(RHT)就能让每个坐标的分布接近标准高斯分布,从而在性能上达到与理想均匀随机旋转(URR)相当的效果;但为了确保向量量化(VQ)中不同坐标块之间无关,需要三次RHT,并且论文还提出了一种线性时间的自适应检测方法,根据输入特征动态调整RHT的使用次数。

源自 arXiv: 2605.06014