菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-03-09
📄 Abstract - Beyond ReinMax: Low-Variance Gradient Estimators for Discrete Latent Variables

Machine learning models involving discrete latent variables require gradient estimators to facilitate backpropagation in a computationally efficient manner. The most recent addition to the Straight-Through family of estimators, ReinMax, can be viewed from a numerical ODE perspective as incorporating an approximation via Heun's method to reduce bias, but at the cost of high variance. In this work, we introduce the ReinMax-Rao and ReinMax-CV estimators which incorporate Rao-Blackwellisation and control variate techniques into ReinMax to reduce its variance. Our estimators demonstrate superior performance on training variational autoencoders with discrete latent spaces. Furthermore, we investigate the possibility of leveraging alternative numerical methods for constructing more accurate gradient approximations and present an alternative view of ReinMax from a simpler numerical integration perspective.

顶级标签: machine learning model training theory
详细标签: gradient estimation discrete latent variables variance reduction straight-through estimator variational autoencoder 或 搜索:

超越ReinMax:面向离散隐变量的低方差梯度估计器 / Beyond ReinMax: Low-Variance Gradient Estimators for Discrete Latent Variables


1️⃣ 一句话总结

这篇论文提出了两种新的梯度估计方法(ReinMax-Rao和ReinMax-CV),通过在现有ReinMax方法中融入统计降噪技术,显著降低了模型训练中离散隐变量梯度估计的方差,从而提升了变分自编码器等模型的训练效果。

源自 arXiv: 2603.08257