菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-02-03
📄 Abstract - Improved Analysis of the Accelerated Noisy Power Method with Applications to Decentralized PCA

We analyze the Accelerated Noisy Power Method, an algorithm for Principal Component Analysis in the setting where only inexact matrix-vector products are available, which can arise for instance in decentralized PCA. While previous works have established that acceleration can improve convergence rates compared to the standard Noisy Power Method, these guarantees require overly restrictive upper bounds on the magnitude of the perturbations, limiting their practical applicability. We provide an improved analysis of this algorithm, which preserves the accelerated convergence rate under much milder conditions on the perturbations. We show that our new analysis is worst-case optimal, in the sense that the convergence rate cannot be improved, and that the noise conditions we derive cannot be relaxed without sacrificing convergence guarantees. We demonstrate the practical relevance of our results by deriving an accelerated algorithm for decentralized PCA, which has similar communication costs to non-accelerated methods. To our knowledge, this is the first decentralized algorithm for PCA with provably accelerated convergence.

顶级标签: machine learning theory systems
详细标签: decentralized pca accelerated methods noisy power method convergence analysis distributed optimization 或 搜索:

加速噪声幂方法的改进分析及其在去中心化主成分分析中的应用 / Improved Analysis of the Accelerated Noisy Power Method with Applications to Decentralized PCA


1️⃣ 一句话总结

这篇论文改进了加速噪声幂方法的理论分析,大幅放宽了对计算误差的限制条件,并基于此提出了首个具有可证明加速收敛性的去中心化主成分分析算法,在保持通信成本不变的情况下显著提升了效率。

源自 arXiv: 2602.03682