📄
Abstract - On Higher-Order Geometric Refinements of Classical Covariance Asymptotics: An Approach via Intrinsic and Extrinsic Information Geometry
Classical Fisher-information asymptotics describe the covariance of regular efficient estimators through the local quadratic approximation of the log-likelihood, and thus capture first-order geometry only. In curved models, including mixtures, curved exponential families, latent-variable models, and manifold-constrained parameter spaces, finite-sample behavior can deviate systematically from these predictions. We develop a coordinate-invariant, curvature-aware refinement by viewing a regular parametric family as a Riemannian manifold \((\Theta,g)\) with Fisher--Rao metric, immersed in \(L^2(\mu)\) through the square-root density map. Under suitable regularity and moment assumptions, we derive an \(n^{-2}\) correction to the leading \(n^{-1}I(\theta)^{-1}\) covariance term for score-root, first-order efficient estimators. The correction is governed by a tensor \(P_{ij}\) that decomposes canonically into three parts, an intrinsic Ricci-type contraction of the Fisher--Rao curvature tensor, an extrinsic Gram-type contraction of the second fundamental form, and a Hellinger discrepancy tensor encoding higher-order probabilistic information not determined by immersion geometry alone. The extrinsic term is positive semidefinite, the full correction is invariant under smooth reparameterization, and it vanishes identically for full exponential families. We then extend the picture to singular models, where Fisher information degenerates. Using resolution of singularities under an additive normal crossing assumption, we describe the resolved metric, the role of the real log canonical threshold in learning rates and posterior mean-squared error, and a curvature-based covariance expansion on the resolved space that recovers the regular theory as a special case. This framework also suggests geometric diagnostics of weak identifiability and curvature-aware principles for regularization and optimization.
经典协方差渐近理论的高阶几何修正:一种基于内蕴与外蕴信息几何的方法 /
On Higher-Order Geometric Refinements of Classical Covariance Asymptotics: An Approach via Intrinsic and Extrinsic Information Geometry
1️⃣ 一句话总结
这篇论文通过将统计模型视为几何空间,首次系统性地推导了统计估计误差的高阶修正项,揭示了模型曲率如何影响估计精度,并统一了常规模型与奇异模型的协方差渐近理论。