Affine-Invariant Global Non-Asymptotic Convergence Analysis of BFGS under Self-Concordance

2
citations
#1951
in NEURIPS 2025
of 5858 papers
2
Top Authors
5
Data Points

Abstract

In this paper, we establish global non-asymptotic convergence guarantees for the BFGS quasi-Newton method without requiring strong convexity or the Lipschitz continuity of the gradient or Hessian. Instead, we consider the setting where the objective function is strictly convex and strongly self-concordant. For an arbitrary initial point and any arbitrary positive-definite initial Hessian approximation, we prove global linear and superlinear convergence guarantees for BFGS when the step size is determined using a line search scheme satisfying the weak Wolfe conditions. Moreover, all our global guarantees are affine-invariant, with the convergence rates depending solely on the initial error and the strongly self-concordant constant. Our results extend the global non-asymptotic convergence theory of BFGS beyond traditional assumptions and, for the first time, establish affine-invariant convergence guarantees aligning with the inherent affine invariance of the BFGS method.

Citation History

Jan 25, 2026
1
Jan 30, 2026
1
Feb 13, 2026
2+1
Feb 13, 2026
2
Feb 13, 2026
2