An Enhanced Levenberg--Marquardt Method via Gram Reduction
Top Authors
Abstract
This paper studied the problem of solving the system of nonlinear equations ${\bf F}({\bf x})={\bf 0}$, where ${\bf F}:{\mathbb R}^{d}\to{\mathbb R}^d$. We propose Gram-Reduced Levenberg--Marquardt method which updates the Gram matrix ${\bf J}(\cdot)^\top{\bf J}(\cdot)$ in every $m$ iterations, where ${\bf J}(\cdot)$ is the Jacobian of ${\bf F}(\cdot)$. Our method has a global convergence guarantee without relying on any step of line-search or solving sub-problems. We prove our method takes at most $\mathcal{O}(m^2+m^{-0.5}ε^{-2.5})$ iterations to find an $ε$-stationary point of $\frac{1}{2}\|{\bf F}(\cdot)\|^2$, which leads to overall computation cost of $\mathcal{O}(d^3ε^{-1}+d^2ε^{-2})$ by taking $m=Θ(ε^{-1})$. Our results are strictly better than the cost of $\mathcal{O}(d^3ε^{-2})$ for existing Levenberg--Marquardt methods. We also show the proposed method enjoys local superlinear convergence rate under the non-degenerate assumption. We provide experiments on real-world applications in scientific computing and machine learning to validate the efficiency of the proposed methods.