Local Steps Speed Up Local GD for Heterogeneous Distributed Logistic Regression
1citations
arXiv:2501.137901
citations
#3046
in ICLR 2025
of 3827 papers
3
Top Authors
7
Data Points
Top Authors
Topics
Abstract
We analyze two variants of Local Gradient Descent applied to distributed logistic regression with heterogeneous, separable data and show convergence at the rate $O(1/KR)$ for $K$ local steps and sufficiently large $R$ communication rounds. In contrast, all existing convergence guarantees for Local GD applied to any problem are at least $Ω(1/R)$, meaning they fail to show the benefit of local updates. The key to our improved guarantee is showing progress on the logistic regression objective when using a large stepsize $η\gg 1/K$, whereas prior analysis depends on $η\leq 1/K$.
Citation History
Jan 25, 2026
0
Jan 27, 2026
0
Jan 27, 2026
0
Jan 31, 2026
1+1
Feb 13, 2026
1
Feb 13, 2026
1
Feb 13, 2026
1