Parameter-free Algorithms for the Stochastically Extended Adversarial Model

0
citations
#3347
in NEURIPS 2025
of 5858 papers
4
Top Authors
7
Data Points

Abstract

We develop the first parameter-free algorithms for the Stochastically Extended Adversarial (SEA) model, a framework that bridges adversarial and stochastic online convex optimization. Existing approaches for the SEA model require prior knowledge of problem-specific parameters, such as the diameter of the domain $D$ and the Lipschitz constant of the loss functions $G$, which limits their practical applicability. Addressing this, we develop parameter-free methods by leveraging the Optimistic Online Newton Step (OONS) algorithm to eliminate the need for these parameters. We first establish a comparator-adaptive algorithm for the scenario with unknown domain diameter but known Lipschitz constant, achieving an expected regret bound of $\tilde{O}\big(\|u\|_2^2 + \|u\|_2(\sqrt{σ^2_{1:T}} + \sqrt{Σ^2_{1:T}})\big)$, where $u$ is the comparator vector and $σ^2_{1:T}$ and $Σ^2_{1:T}$ represent the cumulative stochastic variance and cumulative adversarial variation, respectively. We then extend this to the more general setting where both $D$ and $G$ are unknown, attaining the comparator- and Lipschitz-adaptive algorithm. Notably, the regret bound exhibits the same dependence on $σ^2_{1:T}$ and $Σ^2_{1:T}$, demonstrating the efficacy of our proposed methods even when both parameters are unknown in the SEA model.

Citation History

Jan 26, 2026
0
Jan 26, 2026
0
Jan 27, 2026
0
Feb 3, 2026
0
Feb 13, 2026
0
Feb 13, 2026
0
Feb 13, 2026
0