α
Research
Alpha Leak
Conferences
Topics
Top Authors
Rankings
Browse All
EN
中
Home
/
Authors
/
Tomer Koren
Tomer Koren
20
papers
262
total citations
papers (20)
Prediction with Corrupted Expert Advice
NEURIPS 2020
arXiv
43
citations
Asynchronous Stochastic Optimization Robust to Arbitrary Delays
NEURIPS 2021
arXiv
40
citations
Can Implicit Bias Explain Generalization? Stochastic Convex Optimization as a Case Study
NEURIPS 2020
arXiv
26
citations
Benign Underfitting of Stochastic Gradient Descent
NEURIPS 2022
arXiv
22
citations
Better Best of Both Worlds Bounds for Bandits with Switching Costs
NEURIPS 2022
arXiv
20
citations
Bandit Linear Control
NEURIPS 2020
arXiv
18
citations
Never Go Full Batch (in Stochastic Convex Optimization)
NEURIPS 2021
arXiv
15
citations
Stochastic Optimization with Laggard Data Pipelines
NEURIPS 2020
arXiv
12
citations
How Free is Parameter-Free Stochastic Optimization?
ICML 2024
arXiv
11
citations
Algorithmic Instabilities of Accelerated Gradient Descent
NEURIPS 2021
arXiv
11
citations
Optimal Rates for Random Order Online Optimization
NEURIPS 2021
arXiv
10
citations
Rate-Optimal Policy Optimization for Linear Markov Decision Processes
ICML 2024
arXiv
9
citations
Rate-Optimal Online Convex Optimization in Adaptive Linear Control
NEURIPS 2022
arXiv
9
citations
Tight Risk Bounds for Gradient Descent on Separable Data
NEURIPS 2023
arXiv
8
citations
Dueling Convex Optimization with General Preferences
ICML 2025
arXiv
5
citations
Optimal Rates in Continual Linear Regression via Increasing Regularization
NEURIPS 2025
arXiv
2
citations
Convergence of Policy Mirror Descent Beyond Compatible Function Approximation
ICML 2025
arXiv
1
citations
Multiclass Loss Geometry Matters for Generalization of Gradient Descent in Separable Classification
NEURIPS 2025
arXiv
0
citations
Towards Best-of-All-Worlds Online Learning with Feedback Graphs
NEURIPS 2021
0
citations
Faster Stochastic Optimization with Arbitrary Delays via Adaptive Asynchronous Mini-Batching
ICML 2025
0
citations