Poster "kullback-leibler divergence" Papers

11 papers found

Better Estimation of the Kullback--Leibler Divergence Between Language Models

Afra Amini, Tim Vieira, Ryan Cotterell

NEURIPS 2025arXiv:2504.10637
4
citations

Bounds on $L_p$ Errors in Density Ratio Estimation via $f$-Divergence Loss Functions

Yoshiaki Kitazawa

ICLR 2025
2
citations

Connecting Jensen–Shannon and Kullback–Leibler Divergences: A New Bound for Representation Learning

Reuben Dorent, Polina Golland, William (Sandy) Wells

NEURIPS 2025arXiv:2510.20644

DKDR: Dynamic Knowledge Distillation for Reliability in Federated Learning

Yueyang Yuan, Wenke Huang, Guancheng Wan et al.

NEURIPS 2025

Provable Benefit of Annealed Langevin Monte Carlo for Non-log-concave Sampling

Wei Guo, Molei Tao, Yongxin Chen

ICLR 2025arXiv:2407.16936
19
citations

Provable Robust Overfitting Mitigation in Wasserstein Distributionally Robust Optimization

Shuang Liu, Yihan Wang, Yifan Zhu et al.

ICLR 2025arXiv:2503.04315

Stochastic variance-reduced Gaussian variational inference on the Bures-Wasserstein manifold

Hoang Phuc Hau Luu, Hanlin Yu, Bernardo Williams et al.

ICLR 2025arXiv:2410.02490
1
citations

Variational Inference with Mixtures of Isotropic Gaussians

Marguerite Petit-Talamon, Marc Lambert, Anna Korba

NEURIPS 2025arXiv:2506.13613

A Diffusion Model Framework for Unsupervised Neural Combinatorial Optimization

Sebastian Sanokowski, Sepp Hochreiter, Sebastian Lehner

ICML 2024arXiv:2406.01661
53
citations

DistiLLM: Towards Streamlined Distillation for Large Language Models

Jongwoo Ko, Sungnyun Kim, Tianyi Chen et al.

ICML 2024arXiv:2402.03898
73
citations

Theoretical Guarantees for Variational Inference with Fixed-Variance Mixture of Gaussians

Tom Huix, Anna Korba, Alain Oliviero Durmus et al.

ICML 2024arXiv:2406.04012
10
citations