Spotlight "knowledge distillation" Papers
2 papers found
Conference
Distillation Robustifies Unlearning
Bruce W, Lee, Addie Foote, Alex Infanger et al.
NEURIPS 2025spotlightarXiv:2506.06278
6
citations
Large Language Models are Efficient Learners of Noise-Robust Speech Recognition
Yuchen Hu, CHEN CHEN, Chao-Han Huck Yang et al.
ICLR 2024spotlightarXiv:2401.10446
36
citations