Poster "multi-teacher distillation" Papers
3 papers found
Conference
Learning Task-Agnostic Representations through Multi-Teacher Distillation
Philippe Formont, Maxime Darrin, Banafsheh Karimian et al.
NEURIPS 2025arXiv:2510.18680
Exploring Efficient Asymmetric Blind-Spots for Self-Supervised Denoising in Real-World Scenarios
Shiyan Chen, Jiyuan Zhang, Zhaofei Yu et al.
CVPR 2024arXiv:2303.16783
19
citations
UNIC: Universal Classification Models via Multi-teacher Distillation
Yannis Kalantidis, Larlus Diane, Mert Bulent SARIYILDIZ et al.
ECCV 2024arXiv:2408.05088
19
citations