"multi-teacher distillation" Papers
5 papers found
Conference
Learning Task-Agnostic Representations through Multi-Teacher Distillation
Philippe Formont, Maxime Darrin, Banafsheh Karimian et al.
NEURIPS 2025arXiv:2510.18680
Noisy Node Classification by Bi-level Optimization Based Multi-Teacher Distillation
Yujing Liu, Zongqian Wu, Zhengyu Lu et al.
AAAI 2025paperarXiv:2404.17875
2
citations
Exploring Efficient Asymmetric Blind-Spots for Self-Supervised Denoising in Real-World Scenarios
Shiyan Chen, Jiyuan Zhang, Zhaofei Yu et al.
CVPR 2024arXiv:2303.16783
19
citations
Let All Be Whitened: Multi-Teacher Distillation for Efficient Visual Retrieval
Zhe Ma, Jianfeng Dong, Shouling Ji et al.
AAAI 2024paperarXiv:2312.09716
12
citations
UNIC: Universal Classification Models via Multi-teacher Distillation
Yannis Kalantidis, Larlus Diane, Mert Bulent SARIYILDIZ et al.
ECCV 2024arXiv:2408.05088
19
citations