"knowledge distillation" Papers

210 papers found • Page 5 of 5

SinSR: Diffusion-Based Image Super-Resolution in a Single Step

Yufei Wang, Wenhan Yang, Xinyuan Chen et al.

CVPR 2024arXiv:2311.14760
226
citations

SpikingBERT: Distilling BERT to Train Spiking Language Models Using Implicit Differentiation

Malyaban Bal, Abhronil Sengupta

AAAI 2024paperarXiv:2308.10873
73
citations

Summarizing Stream Data for Memory-Constrained Online Continual Learning

Jianyang Gu, Kai Wang, Wei Jiang et al.

AAAI 2024paperarXiv:2305.16645
23
citations

Sunshine to Rainstorm: Cross-Weather Knowledge Distillation for Robust 3D Object Detection

Xun Huang, Hai Wu, Xin Li et al.

AAAI 2024paperarXiv:2402.18493
15
citations

Synchronization is All You Need: Exocentric-to-Egocentric Transfer for Temporal Action Segmentation with Unlabeled Synchronized Video Pairs

Camillo Quattrocchi, Antonino Furnari, Daniele Di Mauro et al.

ECCV 2024arXiv:2312.02638
18
citations

Teacher as a Lenient Expert: Teacher-Agnostic Data-Free Knowledge Distillation

Hyunjune Shin, Dong-Wan Choi

AAAI 2024paperarXiv:2402.12406
7
citations

UNIC: Universal Classification Models via Multi-teacher Distillation

Yannis Kalantidis, Larlus Diane, Mert Bulent SARIYILDIZ et al.

ECCV 2024arXiv:2408.05088
19
citations

UNIKD: UNcertainty-Filtered Incremental Knowledge Distillation for Neural Implicit Representation

Mengqi GUO, Chen Li, Hanlin Chen et al.

ECCV 2024arXiv:2212.10950
3
citations

Weakly Supervised Monocular 3D Detection with a Single-View Image

Xueying Jiang, Sheng Jin, Lewei Lu et al.

CVPR 2024arXiv:2402.19144
12
citations

Weak-to-Strong 3D Object Detection with X-Ray Distillation

Alexander Gambashidze, Aleksandr Dadukin, Maksim Golyadkin et al.

CVPR 2024arXiv:2404.00679
6
citations
Previous
1...345
Next