Poster "knowledge distillation" Papers

165 papers found • Page 3 of 4

Adversarially Robust Distillation by Reducing the Student-Teacher Variance Gap

Junhao Dong, Piotr Koniusz, Junxi Chen et al.

ECCV 2024
10
citations

AMD: Automatic Multi-step Distillation of Large-scale Vision Models

Cheng Han, Qifan Wang, Sohail A Dianat et al.

ECCV 2024arXiv:2407.04208
15
citations

AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge Distillation

Zihao Tang, Zheqi Lv, Shengyu Zhang et al.

ICLR 2024arXiv:2403.07030
4
citations

Bayesian Knowledge Distillation: A Bayesian Perspective of Distillation with Uncertainty Quantification

Luyang Fang, Yongkai Chen, Wenxuan Zhong et al.

ICML 2024

BKDSNN: Enhancing the Performance of Learning-based Spiking Neural Networks Training with Blurred Knowledge Distillation

Zekai Xu, Kang You, Qinghai Guo et al.

ECCV 2024arXiv:2407.09083
13
citations

Bootstrapping Chest CT Image Understanding by Distilling Knowledge from X-ray Expert Models

Weiwei Cao, Jianpeng Zhang, Yingda Xia et al.

CVPR 2024arXiv:2404.04936
15
citations

Bridge Past and Future: Overcoming Information Asymmetry in Incremental Object Detection

QIJIE MO, Yipeng Gao, Shenghao Fu et al.

ECCV 2024arXiv:2407.11499
14
citations

Data-free Distillation of Diffusion Models with Bootstrapping

Jiatao Gu, Chen Wang, Shuangfei Zhai et al.

ICML 2024

De-confounded Data-free Knowledge Distillation for Handling Distribution Shifts

Yuzheng Wang, Dingkang Yang, Zhaoyu Chen et al.

CVPR 2024arXiv:2403.19539
17
citations

DeiT-LT: Distillation Strikes Back for Vision Transformer Training on Long-Tailed Datasets

Harsh Rangwani, Pradipto Mondal, Mayank Mishra et al.

CVPR 2024arXiv:2404.02900
18
citations

DetKDS: Knowledge Distillation Search for Object Detectors

Lujun Li, Yufan Bao, Peijie Dong et al.

ICML 2024

DFD: Distilling the Feature Disparity Differently for Detectors

Kang Liu, Yingyi Zhang, Jingyun Zhang et al.

ICML 2024

Direct Distillation between Different Domains

Jialiang Tang, Shuo Chen, Gang Niu et al.

ECCV 2024arXiv:2401.06826
7
citations

Distilling Knowledge from Large-Scale Image Models for Object Detection

Gang Li, Wenhai Wang, Xiang Li et al.

ECCV 2024
4
citations

Distilling ODE Solvers of Diffusion Models into Smaller Steps

Sanghwan Kim, Hao Tang, Fisher Yu

CVPR 2024arXiv:2309.16421
10
citations

Distilling Semantic Priors from SAM to Efficient Image Restoration Models

Quan Zhang, Xiaoyu Liu, Wei Li et al.

CVPR 2024arXiv:2403.16368
36
citations

DistiLLM: Towards Streamlined Distillation for Large Language Models

Jongwoo Ko, Sungnyun Kim, Tianyi Chen et al.

ICML 2024arXiv:2402.03898
73
citations

Do Topological Characteristics Help in Knowledge Distillation?

Jungeun Kim, Junwon You, Dongjin Lee et al.

ICML 2024

DSD-DA: Distillation-based Source Debiasing for Domain Adaptive Object Detection

Yongchao Feng, Shiwei Li, Yingjie Gao et al.

ICML 2024arXiv:2311.10437
8
citations

DSMix: Distortion-Induced Saliency Map Based Pre-training for No-Reference Image Quality Assessment

Jinsong Shi, Jinsong Shi, Xiaojiang Peng et al.

ECCV 2024
5
citations

DεpS: Delayed ε-Shrinking for Faster Once-For-All Training

Aditya Annavajjala, Alind Khare, Animesh Agrawal et al.

ECCV 2024arXiv:2407.06167
1
citations

Efficient Multitask Dense Predictor via Binarization

Yuzhang Shang, Dan Xu, Gaowen Liu et al.

CVPR 2024arXiv:2405.14136
6
citations

Embodied CoT Distillation From LLM To Off-the-shelf Agents

Wonje Choi, Woo Kyung Kim, Minjong Yoo et al.

ICML 2024arXiv:2412.11499
11
citations

Enhanced Sparsification via Stimulative Training

Shengji Tang, Weihao Lin, Hancheng Ye et al.

ECCV 2024arXiv:2403.06417
2
citations

Enhancing Class-Imbalanced Learning with Pre-Trained Guidance through Class-Conditional Knowledge Distillation

Lan Li, Xin-Chun Li, Han-Jia Ye et al.

ICML 2024

From Coarse to Fine: Enable Comprehensive Graph Self-supervised Learning with Multi-granular Semantic Ensemble

Qianlong Wen, Mingxuan Ju, Zhongyu Ouyang et al.

ICML 2024

Good Teachers Explain: Explanation-Enhanced Knowledge Distillation

Amin Parchami, Moritz Böhle, Sukrut Rao et al.

ECCV 2024arXiv:2402.03119
19
citations

GraphDreamer: Compositional 3D Scene Synthesis from Scene Graphs

Gege Gao, Weiyang Liu, Anpei Chen et al.

CVPR 2024arXiv:2312.00093
87
citations

Harmonizing knowledge Transfer in Neural Network with Unified Distillation

yaomin huang, faming Fang, Zaoming Yan et al.

ECCV 2024arXiv:2409.18565
1
citations

Human Motion Forecasting in Dynamic Domain Shifts: A Homeostatic Continual Test-time Adaptation Framework

Qiongjie Cui, Huaijiang Sun, Bin Li et al.

ECCV 2024
1
citations

Improving Plasticity in Online Continual Learning via Collaborative Learning

Maorong Wang, Nicolas Michel, Ling Xiao et al.

CVPR 2024arXiv:2312.00600
21
citations

Is Retain Set All You Need in Machine Unlearning? Restoring Performance of Unlearned Models with Out-Of-Distribution Images

Jacopo Bonato, Marco Cotogni, Luigi Sabetta

ECCV 2024arXiv:2404.12922
21
citations

Keypoint-based Progressive Chain-of-Thought Distillation for LLMs

Kaituo Feng, Changsheng Li, Xiaolu Zhang et al.

ICML 2024arXiv:2405.16064
16
citations

Knowledge Distillation with Auxiliary Variable

Bo Peng, zhen fang, Guangquan Zhang et al.

ICML 2024

LASS3D: Language-Assisted Semi-Supervised 3D Semantic Segmentation with Progressive Unreliable Data Exploitation

Jianan Li, Qiulei Dong

ECCV 2024
1
citations

Learning Modality-agnostic Representation for Semantic Segmentation from Any Modalities

Xu Zheng, Yuanhuiyi Lyu, LIN WANG

ECCV 2024arXiv:2407.11351
28
citations

LEROjD: Lidar Extended Radar-Only Object Detection

Patrick Palmer, Martin Krüger, Stefan Schütte et al.

ECCV 2024arXiv:2409.05564
2
citations

Less or More From Teacher: Exploiting Trilateral Geometry For Knowledge Distillation

Chengming Hu, Haolun Wu, Xuan Li et al.

ICLR 2024arXiv:2312.15112
3
citations

LiDAR-based All-weather 3D Object Detection via Prompting and Distilling 4D Radar

Yujeong Chae, HYEONSEONG KIM, Changgyoon Oh et al.

ECCV 2024
6
citations

Make a Strong Teacher with Label Assistance: A Novel Knowledge Distillation Approach for Semantic Segmentation

Shoumeng Qiu, Jie Chen, Xinrun Li et al.

ECCV 2024arXiv:2407.13254
9
citations

Markov Knowledge Distillation: Make Nasty Teachers trained by Self-undermining Knowledge Distillation Fully Distillable

En-Hui Yang, Linfeng Ye

ECCV 2024
8
citations

MH-pFLID: Model Heterogeneous personalized Federated Learning via Injection and Distillation for Medical Data Analysis

Luyuan Xie, Manqing Lin, Tianyu Luan et al.

ICML 2024arXiv:2405.06822
15
citations

Mitigating Background Shift in Class-Incremental Semantic Segmentation

gilhan Park, WonJun Moon, SuBeen Lee et al.

ECCV 2024arXiv:2407.11859
12
citations

MobileDiffusion: Instant Text-to-Image Generation on Mobile Devices

Yang Zhao, Zhisheng Xiao, Yanwu Xu et al.

ECCV 2024arXiv:2311.16567
36
citations

Multi-scale Cross Distillation for Object Detection in Aerial Images

Kun Wang, Zi Wang, Zhang Li et al.

ECCV 2024
2
citations

Online Speculative Decoding

Xiaoxuan Liu, Lanxiang Hu, Peter Bailis et al.

ICML 2024arXiv:2310.07177
92
citations

On the Road to Portability: Compressing End-to-End Motion Planner for Autonomous Driving

Kaituo Feng, Changsheng Li, Dongchun Ren et al.

CVPR 2024arXiv:2403.01238
15
citations

Open Vocabulary 3D Scene Understanding via Geometry Guided Self-Distillation

Pengfei Wang, Yuxi Wang, Shuai Li et al.

ECCV 2024arXiv:2407.13362
10
citations

Overcoming Data and Model heterogeneities in Decentralized Federated Learning via Synthetic Anchors

Chun-Yin Huang, Kartik Srinivas, Xin Zhang et al.

ICML 2024arXiv:2405.11525
19
citations

PartDistill: 3D Shape Part Segmentation by Vision-Language Model Distillation

Ardian Umam, Cheng-Kun Yang, Min-Hung Chen et al.

CVPR 2024arXiv:2312.04016
26
citations