"large language model distillation" Papers
3 papers found
Conference
Distilling LLM Prior to Flow Model for Generalizable Agent’s Imagination in Object Goal Navigation
Badi Li, Ren-Jie Lu, Yu Zhou et al.
NEURIPS 2025arXiv:2508.09423
EA-KD: Entropy-based Adaptive Knowledge Distillation
Chi-Ping Su, Ching-Hsun Tseng, Bin Pu et al.
ICCV 2025arXiv:2311.13621
3
citations
Weighted Multi-Prompt Learning with Description-free Large Language Model Distillation
Sua Lee, Kyubum Shin, Jung Ho Park
ICLR 2025arXiv:2507.07147
1
citations