"data-free distillation" Papers
5 papers found
Conference
Data-Free Black-Box Federated Learning via Zeroth-Order Gradient Estimation
Xinge Ma, Jin Wang, Xuejie Zhang
AAAI 2025paperarXiv:2503.06028
Guided Score identity Distillation for Data-Free One-Step Text-to-Image Generation
Mingyuan Zhou, Zhendong Wang, Huangjie Zheng et al.
ICLR 2025arXiv:2406.01561
4
citations
Toward Efficient Data-Free Unlearning
Chenhao Zhang, Shaofei Shen, Weitong Chen et al.
AAAI 2025paperarXiv:2412.13790
3
citations
What Makes a Good Dataset for Knowledge Distillation?
Logan Frank, Jim Davis
CVPR 2025arXiv:2411.12817
4
citations
Score identity Distillation: Exponentially Fast Distillation of Pretrained Diffusion Models for One-Step Generation
Mingyuan Zhou, Huangjie Zheng, Zhendong Wang et al.
ICML 2024arXiv:2404.04057
154
citations