"dataset distillation" Papers
29 papers found
Conference
Adaptive Dataset Quantization
Muquan Li, Dongyang Zhang, Qiang Dong et al.
Beyond Modality Collapse: Representation Blending for Multimodal Dataset Distillation
xin zhang, Ziruo Zhang, JIAWEI DU et al.
Beyond Random: Automatic Inner-loop Optimization in Dataset Distillation
Muquan Li, Hang Gou, Dongyang Zhang et al.
Boost Self-Supervised Dataset Distillation via Parameterization, Predefined Augmentation, and Approximation
Sheng-Feng Yu, Jia-Jiun Yao, Wei-Chen Chiu
Dataset Distillation for Pre-Trained Self-Supervised Vision Models
George Cazenavette, Antonio Torralba, Vincent Sitzmann
Dataset Distillation via Knowledge Distillation: Towards Efficient Self-Supervised Pre-training of Deep Networks
Siddharth Joshi, Jiayi Ni, Baharan Mirzasoleiman
Dataset Distillation via Vision-Language Category Prototype
YAWEN ZOU, Guang Li, Duo Su et al.
DELT: A Simple Diversity-driven EarlyLate Training for Dataset Distillation
Zhiqiang Shen, Ammar Sherif, Zeyuan Yin et al.
Distilling Dataset into Neural Field
Donghyeok Shin, HeeSun Bae, Gyuwon Sim et al.
Does Training with Synthetic Data Truly Protect Privacy?
Yunpeng Zhao, Jie Zhang
Efficient Multimodal Dataset Distillation via Generative Models
Zhenghao Zhao, Haoxuan Wang, Junyi Wu et al.
Enhancing Dataset Distillation via Non-Critical Region Refinement
Minh-Tuan Tran, Trung Le, Xuan-May Le et al.
Flowing Datasets with Wasserstein over Wasserstein Gradient Flows
Clément Bonet, Christophe Vauthier, Anna Korba
GIFT: Unlocking Full Potential of Labels in Distilled Dataset at Near-zero Cost
Xinyi Shang, Peng Sun, Tao Lin
Group Distributionally Robust Dataset Distillation with Risk Minimization
Saeed Vahidian, Mingyu Wang, Jianyang Gu et al.
Heavy Labels Out! Dataset Distillation with Label Space Lightening
Ruonan Yu, Songhua Liu, Zigeng Chen et al.
Hierarchical Features Matter: A Deep Exploration of Progressive Parameterization Method for Dataset Distillation
Xinhao Zhong, Hao Fang, Bin Chen et al.
Hyperbolic Dataset Distillation
Wenyuan Li, Guang Li, Keisuke Maeda et al.
Influence-Guided Diffusion for Dataset Distillation
Mingyang Chen, Jiawei Du, Bo Huang et al.
Towards Adversarially Robust Dataset Distillation by Curvature Regularization
Eric Xue, Yijiang Li, Haoyang Liu et al.
Towards Stable and Storage-efficient Dataset Distillation: Matching Convexified Trajectory
Wenliang Zhong, Haoyu Tang, Qinghai Zheng et al.
Data-to-Model Distillation: Data-Efficient Learning Framework
Ahmad Sajedi, Samir Khaki, Lucy Z. Liu et al.
Distill Gold from Massive Ores: Bi-level Data Pruning towards Efficient Dataset Distillation
YUE XU, Yong-Lu Li, Kaitong Cui et al.
Large Scale Dataset Distillation with Domain Shift
Noel Loo, Alaa Maalouf, Ramin Hasani et al.
Low-Rank Similarity Mining for Multimodal Dataset Distillation
Yue Xu, Zhilin Lin, Yusong Qiu et al.
SelMatch: Effectively Scaling Up Dataset Distillation via Selection-Based Initialization and Partial Updates by Trajectory Matching
Yongmin Lee, Hye Won Chung
Teddy: Efficient Large-Scale Dataset Distillation via Taylor-Approximated Matching
Ruonan Yu, Songhua Liu, Jingwen Ye et al.
Unlocking the Potential of Federated Learning: The Symphony of Dataset Distillation via Deep Generative Latents
Yuqi Jia, Saeed Vahidian, Jingwei Sun et al.
What is Dataset Distillation Learning?
William Yang, Ye Zhu, Zhiwei Deng et al.