α
Research
Alpha Leak
Conferences
Topics
Top Authors
Rankings
Browse All
EN
中
Home
/
Authors
/
Yuzhang Shang
Yuzhang Shang
19
papers
828
total citations
papers (19)
Post-Training Quantization on Diffusion Models
CVPR 2023
arXiv
270
citations
LLaVA-PruMerge: Adaptive Token Reduction for Efficient Large Multimodal Models
ICCV 2025
arXiv
234
citations
PB-LLM: Partially Binarized Large Language Models
ICLR 2024
arXiv
82
citations
QuEST: Low-bit Diffusion Model Quantization via Efficient Selective Finetuning
ICCV 2025
arXiv
44
citations
Lipschitz Continuity Guided Knowledge Distillation
ICCV 2021
arXiv
30
citations
A Closer Look at Time Steps is Worthy of Triple Speed-Up for Diffusion Model Training
CVPR 2025
arXiv
25
citations
Network Binarization via Contrastive Learning
ECCV 2022
arXiv
25
citations
Lipschitz Continuity Retained Binary Neural Network
ECCV 2022
arXiv
25
citations
MIM4DD: Mutual Information Maximization for Dataset Distillation
NEURIPS 2023
arXiv
23
citations
Robin3D: Improving 3D Large Language Model via Robust Instruction Tuning
ICCV 2025
arXiv
21
citations
DKDM: Data-Free Knowledge Distillation for Diffusion Models with Any Architecture
CVPR 2025
arXiv
19
citations
Causal-DFQ: Causality Guided Data-Free Network Quantization
ICCV 2023
arXiv
8
citations
Efficient Multitask Dense Predictor via Binarization
CVPR 2024
arXiv
6
citations
CaO2: Rectifying Inconsistencies in Diffusion-Based Dataset Distillation
ICCV 2025
5
citations
Distilling Long-tailed Datasets
CVPR 2025
arXiv
5
citations
EA-Vit: Efficient Adaptation for Elastic Vision Transformer
ICCV 2025
arXiv
3
citations
Efficient Multimodal Dataset Distillation via Generative Models
NEURIPS 2025
arXiv
2
citations
DLFR-Gen: Diffusion-based Video Generation with Dynamic Latent Frame Rate
ICCV 2025
1
citations
Enhancing Post-training Quantization Calibration through Contrastive Learning
CVPR 2024
0
citations