α
Research
Alpha Leak
Conferences
Topics
Top Authors
Rankings
Browse All
EN
中
Home
/
Authors
/
Xiaoxia Wu
Xiaoxia Wu
5
papers
773
total citations
papers (5)
ZeroQuant: Efficient and Affordable Post-Training Quantization for Large-Scale Transformers
NEURIPS 2022
arXiv
636
citations
Exploring Post-training Quantization in LLMs from Comprehensive Study to Low Rank Compensation
AAAI 2024
arXiv
71
citations
DeepSpeed Data Efficiency: Improving Deep Learning Model Quality and Training Efficiency via Efficient Data Sampling and Routing
AAAI 2024
arXiv
40
citations
Implicit Regularization and Convergence for Weight Normalization
NEURIPS 2020
arXiv
26
citations
XTC: Extreme Compression for Pre-trained Transformers Made Simple and Efficient
NEURIPS 2022
0
citations