α
Research
Alpha Leak
Conferences
Topics
Top Authors
Rankings
Browse All
EN
中
Home
/
Authors
/
Conglong Li
Conglong Li
4
papers
727
total citations
papers (4)
ZeroQuant: Efficient and Affordable Post-Training Quantization for Large-Scale Transformers
NEURIPS 2022
arXiv
636
citations
The Stability-Efficiency Dilemma: Investigating Sequence Length Warmup for Training GPT Models
NEURIPS 2022
arXiv
51
citations
DeepSpeed Data Efficiency: Improving Deep Learning Model Quality and Training Efficiency via Efficient Data Sampling and Routing
AAAI 2024
arXiv
40
citations
XTC: Extreme Compression for Pre-trained Transformers Made Simple and Efficient
NEURIPS 2022
0
citations