Poster "activation sparsity" Papers
5 papers found
Conference
Activity Pruning for Efficient Spiking Neural Networks
Tong Bu, Xinyu Shi, Zhaofei Yu
NEURIPS 2025
DuoGPT: Training-free Dual Sparsity through Activation-aware Pruning in LLMs
Ruokai Yin, Yuhang Li, Donghyun Lee et al.
NEURIPS 2025arXiv:2506.20194
2
citations
Spark Transformer: Reactivating Sparsity in Transformer FFN and Attention
Chong You, Kan Wu, Zhipeng Jia et al.
NEURIPS 2025
2
citations
Training-Free Activation Sparsity in Large Language Models
James Liu, Pragaash Ponnusamy, Tianle Cai et al.
ICLR 2025arXiv:2408.14690
39
citations
Exploring the Benefit of Activation Sparsity in Pre-training
Zhengyan Zhang, Chaojun Xiao, Qiujieli Qin et al.
ICML 2024arXiv:2410.03440
6
citations