Spotlight "linear attention" Papers
2 papers found
Conference
ZeroS: Zero‑Sum Linear Attention for Efficient Transformers
Jiecheng Lu, Xu Han, Yan Sun et al.
NEURIPS 2025spotlightarXiv:2602.05230
Simple linear attention language models balance the recall-throughput tradeoff
Simran Arora, Sabri Eyuboglu, Michael Zhang et al.
ICML 2024spotlightarXiv:2402.18668
140
citations