by Jack Cai Papers
3 papers found
Conference
Attention-Level Speculation
Jack Cai, Ammar Vora, Randolph Zhang et al.
ICML 2025
Everything Everywhere All at Once: LLMs can In-Context Learn Multiple Tasks in Superposition
Zheyang Xiong, Jack Cai, John Cooper et al.
ICML 2025spotlightarXiv:2410.05603
9
citations
Self-Improving Transformers Overcome Easy-to-Hard and Length Generalization Challenges
Nayoung Lee, Jack Cai, Avi Schwarzschild et al.
ICML 2025arXiv:2502.01612
22
citations