Looped Transformers for Length Generalization

41
citations
#415
in ICLR 2025
of 3827 papers
4
Top Authors
5
Data Points

Abstract

Recent work has shown that Transformers trained from scratch can successfully solve various arithmetic and algorithmic tasks, such as adding numbers and computing parity. While these Transformers generalize well on unseen inputs of the same length, they struggle with length generalization, i.e., handling inputs of unseen lengths. In this work, we demonstrate that looped Transformers with an adaptive number of steps significantly improve length generalization. We focus on tasks with a known iterative solution, involving multiple iterations of a RASP-L operation - a length-generalizable operation that can be expressed by a finite-sized Transformer. We train looped Transformers using our proposed learning algorithm and observe that they learn highly length-generalizable solutions for various tasks.

Citation History

Jan 26, 2026
33
Feb 1, 2026
33
Feb 6, 2026
37+4
Feb 13, 2026
41+4
Feb 13, 2026
41