Grokking as a First Order Phase Transition in Two Layer Networks

38citations
arXiv:2310.03789
38
citations
#635
in ICLR 2024
of 2297 papers
3
Top Authors
4
Data Points

Abstract

A key property of deep neural networks (DNNs) is their ability to learn new features during training. This intriguing aspect of deep learning stands out most clearly in recently reported Grokking phenomena. While mainly reflected as a sudden increase in test accuracy, Grokking is also believed to be a beyond lazy-learning/Gaussian Process (GP) phenomenon involving feature learning. Here we apply a recent development in the theory of feature learning, the adaptive kernel approach, to two teacher-student models with cubic-polynomial and modular addition teachers. We provide analytical predictions on feature learning and Grokking properties of these models and demonstrate a mapping between Grokking and the theory of phase transitions. We show that after Grokking, the state of the DNN is analogous to the mixed phase following a first-order phase transition. In this mixed phase, the DNN generates useful internal representations of the teacher that are sharply distinct from those before the transition.

Citation History

Jan 28, 2026
0
Feb 13, 2026
38+38
Feb 13, 2026
38
Feb 13, 2026
38