Hyper-Connections

33citations
arXiv:2409.19606
33
citations
#533
in ICLR 2025
of 3827 papers
8
Top Authors
6
Data Points

Abstract

We present hyper-connections, a simple yet effective method that can serve as an alternative to residual connections. This approach specifically addresses common drawbacks observed in residual connection variants, such as the seesaw effect between gradient vanishing and representation collapse. Theoretically, hyper-connections allow the network to adjust the strength of connections between features at different depths and dynamically rearrange layers. We conduct experiments focusing on the pre-training of large language models, including dense and sparse models, where hyper-connections show significant performance improvements over residual connections. Additional experiments conducted on vision tasks also demonstrate similar improvements. We anticipate that this method will be broadly applicable and beneficial across a wide range of AI problems.

Citation History

Jan 26, 2026
20
Jan 27, 2026
20
Feb 3, 2026
25+5
Feb 13, 2026
33+8
Feb 13, 2026
33
Feb 13, 2026
33