Poster "feature learning" Papers
19 papers found
Conference
A Rainbow in Deep Network Black Boxes
Florentin Guth, Brice Ménard, Gaspar Rochette et al.
Deep Networks Learn Features From Local Discontinuities in the Label Function
Prithaj Banerjee, Harish G Ramaswamy, Mahesh Yadav et al.
Do Mice Grok? Glimpses of Hidden Progress in Sensory Cortex
Tanishq Kumar, Blake Bordelon, Cengiz Pehlevan et al.
From Linear to Nonlinear: Provable Weak-to-Strong Generalization through Feature Learning
Junsoo Oh, Jerry Song, Chulhee Yun
How Data Mixing Shapes In-Context Learning: Asymptotic Equivalence for Transformers with MLPs
Samet Demir, Zafer Dogan
Learning Hierarchical Polynomials of Multiple Nonlinear Features
Hengyu Fu, Zihao Wang, Eshaan Nichani et al.
On the Feature Learning in Diffusion Models
Andi Han, Wei Huang, Yuan Cao et al.
Revisiting Residual Connections: Orthogonal Updates for Stable and Efficient Deep Networks
Giyeong Oh, Woohyun Cho, Siyeol Kim et al.
Robust Feature Learning for Multi-Index Models in High Dimensions
Alireza Mousavi-Hosseini, Adel Javanmard, Murat A Erdogdu
TS-MOF: Two-Stage Multi-Objective Fine-tuning for Long-Tailed Recognition
Zhe Zhao, Zhiheng Gong, Pengkun Wang et al.
A Theory of Non-Linear Feature Learning with One Gradient Step in Two-Layer Neural Networks
Behrad Moniri, Donghwan Lee, Hamed Hassani et al.
Catapults in SGD: spikes in the training loss and their impact on generalization through feature learning
Libin Zhu, Chaoyue Liu, Adityanarayanan Radhakrishnan et al.
DeiT-LT: Distillation Strikes Back for Vision Transformer Training on Long-Tailed Datasets
Harsh Rangwani, Pradipto Mondal, Mayank Mishra et al.
Diffusion Models Demand Contrastive Guidance for Adversarial Purification to Advance
Mingyuan Bai, Wei Huang, Li Tenghui et al.
LoRA+: Efficient Low Rank Adaptation of Large Models
Soufiane Hayou, Nikhil Ghosh, Bin Yu
Mean-field Analysis on Two-layer Neural Networks from a Kernel Perspective
Shokichi Takakura, Taiji Suzuki
Provable Benefits of Local Steps in Heterogeneous Federated Learning for Neural Networks: A Feature Learning Perspective
Yajie Bao, Michael Crawshaw, Mingrui Liu
Provable Multi-Task Representation Learning by Two-Layer ReLU Neural Networks
Liam Collins, Hamed Hassani, Mahdi Soltanolkotabi et al.
Provably Neural Active Learning Succeeds via Prioritizing Perplexing Samples
Dake Bu, Wei Huang, Taiji Suzuki et al.